Backup and Disaster Recovery Best Practices https://solutionsreview.com/backup-disaster-recovery/category/best-practices/ Solutions Review Fri, 08 Dec 2023 20:08:33 +0000 en-US hourly 1 https://solutionsreview.com/backup-disaster-recovery/files/2023/07/SR_Icon.png Backup and Disaster Recovery Best Practices https://solutionsreview.com/backup-disaster-recovery/category/best-practices/ 32 32 80744990 59 Data Protection Predictions from 33 Experts for 2024 https://solutionsreview.com/backup-disaster-recovery/data-protection-predictions-from-experts-for-2024/ Thu, 07 Dec 2023 16:42:53 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6296 For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of data protection predictions for 2024 from Insight Jam, its new community of enterprise tech experts. Note: Data protection predictions are listed in the order we received them. Data Protection Predictions from Experts for 2024 Bobby Cornwell, Vice President Strategic Partner […]

The post 59 Data Protection Predictions from 33 Experts for 2024 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of data protection predictions for 2024 from Insight Jam, its new community of enterprise tech experts.

Note: Data protection predictions are listed in the order we received them.

Data Protection Predictions from Experts for 2024


Bobby Cornwell, Vice President Strategic Partner Enablement & Integration at SonicWall

Expect to See New Regulations for Reporting Breaches

“In 2024, incoming cybersecurity regulations will force businesses to be more transparent about their breaches and attacks. Forthcoming legislation such as the EU’s NIS2 Directive and the Cyber Resilience Act will impose more stringent standards for cyber protection and establish clear reporting timelines in the event of a breach. As these directives take effect, businesses will be made to share with their partners and suppliers early identifications of system vulnerabilities or face fines. The aim of this is to prevent cybercriminals from inflicting widespread damage across multiple businesses. In 2024, it will be crucial to optimize the transparency afforded by these regulations, and by dragging cybercriminals out into the open, authorities can more effectively curtail their illicit activity.”

Samir Zaveri, Practice Manager – Package Led Transformation at Wipro

Future of Data Protection in Hybrid Cloud Deployments

“On one hand, hybrid cloud adoption will continue to grow exponentially and on other hand, organizations are looking to repatriate workloads back to private clouds. Data protection in a single cloud environment was already a challenge and with data distributed across multiple clouds and cloud service provider,s the challenge has grown even more. Today, organizations are having a one-to-one mapping between source clouds and data backup, and disaster recovery sites lead to multiple standard operating procedures and multiple points of data thefts along with inconsistent recovery SLAs.”

To overcome these challenges, organizations will need to adopt new capabilities

“Full workload portability is one of them. With portability, organizations have the ability to deploy workloads across different cloud service providers without having to adapt to each environment and with no changes needed to the application or the infrastructure. Portability will give organizations a way to consolidate multiple sources into one single data backup and disaster recovery site, as well as consolidate standard operating procedures (SOPs), all with consistent recovery SLAs.”

Eyal Arazi, Cloud Security Manager at Radware

Migration to the cloud will slow down as companies reverse course

“During the past few years, there has been a rapid adoption of multi-cloud strategies, with organizations often using three, four and even five different cloud environments. The problem, however, is that while organizations accumulated more cloud platforms, they ignored the problems of cross-cloud security consistency, visibility and management.

A recent survey commissioned by Radware suggests that now organizations are starting to reverse course. Despite the ongoing discussion about “the great cloud migration” and the abandonment of on-premises environments, approximately three quarters of organizations not only still use these environments but expect usage to increase in the next 12 months. Based on the report, look for more companies to consolidate their cloud environments from three or more to one or two platforms in 2024. While there will be consolidation around the cloud, most organizations will continue to maintain a combination of public cloud and on prem platforms, resulting in a hybrid environment.”

Andrew Moloney, Chief Strategy Officer at SoftIron

Cloud strategy moves from “fashionable to rational” 

“Moving from an era when proposing a full-scale migration to the public cloud was a sure-fire way to promotion, the current maturation of the market, economic conditions, shifting performance requirements, and a dramatic simplification in building private cloud-native infrastructure, will see a much more rational approach. Underpinning this will be a broader understanding of the difference between cloud “the model” and cloud the “place”, where how applications are built is decoupled from where they operate.”

A sovereign cloud shake out 

“We predict that many of the “pseudo” sovereign cloud projects – those that rely on obfuscated infrastructure and/or local third parties to operate them to provide a veneer of sovereignty, will not gain traction. AWS, late to the party to offer such a service and having recently launched their European Sovereign Cloud may well be delivering too little, too late. Instead, those that offer true sovereign resilience – enabling nation-states to build, operate, inspect, and audit their own infrastructure on their own terms and turf, will become the preferred option.”

VMware acquisition accelerates the adoption of cloud-native infrastructure 

“Forced into seeking credible alternatives to using VMware to provide virtualized infrastructure in on-prem. data centers, existing VMware customers will take the opportunity to revisit their cloud strategy, making the rational decision to shift to a fully cloud-native infrastructure – one able to consolidate and simplify existing virtualized on-prem. workloads within an infrastructure able to deliver true private cloud going forward will grasp that opportunity. Finally, they will be able to deliver what VMware and Nutanix have promised for years but have never quite been able to deliver.”

A renaissance for Private Cloud 

“Partially related to our VMWare prediction, and the availability of cloud-native infrastructure that changes the economics of private cloud, the evolution of a more rational cloud strategy will see Cloud Centers of Excellence (CCoEs) and FinOps professionals grasp the opportunity to get an apples-to-apples comparison across not just public clouds, but now between public and private cloud. New open standards released in 2024, such as FOCUS will help to enable this.

At the same time, shifts to distributed cloud architectures, enabling workloads to move to the edge to the core and back will elevate the need to make private clouds more than just basic virtualized infrastructure.”

The death of “Hyper-Converged” 

“Already effectively abandoned by its greatest exponent, Nutanix, the independent and elastic scaling limitations inherent in these architectures, plus their failures to fully deliver on a fully cloud-native environment without significant integrations and third parties will see hyperconvergence relegated from the data center to smaller, departmental type solutions only.”

Software-defined fades, hardware, and hard tech get sexy again 

“Fuelled by the hype around AI and the investments being made in the processing power to support it, we predict we’ll see a resurgence in interest in innovation right in hardware, and the hard tech required to support that. A new generation of start-ups will disrupt the inertia in innovation in IT infrastructure design of the last couple of decades.”

Thomas Chauchefoin, Vulnerability Research at Sonar

AI-Assisted attacks to become more sophisticated and automated

“IT security attacks leveraging AI are expected to become more sophisticated and automated. Hackers will likely use AI to analyze vast amounts of data and launch targeted attacks. AI-driven phishing attackers capable of generating highly convincing and personalized messages, which trick users into revealing sensitive information, may increase. Furthermore, AI-powered malware could adapt and evolve in real time, making it more challenging for traditional antimalware detection systems to keep up.”

Stacy Hayes, Co-Founder and EVP, Americas at Assured Data Protection

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds.

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers.

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.”

Jim Liddle, Chief Innovation Officer at Nasuni

2024 will be a make-or-break year for data intelligence  

“Following the booming interest in AI in 2023, enterprises will face increased pressure from their boards to leverage AI to gain a competitive edge. That rush for an AI advantage is surfacing deeper data infrastructure issues that have been mounting for years. Before they can integrate AI effectively, organizations will first have to address how they collect, store, and manage their unstructured data, particularly at the edge.

AI doesn’t work in a vacuum and it’s just one part of the broader data intelligence umbrella. Many organizations have already implemented data analytics, machine learning and AI into their sales, customer support, and similar low-hanging initiatives, but struggle to integrate the technology in more sophisticated, high-value applications. 

Visibility, for example, is a crucial and often-overlooked first step towards data intelligence. A shocking number of companies store massive volumes of data simply because they don’t know what’s in it or whether they need it. Is the data accurate and up-to-date? Is it properly classified and ‘searchable’? Is it compliant? Does it contain personal identifiable information (PII), protected health information (PHI), or other sensitive information? Is it available on-demand or archived?  

In the coming year, companies across the board will be forced to come to terms with the data quality, governance, access, and storage requirements of AI before they can move forward with digital transformation or improvement programs to give them the desired competitive edge.” 

2024 will be the year of reckoning for both ransomware and compliance 

“The risk of ransomware and sophisticated attacks is ever-growing and will continue to spread internationally in 2024. Preventing the theft, encryption, misuse, or exposure of sensitive data will remain a daily concern for organizations indefinitely. Multi-layer protection has quickly become a matter of hygiene and even companies that invested in sophisticated, global ransomware protection products will need a belt and braces approach in the form of network, application, and access security, coupled with rapid data recovery solutions. 

Ransomware has typically been more prevalent in the US, with larger organizations and their larger data sets presenting more attractive targets for bad actors. In 2024, we’ll see more ransomware incidents in the UK as government agencies, health services, and critical infrastructure in both countries continue to lack the technology and funding to build adequate data protection and recovery capabilities. 

Organizations that haven’t addressed their data protection and recovery posture are now risking both security and compliance headaches, as regulatory penalties and recovery costs often outmatch ransom payouts. Europe still leads in data governance and regulation with the likes of GDPR, but legislation like the California Consumer Privacy Act (CCPA) is quickly spreading across the US. By delaying their investment in protection and compliance solutions until forced to, many large organizations will soon face the possibility of steep penalties, ransom demands, and business disruption simultaneously.” 

Russ Kennedy, Chief Product Officer at Nasuni

Enterprises will embrace hybrid infrastructure or fall behind 

“The next revolution in data will occur at the edge. After years of conflicting definitions and uncertainty, today’s leading businesses are realizing the necessity of truly hybrid infrastructure. To remain competitive in a data-driven world, enterprises need high performance processing at the edge, where data is generated, in combination with the scale, capacity, and advanced tools available in the cloud.  

Traditionally, large companies have used legacy storage vendors and traditional backup solutions to store and protect petabyte volumes of data. These legacy infrastructures are a performance bottleneck and can’t support the pace of growth, as analyst William Blair recently highlighted.  

Over the next few years, we’ll see more organizations realize it’s not one or the other, but a combination of edge and cloud storage. According to Gartner, 50 percent of critical infrastructure applications will reside outside of the public cloud through 2027. Manufacturers, for example, need to quickly capture and consolidate the critical data coming from their physical systems and processes across the world, while keeping and leveraging that data for analytics year after year. Ready or not, we’ll see this edge-cloud mechanism force organizations to adopt and embrace truly hybrid infrastructure and ultimately transform their ability to drive more effective innovations and respond in a more agile way to customer’s evolving needs.” 

Organizations will continue to grapple with data infrastructure to support hybrid work long after the pandemic 

“The genie is out of the bottle and hybrid or remote is here to stay. Though the greatest economic upheavals have hopefully passed, we’re seeing the residual effects. Many companies are still trying to design or optimize infrastructure to accommodate hybrid work and reconfigured supply chains.  

Though organizations worked quickly to spin up the necessary systems, they simply weren’t designed to support thousands of remote workers. Inevitably, workers started using whatever tools necessary to collaborate, and many businesses saw a significant increase in shadow IT tools outside of sanctioned corporate IT programs. As we enter 2024, IT organizations are still grappling with the effects of remote work on top of mounting pressure to reduce costs and regain control of their disparate and sprawling corporate data assets. 

Some have tried to remedy the issue by mandating employees back into the office, but to attract and retain appropriate talent, businesses will need to provide enhanced multi-team collaboration options and the data infrastructure to scale it. Those that have the right data access solutions in place to streamline processes and remote collaboration will succeed in the hybrid work economy.” 

Matt Waxman, Senior Vice President and GM for Data Protection at Veritas Technologies

The first end-to-end AI-powered robo-ransomware attack will usher in a new era of cybercrime pain for organizations

“Nearly two-thirds (65 percent) of organizations experienced a successful ransomware attack over the past two years in which an attacker gained access to their systems. While startling in its own right, this is even more troubling when paired with recent developments in artificial intelligence (AI). Already, tools like WormGPT make it easy for attackers to improve their social engineering with AI-generated phishing emails that are much more convincing than those we’ve previously learned to spot. In 2024, cybercriminals will put AI into full effect with the first end-to-end AI-driven autonomous ransomware attacks. Beginning with robocall-like automation, eventually AI will be put to work identifying targets, executing breaches, extorting victims and then depositing ransoms into attackers’ accounts, all with alarming efficiency and little human interaction.”

Targeted cell-level data corruption will make ransomware more dangerous than ever

“As more organizations become better prepared to recover from ransomware attacks without paying ransoms, cybercriminals will be forced to continue evolving. In 2024, we expect hackers to turn to targeted cell-level data corruption attacks—code secretly implanted deep within a victim’s database that lies in wait to covertly alter or corrupt specific but undisclosed data if the target refuses to pay a ransom. The real threat is that victims will not know what data—if any, the hackers could be bluffing—has been altered or corrupted until after the repercussions set in, thus effectively rendering all their data untrustworthy. The only solution is to ensure they have secure copies of their data that they are 100 percent certain are uncorrupted and can be rapidly restored.”

Adaptive data protection will autonomously fight hackers without organizations lifting a finger

“More than two-thirds of organizations are looking to boost their cyber resiliency with the help of AI. But, given AI’s dual nature as a force for both good and bad, the question going forward will be whether organizations’ AI-powered protection can evolve ahead of hackers’ AI-powered attacks. Part of that evolution in 2024 will be the emergence of AI-driven adaptive data protection. AI tools will be able to constantly monitor for changes in behavioral patterns to see if users might have been compromised. If the AI detects unusual activity, it can respond autonomously to increase their level of protection. For example, initiating more regular backups, sending them to differently optimized targets and overall creating a safer environment in defense against bad actors.”

Generative AI-focused data compliance regulations will impact adoption

“For all its potential use cases, generative AI also carries heavy risks, not the least of which are data privacy concerns. Organizations that fail to put proper guardrails in place to stop employees from potentially breaching existing privacy regulations through the inappropriate use of generative AI tools are playing a dangerous game that is likely to bring significant consequences. Over the past 12 months, the average organization that experienced a data breach resulting in regulatory noncompliance shelled out more than US$336,000 in fines. Right now, most regulatory bodies are focused on how existing data privacy laws apply to generative AI, but as the technology continues to evolve, expect generative AI-specific legislation in 2024 that applies rules directly to these tools and the data used to train them.”

For every organization that makes the jump to the cloud, another will develop an on-premises datacenter as hybrid cloud equilibrium sets in

“The percentage of data stored in the cloud versus on-premises has steadily grown to the point where it is estimated that 57 percent of data is now stored in the cloud with 43% on-premises. That growth has come from both mature companies with on-premises foundations making the jump to the cloud, and newer companies building their infrastructure in the cloud from the ground up. But both categories of organizations are learning that, for all its benefits, the cloud is not ideally suited for all applications and data. This is leading many companies that made the jump to the cloud to partially repatriate their data and cloud-native companies to supplement their cloud infrastructure with on-premises computing and storage resources. As a result, in 2024, we’ll see hybrid cloud equilibrium—for every organization that makes to the move to the cloud, another will build an on-premises datacenter.”

Cassius Rhue, VP of Customer Experience at SIOS Technology

Application high availability becomes universal

“As reliance on applications continues to rise, IT teams will be pressured to deliver efficient high availability for applications once considered non-essential. Once reserved for mission-critical systems, such as SQL Server, Oracle, SAP, and HANA, application high availability – typically delivered with HA clustering technology – will become a requirement for more systems, applications, and services throughout the enterprise.”

Cloud and OS agnostic high availability becomes an expected requirement for most applications

“IT teams will look for application HA solutions that are consistent across operating systems and cloud reducing complexity and improving cost-efficiency. As the need for HA rises, companies running applications in both on-prem and cloud environments as well as those running applications in both Windows and Linux environments will look to streamline their application environments with HA solutions that deliver a consistent user interface across all of their environments and also for matching cloud and OS technical support and services from the HA vendor.”

The trend toward migration to SAP HANA is likely to continue in 2024

“The mandatory 2027 migration will push more companies to migrate to SAP HANA. As companies migrate to SAP HANA there will be an increased need for more sophisticated and flexible high availability and disaster recovery solutions that help them bridge the gap between existing systems and the new, more modern systems that take advantage of SAP HANA’s capabilities. Organizations will look for HA solutions that help them find ways to take advantage of emerging technologies and accelerate digital transformation, while not losing the HA and DR capabilities that continue to arise.”

Automation becomes more common in high availability and disaster recovery efforts as data and analytics increase complexity

“As the volume and variety of data as well as the channels through which data are collected increase, organizations will require more information about why faults/failures occurred and how to address potential issues. Automation and orchestration tools will play a central role, streamlining root cause analysis, improving intelligent responses, and enhancing disaster recovery processes to further reduce downtime and enhance data availability.”

The focus on data security and compliance will intensify

“The focus on data retention, security, and access controls will intensify prompting organizations to integrate enhanced security measures deeper into their high availability and disaster recovery solutions, services, and strategies. As the volume and variety of data as well as the channels through which data are collected and processed increase, organizations will require more security measures to be baked into their solutions.”

Sophisticated storage and DR strategies will become crucial to the demands of an increasingly dynamic and data-driven business landscape

“As the volume of unstructured data continues to surge, storage solutions are expected to prioritize scalability, tiered performance, and accessibility. Enterprises will also adopt more sophisticated and resilient DR strategies using multiple high availability (HA) nodes, and DR technologies that understand the complexity of tiered storage solutions. Cloud storage is expected to continue its ascendancy, with organizations increasingly relying on scalable and flexible cloud solutions to meet their expanding data requirements. At the same time, a growing number of companies will look to move workloads out of the cloud to on-prem environments in favor of more predictable costs and greater control over their environments.”

Justin Borgman, Co-Founder and CEO at Starburst

All things make a comeback and on-prem storage is having a resurgence

“Companies including Dell have heavily invested in their EMC portfolio. Enterprise customers will continue to recognize that enhancing on-premise storage hardware presents the faster path to mitigating rising cloud expenses. This modernization will allow companies to manage data gravity for on-premise data that cannot be easily relocated, ensuring a more efficient approach.”

Haoyuan Li, Founder and CEO at Alluxio

Hybrid and Multi-cloud Acceleration

“In 2024, the adoption of hybrid and multi-cloud strategies is expected to accelerate, both for strategic and tactical reasons. From a strategic standpoint, organizations will aim to avoid vendor lock-in and will want to retain sensitive data on-premises while still utilizing the scalable resources offered by cloud services. Tactically, due to the continued scarcity of GPUs, companies will seek to access GPUs or specific resources and services that are unique to certain cloud providers. A seamless combination of cross-region and cross-cloud services will become essential, enabling businesses to enhance performance, flexibility, and efficiency without compromising data sovereignty.”

From Specialized Storage to Optimized Commodity Storage for AI Platform

“The growth of AI workloads has driven the adoption of specialized high-performance computing (HPC) storage optimized for speed and throughput. But in 2024, we expect a shift towards commoditized storage. Cloud object stores, NVMe flash, and other storage solutions will be optimized for cost-efficient scalability. The high cost and complexity of specialized storage will give way to flexible, cheaper, easy-to-manage commodity storage tailored for AI needs, allowing more organizations to store and process data-intensive workloads using cost-effective solutions.”

Jimmy Tam, CEO at Peer Software

Active-Passive High Availability Practices Evolve – Active-Active Has its Moment

“Without continuous availability and real-time access to data, businesses risk losing out to competitors, making decisions with inaccurate information, and more. So it is no wonder that CIOs are starting to demand more from their data centers. In the coming 12 months, it is likely that many IT leaders will start to adopt active-active capabilities, improving performance by distributing the workload across several nodes to allow access to the resources of all servers. 

By moving away from active-passive technologies that simply don’t make the most of the available servers and often require manual intervention during outages, CIOs will ensure that data is actionable wherever it resides, is as close as possible to the end-user for performance, and that the load of data processing is spread across all compute and storage nodes whether it be at the edge, in the data center, or in the cloud.”

The storage industry will start to productize AI and ML 

“AI and Machine Learning have so much promise, but they’re not being adopted as quickly as anyone in the industry anticipated. There’s a clear reason why: users simply don’t know how to realize the technologies’ full potential. Beyond ChatGPT, which is easy to use and incredibly popular, there’s no real out-of-the-box product for enterprise storage customers. So unless organizations have a data scientist on hand to help them navigate the intricacies of AI and ML, they’re very likely to hold off when it comes to implementing any kind of solution.

This presents a great opportunity for the storage industry and the smart companies are already starting to think about it. Through 2024, we’ll see the beginning of the productization of AI and ML. Ready-to-use packages will be developed so that users can easily understand what the technologies can help them achieve, while being straightforward to set up and run. Then watch, as AI and ML adoption increases.”

Virtual Desktop Infrastructure is here to stay – but much will move back on-premise

“When Covid hit, VDIs were the reason many of us could continue to work. They offered users a flexible, consistent experience from wherever they logged in and became a lynchpin for organizations during the days of lockdown. But there was an issue: the hardware was difficult to get hold of. And the urgency we all became so used to during the pandemic meant there was no time to wait for the supply chain to right itself, so CIOs turned to the cloud. 

Don’t get me wrong, the cloud has clear benefits. It is easy to implement, and it is elastic in nature, quickly responding to and growing with our needs. But it can be very expensive and, because cloud providers tend to charge for each transaction, costs can be difficult to predict. Availability in the supply chain, will bring about a shift towards migrating highly transactional workloads back on-premise. Unhappy with writing blank checks, CFOs will rightly start to ask CIOs to demonstrate ROI and explain the cost difference between cloud and on-premise.”

JB Baker, Vice President of Marketing & Product Management at ScaleFlux

Sustainable Data Storage Becomes a Priority

“With sustainability rising as an urgent priority across industries, data storage solutions will be under increasing pressure to reduce their environmental impact. Organizations are ramping up investments in energy-efficient technologies to meet emissions requirements and goals. Data storage, projected to account for 14 percent of the global carbon footprint by 2040, will be a key focus area.

To minimize the footprint of the data center, storage leaders will need to look beyond device-level specs and take a solution-wide view. The criteria will expand to encompass data compression, energy expenditure, workload optimization, and more. The goal is to maximize efficiency and minimize power consumption across the storage infrastructure. As sustainability becomes a competitive differentiator, we will see rapid innovation in “green” data storage technologies, architectures, and management techniques. The storage domain will play a critical role in driving the sustainability transformation.”

Anand Babu, Co-Founder & CEO at MinlO

Unstructured data becomes a core enterprise challenge

“Over the last few years, we have seen explosive growth in the semi-structured data world (log files, models, snapshots, artifactory code) which has, in turn, driven the growth of object storage.”

“In 2024, we’ll see an enterprise explosion of truly unstructured data (audio, video, meeting recordings, talks, presentations) as AI applications take flight. This is highly ‘learnable’ content from an AI perspective and gathering it into the AI data lake will greatly enhance the intelligence capacity of the enterprise as a whole, but it also comes with unique challenges.”

“There are distinct challenges with maintaining performance at 10s of petabytes. Those generally cannot be solved with traditional SAN/NAS solutions — they require the attributes of a modern, highly performant object store. This is why most of the AI/ML technologies (I.e. OpenAI, Anthropic, Kubeflow), leverage object stores and why most databases are moving to be object storage centric.”

Jon France, CISO at ISC2

We’ll see an evolution, rather than a revolution of regulations

“The regulatory landscape will continue to stay hot – I think we’ll see more regulations governing AI and privacy in particular, and we’ll likely see more backlash around reporting requirements and a push for agencies to define what should actually be reported and at what thresholds of materiality. However, I don’t see a major overhaul coming. Instead, I think what we’ll see is sectors grappling with the tangible effects of the requirements that have been introduced. We’re no longer looking at these regulations as being on the horizon…in 2024, they’ll have to be adhered to. With this, I hope to see increased harmonization of regulations globally, so that multinational companies don’t run into navigational issues of not knowing which regulations and policies to follow and which don’t apply. We’re starting to see increased communication on a global scale, but we’re not there yet. It may be wishful thinking, but I predict we’ll see major global powers collaborating on what a cyber secure world should look like, and making policy decisions based on those discussions.”

Mark Cassetta, Chief Product Officer at Axiomatics

With recent legislation, the security market is poised to shift focus

“2023 saw a number of notable startups, especially incorporating Generative AI into their offerings. As fast as enterprises have started to adopt the usage, legislation has been discussed and shared to try to further protect the US identity and economy. This means that in 2024, just as both the mobile (MDM) and cloud platform shifts (CASB) created their security categories, we will see the same very quickly formally emerge for Generative AI.”

Giorgio Regni, CTO at Scality

HDDs will live on, despite predictions of a premature death

“Some all-flash vendors prognosticate the end of spinning disk (HDD) media in the coming years. While flash media and solid state drives (SSDs) have clear benefits when it comes to latency, are making major strides in density, and the cost per GB is declining, we see HDDs holding a 3-5x density/cost advantage over high-density SSDs through 2028.

Therefore, the current call for HDD end-of-life is akin to the tape-is-dead arguments from 20 years ago. In a similar way, HDDs will likely survive for the foreseeable future as they continue to provide workload-specific value.”

End users will discover the value of unstructured data for AI

“The meteoric rise of large language models (LLMs) over the past year highlights the incredible potential they hold for organizations of all sizes and industries. They primarily leverage structured, or text-based, training data. In the coming year, businesses will discover the value of their vast troves of unstructured data, in the form of images and other media.

This unstructured data will become a useful source of insights through AI/ML tooling for image recognition applications in healthcare, surveillance, transportation, and other business domains. Organizations will store petabytes of unstructured data in scalable “lakehouses” that can feed this unstructured data to AI-optimized services in the core, edge and public cloud as needed to gain insights faster.”

Ransomware detection will be the next advancement in data protection solutions

“In recent years, the tech industry has made tremendous strides in protecting data against all manner of threats, including increasingly destructive malware and ransomware. This is exemplified by the rise of immutability in data protection and data storage solutions, especially for backup data.

While data protection and restoration are a major cornerstone that serves as a critical last line of defense in a layered cybersecurity infrastructure, new advancements in AI-generated ransomware detection capabilities will emerge in data protection and storage solutions in 2024.”

Managed services will become key to resolving the complexity of hybrid cloud

“Multi-cloud is a reality today for most enterprises, in their use of multiple SaaS and IaaS offerings from different vendors. However, the use of on-premises and public cloud in a single application or workload has become mired in the complexities of different application deployment models and multiple vendor APIs and orchestration frameworks.

While this has inhibited the powerful agility and cost-reduction promises of the hybrid-cloud model, throughout the coming year, organizations will increasingly leverage the experience and skills of managed service providers (MSPs) to solve these complexity issues and help them achieve business value and ROI.”

Amer Deeba, CEO and Co-Founder at Normalyze

SEC Regulations Will Impact The One Area We Don’t Want to Talk About: Your Data 

“As we know, the new SEC transparency requirements and ruling now requires public companies to disclose cybersecurity posture annually and cyber incidents within four days after determining an incident was material. In 2024, this major policy shift will have a significant effect on one key area: data, forcing businesses to think about security with data at the forefront. In response, enterprises will dedicate both effort and budget to support the SEC’s data-first strategy – implementing best practices that assure shareholders that their company’s most valuable asset – data – is protected. In 2024, companies will need to discover where their data resides and who can access it, while proactively remediating risks that have the highest monetary impact in the event of a breach. When faced with this dilemma companies will lean on automation, specifically end-to-end, automated solutions that center on a holistic approach.

The recent ALPHV/Black Cat and MeridianLink breach underscores the importance for businesses of understanding exactly what data they have, where it lives, and how it is protected. In order to answer critical questions with confidence in the event of a breach and lower the probability of a breach, companies need to build better defenses. The risk of exposure/tagging is not novel, but with these new disclosure requirements, securing the target of such attacks – the data – has gone from a good first practice to an absolute necessity.  Being proactive means that if a breach does occur, you can respond quickly, answer these critical questions, be in compliance with the SEC requirements, and most importantly — respond. To summarize, in 2024 we’ll see organizations separated by their approach to data security. With these regulations, there is no alternative. Organizations must effectively remediate risks to lucrative sensitive data before breaches occur. Only this will allow organizations to respond decisively and confidently if an incident occurs.” 

To Address the Influx of Data, Security Teams Must Approach Data Security Like a Team Sport

“As AI booms, the industry is facing increasing complexity and an influx of data, and companies are grappling with how to keep it all secure. In the height of AI technology adoption, companies will need to refocus in 2024 on what matters most – protecting their data as it gets used by machine learning modes and new AI technologies. Businesses need to change their approach: the success of the coming year for organizations big and small will come back to how they do so. The challenges that this will bring require the profound depth and efficiencies of AI and automated processes to ensure protection of cloud-resident sensitive data. As demands around data change in 2024, organizations will need to invest in their security and cloud ops teams, approaching data security like a team sport, building more efficient shared responsibility models to better protect data. These teams can then regain visibility of all data stores within an enterprise’s cloud or on-premises environment and trace possible attack paths, overprovisioned access, and risks that can lead to data exposure. Only by identifying the approach to data, ensuring permissions and privileges and efficiently implementing AI will companies enable their teams to be successful in 2024.” 

Raul Martynek, CEO at DataBank

The AI cloud wars between hyperscalers will take center stage

“With Google’s latest investment in Anthropic, together with Microsoft’s stake in OpenAI as well as Nvidia’s support for GPU-as-a-service players like CoreWeave, we are beginning to see the emerging outlines of a new phase of competition in the public cloud driven by differentiated AI GPU clouds. In 2024, these new competition dynamics will take center stage as big tech seeks to outcompete each other in the race to realize artificial general intelligence. Nvidia will emerge as a giant competing on the same level as the ranks of Google, Microsoft and AWS. With its cutting-edge GPUs, I see Nvidia emerging as a very capable threat to big tech’s dominance in the public cloud space.”

Miroslav Klivansky, Global Practice Leader at Pure Storage

“In 2024, we will start to creep into GenAI’s trough of disillusionment (Gartner’s Hype Cycle defines this as a period when interest wanes when experiments and implementations fail to deliver) and will eventually industrialize the use of AI.  As we shift from the hype brought on by AI tools with a consumer-friendly UX, we’ll see companies better understand, invest, and apply AI-specific solutions to their business needs.”

“In 2024, we can expect AI to optimize energy efficiency across energy-hungry industries (e.g., manufacturing) as it will be integral in optimizing the process and money savings. Deploying LLMs for inference at scale will also lead to surprisingly high power bills, leading companies to review their data center efficiency strategies and ESG initiatives.”

“One of the industries most ripe for innovation with the help of AI is healthcare. Not only does it have the potential to improve diagnostics, but it also can improve medical devices and automate administrative tasks. The latter will likely be disrupted first because these systems are electronically managed and quickly automate tasks.”

The rate of AI innovation will slow down in the next year

“Over the last several years, AI innovation has been fueled by information sharing and open-source development. However, as companies increasingly invest in AI to give them a competitive edge and regulatory bodies seek to unpack the potential around AI’s broader impact, companies will likely be more aggressive when it comes to protecting their IP.”

Kurt Markley, Managing Director, Americas at Apricorn

Cyber Resilience

“The rapid growth of AI is helping bad actors more quickly create and deploy ransomware tools across a host of industries. It’s been reported that generative AI has helped to double ransomware attacks against industries such as healthcare, municipalities and education between August 2022 and July 2023. Also concerning is the rate at which organizations choose to pay a ransom in order to secure their data. One research report shows that nearly half of respondents have a security policy in place to pay a ransom, with 45% admitting that bad actors still exposed their data even after paying the ransom.

Ransomware isn’t a threat; in many instances it’s an inevitability. No data is too low-value and no organization is too small. The alarmingly high rate of paying a ransom and still having data exposed means that IT leaders have to take back control and put practices in place to protect their data and save their capital budget. It means that IT leaders can’t afford to slack off regarding cyber resilience. 

While almost all IT leaders say they factor in data backups as part of their cyber security strategies, research we conducted earlier this year found that only one in four follow a best practice called the 3-2-1 rule, in which they keep three copies of data on two different formats, one of which is stored offsite and encrypted. Furthermore, this same research found that more than half of respondents kept their backups for 120 days or less, far shorter than the average 287 days it takes to detect a breach.

The likelihood that AI-driven ransomware will impact far-higher numbers of organizations, it will be more important than ever in 2024 that organizations have a strong cyber resiliency plan in place that relies on two things: encryption of data and storage of it for an appropriate amount of time. IT leaders need to embrace the 3-2-1 rule and must encrypt their own data before bad actors steal it and encrypt it against them.”

Data Management Within Security Policy

“Data is no longer a byproduct of what an organization’s users create; it is the most valuable asset organizations have. Businesses, agencies and organizations have invested billions of dollars over the past decade to move their data assets to the cloud; the demand is so high that Gartner expects that public-cloud end user spending will reach $600B this year. These organizations made the move to the cloud, at least in part, because of a perception that the cloud was more secure than traditional on-prem options.

It’s estimated that 30 percent of cloud data assets contain sensitive information. All that data makes the cloud a juicy target and we expect that 2024 will continue to show that bad actors are cunning, clever and hard-working when it comes to pursuing data. The industry has seen triple the number of hacking groups attacking the cloud, with high-profile successes against VMware servers and the U.S. Pentagon taking place this year.

As IT teams spend more on moving and storing data in the cloud, organizations must spend the next 12 – 24 months auditing, categorizing and storing it accordingly. They need to gain deeper visibility into what data they have stored in the cloud, how data relates to each other, and if it is still meaningful to the operations of the organization. In doing so, they are advised to create specific security policies about how, where and for how long they store their data. These policies, when actively enforced, will help organizations better protect their most valuable asset – their data.”

Brian Land, VP of Global Sales Engineering at Lucidworks

Navigating the Privacy Terrain

“In 2024, brands are gearing up to face new challenges around privacy and ethics with the end of third-party cookies and the advent of new large language models (LLMs). This means they’ll be shaking things up in how they market and handle consumer data privacy. For example, they’ll have to find new methods for collecting user data and be more transparent about how they’re collecting that data. And when it comes to managing LLMs, they will adopt advanced encryption and secure data storage practices to safeguard user information. Rest assured, they’re working hard to get it right – making sure they follow the rules while still keeping consumers engaged and happy.”

Matt Watts, CTO at NetApp

There’s No Such Thing as Perfection

“If your company thinks the cloud will ease every IT woe your team is experiencing, you’ll want to think again. It won’t and it can’t. Migrations in hybrid multicloud environments strain both budgets and team resources and you’ll need to find ways to optimize operations both as you move to the cloud and every day thereafter. According to a recent report on data complexity, approximately 75 percent of global tech executives in the throes of cloud migration note they still have a sizable number of workloads remaining on-premises (between 30 percent and 80 percent). For most companies, maintaining an increasingly complex IT infrastructure will remain challenging as cost pressures mount alongside demands for greater innovation. In 2024, we’ll see companies abandon unrealistic ideas of creating the “perfect” cloud environment as they move toward an intelligent data infrastructure (IDI). IDI is the union of unified data storage with fully integrated data management that delivers both security and observability with a single pane of glass interface so you can store, control, and use data more easily no matter what applications, cloud services, or databases you’re using. Companies that choose IDI will experience greater agility in adapting to market conditions. With a more nimble infrastructure, IT can spend its time on innovation, skill building, and development that align with business priorities rather than simply maintaining their cloud environments.”

The Uncomfortable Truth: You’ve Already Had a Breach

“Today’s cyber threat landscape requires constant vigilance as you try to guess who, when, and how the next bad actor will attack. Whether an employee clicks on the wrong link, or an organized gang of cyber criminals are the culprit, you’ll need to have the right tools to quickly alert you of an attack so you can recover quickly. And, while preventing attacks is always the goal, the ability to keep bad actors out indefinitely is now a statistical anomaly. In fact, it’s predicted that by 2031 ransomware attacks will occur every 2 seconds at a cost of $265 billion each year. Because of this, 87 percent of C-suite and board-level executives see protecting themselves from ransomware as a high, or top, priority. And stolen data isn’t your biggest concern after an attack. It’s the lost productivity and business continuity as systems are repaired and data restored to get your business up and running again. In 2024, we’ll see more investment in IT security that ensures systems are secure by design and keep business to a minimum when there is an attack. Security infrastructures that include immutable data backups will add to peace of mind and mitigate downtime when cyber incidents are investigated.”

James Beecham, Founder and CEO at ALTR

While AI and LLMs continue to increase in popularity, so will the potential danger

“With the rapid rise of AI and LLMs in 2023, the business landscape has undergone a profound transformation, marked by innovation and efficiency. But this quick ascent has also given rise to concerns about the utilization and the safeguarding of sensitive data. Unfortunately, early indications reveal that the data security problem will only intensify next year. When prompted effectively, LLMs are adept at extracting valuable insight from training data, but this poses a unique set of challenges that require modern technical solutions. As the use of AI and LLMs continues to grow in 2024, it will be essential to balance the potential benefits with the need to mitigate risks and ensure responsible use. 

Without stringent data protection over the data that AI has access to, there is a heightened risk of data breaches that can result in financial losses, regulatory fines, and severe damage to the organization’s reputation. There is also a dangerous risk of insider threats within organizations, where trusted personnel can exploit AI and LLM tools for unauthorized data sharing whether it was done maliciously or not, potentially resulting in intellectual property theft, corporate espionage, and damage to an organization’s reputation.  

In the coming year, organizations will combat these challenges by implementing comprehensive data governance frameworks, including, data classification, access controls, anonymization, frequent audits and monitoring, regulatory compliance, and consistent employee training. Also, SaaS-based data governance and data security solutions will play a critical role in keeping data protected, as it enables organizations to fit them into their existing framework without roadblocks.”

With increased data sharing, comes increased risk

“Two things will drive an increased need for governance and security in 2024. First, the need to share sensitive data outside of traditional on-premise systems means that businesses need increased real-time auditing and protection. It’s no surprise that sharing sensitive data outside the traditional four walls creates additional risks that need to be mitigated, so next year, businesses need – and want – to ensure that they have the right governance policies in place to protect it. 

The other issue is that new data sets are starting to move to the cloud and need to be shared. The cloud is an increasingly popular platform for this, as it provides a highly scalable and cost-effective way to store and share data. However, as data moves to the cloud, businesses need to ensure that they have the right security policies in place to protect data, and that these policies are being followed. This includes ensuring that data is encrypted both at rest and in transit, and that the right access controls are in place to ensure that only authorized users can access the data. 

In 2024, to reduce these security risks, businesses will make even more of an effort to protect their data no matter where it resides.”

Rodman Ramezanian, Global Cloud Threat Lead at Skyhigh Security

Data Security and Privacy Concerns

“Organizations are increasingly concerned about the security and privacy of their data in the cloud. On-premises infrastructures tend to give organizations more control over their data.”

Andrew Hollister, CISO & VP Labs R&D at LogRhythm

Generative AI adoption will lead to major confidential data risks

“The cybersecurity landscape will confront a similar challenge with generative AI as it did previously with cloud computing. Just as there was initially a lack of understanding regarding the shared responsibility model associated with cloud computing, we find ourselves in a situation where gen AI adoption lacks clarity. Many are uncertain about how to effectively leverage gen AI, where its true value lies, and when and where it should not be employed. This predicament is likely to result in a significant risk of confidential information breaches through gen AI platforms.”

Angel Vina, CEO & Founder at Denodo

Organizations Will Need to Manage Cloud Costs More Effectively

“As businesses continue to shift data operations to the cloud, they face a significant hurdle: the relentless, unsustainable escalation of cloud data expenses. For the year ahead, the mandate is not just to rein in these rising costs but to do so while maintaining high-quality service and competitive performance. Surging cloud hosting and data management costs are preventing companies from effectively forecasting and budgeting, and the previously reliable costs of on-premises data storage have become overshadowed by the volatile pricing structures of the cloud.

Addressing this financial strain requires businesses to thoroughly analyze cloud expenses and seek efficiencies without sacrificing performance. This involves a detailed examination of data usage patterns, pinpointing areas of inefficiency, and a consideration for more cost-effective storage options. To manage cloud data costs effectively, firms need to focus on the compute consumed by queries and the associated data egress volumes, tabulating the usage of datasets, and optimizing storage solutions. These efforts are enhanced by adopting financial operations (FinOps) principles, which blend financial accountability with the cloud’s flexible spending model.

By regularly monitoring expenditures, forecasting costs, and implementing financial best practices in cloud management, organizations can balance cost savings and operational efficacy, ensuring that their data strategies are economically and functionally robust. In 2024, we will see a significant rise in the use of FinOps dashboards to better manage cloud data charges.”

Kevin Keaton, CIO/CISO at Red Cell Partners

Shifting Cyber Regulations Will Change the Status Quo

“The new SEC rules on cyber that go active in December for public companies are and will continue to cause significant changes in how boards operate with regards to cyber risks – and I expect that navigating these rules will be the biggest cybersecurity challenge businesses will face in 2024.”

Eric Herzog, Chief Marketing Officer at Infinidat

Triple play of cyber resiliency, detection, and recovery to create an overall corporate cybersecurity strategy 

“The convergence of cyber resilience, detection, and recovery on a single storage platform is fueling a trend for 2024 around higher levels of cybersecurity for enterprise storage. Reliance solely on backup is no longer enough to secure storage systems. Primary storage has become a main target of cybercriminals for the most insidious and hard-to-detect ransomware and malware attacks that wreak costly havoc on enterprises. Combining resilience (the ability to instill defensive security measures to repel attacks), detection (the ability to know when data is corrupted and whether a known good copy of data is free of ransomware or malware), and recovery (the ability to bounce back) from cyberattacks is the key to hardening storage infrastructure.

This trend to better secure storage systems is highly important because of the continued exponential increase in cyberattacks against enterprises of all types and in all industries.  Cybercrime is predicted to grow from $8 trillion worldwide in 2023 to more than $10 trillion in 2025. Cybercriminals attempted nearly 500 million ransomware attacks last year, marking the second-highest year ever recorded for ransomware attacks globally, and in the 2023 Fortune CEO survey of “Threats” to their companies, CEOs named cybersecurity their #2 concern. Ransomware attacks also represented 12 percent of breaches of critical infrastructure in the last year.

The convergence of cyber resilience, detection, and recovery on an integrated storage platform is an advancement over the past, commonly-used approach of disparate tools and technologies trying to combat cyberattacks in silos. Improving the security defenses of cyber storage for enterprises eliminates the vulnerabilities of the silos. It makes the cyber capabilities more air-tight and ensures a rapid recovery of data within minutes to thwart cybercriminals, nullifying ransom demands and preventing (or minimizing) any downtime or damage to the business. Ignoring this trend in 2024 could greatly harm an enterprise, especially one that doesn’t even know cybercriminals are lurking in their data infrastructure, no matter how good their other cybersecurity defenses are.”

Stacy Hayes, Co-Founder and EVP at Assured Data Protection

More channel players to use specialists for managed services 

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds. 

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers. 

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.” 

Brian Dutton, Director, US Sales and Client Services at Assured Data Protection

More businesses to spend upfront for managed services to beat inflation 

“Businesses are becoming more cost-conscious as prices for cloud and SaaS services keep rising in line with inflation. Every time the large vendors and hyperscalers pass on costs to the customer, company CFOs and finance directors find themselves asking IT the question, ‘where can we cut costs?’ This is creating a dilemma for IT teams, who are left wondering how do they keep the lights on and execute new digital and cloud strategies, on a smaller budget? Which is why so many have switched to an OPEX model that covers core capabilities, including DR and backup, based on a consumption model that is paid for in monthly installments. It has allowed them to cut CAPEX, operate on a per TB model as opposed to wasting valuable data center resources, and focus their efforts on business priorities. 

The impact and cost savings are tangible, but it’s also thrown a lifeline to SMBs and government organizations that simply don’t have the budget or infrastructure to support investment in new DR and backup solutions. The managed service option has become the preferred choice for large enterprises that have to prioritize transformation projects, and SMEs, local schools and municipalities with budget limitations. We expect more businesses to adopt the utility-based model that managed service providers offer for cloud-based data management. It lightens the load on teams, while reducing risk and guaranteeing uptime and business continuity in the event of a disaster, data breach or ransomware attack. Another byproduct of this trend we’ve experienced is companies prepared to pay for services upfront, locking costs in for up to 6-12 months, or longer, to protect themselves against inflation. This makes financial sense, especially if you’re cash rich now and want to ensure your data is protected over the long term when market volatility can affect prices elsewhere. We expect this to become the norm next year and the foreseeable future.” 

Andrew Eva, Director, CIO at Assured Data Protection

Scope three emissions compliance set to drive uptake of disaster recovery managed services 

“Sustainability is an issue that impacts every part of the economy and increasingly, the technology sector is being held to account for its carbon emissions. Until recently, organizations have mostly had to concern themselves with two key emission calcifications: scope one – emissions the organization is directly responsible for, and scope two – indirect emissions, such as electricity. Now though, we’re seeing the impact of scope three emissions being felt. That is, all other emissions associated with an organization’s activities, including its supply chain. While scope three emissions aren’t yet legally enforceable, they are being widely adopted by large organizations, as legislation is inevitable and there’s a widespread desire to get ahead of the issue. We’re now seeing their impact filter down to smaller organizations.  

This is an issue for the DR sector and organizations that are leaders in sustainability – they are recognizing the challenge and the value of outsourcing this function to an MSP. By eliminating the need for data backup via a second site, which are costly to operate, don’t always utilize the latest power efficient hardware, and are responsible for significant carbon emissions, ESG compliance is a lot more manageable. There’s also recognition that this isn’t simply offloading the problem because MSP DR solutions achieve economies of scale by servicing multiple organizations via a shared facility, making them carbon-efficient for customers. Given the rate at which scope three is permeating, we expect to see more organizations adopt outsourced DR services. Both existing and future and existing business for MSPs depends on helping customers and partners achieve ESG compliance.”

Register for Insight Jam (free) to gain exclusive access to best practices resources, DEMO SLAM, leading enterprise tech experts, and more!

The post 59 Data Protection Predictions from 33 Experts for 2024 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6296
4 Questions IT Managers Can Ask to Strengthen Data Backup and Resiliency https://solutionsreview.com/backup-disaster-recovery/questions-it-managers-can-ask-to-strengthen-data-backup-and-resiliency/ Fri, 20 Oct 2023 21:31:05 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6209 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Apricorn‘s Kurt Markley offers four data backup and resilience questions to ask right now. IT leaders face an escalating array of challenges. The landscape of evolving digital threats, coupled with the pandemic-induced surge in remote […]

The post 4 Questions IT Managers Can Ask to Strengthen Data Backup and Resiliency appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Apricorn‘s Kurt Markley offers four data backup and resilience questions to ask right now.

IT leaders face an escalating array of challenges. The landscape of evolving digital threats, coupled with the pandemic-induced surge in remote and hybrid work, has exposed organizations to an increasing number of vulnerabilities. The exponentially fast growth of generative AI applications, too, is cause for alarm, as tools like ChatGPT and Google Bard are making it easier to create and deploy ransomware attacks. 

Additionally, it’s all too common for IT leaders to lose sight of the big picture while heads down at work, which increases the risk of being slow to respond and unprepared to get back up and running in the event of a security crisis.  

So, if you’re an IT manager, what’s the best way to assess the current state of affairs and prepare for what lies ahead?

Data Backup and Resiliency Questions

Prioritize Data Backup and Resiliency 

Begin by focusing on data backups and resiliency as your first line of defense. This ensures that your organization possesses current copies of its most crucial data, safeguarding it against potential disasters. As for why this should be a top priority, look no further than the news, which regularly reports on cybersecurity breaches and ransomware attacks. No matter your business, these incidents can be devastating and affect stakeholders in the short and long term. For example, an attack on a healthcare organization would disrupt healthcare IT systems, affecting patients and staff, but would likely incur hundreds of millions in dollars of recovery costs, too.  

Still, despite the known risks and cautionary tales, recent research by Apricorn reveals a concerning statistic: 99 percent of IT decision-makers struggle to recover data when disaster strikes, even when they have a backup strategy in place. Furthermore, more than 70 percent of IT leaders have had to recover data from backups, with 26 percent unable to fully restore it. In other words, IT professionals are well-aware of the scale of these threats, yet they have not fully mastered prevention or recovery. It is high time to regain control.  

To evaluate your own organization’s preparedness, and to identify opportunities to enhance your data backup and resiliency, start by asking these four questions: 

“Are We Sticking to the 3-2-1 Rule?” 

Not all backups are created equal. The 3-2-1 rule is a simple, but vital practice: maintain three copies of your data on two different media, with one copy stored offsite, encrypted, and offline. Opt for secure storage of local backups on portable hardware-encrypted external devices. Additionally, emphasizing encryption at every location ensures maximum data control, regardless of the disaster scenario. 

“Have We Defined our Backup and Recovery Plan?” 

While IT managers understand the importance of a backup and resiliency plan, they often fall short in its clear definition, communication, and documentation. Take the time to comprehensively outline your plan, then share it with your team. Specify who should be alerted in various situations and establish a clear chain of command for times when leaders are unavailable. A well-documented, shared, and accessible plan significantly reduces risk and streamlines problem resolution, particularly in the aftermath of a DDoS or ransomware attack. 

“How Often are We Checking In?”  

Unlike some aspects of business, when it comes to cybersecurity, there’s no such thing as “one-and-done.” Even if you diligently follow the 3-2-1 rule and define your backup and resiliency plan, ongoing monitoring and improvement are essential. Develop a plan for regular reviews of your multilayered strategy. Consistently back up your data, including offsite and offline copies, and conduct rigorous testing of data recovery processes. Frequent testing prevents you from becoming a statistic and increases the likelihood of a successful restoration in the event of a breach. 

“Are We Auditing What We’re Storing?” 

Regular audits ensure the data being backed up is intact and has not been corrupted or altered. This is crucial for ensuring that, in the event of a data loss, the backup can be relied upon to be restored. Audits also help to ID what’s being stored and what is no longer needed. What is outdated or no longer relevant can be removed from the cloud or the backup, which saves on storage costs. Whether evaluating backups for testing, compliance or even capacity planning, performing audits is a proactive step to help IT leaders maintain control of their data and potentially safeguard against unforeseen events.  

Final Thoughts 

Adopting a strategic approach to data backup and resiliency empowers organizations to enhance data control, mitigate unauthorized data access, and expedite recovery in the face of data breaches, attacks, or losses. As a busy IT professional, asking these four questions is an important starting point to safeguard your organization from costly consequences down the road, and boost resiliency, earning the gratitude of your customers and company alike. 

Download link to Data Protection Vendor Map

The post 4 Questions IT Managers Can Ask to Strengthen Data Backup and Resiliency appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6209
Retailers Must Use SaaS Safely to Protect their Bottom Line https://solutionsreview.com/backup-disaster-recovery/retailers-must-use-saas-safely-to-protect-their-bottom-line/ Thu, 19 Oct 2023 13:59:02 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6189 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Zerto‘s Global Director of Technical Product Marketing Kevin Cole offers commentary on why retailers must use SaaS safely to protect the bottom line. The growth of the global Software as a Service (SaaS) market […]

The post Retailers Must Use SaaS Safely to Protect their Bottom Line appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Zerto‘s Global Director of Technical Product Marketing Kevin Cole offers commentary on why retailers must use SaaS safely to protect the bottom line.

The growth of the global Software as a Service (SaaS) market has been explosive, making it one of the technology industry’s most impressive success stories in recent years. According to McKinsey, the global SaaS market is now valued at $3 trillion, and their estimations show it could increase to a whopping $10 trillion by 2030. The retail industry is one of the largest users of business-to-business technologies like SaaS. A plethora of critical retail software, including order and management fulfillment systems and communication tools, live in SaaS apps in the cloud.

However, there is a key limitation to the SaaS model that companies are not always aware of. Adopting Software as a Service can put retailers at risk of significant data loss, as most SaaS providers only offer basic data protection functionality and operate on a shared responsibility basis. Many companies assume SaaS providers will completely handle data protection, but this is not the case as SaaS providers can usually provide basic data security but lack comprehensive plans and strategies. Retailers using SaaS often find that their data, which they assumed to be safeguarded and recoverable, was not kept as secure as they thought.

SaaS Data Protection

Shared Responsibility in SaaS

The disconnect in expectations here is reasonable, given that one of the core aspects of the SaaS model is that the provider takes on the customer’s technological responsibility and provides it to them as a service. But just because a cloud-based service is adopted, the responsibility for data protection is not automatically taken on by the provider.

This is why it is so important to closely consider the parameters of the shared responsibility model. While signing up with a SaaS provider usually means a range of key technology priorities will be addressed (such as physical security, the operating system, and other factors which should be listed in each Service Level Agreement), protection of users and data is rarely included and remains the responsibility of the customer. Unless specifically built into the contract upfront, viruses and malware, insider threats, and issues caused by human or configuration error are usually not covered by the SaaS provider. If this is not accounted for, a disaster recovery situation can lead to data loss for the company.

For retailers in particular, a data breach can be extremely costly. The 2022 IBM Cost of a Data Breach Report revealed that the average data breach cost for retailers in 2022 was $3.28 million. Additionally, in the retail industry, the impact of a data breach goes far beyond just the financial cost. Loss of consumer confidence can severely damage a company’s bottom line and brand name for years.

Data Protection in a Multi-SaaS Environment

Organizations can take action to make sure they are not leaving themselves vulnerable to data loss. One of the most crucial factors to consider when creating a data protection strategy is SaaS complexity. Data protection becomes significantly more complex the more SaaS applications are used, particularly when extracting data requires proprietary tools.

Industry data shows that in 2022, organizations used an average of 130 SaaS applications each. Many retailers use different iterations of the same SaaS application to manage multiple regions within their supply chain and various product lines across the chain. This means that data is split across a diverse range of SaaS providers, who all store that data on their own data center infrastructure or in the cloud, using different vendors and technology stacks. If a retailer has 50 separate instances of their Customer Relationship Management or their ticketing system, each iteration of the application is vulnerable.

How Vendor-Agnostic Solutions Can Help

The key objective in handling data protection needs to be creating an isolated, tamperproof copy of the data and data objects contained in each SaaS application and workload. Implementing one vendor-agnostic backup solution is easier than trying to use multiple different backup solutions across SaaS platforms, each with its own user interface and architecture.

A unified platform will remove layers of administrative complexity and users will benefit from a streamlined data protection solution that provides one view of all the data sets across the organization’s SaaS portfolio. This platform should also provide automated backup and recovery capabilities, especially for key enterprise SaaS apps like Google Workspace, Microsoft 365, Salesforce, and others.

With all these capabilities, users can protect their application data against risks such as ransomware attacks and accidental data deletion, using a scalable and secure protection method with granular data recovery. When an issue arises, data can either be moved or restored to the same SaaS vendors. Organizations can also create additional immutable copies of backups stored in an independent cloud dedicated to data protection and not rely on large hyperscalers. This ends up being hugely beneficial to data protection and issues such as compliance.

As SaaS adoption continues to accelerate rapidly, data protection strategies need to evolve with them and address new challenges. Retailers benefit from creating a vendor-agnostic SaaS data protection strategy which provides all the benefits of a SaaS and the confidence that their data is safe and recoverable, no matter what happens.

Download link to Data Protection Vendor Map

The post Retailers Must Use SaaS Safely to Protect their Bottom Line appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6189
Regulation Can Only Do So Much: It’s Time to Build for Better Data Privacy https://solutionsreview.com/backup-disaster-recovery/regulation-can-only-do-so-much-its-time-to-build-for-better-data-privacy/ Thu, 19 Oct 2023 13:58:24 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6193 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Inrupt VP of Trust and Digital Ethics Davi Ottenheimer offers commentary on why regulation can only do so much in the era of data privacy. More than two-thirds of consumers globally are concerned about […]

The post Regulation Can Only Do So Much: It’s Time to Build for Better Data Privacy appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Inrupt VP of Trust and Digital Ethics Davi Ottenheimer offers commentary on why regulation can only do so much in the era of data privacy.

More than two-thirds of consumers globally are concerned about their online data privacy. It’s a statistic that shouldn’t surprise anyone — unless they expected it to be even higher.

Recognizing the need to set standards and hold companies accountable for responsible data collection, governments around the world have enacted a slew of privacy laws designed to protect our digital selves.

Regulation is a necessary step. The problem is that it’s often reactive. It comes in the wake of major privacy breaches and it’s designed to course-correct society away from the worst version of our future.

But while privacy laws are informative and valuable, they cannot deliver a working solution to online privacy alone. Laws are not technical enough to prescribe exactly how to deliver the kind of practical solutions that grant users peace of mind about how businesses handle their data.

So rather than scrambling to comply with every new regulation that emerges, companies should see these laws as an opportunity for innovation. Legislation everywhere sends the same essential message: Current data management systems and technical infrastructure are failing to deliver meaningful data control and transparency.

It’s past time for the industry to take the hint and start building better privacy. Re-designing how data is stored, accessed, and controlled will bridge the gap between a baseline of legal compliance and user-centric privacy engineering best practices. Ultimately, the result will be increased trust and security between companies and users. Better yet, we can build the foundation for more powerful, individualized services and products that people actually want to use.

Data Privacy Regulations

Where Data Privacy Regulations Fall Short for Consumers

As it stands now, a significant gap exists between what privacy laws are aiming to prevent and their practical implications for users expecting bigger changes to the privacy landscape.

Laws like Europe’s GDPR or California’s Consumer Privacy Act enshrine user rights like the ability to opt out of targeted advertising, obtain a copy of their data, or request that their data be deleted. But they don’t give instructions to engineering departments about how to make any of these actions inherently accessible and standardized for users.

Sharing personal information like name, phone number, and date of birth is the price we pay for the ability to access nearly any online service. Unfortunately, once a user checks the box agreeing to a website’s terms and conditions, there’s no going back. If they change their mind, the only reasonable next step is to email the company and request that their data be deleted. This can be likened to someone being dragged asleep onto a ship that sets sail. When they wake up in the middle of the ocean, the only two options are to remain a captive or jump overboard.

Even if consumers can remember every company that has copies of their personal data — from online retailers and financial websites to public utilities and everything in between — it requires a significant amount of time and energy to investigate, manually submit objections, and follow up on data privacy requests.

For users, that certainly doesn’t feel like transparency or control. It feels more like having no choice at all.

Meanwhile, companies tasked with complying with privacy regulations are getting stuck in proprietary and dead-end implementations. As new laws emerge and existing rules evolve, companies feel pressured into resource-intensive projects that reorganize consumer data to comply with the latest standards but don’t actually achieve basic safety and usability measures. A costly rearrangement of deck chairs on the Titanic is not an approach anyone really wants to see.

The growing number of laws correctly restricting how companies receive authorization to use customer information should be driving us all toward a new way of thinking about compliance. Instead of repeatedly requesting consent for new use cases and re-collecting data they already possess, what if companies could use a performance-oriented consent mechanism that also reduces risks to privacy?

Building Trust with Data Operationalized for Privacy

Both users’ and companies’ pain points can be solved by operationalizing data for privacy in a streamlined way that enables users to easily manage their shared data. But it requires adopting new technology that organizes data around people, not around applications or data warehouses.

A user-centric data architecture makes it possible to achieve user visibility and control over personal data because data isn’t fragmented across giant hidden silos. Instead, each user’s data is housed in their own personal data store they can access. In this setup, users have the ability to see how companies are processing or manipulating their data. They can grant or revoke consent at any time for individual use cases.

This model accomplishes the objectives of privacy regulations by providing users with transparency and control, while also benefiting companies by making data more accessible and actionable for business decision-making. The end result is increased trust between organizations and consumers because it respects users’ basic rights, such as ownership over and the liberty to control personal information.

Data that is operationalized for privacy achieves:

  • Control: Users gain the ability to exercise meaningful technology-based control over how their data is used and can make choices about consent at any moment. Simultaneously, companies gain the ability to access consumer information in a controlled ecosystem that eliminates the need to build complex data infrastructure that’s counter-productive to individual rights. For smaller companies and startups, users inherently maintaining control eliminates many barriers to entry that come from navigating complex privacy regulation solutions.
  • Transparency: Users can see what’s happening to their data at all times and gain a clearer understanding of how companies are processing it. Transparency enables them to make informed decisions about shared data, which helps them feel comfortable sharing their data for new purposes. Likewise, providing transparency helps companies eliminate persistent data silos while simplifying the process of gaining consumer consent to operationalize data for new purposes.
  • Trust: When technology delivers user control and transparency, consumers are no longer compelled, without real alternatives, to blindly trust companies’ claims that they are responsible stewards of personal data. This newfound sense of security encourages consumers to place more trust in companies when it comes to data usage, and removes exit barriers should users decide to opt-out down the road.

For instance, a healthcare provider may want to use AI to crawl consumers’ health records to surface potential risks that might otherwise go unnoticed, resulting in significant benefits for both consumers and providers. Still, many consumers would be understandably cautious about granting AI access to their records because of the risk of never being able to remove consent. But with data technology built for privacy in place, consumers are more likely to trust companies to access their data knowing they can benefit from more privacy-centric AI and change or revoke access at any point.

Delivering on the Promise of Privacy Requires New Data Technology

Privacy regulations worldwide emphasize what users value and need to make our society function best: control, transparency, and trust. These laws serve as important safeguards to spur innovations in technology, but they aren’t enough to ensure true privacy on their own without innovators building the next generation of tools.

To fully deliver on the promise of privacy regulations, we need new technology that is designed for privacy and puts control back in the hands of consumers. Vendors building this technology will foster trust between consumers and companies, opening the door for organizations to operationalize more data for more purposes and better decision-making.

Ultimately, it’s the synergy of legal frameworks and user-centric, privacy-first data technology that will bridge the gap between privacy regulations’ intent and the kind of practical implementation that will usher in both an important expansion of knowledge as well as higher levels of privacy.

Download link to Data Protection Vendor Map

The post Regulation Can Only Do So Much: It’s Time to Build for Better Data Privacy appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6193
GRC as a Service: The Future of Governance and Risk Management https://solutionsreview.com/backup-disaster-recovery/6145-2/ Fri, 15 Sep 2023 11:19:28 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6145 GRC (governance, risk, and compliance) has long been a static, check-the-box approach for organizations that can be stressful and burdensome. As cyber threats continue to grow in sophistication and number, organizations face the daunting and repeated challenge of ensuring compliance with ever changing regulations. For many, the traditional methods of audits and assessments take shape […]

The post GRC as a Service: The Future of Governance and Risk Management appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

GRC (governance, risk, and compliance) has long been a static, check-the-box approach for organizations that can be stressful and burdensome. As cyber threats continue to grow in sophistication and number, organizations face the daunting and repeated challenge of ensuring compliance with ever changing regulations.

For many, the traditional methods of audits and assessments take shape as a reactive 11th-hour hustle, one that tends to be expensive while only providing a point-in-time report with limited value.

Organizations who are tired of this approach would do well to consider GRC as a Service (GRCaaS). This approach transforms compliance into an operational program, making it a more proactive and constructive business as-usual approach. Here are six major complaints I hear about GRC, and how GRCaaS addresses them:

  1. We find ourselves duplicating efforts. GRCaaS is particularly well-suited to organizations with multiple compliance frameworks. PCI and HITRUST are both great examples: If you are a healthcare organization that accepts credit cards, those controls overlap with each other to an extent, because at their core they use the same security compliance framework (an example would be NIST CSF). Handling these frameworks separately means doing some of the same things twice. GRCaaS removes such inefficiencies. With a GRCaaS approach, evidence is collected over a 12-month period and uploaded to a central repository. The items that live there can be used to satisfy multiple compliance requirements, reducing the documentation burden and the tendency to have to pull the same information repeatedly.
  2. It feels like an annual fire drill. Many GRCaaS engagements are multi-year in nature, and that can make the entire approach to GRC one that gets progressively easier over time. Since all documentation and evidence lives in a central location, when year two or year three rolls around, businesses can see what was uploaded the previous year, review it, and make any necessary changes. They know what is coming up from an evidence standpoint and what controls are going to be reviewed down the line. That enables organizations to see their progress and maturity over time while also maintaining real-time, continuous compliance that evolves as they do.
  3. Standalone reports are only half-helpful. Too often, GRC sprints end with the organization receiving a list of 50 items that need to be fixed. That type of reporting can be tough to take action on because of the challenge of prioritizing so much information at one time. With GRCaaS, as things are assessed, any gaps or findings are logged into a portal. The result is a list of actionable remediation items that can be worked on throughout the year. By its nature, the GRCaaS model provides ongoing touch points and drivers—as well as ongoing and consistent resources. Instead of being handed a report and left to fix things on their own, organizations have an experienced GRC resource available every step of the way.
  4. It is tough to coordinate tasks across departments. GRCaaS forces everyone to manage and track GRC-related activities in one location. In doing so, it reduces the siloed feeling that GRC can sometimes produce. Organizations typically experience an increase in collaboration due to the centralized nature of relying on one platform: There is a single place to add comments, send items back for review, or have conversations.
  5. It can sap resources—people and money. “How am I going to get this done in six weeks by the deadline?” It is a question too many organizations find themselves asking when it comes to adherence with regulatory compliance, and the answer can be an expensive one. Sometimes getting it done within a tight window requires extra resources—and extra costs. GRCaaS reduces that fire drill to make it more cost effective. Organizations that transition to GRCaaS are also able to free up internal resources. Instead of setting aside resources for eight hours a week disrupting the currently scheduled tasks, the resources can plan well in advance allowing companies to prioritize their needs appropriately and be able to accomplish other initiatives.
  6. We are always one step behind. GRCaaS typically manifests as a multi-year engagement. As new controls and requirements are introduced, they are embedded into that as-a-service model without having to spin up another contract or start another engagement. GRCaaS is fiercely forward-looking. If something is coming down the pike two years from now, new controls will be integrated into the portal, allowing businesses to be proactive, preemptively eliminate future stress, and mature their security posture.

With GRCaaS, compliance becomes an enhancement that can help direct an organization rather than hinder it, enabling it to better monitor contracts, decide on internal controls, build business continuity plans, plan cybersecurity investments, and more.

The post GRC as a Service: The Future of Governance and Risk Management appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6145
Are Your Backups Safe From Ransomware? Your 8-Point Checklist for Backup Security & Data Protection https://solutionsreview.com/backup-disaster-recovery/are-your-backups-safe-from-ransomware-your-8-point-checklist-for-backup-security-data-protection/ Thu, 14 Sep 2023 13:31:35 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5965 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s CTO Doron Pinhas offers The average cost of recovery from a ransomware attack has more than doubled in a year, according to a Sophos survey. The global report also shows that just 8% […]

The post Are Your Backups Safe From Ransomware? Your 8-Point Checklist for Backup Security & Data Protection appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s CTO Doron Pinhas offers

The average cost of recovery from a ransomware attack has more than doubled in a year, according to a Sophos survey.

The global report also shows that just 8% of organizations manage to get back all of their data after paying the ransom.

With the increased number and sophistication of ransomware attacks, it’s not a matter of if, but when. And when it does happen, your ability to recover clean and up-to-date backup files is your last line of defense.

  • The Conti ransomware gang has developed novel tactics to demolish backups The majority of targets who pay the ransom are motivated by the need to restore their data.
  • The ransomware gang, Hive, is known to seek out and delete any backups to prevent them from being used by the victim to recover their data.

In this new Dummies Guide to Ransomware Resiliency for Enterprise Storage & Backup, discover the new threat tactics, and get a list of practical tips and solutions to secure these critical systems, protect your data, and ensure recoverability.

Backup & Security Checklist

Your 8-Point Checklist To Secure Your Backups

A ransomware attack is a horrible time to discover that your backups are not secure, so to help, here’s an 8-point checklist to determine whether your backups are sufficiently secured, and whether data is fully protected.

Do your security incident-response plans include cyberattacks on your backups? If so, what’s included:

  • Recovery from a complete wipe of a storage array
  • Recovery from a complete corruption of the SAN fabric configuration
  • Recovery from ransomware

Is there a complete inventory of your storage and backup devices, that includes the current security status for each one?

  • All backups, archive environments, storage arrays (block, file, object), and SAN switches
  • Storage software versions (storage OS, firmware deployed), and, in particular: patching status, known CVEs, and actual resolution status
  • What is backed up? Where? How?
  • Which storage & backup protocols are allowed? Are all obsolete and insecure protocols disabled

Is there comprehensive and secure event logging and auditing of your backups?

  • Including: central log services, redundant and tamper-proof records, and redundant and reliable time service

Are you able to audit the configuration changes?

  • e.g., what changed and when – in device configuration, storage mapping, and access control?

Is there a well-documented, and enforced separation of duties for your backups?

  • e.g., separate admins for storage, backup, and disaster recover in each environment

Are all storage and backup administrative-access mechanisms documented?

  • e.g., which APIs are open, how many central storage management systems can control each storage device, and are there any servers or OS instances that can control storage

Are existing mechanisms for ransomware protection, air-gapping, and copy-locking used?

  • Is there an audit process to verify they are correctly deployed at all times

Is the security of your backups regularly audited?

  • Does this audit process include: SAN communication devices, storage arrays (block, file, object), server-based SAN, and backup?

Take the 2-minute Ransomware Resiliency Assessment for Backups, and get your own maturity score and practical recommendations – to help protect your data, and ensure recoverability.

 

 

 

 

Download link to Data Protection Vendor Map

The post Are Your Backups Safe From Ransomware? Your 8-Point Checklist for Backup Security & Data Protection appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5965
The Most Overlooked Security Issues Facing the Financial Services https://solutionsreview.com/backup-disaster-recovery/the-most-overlooked-security-issues-facing-the-financial-services/ Fri, 08 Sep 2023 21:28:33 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5967 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s John Meakin offers Data is a major part of the role of any CISO. When it comes to the financial services industry, data is even more important and valuable than in other industries. […]

The post The Most Overlooked Security Issues Facing the Financial Services appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s John Meakin offers

Data is a major part of the role of any CISO. When it comes to the financial services industry, data is even more important and valuable than in other industries. Securing storage and backup systems isn’t always obvious and isn’t always the focus of many CISOs or their teams. I admit that it wasn’t part of my focus until quite recently.

So, what is the big picture of securing storage and backup? Is this a Cinderella area in the pursuit of business security? How can you prepare? And where do you go from here? I will share with you my views in this article.

Security & the Financial Sector

All Eyes On Storage And Backup

It’s no secret that modern security is focused on data, particularly in the financial services industry. The rise – and sophistication – of ransomware attacks has been documented by all parties concerned.

From industry publications like Bleeping Computer

“The ALPHV ransomware operation exploits veritas backup exec bugs for initial access. U.S. Cybersecurity and Infrastructure Security Agency (CISA) adds these 3 security issues to its list.”

…to analysts like Gartner

Harden the components of enterprise backup and recovery infrastructure against attacks by routinely examining backup application, storage and network access and comparing this against expected or baseline activity.”

…to governments finally addressing the issue, like in last year’s White House memo:

Test the security of your systems and your ability to defend against a sophisticated attack.”

Ransomware is focused on data. As such, the key to mitigating (and ideally neutralizing) that threat is to secure data in storage and backup.

We tend to think of backups as the final layer of protection against ransomware, though in reality they are simply another repository of data in storage, ready to be harvested if not appropriately secured.

This begs the question: are we as CISOs and security leaders currently focused on the most pressing risks?

The Unspoken Gap

The value of business data is growing annually in virtually every organization. Malicious actors recognize this fact, so data-centered attacks continue to grow both in number and sophistication.

Are we really rising to this challenge as CISOs and security leaders? Have we spent enough time analyzing and reinforcing those darker parts of our storage and backup infrastructure that any smart threat would target? This industry-wide oversight is exactly why so many of these attacks succeed.

There are other myths that many CISOs and security leaders believe which feed the current exponential growth of attacks and further demonstrate the industry’s continued failure to harden storage and backup systems. They are the greatest current oversight in cybersecurity.

The Shift In Voice And Focus Of The Financial Services CISO

The truth? In a cloud-fuelled world, storage layers deserve as much attention as computing and networking layers. Cloud providers offer cloud storage as a separate service, carrying a separate set of risks – access keys in AWS S3 storage, for example.

Storage security issues aren’t limited to the cloud either; they spread across the full spectrum of hybrid and on-premise infrastructures. All these modes of storage constitute separate systems, but for whatever reason they haven’t enjoyed the same attention from infrastructure and security experts as those on other layers.

The need for change is also reflected in this Financial Services Research Report, which analyzed the state of storage & backup security:

  • Two-thirds believe that a storage attack will have ‘significant’ or ‘catastrophic’ impacts.
  • 60% are not confident in their ability to recover from a ransomware attack.
  • Two-thirds say securing backups and storage was addressed in recent external audits.

Heading For A Better Future… But How? 4 Steps For Success

Now that we understand the problem, what’s the solution?

Many CISOs already follow the steps below when faced with a new threat. While I’m not here to deliver a tutorial on how to teach a fisherman how to fish, it’s nonetheless critical to revisit the fundamentals to ensure we’re covering the increasing storage & backup security problem in the correct and thorough way it deserves. At this point we’re playing a game of catch-up, and can ill-afford missteps.

Use the following four steps to form the foundation of your storage and backup security approach:

1.   Education

The first step is to understand the capabilities of your storage and backup devices. In your real environment, what do you have (not just in theory): which vendors do you use, how are their technologies deployed, and how are roles and responsibilities defined?

An excellent place to begin in this phase is to perform an initial assessment of your storage and backup security. This assessment will detail any risks identified and include the corrective steps for remediation.

The NIST Special Publication 800-209; Security Guidelines for Storage Infrastructure (co-authored by Continuity) is an excellent resource for those looking to develop their storage infrastructure knowledge. It provides a thorough overview of current storage technologies and their relative risk landscapes.

2.   Definition

Once you get the lay of the land, you should define ‘secure enough’ baselines of your storage and backup environments. It needs to be detailed since these environments are complex and the attack surface is convoluted.

For instance: what kind of roles are needed? What kind of controls do we want to have? What level of auditing do we expect?

Once you define these baselines, it’s much easier for the storage admins to ensure they’re fully implemented, audited and monitored

We also need to define threats and robust security protocols.

3.   Implementation

With knowledge accrued and threats defined, the rubber needs to meet the road. Now comes the stage of implementing the controls that were previously defined. Please note: usually when the initial gap analysis is done (remember step 1), you end up with a long list of deviations. Now’s the time to iron them out.

Implementing automation in performing these changes is key to wrap it close to the attack surface.

Another practice I recommend here is to build KPIs and automatic measurements for the predefined baselines, in order to make sure they are always met.

So, in essence, at this stage security leaders must:

  • Harden storage
  • Implement controls

4.   Ongoing risk management

Storage and backup security demands active, ongoing risk management. As threats continue to evolve, so must we. In order to keep up, lean on:

  • Measurement
  • Reporting
  • Automation

While the above steps might seem obvious, until now their implementation within the area of storage and backup has been less so. CISOs need to be in dialogue with the IT infrastructure teams to ensure that this set of risks is being taken as seriously as it needs to be.

Conclusion

Data is the bread and butter of the 21st century. And just as these valuable resources have always been securely stored and protected, so must an organization make significant investments in data protection, and storage and backup hardening. CISOs have the skill to do it; many simply lack the know-how. The problem needs to be reframed in the minds of security experts, and fast, as the problem of ransomware is already beginning to become a runaway train.

Download link to Data Protection Vendor Map

The post The Most Overlooked Security Issues Facing the Financial Services appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5967
The CISOs Guide to Storage & Backup Cyber Resiliency https://solutionsreview.com/backup-disaster-recovery/the-cisos-guide-to-storage-backup-cyber-resiliency/ Fri, 08 Sep 2023 21:19:20 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5968 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s CTO Doron Pinhas offers CISOs rely on information from across the organization about security, particularly from the various IT departments. Unfortunately, the information being fed to CISOs about the state of cybersecurity risk […]

The post The CISOs Guide to Storage & Backup Cyber Resiliency appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Continuity‘s CTO Doron Pinhas offers

CISOs rely on information from across the organization about security, particularly from the various IT departments. Unfortunately, the information being fed to CISOs about the state of cybersecurity risk is incomplete. There is a blind spot present – a gaping hole. Data about the security posture of their storage and backup systems is either woefully deficient or missing entirely.

That is one of the reasons why CISOs set strategy and approve the procurement of solutions to keep data and systems safe, yet the organization continues to suffer from breaches and attacks. Despite implementing vulnerability management, extended detection and response (XDR), threat monitoring, security information and event management (SIEM), and other technologies, they always seem to be one step behind the cybercriminal fraternity. That state of affairs is likely to remain until the inherent risk posed by vulnerable storage and backup systems is addressed.

CISO Guide to Storage & Backup

False Sense of Security

Part of the problem is that storage and backup systems are thought of as back-end and don’t pose the same level of risk as other layers of IT closer on the perimeter. This can lull storage admins, infrastructure managers, and CISOs into a false sense of security.

This is a misconception, and a dangerous one at that. The average enterprise storage device has around 15 vulnerabilities or security misconfigurations. Of these, three are considered a high or critical risk. Therefore, it is vitally important that CISOs understand the magnitude of the threat posed by insecure storage and backup systems and what they need to do about it.

Earlier this year, we interviewed 8 CISOs to get their insights on new data protection methods and the importance of securing storage & backup, including: John Meakin, Former CISO at GlaxoSmithKline and Deutsche Bank, Joel Fulton, Former CISO at Symantec and Splunk, Endré Jarraux Walls, CISO at Customers Bank, and George Eapen, Group CIO (and former CISO) at Petrofac.

Using the Wrong Tools

There are scores of vulnerability scanners, patch management, and configuration management systems in existence. Organizations rely on them to locate areas of potential weakness, remediate them, and deploy patches to resolve known vulnerabilities. These systems do a great job at inventorying and scanning networks, operating systems (OSes) and enterprise applications. But they are typically sketchy when it comes to inventorying and assessing storage and backup issues.

Shockingly, they often miss security misconfigurations and Common Vulnerability and Exposures (CVEs) on popular storage systems from the likes of Dell EMC, NetApp, or Pure, and backup systems from the likes of Veeam, Rubrik, and Veritas. Yet such systems host the crown jewels of enterprise data.

Superficial scans of storage and backup infrastructure can lead CISOs to believe that these systems lie outside the reach of cybercriminals. Nothing could be further from the truth. Hackers are notorious for finding ways to obtain privileges to user accounts and finding their way into storage and backup systems. From there, they can wreak havoc.

The State of Storage and Backup Vulnerabilities

The fact is that hundreds of active security misconfigurations and CVEs currently exist in various storage and backup systems. Our research shows that on average, about 20% of storage devices are currently exposed. That means they are wide open to attack from ransomware and other forms of malware.

A study of enterprise storage devices detected more than 6,000 discrete storage vulnerabilities, backup misconfigurations, and other security issues. At the device level, the average storage device is riddled with vulnerabilities, some of them severe. In addition, there are currently about 70 CVEs in storage environments that could be used to exfiltrate files, initiate denial-of-service attacks, take ownership of files, and block devices. Many of these CVEs are several months old. A few of them are a year or more old. This means that approved patches exist but are not deployed.

Don’t think the bad guys aren’t awareness of this. They prefer the easiest possible route into the enterprise. Why come up with a genius plan to broach defenses when all you need to do is scan for some common vulnerabilities and mount an incursion from there?

Storage Security Features Not Implemented

Modern storage devices often include ransomware detection and prevention capabilities. Some include the capability to lock retained copies, protect critical data from tampering and deletion, and air gap data. However, in breach after breach, such features were found to either be misconfigured or not implemented at all – leaving the organization exposed.

Misconfigured backup and storage systems impacts cybersecurity in other ways. Zoning and masking mistakes may leave LUNs accessible to unintended hosts. Replicated copies and snapshots may not be properly secured. Audit logging misconfigurations make it more difficult for the organization to detect brute force attacks and spot anomalous behavior patterns. They can also impede forensic investigation and curtail recovery efforts. And a surprising number of storage and backup systems still operate with their original default administrative passwords. These factory settings can be easily exploited by unauthorized employees and malicious actors to inflict serious damage.

These are just a few of the many security challenges that are present within enterprise infrastructure. There are many other areas to check. The bottom line is that storage and backup systems generally have a significantly weaker security posture than the compute and network infrastructure layers. It is a ticking time bomb ripe for exploitation by criminal gangs.

How to Harden Storage and Backup Security

Storage and backup systems must be fully secured to protect data and ensure recoverability. StorageGuard finds the security risks that other vulnerability management tools miss. Developed specifically for storage and backup systems, its automated risk detection engines check for thousands of possible security misconfigurations and vulnerabilities at the storage system and backup system level that might pose a security threat to enterprises data. It analyzes block, object, and IP storage systems, SAN/NAS, storage management servers, storage appliances, virtual SAN, storage networking switches, data protection appliances, storage virtualization systems, and backup devices.

Continuity’s StorageGuard ensures these systems will never be the weakest link in cybersecurity. Its comprehensive approach to the scanning of storage and backup systems offers complete visibility into blind spots, automatically prioritizing the most urgent risks, and remediating them.

Download link to Data Protection Vendor Map

The post The CISOs Guide to Storage & Backup Cyber Resiliency appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5968
3 Ways Business Leaders Can Help Create New AI Regulations https://solutionsreview.com/backup-disaster-recovery/ai-regulations-are-coming-here-are-3-ways-business-leaders-can-help-to-formulate-the-new-rules/ Fri, 01 Sep 2023 20:53:34 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6111 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Cohesity‘s Greg Statton offers commentary on upcoming and new AI regulations and how enterprises can help write the rules. The calls for regulating artificial intelligence (AI) are getting stronger. Developers, business leaders, academics and […]

The post 3 Ways Business Leaders Can Help Create New AI Regulations appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Cohesity‘s Greg Statton offers commentary on upcoming and new AI regulations and how enterprises can help write the rules.

The calls for regulating artificial intelligence (AI) are getting stronger. Developers, business leaders, academics and politicians on both sides of the aisle are calling for new rules aimed at making AI more accountable. This summer, Senate Majority Leader Chuck Schumer unveiled a legislative framework for regulating the new technology and announced a series of “AI Insight Forums” with top AI experts to explore how these regulations would work. Even self-described libertarian business leader Elon Musk has called for regulations around AI.

As questions surrounding AI regulation move from ‘if’ to ‘when’ and ‘how’, business leaders need to be proactive with helping the federal government formulate and roll out these new rules. A failure to participate in the process could threaten businesses’ ability to use AI to innovate, create new products and compete with foreign entities. AI is poised to radically transform the world around us, and the time to act is now. Business leaders simply can’t watch the situation play out from the sidelines.

 

Restoring Trust by Building a Culture of Responsible AI

As AI continues to evolve, the federal government is concerned about safety, security and trust – including potential risks associated with misuse of the technology such as biosecurity, cybersecurity and broader societal effects like protecting Americans’ rights and physical safety. Data plays a crucial role in creating these concerns (AI is only as good as the data you feed into it, of course), and the quality and management of this data directly influences the outcomes and potential risks that come with AI.

Business leaders can partner with the government to address these concerns by committing to common-sense, responsible AI development and deployment. This includes ensuring products are safe before launching them into the marketplace, building security and privacy safeguards directly into AI engines and earning the public’s trust by building a culture of transparency and accountability throughout their organizations. In parallel, businesses should prioritize research on the societal risks posed by AI systems – collaborating with governments, civil society and academia to share information and best practices for managing AI risks.

Here are three ways businesses can proactively address these concerns and create a culture of responsible AI in their organizations:

Develop & Implement Internal AI Ethics Guidelines

The business community can offer the government a set of best practices and guidelines by developing its own set of internal rules and regulations. This requires establishing a clear set of ethical principles that focus on transparency, fairness, privacy, security and accountability. Most companies today already adhere to clear policies around risk and bias, so it wouldn’t be much of a stretch to extend these policies across all data in the organization.

As policies change over time, it’s important to look back and curate older data sets that are still relevant to make sure new, evolving principles are being applied. As models learn, even small, imperceivable biases can create huge problems down the road – just ask Apple. Building and applying these guidelines internally ensures that government regulations will be means tested when rolled out on a grander scale.

Encourage Collaboration & Knowledge Sharing

As AI democratizes, organizations should foster a culture of collaboration and knowledge sharing within the organization and with key stakeholders – such as employees, partners, customers and the public as a whole. We’re seeing a rapid pace of innovation the world has never seen before, but this agility has expanded threat surfaces beyond traditional perimeters and threatens to create information silos within organizations. Knowledge workers across disciplines have different ideas and use cases for AI that engineers or developers haven’t considered. Openly encouraging cross-functional collaboration makes it easier to monitor and control AI development and use throughout the organization while breaking down silos.

Provide AI Ethics Training and Education

AI has the power to make employees’ lives easier, but it comes with great responsibility. Businesses need to make sure their workers understand the risks associated with using public and proprietary AI tools – especially the risks surrounding adversarial attacks, privacy preservation and bias mitigation. Clear guidelines around the kind of data they can input into AI engines protect personally-identifiable information (PII), intellectual property (IP) and other trade secrets. Consent is also important – making sure customers, employees and other stakeholders are comfortable with their data being used to train AI models. The last thing businesses want is for employees to go rogue and use AI on their own without the guidelines set out by the organization. The lack of visibility and control would be a recipe for disaster and set a dangerous precedent at a time when federal regulations are being formed.

Get Proactive to Get Ahead

Public trust is eroding across the board, according to the Edelman Trust Barometer, and governments are looking to increase regulations to get back on track. AI is firmly in their crosshairs, and business leaders need to get out in front of the process to provide clear ethical and procedural guidelines around the new technology’s development and deployment. This includes developing their own responsible AI principles internally, encouraging collaboration across the organization and ensuring all stakeholders are up to date on clear guidelines around protecting PII, IP and other critical data points. AI is going to change the world, and businesses have an opportunity to show regulators that they are serious about taking responsibility for using the technology in a safe, ethical manner.

 

The post 3 Ways Business Leaders Can Help Create New AI Regulations appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6111
Four Key Steps a CIO Should Take after a Ransomware Attack https://solutionsreview.com/backup-disaster-recovery/key-steps-a-cio-should-take-after-a-ransomware-attack/ Fri, 01 Sep 2023 20:53:24 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6109 Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Veeam CIO Nate Kurtz offers a commentary on key steps enterprises need to take after they’ve suffered a ransomware attack. The infamous MoveIt tool threatening enterprises everywhere has, of late, begun breaching companies that […]

The post Four Key Steps a CIO Should Take after a Ransomware Attack appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise technology. In this feature, Veeam CIO Nate Kurtz offers a commentary on key steps enterprises need to take after they’ve suffered a ransomware attack.

The infamous MoveIt tool threatening enterprises everywhere has, of late, begun breaching companies that don’t even use it, simply because their business partners do. Cyberattacks are proliferating with concerning ease and speed, and not everyone is prepared for it.

As a CIO myself, I’m keenly aware of the pressures CIO’s face, and have worked alongside Veeam’s own CISO to develop a strategic, targeted response to cyberattacks. What I’ve found is: there are four crucial measures to an effective post-attack response.

After a Ransomware Attack

Observe

When faced with a ransomware attack, our first instinct from a security perspective is to eliminate the threat and resolve the issue. Truthfully, this isn’t the best move.

Instead, a CIO should first focus on quickly isolating the bad actor within the environment. Sequestering them without removal is helpful because 1) it prevents the bad actor from harming other parts of the environment, and 2) it allows you to observe their actions. Eliminating or resolving the threat is tempting but it often prevents the opportunity to analyze the threat actor’s actions, which can reveal a lot about their intent, target, and strategy, as well as the company’s own vulnerabilities. It is also critical to understand the extent of the compromise both from a systems and data perspective.

Critical observation will help CIOs gain a better understanding of how the threat actor operated, and down the line, this knowledge will also help develop a proactive approach for the next ransomware attack.

Correct

Now that you have a comprehensive understanding of how the attacker infiltrated your company, you can take corrective measures.

What do ‘corrective measures’ entail? Namely, removing the threat, patching up the attack vector, recovering systems and data, and addressing any other damage the attacker may have caused. Once a CIO has done the necessary footwork to obtain valuable data on attacker intent, behavior patterns, knowledge, and impact, it’s high time the attacker be eliminated. In the observation stage, the attack is siloed off to prevent them from accessing and harming more of the company’s data processes. Pull the necessary tools required for removal and do so with the knowledge that they will not be able to

immediately return through their original breach, or any other potential vulnerability visible to the artificial eye.

Once the attacker’s presence has been removed, a CIO can review the damage done in full, checking through valuable data, backups, logs, and what seems to be missing and if it can be recovered or has a copy, and what may require further action.

Prevent

With the threat actor removed and the breach secured, CIOs can kick off preventative measures to avoid undergoing such an attack again. Scanning security measures will help identify any immediate gaps or vulnerabilities in your attack surface.

While an attacker may not return to the scene of the crime for another go, knowing their point of attack can help patch the vulnerability and protect against another threat. In reviewing the criminal profile stemming from the attack, as a CIO, you must focus on the key variables at play: the target, the attacker’s identity, the actions they took, and the impact they caused. These factors are crucial to determining next steps to reduce future risks. Identify the pattern of behavior to determine if similar activity could cause another, or wider, breach.

Security vulnerabilities are often seen as technical issues, but the biggest risk is the people working within the organization. Most attackers enter companies through human engineering – phishing scams or the like, preying on the distracted employee. In such cases that lead to an attack, you could immediately restrict or lock down access for employees to avoid further harm.

Only when you have taken all the precautionary measures above to reduce or eliminate further threats can you move on to stage four: relaying the news.

Notify

It’s never fun breaking the news of a ransomware attack to your stakeholders. But transparency is valuable to retaining trust and loyalty while keeping the industry informed about emerging threats.

You must be purposeful in your notification. Sharing everything without a plan not only risks the company reputation, but also leaves you vulnerable to future attacks. Instead, start by reaching out to key parties – the board, the company’s legal team, and business stakeholders. If there has been a loss or theft of customer data, this can open the door to legal repercussions. Coordinate with your legal team and board to align on messaging and what information on the attack can be shared, with whom, and when.

It can take days to weeks to address an attack sequentially and thoughtfully. By this time, you will likely have the information to provide and be able to reassure customers of your company’s commitment to protecting their data, and the actionable steps taken to prevent more attacks. Doing so demonstrates customer value, helps retain customer loyalty and trust.

What Comes Next?

While ransomware attackers don’t normally target the same gap twice, they can, and likely will, strike again. Taking a backward approach and securing already-breached zones is not going to be effective for long. Instead, CIOs should consider the potential vulnerabilities and targets to get in front of before an attack can occur.

In the end, CIOs that follow the post-ransomware attack procedure, in whatever capacity, should operate with a primary goal in mind: To secure the future of the company.

Download link to Data Protection Vendor Map

The post Four Key Steps a CIO Should Take after a Ransomware Attack appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
6109