top of page
Alex Nimigean

Big Data is Bringing Big Challenges

Updated: Apr 9, 2020


Data is transforming medicine, transportation, and many other industries. In theory, data is great. However, with our growing reliance on it, and big data getting bigger every day, the common growing problem is about making sense of what data matters, and how to best use it to solve critical business problems.

In just a few years, big data has advanced from scattered experimental projects to achieve mission-critical status in digital enterprises, and its importance is increasing. According to IDC, by 2020, organizations able to analyze all relevant data and deliver actionable information will earn $430 billion more than their less analytically oriented peers.

Veritas Data Insight gives you the analytics, tracking, and reporting necessary to deliver organizational accountability for file use and security. Designed to manage the needs of organizations with petabytes of data and billions of files, Data Insight integrates with archiving and security solutions to prevent data loss and ensure policy-based data retention.

  • Automate governance through workflows and customization

  • Drive efficiencies and cost savings in your unstructured data environment

  • Maintain regulatory compliance for information access, use, and retention

  • Protect confidential information from unauthorized use and exposure

Big data’s climb to the top rung of the information ladder means it must be treated with care. Backup, once an afterthought for big-data volumes, is now essential and must be completed promptly and reliably. IT leaders are taking heed. In a recent ESG survey on the top five 2018 data center modernization priorities, 31% of respondents indicated that improving data backup and recovery was a high priority. Increasingly, backup and recovery resources will be dedicated to big data.

To back up data effectively, many businesses will have to make significant operational changes. Legacy architectures that were sufficient for earlier-generation workloads must be revamped with modern architectures to protect environments running big data workloads such as Hadoop.

Veritas NetBackup is designed to meet the needs of big-data backup. It includes the Veritas NetBackup Parallel Streaming Framework, which is designed for large, scale-out, backup, and multinode cluster workloads in Hadoop environments. By backing up the Hadoop Distributed File System (HDFS) natively, the agentless NetBackup 8.1 eliminates the need for complex workarounds. Agentless architecture is an important concept for modern environments. Because there is no agent footprint on the cluster nodes, you need not be concerned with managing each respective agent, saving time and money. And there is no need to worry about upgrading Hadoop.

Conclusions

Big-data analytics have emerged from the test lab to take their place in the enterprise mainstream, establishing a track record of strategic value at many leading corporations. With business success or failure hanging in the balance, organizations must quickly analyze large quantities of data in real-time.

The infrastructure supporting big-data backup and recovery must adapt to this mission-critical role. The Veritas NetBackup Parallel Streaming Framework overcomes the obstacles to big-data backup inherent in big-data environments. Veritas NetBackup 8.1 delivers faster performance while reducing risk and consuming fewer storage resources, opening new horizons for digital enterprises in their quest to gain a strategic edge from big-data analytics.

49 views0 comments

Recent Posts

See All

Comments


bottom of page