RSS feed Get our RSS feed

News by Topic

data volumes

Results 1 - 25 of 103Sort Results By: Published Date | Title | Company Name
By: TeamQuest     Published Date: Apr 09, 2014
TeamQuest Director of Global Services Per Bauer explains how to manage services in relation to servers, storage, network, power and floor space. Understand costing data, incidents, business transaction volumes, and demand forecasts. Watch this short video to learn more about Optimization in a Box and how to quickly improve your ability to optimize business services today.
Tags : teamquest, virtualization, it professionals, optimize software, distributed environment, data center, performance
     TeamQuest
By: McAfee     Published Date: Nov 07, 2014
Segundo o relatório “Agulha em um palheiro de dados” (Needle in a Datastack), as empresas estão vulneráveis a violações de segurança porque não são capazes de analisar ou armazenar adequadamente o Big Data. Esses volumes cada vez maiores de eventos, bem como de dados sobre ativos, ameaças, usuários e outros dados relevantes, criaram um grande desafio para as equipes de segurança em relação ao Big Data. Para solucionar esse desafio, as empresas abandonaram as arquiteturas tradicionais de gerenciamento de dados para adotar sistemas dedicados ao gerenciamento de dados de segurança na era das APTs (ameaças persistentes avançadas).
Tags : siem, big security data, segurança do big data, informações de segurança, ameaças avançadas, ameaças persistentes avançadas, apt, inteligência de segurança
     McAfee
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
     CA Technologies_Business_Automation
By: AWS - ROI DNA     Published Date: Aug 09, 2018
In today's big data digital world, your organization produces large volumes of data with great velocity. Generating value from this data and guiding decision making require quick capture, analysis and action. Without strategies to turn data into insights, the data loses its value and insights become irrelevant. Real-time data inegration and analytics tools play a crucial role in harnessing your data so you can enable business and IT stakeholders to make evidence-based decisions
Tags : 
     AWS - ROI DNA
By: MarkLogic     Published Date: Jun 09, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic
Tags : 
     MarkLogic
By: MarkLogic     Published Date: Nov 07, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic.
Tags : 
     MarkLogic
By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : ibm, data, big data, information, healthcare, governance, technology
     IBM
By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : data, lake, amazon, web, services, aws
     AWS
By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
     AWS
By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : cost effective, data storage, data collection, security, compliance, platform, big data, it resources
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
     Amazon Web Services
By: IBM     Published Date: May 28, 2014
Different types of data have different data retention requirements. In establishing information governance and database archiving policies, take a holistic approach by understanding where the data exists, classifying the data, and archiving the data. IBM InfoSphere Optim™ Archive solution can help enterprises manage and support data retention policies by archiving historical data and storing that data in its original business context, all while controlling growing data volumes and improving application performance. This approach helps support long-term data retention by archiving data in a way that allows it to be accessed independently of the original application.
Tags : ibm, data retention, information governance, archiving, historical data, integrating big data, governing big data, integration
     IBM
By: EMC Corporation     Published Date: Feb 13, 2015
Clutch Group provides litigation, compliance, and legal services to major Fortune 500 companies in sectors with large legal exposure (Finance, Life Sciences, Natural Resources, etc.). By using purpose-built technology solutions to extract deep insights from large data volumes, the company transformed its business approach of helping client firms make better legal decisions.
Tags : all-flash-array, purpose-built technology, large data volumes, business approach, legal decisions
     EMC Corporation
By: Mimecast     Published Date: Oct 11, 2018
Information management is getting harder. Organizations face increasing data volumes, more stringent legal and regulatory record-keeping requirements, stricter privacy rules, increasing threat of breaches and decreasing employee productivity. Companies are also finding that their old-fashioned, legacy archive strategies are increasingly ineffective. This is driving many organizations to rethink their approach, developing more modern Information Governance strategies.
Tags : 
     Mimecast
By: Mimecast     Published Date: Oct 11, 2018
Information management is getting harder. Organizations face increasing data volumes, more stringent legal and regulatory record-keeping requirements, stricter privacy rules, increasing threat of breaches and decreasing employee productivity. Companies are also finding that their old-fashioned, legacy archive strategies are increasingly ineffective. This is driving many organizations to rethink their approach, developing more modern Information Governance strategies.
Tags : 
     Mimecast
By: CrowdStrike     Published Date: Feb 01, 2017
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. Download the white paper to learn: • How to detect known and unknown threats by applying high-volume graph-based technology, similar to the ones developed by Facebook and Google • How CrowdStrike solved this challenge by building its own proprietary graph data model • How CrowdStrike Threat Graph™ collects and analyzes massive volumes of security-related data to stop breaches
Tags : 
     CrowdStrike
By: IBM     Published Date: Jul 05, 2018
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
     IBM
By: IBM     Published Date: Jul 09, 2018
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
     IBM
By: SAS     Published Date: Aug 03, 2016
Data visualization is the visual and interactive exploration and graphic representation of data of any size, type (structured and unstructured) or origin. Visualizations help people see things that were not obvious to them before. Even when data volumes are very large, patterns can be spotted quickly and easily. Visualizations convey information in a universal manner and make it simple to share ideas with others.
Tags : best practices, data visualization, data, technology
     SAS
By: HP     Published Date: May 14, 2014
In today’s world of explosive data volumes, midmarket companies face a growing number of challenges when it comes to safeguarding data, and today's approaches to backup and data protection are falling short. The Brief outlines HP solutions to meet IT needs for backup appliances that tightly integrate deduplication, data protection and recovery.
Tags : hewlett packard, data, backup, . data protections, recovery, deduplication
     HP
By: HP     Published Date: Feb 02, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
     HP
By: HP     Published Date: Feb 11, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
     HP
By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Les algorithmes de compression réduisent le nombre de bits nécessaires pour représenter un ensemble de données. Plus le taux de compression est élevé, plus cette technique de réduction des données permet d’économiser de l’espace. Lors de notre test OLTP, la baie Unity a atteint un taux de compression de 3,2 pour 1 sur les volumes de base de données. De son côté, la baie 3PAR affichait en moyenne un taux de 1,3 pour 1. Sur le test de chargement DataMart, la baie 3PAR a atteint un taux de 1,4 pour 1 sur les volumes de bases de données, tandis que la baie Unity enregistrait un taux de 1,3 pour 1.
Tags : 
     Dell PC Lifecycle
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
Start   Previous   1 2 3 4 5    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

Related Topics