RSS feed Get our RSS feed

News by Topic

data volumes

Results 1 - 25 of 103Sort Results By: Published Date | Title | Company Name
By: Akamai Technologies     Published Date: Apr 13, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : api, security, interface, businesses, data, mobile, adoption
     Akamai Technologies
By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : api, security, interface, businesses, data, mobile, adoption
     Akamai Technologies
By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : api, security, interface, businesses, data, mobile, adoption
     Akamai Technologies
By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : api, security, interface, businesses, data, mobile, adoption
     Akamai Technologies
By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : api, security, interface, businesses, data, mobile, adoption
     Akamai Technologies
By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : cost effective, data storage, data collection, security, compliance, platform, big data, it resources
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
     Amazon Web Services
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
     AWS
By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : data, lake, amazon, web, services, aws
     AWS
By: AWS     Published Date: Nov 28, 2018
Financial institutions run on data: collecting it, analyzing it, delivering meaningful insights, and taking action in real time. As data volumes increase, organizations demand a scalable analytics platform that can meet the needs of data scientists and business users alike. However, managing an on-premises analytics environment for a large and diverse user base can become time-consuming, costly, and unwieldy. Tableau Server on Amazon Web Services (AWS) is helping major Financial Services organizations shift data visualization and analytics workloads to the cloud. The result is fewer hours spent on manual work and more time to ask deeper questions and launch new data analyses, with easily-scalable support for large numbers of users. In this webinar, you’ll hear how one major asset management company made the shift to cloud data visualization with Tableau Server on AWS. Discover lessons learned, best practices tailored to Financial Services organizations, and starting tactics for scalable analytics on the cloud.
Tags : 
     AWS
By: AWS - ROI DNA     Published Date: Aug 09, 2018
In today's big data digital world, your organization produces large volumes of data with great velocity. Generating value from this data and guiding decision making require quick capture, analysis and action. Without strategies to turn data into insights, the data loses its value and insights become irrelevant. Real-time data inegration and analytics tools play a crucial role in harnessing your data so you can enable business and IT stakeholders to make evidence-based decisions
Tags : 
     AWS - ROI DNA
By: CA Mainframe     Published Date: Sep 12, 2008
Increasing infrastructure complexity has led to unprecedented growth in enterprise systems management data on all systems, including the mainframe and it is becoming more complex to manage SMF data.  Download this Technology Brief to learn how to address the complexity of managing SMF data.
Tags : smf, service management, data management, infrastructure, systems management, ca mainframe, mainframe
     CA Mainframe
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
     CA Technologies_Business_Automation
By: CDW     Published Date: Aug 04, 2016
As data volumes grow, you need more than just storage space. Let us help you orchestrate a solution that brings you the scalability and agility you need to move your organization forward. Storage needs are changing rapidly, and legacy appliances and processes just can’t keep up. Old systems are running slowly and filling up fast. At CDW, we can help you evolve your storage with a smart solution that’s ready for what lies ahead.
Tags : data, technology, storage, best practices, best solutions
     CDW
By: Cisco     Published Date: Jan 15, 2015
While this dramatic growth occurs, projections call for the cloud to account for nearly two-thirds of data center traffic and for cloud-based workloads to quadruple over traditional servers. That adds another element to the picture: changing traffic patterns. Under a cloud model, a university, for example, can build its network to handle average traffic volumes but then offload data on heavier trafficked days to a public cloud service when demand dictates, such as when it’s time to register for the next semester of classes.
Tags : cloud, growth, traffic, projection, account, network
     Cisco
By: Cloudian     Published Date: Feb 15, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate. Object-based storage systems are now available that demonstrate these characteristics. While they have a diverse set of use cases, we see several vendors now positioning them as on-premises targets for backups. In addition, integration of object-based data protection storage with cloud storage resources is seen by these vendors as a key enabler of performance at scale, cost savings, and administrative efficiency.
Tags : 
     Cloudian
By: Cloudian     Published Date: Feb 20, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate.
Tags : 
     Cloudian
By: Cloudian     Published Date: Feb 21, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate. Object-based storage systems are now available that demonstrate these characteristics. While they have a diverse set of use cases, we see several vendors now positioning them as on-premises targets for backups. In addition, integration of object-based data protection storage with cloud storage resources is seen by these vendors as a key enabler of performance at scale, cost savings, and administrative efficiency.
Tags : 
     Cloudian
By: Cohesity     Published Date: May 09, 2018
The growing importance—and complexity—of data protection means old approaches no longer will get the job done in an era of exploding data volumes and ever-changing business requirements. It’s time to reimagine and reengineer your IT infrastructure for a more efficient, affordable and manageable data protection framework.
Tags : 
     Cohesity
By: Cohesity     Published Date: Aug 09, 2019
The growing importance—and complexity—of data protection means old approaches no longer will get the job done in an era of exploding data volumes and ever-changing business requirements. It’s time to reimagine and reengineer your IT infrastructure for a more efficient, affordable and manageable data protection framework.
Tags : 
     Cohesity
By: CrowdStrike     Published Date: Feb 01, 2017
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. Download the white paper to learn: • How to detect known and unknown threats by applying high-volume graph-based technology, similar to the ones developed by Facebook and Google • How CrowdStrike solved this challenge by building its own proprietary graph data model • How CrowdStrike Threat Graph™ collects and analyzes massive volumes of security-related data to stop breaches
Tags : 
     CrowdStrike
By: CrowdStrike     Published Date: May 10, 2018
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the proverbial “needle in the haystack” – the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. In this context, detecting attacks is often difficult, and sometimes impossible. This white paper describes how CrowdStrike solved this challenge by building its own graph data model – the CrowdStrike Threat Graph? – to collect and analyze extremely large volumes of security-related data, and ultimately, to stop breaches. This revolutionary approach applies massive graph-based technologies, similar to the ones developed by Facebook and Google, to detect k
Tags : 
     CrowdStrike
By: Dell     Published Date: Sep 12, 2013
Information is the engine of business growth in the digital age. Market intelligence, customer information, intellectual property and other data can be harnessed to create a quantifiable competitive advantage for a company and lay the path for future expansion. Yet as critical as data can be, the sweeping year-over-year proliferation in data volumes can quickly overwhelm an IT organization. This is much more than a budget problem: Without an effective data storage strategy, an organization is putting a critical resource at risk.
Tags : data storage management, microsoft windows server, market intelligence, customer information, intellectual property, data, it organizations, storage strategy
     Dell
Start   Previous   1 2 3 4 5    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

Related Topics