RSS feed Get our RSS feed

News by Topic

data volumes

Results 76 - 100 of 104Sort Results By: Published Date | Title | Company Name
By: AWS     Published Date: Nov 28, 2018
Financial institutions run on data: collecting it, analyzing it, delivering meaningful insights, and taking action in real time. As data volumes increase, organizations demand a scalable analytics platform that can meet the needs of data scientists and business users alike. However, managing an on-premises analytics environment for a large and diverse user base can become time-consuming, costly, and unwieldy. Tableau Server on Amazon Web Services (AWS) is helping major Financial Services organizations shift data visualization and analytics workloads to the cloud. The result is fewer hours spent on manual work and more time to ask deeper questions and launch new data analyses, with easily-scalable support for large numbers of users. In this webinar, you’ll hear how one major asset management company made the shift to cloud data visualization with Tableau Server on AWS. Discover lessons learned, best practices tailored to Financial Services organizations, and starting tactics for scalable analytics on the cloud.
Tags : 
     AWS
By: CrowdStrike     Published Date: May 10, 2018
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the proverbial “needle in the haystack” – the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. In this context, detecting attacks is often difficult, and sometimes impossible. This white paper describes how CrowdStrike solved this challenge by building its own graph data model – the CrowdStrike Threat Graph? – to collect and analyze extremely large volumes of security-related data, and ultimately, to stop breaches. This revolutionary approach applies massive graph-based technologies, similar to the ones developed by Facebook and Google, to detect k
Tags : 
     CrowdStrike
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
By: Exablox     Published Date: Jan 27, 2015
When it comes to the increasingly complex task of managing data storage, many small and midsize organizations face even greater challenges than large, global enterprises. Small and midsize companies have ever-increasing volumes of information to manage and secure, and they are confronting a number of difficulties when it comes to storage. Among the biggest hurdles: ›› Scaling storage as the business grows rapidly ›› Meeting the rising expense of data storage capacity ›› Dealing with the complexity of management and architecture ›› Devoting precious staff time managing storage and data backup Whereas larger organizations have significant IT budgets and staff to handle storage-related challenges, small and midsize companies lack the IT resources to dedicate to storage management. Fortunately, there are new approaches to data storage on the market that can help such companies address their data storage needs without requiring dedicated storage management resources, while at the same ti
Tags : scaling storage, data storage capacity, data backup, data protection, data management, exablox, oneblox
     Exablox
By: IBM     Published Date: May 28, 2014
Different types of data have different data retention requirements. In establishing information governance and database archiving policies, take a holistic approach by understanding where the data exists, classifying the data, and archiving the data. IBM InfoSphere Optim™ Archive solution can help enterprises manage and support data retention policies by archiving historical data and storing that data in its original business context, all while controlling growing data volumes and improving application performance. This approach helps support long-term data retention by archiving data in a way that allows it to be accessed independently of the original application.
Tags : ibm, data retention, information governance, archiving, historical data, integrating big data, governing big data, integration
     IBM
By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : ibm, data, big data, information, healthcare, governance, technology
     IBM
By: IBM     Published Date: Aug 28, 2014
Data volumes are getting out of control, but choosing the right information lifecycle governance solution can be a huge challenge, with multiple stakeholders, numerous business processes, and extensive solution requirements. Use this requirements kit from the Compliance, Governance and Oversight Council (CGOC) to find the tools and technology you need.
Tags : ilg, data volumes, cgoc, information economics
     IBM
By: IBM     Published Date: Apr 18, 2016
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : ibm, big data, trusted data, data management, data solutions
     IBM
By: IBM     Published Date: Jul 19, 2016
Data movement and management is a major pain point for organizations operating HPC environments. Whether you are deploying a single cluster, or managing a diverse research facility, you should be taking a data centric approach. As data volumes grow and the cost of compute drops, managing data consumes more of the HPC budget and computational time. The need for Data Centric HPC architectures grows dramatically as research teams pool their resources to purchase more resources and improve overall utilization. Learn more in this white paper about the key considerations when expanding from traditional compute-centric to data-centric HPC.
Tags : ibm, analytics, hpc, big data
     IBM
By: SAS     Published Date: Jan 04, 2019
As the pace of business continues to accelerate, forward-looking organizations are beginning to realize that it is not enough to analyze their data; they must also take action on it. To do this, more businesses are beginning to systematically operationalize their analytics as part of a business process. Operationalizing and embedding analytics is about integrating actionable insights into systems and business processes used to make decisions. These systems might be automated or provide manual, actionable insights. Analytics are currently being embedded into dashboards, applications, devices, systems, and databases. Examples run from simple to complex and organizations are at different stages of operational deployment. Newer examples of operational analytics include support for logistics, customer call centers, fraud detection, and recommendation engines to name just a few. Embedding analytics is certainly not new but has been gaining more attention recently as data volumes and the freq
Tags : 
     SAS
By: IBM     Published Date: Jun 20, 2014
Data volumes are getting out of control, but choosing the right ILG solution can be a huge challenge, with multiple stakeholders, numerous business processes, and extensive solution requirements. Use this requirements kit from CGOC to find the tools and technology you need.
Tags : ibm, ilg, information lifecycle governance, information management, ilg solutions, data storage, data management
     IBM
By: IBM     Published Date: Jul 08, 2015
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize its value, and integrate it at a more granular level than ever before—focusing on the individual transaction level, rather than on general summary data. As data volumes continue to explode, clients must take advantage of a fully scalable information integration architecture that supports any type of data integration technique such as ETL, ELT (also known as ETL Pushdown), data replication or data virtualization. Read this new whitepaper to learn about the seven essential elements needed to achieve the highest performance.
Tags : 
     IBM
By: IBM     Published Date: Jul 08, 2015
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : 
     IBM
By: Cisco     Published Date: Jan 15, 2015
While this dramatic growth occurs, projections call for the cloud to account for nearly two-thirds of data center traffic and for cloud-based workloads to quadruple over traditional servers. That adds another element to the picture: changing traffic patterns. Under a cloud model, a university, for example, can build its network to handle average traffic volumes but then offload data on heavier trafficked days to a public cloud service when demand dictates, such as when it’s time to register for the next semester of classes.
Tags : cloud, growth, traffic, projection, account, network
     Cisco
By: HP     Published Date: May 14, 2014
Your data is everywhere. It’s in the cloud, in virtual environments, in remote offices, and on mobile devices. Plus, there’s more of it, and much of it is business-critical, meaning you need to back up larger volumes of data than ever before without adding costs or bandwidth. Even more importantly, you need the ability to restore data quickly in the event of a disaster, failure, outage. The cost of data protection is higher now more than ever.
Tags : hewlett packard, cloud, mobile, remote, data recovery, disaster, failure
     HP
By: Neterion     Published Date: Dec 05, 2006
The relentless growth of data and network-intensive applications such as digital imaging, multimedia content, and broadcast/video continues to drive volumes of enterprise data and network traffic. As growth continues, IT managers are challenged with implementing solutions without interrupting critical business processes.
Tags : network infrastructure, traffic management, bandwidth management, bandwidth, network management, neterion
     Neterion
By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : backup infrastructure, legacy systems, overload infrastructure, cloud, security
     Arcserve
By: SAS     Published Date: Aug 03, 2016
Data visualization is the visual and interactive exploration and graphic representation of data of any size, type (structured and unstructured) or origin. Visualizations help people see things that were not obvious to them before. Even when data volumes are very large, patterns can be spotted quickly and easily. Visualizations convey information in a universal manner and make it simple to share ideas with others.
Tags : best practices, data visualization, data, technology
     SAS
By: SAS     Published Date: Apr 20, 2017
Hype and hope — Big Data has generated a lot of both. Thanks to an abundance of enterprise information systems, networks, applications and devices that churn out huge volumes of information, government agencies are awash in Big Data. Add to this data growth the emerging trend of the Internet of Things (IoT) — the network of people, data, things and processes that is increasingly linked through automated connections and sensors — and the future of Big Data can seem quite daunting.
Tags : 
     SAS
By: Dell Storage     Published Date: Apr 21, 2011
The Dell Compellent vSphere Plug-In currently under development will enable data center administrators to manage many aspects of Dell Compellent Fluid Data storage directly through the VMware vSphere client. This demo video shows how to leverage the integration to provision and map storage volumes for VMFS datastores, create storage profiles for automated tiering and continuous snapshots, and recover datastores to virtually any point in time - all through a single interface. (4 min)
Tags : dell compellent, server, storage, vmware vsphere, integration, compellent vsphere plug-in, fluid data storage
     Dell Storage
By: Dell Storage     Published Date: Apr 21, 2011
Enterprise Manager provides a single storage resource management (SRM) interface for one or more Compellent SANs. The software also integrates with VMware vCenter to enhance server and storage visibility and manageability. This demo video shows how to use Enterprise Manager to collect VM host and guest storage statistics from vCenter, create, map and rescan storage volumes, and then format the volumes with VMFS to create datastores or assign the volumes as RDMs to a virtual machine. (3 min)
Tags : dell compellent, vcenter integration, compellent enterprise manager, storage resource management, srm, vmware vcenter, server, storage
     Dell Storage
By: IBM     Published Date: Jul 05, 2012
Esri, a U.S. software company, had growing data volumes as they added new applications. System response slowed down and their existing infrastructure had no room for extension. Read the white paper and see how with the IBM Migration Factory helped Esri consolidate 16 physical servers to two, automate storage management and drastically cut costs.
Tags : ibm, technology, sap, sofrware, technology, migration factory, storage
     IBM
By: HP     Published Date: May 14, 2014
In today’s world of explosive data volumes, midmarket companies face a growing number of challenges when it comes to safeguarding data, and today's approaches to backup and data protection are falling short. The Brief outlines HP solutions to meet IT needs for backup appliances that tightly integrate deduplication, data protection and recovery.
Tags : hewlett packard, data, backup, . data protections, recovery, deduplication
     HP
By: IBM     Published Date: Mar 30, 2017
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize its value, and integrate it at a more granular level than ever before—focusing on the individual transaction level, rather than on general summary data. As data volumes continue to explode, clients must take advantage of a fully scalable information integration architecture that supports any type of data integration technique such as extract, transfer and load (ETL), data replication or data virtualization.
Tags : data integration, data security, data optimization, data virtualization, database security
     IBM
By: CrowdStrike     Published Date: Feb 01, 2017
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. Download the white paper to learn: • How to detect known and unknown threats by applying high-volume graph-based technology, similar to the ones developed by Facebook and Google • How CrowdStrike solved this challenge by building its own proprietary graph data model • How CrowdStrike Threat Graph™ collects and analyzes massive volumes of security-related data to stop breaches
Tags : 
     CrowdStrike
Start   Previous    1 2 3 4 5    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

Related Topics