RSS feed Get our RSS feed

News by Topic

data processing

Results 1 - 25 of 124Sort Results By: Published Date | Title | Company Name
By: TIBCO Software APAC     Published Date: Aug 13, 2018
Big data has raised the bar for data virtualization products. To keep pace, TIBCO® Data Virtualization added a massively parallel processing engine that supports big-data scale workloads. Read this whitepaper to learn how it works.
Tags : 
     TIBCO Software APAC
By: IBM     Published Date: Jun 04, 2018
"The appearance of your reports and dashboards – the actual visual appearance of your data analysis -- is important. An ugly or confusing report may be dismissed, even though it contains valuable insights about your data. Cognos Analytics has a long track record of high quality analytic insight, and now, we added a lot of new capabilities designed to help even novice users quickly and easily produce great-looking and consumable reports you can trust. Watch this webinar to learn: • How you can more effectively communicate with data. • What constitutes an intuitive and highly navigable report • How take advantage of some of the new capabilities in Cognos Analytics to create reports that are more compelling and understandable in less time. • Some of the new and exciting capabilities coming to Cognos Analytics in 2018 (hint: more intelligent capabilities with enhancements to Natural Language Processing, data discovery and Machine Learning)."
Tags : data analysis, data analytics, dashboards
     IBM
By: SAS     Published Date: May 24, 2018
This paper provides an introduction to deep learning, its applications and how SAS supports the creation of deep learning models. It is geared toward a data scientist and includes a step-by-step overview of how to build a deep learning model using deep learning methods developed by SAS. You’ll then be ready to experiment with these methods in SAS Visual Data Mining and Machine Learning. See page 12 for more information on how to access a free software trial. Deep learning is a type of machine learning that trains a computer to perform humanlike tasks, such as recognizing speech, identifying images or making predictions. Instead of organizing data to run through predefined equations, deep learning sets up basic parameters about the data and trains the computer to learn on its own by recognizing patterns using many layers of processing. Deep learning is used strategically in many industries.
Tags : 
     SAS
By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
     Amazon Web Services
By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : vision, high availability, ibm, aix, cdp, core union
     Vision Solutions
By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
     SAP
By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
     Cisco EMEA Tier 3 ABM
By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
     SAP
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : database usage, database management, server usage, data protection
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : cost reduction, oracle database, it operation, online transaction, online analytics
     Hewlett Packard Enterprise
By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
     Cisco
By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
     Amazon Web Services
By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
     Oracle CX
By: IBM APAC     Published Date: Apr 27, 2018
While relying on x86 servers and Oracle databases to support their stock trading systems, processing rapidly increasing number of transactions fast became a huge challenge for Wanlian Securities. They shifted to IBM FlashSystem that helped them cut average response time for their Oracle Databases from 10 to less than 0.4 milliseconds and improved CPU usage by 15%. Download this case study now.
Tags : 
     IBM APAC
By: Dome9     Published Date: Apr 25, 2018
As of May 2017, according to a report from The Depository Trust & Clearing Corporation (DTCC), which provides financial transaction and data processing services for the global financial industry, cloud computing has reached a tipping point1. Today, financial services companies can benefit from the capabilities and cost efficiencies of the cloud. In October of 2016, the Federal Deposit Insurance Corporation (FDIC), the Office of the Comptroller of Currency (OCC) and the Federal Reserve Board (FRB) jointly announced enhanced cyber risk management standards for financial institutions in an Advanced Notice of Proposed Rulemaking (ANPR)2. These proposed standards for enhanced cybersecurity are aimed at protecting the entire financial system, not just the institution. To meet these new standards, financial institutions will require the right cloud-based network security platform for comprehensive security management, verifiable compliance and governance and active protection of customer data
Tags : 
     Dome9
By: Oracle     Published Date: Aug 09, 2018
The purpose of IT backup and recovery systems is to avoid data loss and recover quickly, thereby minimizing downtime costs. Traditional storage-centric data protection architectures such as Purpose Built Backup Appliances (PBBAs), and the conventional backup and restore processing supporting them, are prone to failure on recovery. This is because the processes, both automated and manual, are too numerous, too complex, and too difficult to test adequately. In turn this leads to unacceptable levels of failure for today’s mission critical applications, and a poor foundation for digital transformation initiatives. Governments are taking notice. Heightened regulatory compliance requirements have implications for data recovery processes and are an unwelcome but timely catalyst for companies to get their recovery houses in order. Onerous malware, such as ransomware and other cyber attacks increase the imperative for organizations to have highly granular recovery mechanisms in place that allow
Tags : 
     Oracle
By: Dell EMC     Published Date: Nov 09, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera and Intel.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction.
Tags : 
     Dell EMC
By: SAS     Published Date: Apr 16, 2015
Former Intel CEO Andy Grove once coined the phrase, “Technology happens.” As true as Grove’s pat aphorism has become, it’s not always good news. Twenty years ago, no one ever got fired for buying IBM. In the heyday of customer relationship management (CRM), companies bought first and asked questions later. Nowadays, executives are being enlightened by the promise of big data technologies and the role data plays in the fact-based enterprise. Leaders in business and IT alike are waking up to the reality that – despite the hype around platforms and processing speeds – their companies have failed to established sustained processes and skills around data.
Tags : 
     SAS
By: EMC Converged Platforms     Published Date: Oct 22, 2015
Old Dutch Foods, known for its broad selection of snack foods in the midwest United States and Canada, was struggling to get the right products to the right places at the right time. Its data center included outdated physical servers, and batch processing meant that inventory would not be updated until the end of the day as opposed to real time. In addition, recovering from power outages and disk failures could frequently take up to two weeks. To modernize its data center, Old Dutch Foods invested in EMC Converged Infrastructure. The fast and easy deployment of two VCE VBlock® systems running JD Edwards, MS Exchange, mobile device apps, and operation of a backup site with replicated applications and data. This enhanced the IT department's responsiveness to the business, allowed them to shift to real-time inventory, and reduced CapEx and OpEx costs. Operations were simplified by reducing person-hours needed for infrastructure maintenance by 75 percent.
Tags : 
     EMC Converged Platforms
By: Adobe     Published Date: Feb 20, 2014
San Diego County District Attorney’s Office accelerates Juvenile Court proceedings using Adobe® Acrobat® Pro in Microsoft SharePoint environment.
Tags : adobe, adobe acrobat pro, file management, software, data management, electronic documentation, paperless processing, pdf documents
     Adobe
By: Adobe     Published Date: Feb 20, 2014
Gain more efficient ways of working with documents and collaborating with others on them.
Tags : adobe, adobe acrobat pro, microsoft applications, collaboration, merging documents, editing documents, pdf to office format, file formatting
     Adobe
By: Oracle     Published Date: Feb 28, 2018
Getting the most from your data, driving innovation, processing orders faster, and reducing operating costs are all essential. And Oracle Exadata is the database platform to deliver these improvements. Read five top reasons for running your business on Oracle Exadata, and find out why other organisations say it was such a good choice for them.
Tags : business, oracle, exadata, organisation, operation
     Oracle
By: This program is brought to you by Oracle and Intel     Published Date: Mar 15, 2018
Getting the most from your data, driving innovation, processing orders faster, and reducing operating costs are all essential. And Oracle Exadata is the database platform to deliver these improvements. Read five top reasons for running your business on Oracle Exadata, and find out why other organisations say it was such a good choice for them.
Tags : 
     This program is brought to you by Oracle and Intel
By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : reporting, artificial intelligence, insights, organization, institution, recognition
     Pure Storage
Start   Previous   1 2 3 4 5    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

Related Topics