RSS feed Get our RSS feed

News by Topic

database analytics

Results 1 - 25 of 58Sort Results By: Published Date | Title | Company Name
By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
     SAS
By: SAP     Published Date: May 18, 2014
This TDWI Checklist Report presents requirements for analytic DBMSs with a focus on their use with big data. Along the way, the report also defines the many techniques and tool types involved. The requirements checklist and definitions can assist users who are currently evaluating analytic databases and/or developing strategies for big data analytics.
Tags : sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
     SAP
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : cost reduction, oracle database, it operation, online transaction, online analytics
     Hewlett Packard Enterprise
By: Oracle CX     Published Date: Oct 19, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make better
Tags : 
     Oracle CX
By: Oracle CX     Published Date: Oct 19, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make better
Tags : 
     Oracle CX
By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
     Oracle CX
By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
     Oracle CX
By: IBM APAC     Published Date: Nov 22, 2017
A user initiates the call and selects the source language, such as Spanish. (In this example, assume that the target language is set to English.) As the user is talking to the support representative, the audio is converted to text using the Speech to Text service. Then using Language Translator, the text is translated to English. English language text is then sent to the Text to Speech service as input. The output audio message is what the support representative hears. All of this happens in near real time. The text from Speech to Text and the Language Translator service also can be stored in a database for analytics. The same process is repeated in reverse for the audio message sent by support personnel.
Tags : source, language, english, spanish, speech to text, database, analytics, audio message
     IBM APAC
By: Clustrix     Published Date: Sep 04, 2013
Online advertising is a highly competitive and innovative market being driven to new levels by the rise of ad exchanges, real-time bidding alongside traditional ad networks. With advertisers increasingly buying one impression at a time, advertising market growth is soaring. If your database is the bottleneck limiting the growth of your advertising business, this is the white paper for you. Find out how Clustrix will give you access to functionality, such as ad segmentation and targeting based on up-to-the minute campaign performance, as well as instant access to smart data, so your clients can make the right buy decisions. This free whitepaper considers the technical challenges this rise presents for the database, and discusses the unique technology that enables Clustrix to solve these challenges and give your advertising business a competitive advantage.
Tags : technology, clustrix, online advertising, real time bidding, database, analytics
     Clustrix
By: Clustrix     Published Date: Oct 08, 2013
This whitepaper outlines new database technologies that are helping advertisers remove bottlenecks that slow down applications, improve functions such as ad segmentation and targeting based on up to the minute campaign performance and give agency clients instant access to smart data.
Tags : technology, clustrix, online advertising, real time bidding, database, analytics, smart data, applications
     Clustrix
By: IBM     Published Date: Oct 14, 2016
This ebook presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : ibm, database, database change, analytics
     IBM
By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
Learn how to create cloud infrastructure that's secure by default and has better core efficiency for Java, database, and big data. Oracle's servers offer hardware acceleration of data analytics and machine learning, with 10X better time-to-insight.
Tags : 
     Oracle PaaS/IaaS/Hardware
By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
"With the introduction of Oracle Database In-Memory and servers with the SPARC S7 and SPARC M7 processors Oracle delivers an architecture where analytics are run on live operational databases and not on data subsets in data warehouses. Decision-making is much faster and more accurate because the data is not a stale subset. And for those moving enterprise applications to the cloud, Real-time analytics of the SPARC S7 and SPARC M7 processors are available both in a private cloud on SPARC servers or in Oracle’s Public cloud in the SPARC cloud compute service. Moving to the Oracle Public Cloud does not compromise the benefits of SPARC solutions. Some examples of utilizing real time data for business decisions include: analysis of supply chain data for order fulfillment and supply optimization, analysis of customer purchase history for real time recommendations to customers using online purchasing systems, etc. "
Tags : 
     Oracle PaaS/IaaS/Hardware
By: TIBCO Software APAC     Published Date: Aug 15, 2018
TIBCO Spotfire® Data Science is an enterprise big data analytics platform that can help your organization become a digital leader. The collaborative user-interface allows data scientists, data engineers, and business users to work together on data science projects. These cross-functional teams can build machine learning workflows in an intuitive web interface with a minimum of code, while still leveraging the power of big data platforms. Spotfire Data Science provides a complete array of tools (from visual workflows to Python notebooks) for the data scientist to work with data of any magnitude, and it connects natively to most sources of data, including Apache™ Hadoop®, Spark®, Hive®, and relational databases. While providing security and governance, the advanced analytic platform allows the analytics team to share and deploy predictive analytics and machine learning insights with the rest of the organization, white providing security and governance, driving action for the business.
Tags : 
     TIBCO Software APAC
By: Pure Storage     Published Date: Oct 09, 2017
Storing data is critical. Everyone stores data. Today, it’s all about how you use the data you’re storing and if you’re storing the right data. The right mix of data and the ability to analyze it against all data types is driving markets worldwide in what is known as digital transformation. Digital transformation requires storing, accessing, and analyzing all types of data as fast and efficiently as possible. The end goal is to derive insights and gain a competitive advantage by using those insights to move faster and deliver smarter products and services than your competition.
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     Pure Storage
By: IBM     Published Date: May 17, 2016
Is your data architecture up to the challenge of the big data era? Can it manage workload demands, handle hybrid cloud environments and keep up with performance requirements? Here are six reasons why changing your database can help you take advantage of data and analytics innovations. 
Tags : ibm, business analytics, business intelligence, data, analytics, database
     IBM
By: IBM     Published Date: Jul 05, 2016
This ebook presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : ibm, database, database change, analytics, storage
     IBM
By: IBM     Published Date: Apr 18, 2017
The data integration tool market was worth approximately $2.8 billion in constant currency at the end of 2015, an increase of 10.5% from the end of 2014. The discipline of data integration comprises the practices, architectural techniques and tools that ingest, transform, combine and provision data across the spectrum of information types in the enterprise and beyond — to meet the data consumption requirements of all applications and business processes. The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine "data lakes" with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.
Tags : data integration, data security, data optimization, data virtualization, database security, data analytics, data innovation
     IBM
By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : scalability, hybrid environment, emerging applications, rapid reporting
     IBM
By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : database, applications, data availability, cognitive applications
     Group M_IBM Q1'18
By: IBM     Published Date: May 23, 2017
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : cloud strategy, database projects, disaster recover, geographic reach, large database, ibm, analytics, management optimization
     IBM
By: IBM     Published Date: Apr 19, 2018
This paper presents a cost/benefit case for two industry-leading database platforms for analytics workloads.
Tags : db2, data migration, ibm, oracle
     IBM
By: Amazon Web Services     Published Date: Apr 16, 2018
Since SAP introduced its in-memory database, SAP HANA, customers have significantly accelerated everything from their core business operations to big data analytics. But capitalizing on SAP HANA’s full potential requires computational power and memory capacity beyond the capabilities of many existing data center platforms. To ensure that deployments in the AWS Cloud could meet the most stringent SAP HANA demands, AWS collaborated with SAP and Intel to deliver the Amazon EC2 X1 and X1e instances, part of the Amazon EC2 Memory-Optimized instance family. With four Intel® Xeon® E7 8880 v3 processors (which can power 128 virtual CPUs), X1 offers more memory than any other SAP-certified cloud native instance available today.
Tags : 
     Amazon Web Services
By: NetApp     Published Date: Dec 15, 2014
Organizations of all kinds rely on their relational databases for both transaction processing and analytics, but many still have challenges in meeting their goals of high availability, security, and performance. Whether planning for a major upgrade of existing databases or considering a net new project, enterprise solution architects should realize that the storage capabilities will matter. NetApp’s systems, software, and services offer a number of advantages as a foundation for better operational results.
Tags : database, transaction processing, analytics, enterprise solution architects, storage capabilities, storage
     NetApp
By: LogMeIn     Published Date: Feb 27, 2018
24/7 Self-Service Support Center: Bold360 ai’s 24/7 context driven support center was implemented, allowing users to instantly discover relevant content from the smart knowledge database. Dynamic FAQs displayed trending topics in real-time to speed up customer resolution and discoverability. Real-Time Customer Analytics highlight unanswered questions, giving Premium Credit instant visibility of missing topics, questions driving ticket volume, and more.
Tags : customer, support, faq, credit
     LogMeIn
Start   Previous   1 2 3    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

Related Topics