Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. "Hadoop is a very broad ecosystem. It is not a single project," Ian Fyfe, chief technology evangelist at Pentaho, tells DBTA. "Sqoop and Oozie are shipped as part of Cloudera's distribution so that is an important part of our support for Cloudera as well - providing that visual support which nobody else in the market does today."

Posted August 23, 2012

SAP AG introduced a new solution to help organizations gain real-time insights into market trends and customer sentiment. The SAP rapid-deployment solution for sentiment intelligence with SAP HANA is intended to allow users to analyze customer sentiment from social networking sites, communities, wikis, blogs and other sources, and combine the information with CRM data. Customers that have had success getting started with big data analytics are the ones that have set out to solve a very specific use case or set out to solve a specific problem, David Jonker, director of marketing for database and technology at SAP, tells DBTA. "The rapid deployment solution for sentiment intelligence does exactly that."

Posted August 22, 2012

Symantec Corp. has partnered with Hortonworks to introduce the new Symantec Enterprise Solution for Hadoop, providing a scalable, resilient data management solution for handling big data workloads. The add-on solution for Symantec's Cluster File System enables Symantec customers to run big data analytics on their existing storage infrastructure.

Posted August 14, 2012

Data warehousing is undergoing the most radical transformation seen since it was first conceived in the 1970s, and brought to market in the late 1980s and 1990s. One reason for this transformation is that data warehouses are on the front lines of the big data explosion. Findings from a new survey of IOUG members indicate that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. This survey, underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc., included input from 421 data managers and professionals.

Posted August 08, 2012

Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.

Posted July 25, 2012

Datameer has announced a new release of its big data analytics solution, which combines data integration, analytics and visualization of any data type in one application. The new capabilities offered in Datameer 2.0 are in two main categories, Joe Nicholson, vice president of marketing, Datameer, tells DBTA. One is adding new functionality and the other is bringing Hadoop to the desktop with Hadoop natively embedded in two of three new editions of the application.

Posted June 28, 2012

Lucid Imagination, a developer of search, discovery and analytics software based on Apache Lucene and Apache Solr technology, has unveiled LucidWorks Big Data, a fully integrated development stack that combines advantages of multiple open source projects including Hadoop, Mahout, R and Lucene/Solr to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one solution available in the cloud. "With more and more companies being challenged by the explosive growth of information, as has been widely reported, the vast majority of that content is unstructured or semi structured text, and traditional business intelligence or traditional analytics methodologies don't come close to addressing the vast percentage of content," Paul Doscher, CEO of Lucid Imagination, tells DBTA.

Posted June 28, 2012

Data analytics vendor Teradata and information management software provider Kalido have introduced a new joint solution that they say will allow customers to build or expand a data warehouse in 90 days or less, providing deeper analytics to users for improved business decision-making. This solution combines the Teradata Data Warehouse Appliance with the Kalido Information Engine, providing customers with a streamlined data consolidation tool that aggregates disparate data into a single unified platform.

Posted June 28, 2012

MapR Technologies will make its distribution for Hadoop available on Google Compute Engine. The combination of the new Google service and the MapR distribution is intended to enable customers to quickly provision large MapR clusters on demand and to take advantage of the scalability of a cloud-based solution. "Off-premise, on-demand computing is an important part of the future for Hadoop," says John Schroeder, CEO and co-founder of MapR Technologies. "MapR is solidifying that future by partnering with Google and leveraging their cost-effective, high performance and scale-out infrastructure."

Posted June 28, 2012

SAP marked the 1-year anniversary of the SAP HANA platform becoming generally available. To celebrate the occasion, SAP AG announced the launch of the SAP HANA Distinguished Engineer program. The new program is focused on promoting SAP HANA expertise in the market and is intended to support a new group of community-driven, hands-on HANA technical professionals.

Posted June 27, 2012

Connotate, Inc., a provider of solutions that help organizations monitor and collect data and content from the web, is partnering with Digital Reasoning, which enables unstructured data analytics at scale, to provide a solution that creates actionable intelligence from fact-based analysis of big data.

Posted June 26, 2012

IBM has introduced a new analytics appliance that is intended to allow organizations to analyze up to 10 petabytes of data in minutes, helping them uncover patterns and trends from large data sets, while meeting compliance mandates. The new IBM Netezza High Capacity Appliance addresses a growing challenge: Banks, insurance companies, healthcare organizations and communications services providers are required by industry regulators to retain massive amounts of data - in some cases up to a decade. And, as data retention laws continue to evolve, organizations are faced with the need to store and analyze ever-expanding "big data" sets that may not be directly related to daily operations, yet still hold potential business value.

Posted June 26, 2012

Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"

Posted June 13, 2012

Kalido, a provider of agile information management software and Gold level member in Oracle PartnerNetwork (OPN), has achieved Oracle Exadata Optimized status, demonstrating that Kalido Information Engine 9.0 has been tested and tuned on Oracle Exadata Database Machine to deliver speed, scalability and reliability to customers. The new optimized status signifies that Kalido's application which is certified to run on Exadata has now been optimized, meaning that it is taking advantage of the features in Exadata, including the memory, and the tuning and performance capabilities, and that as a result, Kalido is able to give customers better performance of its application running on the Exadata stack, says Bill Hewitt, Kalido president and CEO.

Posted June 13, 2012

Oracle has announced the Sun ZFS Backup Appliance, an integrated, high performance backup solution for Oracle engineered systems, including the Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud and Oracle SPARC SuperCluster T4-4. According to Oracle, the Sun ZFS Backup Appliance delivers up to 20TB per hour full backup and up to 9.4TB per hour full restore throughputs, the fastest published recovery rates among general purpose storage systems for Oracle engineered systems data protection. Oracle's Sun ZFS Backup Appliance is available in two configurations, High Performance and High Capacity, and comes pre-racked and cabled, helping to eliminate hardware tuning.

Posted June 13, 2012

Hortonworks, a commercial vendor promoting the innovation, development and support of Apache Hadoop, has announced the general availability of Hortonworks Data Platform (HDP) 1.0 which is intended to make Hadoop easy to consume and use in enterprise environments. "With the general availability of Hortonworks Data Platform 1.0, Hortonworks is delivering on its promise to make Apache Hadoop an enterprise viable data platform," says Rob Bearden, CEO of Hortonworks.

Posted June 13, 2012

Cloudera has unveiled the fourth generation of its flagship Apache Hadoop data management platform, Cloudera Enterprise. Cloudera Enterprise 4.0 combines the company's Cloudera Manager software with expert technical to provide a turnkey system for deploying and managing Hadoop in production. The company also announced the general availability of CDH4 (Cloudera's Distribution Including Apache Hadoop, version 4), resulting from the successful completion of a beta program among its enterprise customers and partner ecosystem and the contributions of Cloudera's engineering team and the greater Apache open source community.

Posted June 06, 2012

Despite IT industry talk about the need for real-time data, a new survey of more than 330 data managers and professionals who are subscribers to Database Trends and Applications reveals that access to current data to support decision making is not actually possible for many companies. At least half of companies represented in the survey indicate that relevant data still takes 24 hours or longer to reach decision makers.

Posted May 23, 2012

The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.

Posted May 23, 2012

SAP AG has announced a range of innovations. The announcements put heavy emphasis on SAP HANA, and focus on three main areas, David Jonker, director of product marketing - Data Management & Analytics, SAP, tells 5 Minute Briefing. One is working with customers to accelerate their existing investment in the SAP landscape, a second emphasis is big data and analytics, and the third is around sparking new innovation by working with customers, partners and startups to leverage real-time analytics to rethink how business gets done.

Posted May 22, 2012

Attunity Ltd., a provider of information availability software solutions, will host a live webcast titled "From Batch To Blazing- Moving Data In Today's Real-time Business" on Wednesday, May 16, from 1 pm to 2 pm ET. Presented by Thomas J. Wilson, president of Unisphere Research, the webcast will unveil the newly-published findings of Unisphere Research's 2012 data integration study, and share key analysis of the results.

Posted May 15, 2012

Hadapt, which provides a big data platform for natively integrating Apache Hadoop and a relational DBMS, and Cloudera, a provider of Apache Hadoop-based data management software, services and training, have announced the integration of Hadapt's Adaptive Analytical Platform with Cloudera Manager and Cloudera's Distribution Including Apache Hadoop (CDH).

Posted May 15, 2012

Organizations are struggling with big data, which they define as any large-size data store that becomes unmanageable by standard technologies or methods, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications. The survey was conducted by Unisphere Research, a division of Information Today, Inc., in partnership with MarkLogic in January 2012. Among the key findings uncovered by the survey is the fact that unstructured data is on the rise, and ready to engulf current data management systems. Added to that concern, say respondents, is their belief that management does not understand the challenge that is looming, and is failing to recognize the significance of unstructured data assets to the business.

Posted May 09, 2012

TIBCO Software Inc., a provider of infrastructure software on-premise and in the cloud, has announced the latest version of its analytics platform: Spotfire 4.5. This new release will feature visualization-based data discovery and collaboration capabilities for all types of extreme information, including big data. Additionally, Spotfire 4.5 will provide greater data exploration and new predictive analytics capabilities of both structured data and unstructured content, as well as iPad compatibility that allows users to construct "private-branded" Spotfire Analytics solutions. Together, these capabilities are intended to offer organizations the ability to leverage all of their data to better discover and predict business outcomes that can be delivered to a larger population.

Posted May 08, 2012

Oracle addressed the need to make IT infrastructure and business analytics technologies simpler and more efficient in a presentation to OpenWorld Tokyo 2012 attendees that was also made available via live webcast. In addition to presenting its strategy and plans for business analytics, the company also unveiled new additions to its product portfolio. In his keynote address, Oracle president Mark Hurd explained how the business users of tomorrow will require faster and more comprehensive information access. "The true question with analytics is how to get the right information to the right person at the right time to make the right decision," he said.

Posted April 26, 2012

IBM has introduced DB2 10 and InfoSphere Warehouse 10 software that integrates with big data systems, automatically compresses data into tighter spaces to prevent storage sprawl, and slices information from the past, present, and future to eliminate expensive application code. Over the past 4 years, more than 100 clients, 200 business partners, and hundreds of experts from IBM Research and Software Development Labs around the world collaborated to develop the new software.

Posted April 26, 2012

MapR Technologies, Inc., provider of the MapR distribution for Apache Hadoop, has introduced new data connection options for Hadoop to enable a range of data ingress and egress alternatives for customers. These include direct file-based access using standard tools and file-based applications, direct database connectivity, Hadoop specific connectors via Sqoop, Flume and Hive; as well as direct access to popular data warehouses and applications using custom connectors. Additionally, technology providers Pentaho and Talend are partnering with MapR to provide direct integration with MapR's distribution, and MapR has also entered into a partnership with data warehouse and business intelligence platform vendor Tableau Software.

Posted April 26, 2012

Attivio and TIBCO Software Inc. have announced that, as part of a new partnership agreement, the TIBCO Spotfire analytics platform has achieved the highest level of information access available within Attivio's Active Intelligence Engine (AIE). According to Attivio, its AIE ingests all types of structured data and unstructured content, and unlike a traditional data warehouse, does not require relationships between any data or content to be defined in advance prior to ingestion. In achieving platinum-level certification, Attivio says TIBCO Spotfire has been authenticated by Attivio to be able to present, analyze, access and leverage AIE's AI-SQL capabilities to provide data visualization, enhanced analytics of unstructured content and intuitive search and discovery, all in the same dashboard.

Posted April 10, 2012

1010data, Inc., provider of an internet-based big data warehouse, has announced the launch of a new software tool that enables 1010data's customers to automatically segment and analyze huge consumer transaction databases and produce statistical models with specificity, even to the level of social groups, families and individuals. For the first stage of the launch, 1010data is making the tool available in an invitational beta release for retail, consumer goods, and mobile telecom companies. "In all consumer-driven industries, customers are demanding to be treated as individuals, not boomers, tweeners, or dinks - dual income, no kids," said Tim Negris, vice president of marketing at 1010data.

Posted March 22, 2012

For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.

Posted March 19, 2012

Oracle has unveiled Oracle Airline Data Model, a standards-based, pre-built database schema to help airlines optimize the collection, storage, and analysis of passenger data from reservations, sales, operations, loyalty, customer service and finance in their data warehouse. Available as an option for Oracle Database 11g Enterprise Edition, the new Oracle Airline Data Model delivers a comprehensive database schema for passenger data, sophisticated analytics, trending and data mining capabilities.

Posted March 14, 2012

Quest Software said it has entered into definitive agreements with affiliates of Insight Venture Partners to become a private company. In the deal, stockholders would receive $23 per share in cash, valuing the company at approximately $2.0 billion. Upon closing, Quest will be a privately held company and will continue to be led by chairman and CEO Vinny Smith and the existing senior management team. "As a private company, we will have increased flexibility to drive innovation across our product lines and execute our long-term strategy," said Smith in a statement released by the company.

Posted March 09, 2012

EMC Corporation has announced version 4.2 of EMC Greenplum Database, which includes a high-performance gNet for Hadoop; simpler, scalable backup with EMC Data Domain Boost; an extension framework and turnkey in-database analytics; language and compatibility enhancements for faster migrations to Greenplum; and targeted performance optimization.

Posted March 01, 2012

The Teradata Data Warehouse Appliance 2690 is now generally available. The new release is designed to deliver double the performance with up to triple the data capacity of its predecessor. "The Teradata Data Warehouse Appliance 2690 is our fifth generation appliance that provides a faster, easier, and greener analytic engine for a wide variety of demanding business intelligence tasks, which has contributed to its rapid customer adoption," says Ed White, general manager, Teradata Appliances, Teradata Corporation.

Posted February 28, 2012

Tableau Software, a provider of business intelligence software, and Cloudera Inc., a provider of Apache Hadoop-based data management software and services, have announced integration between the two companies that provides enterprises with capabilities to more easily extract business insights from their big data without needing the specific technical skills typically required to operate Hadoop. Tableau has developed a Certified Cloudera Connector that is licensed to work with Cloudera's Distribution Including Apache Hadoop (CDH). The new connector is part of the Tableau 7.0 release.

Posted February 24, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44

Sponsors