Newsletters




Hadoop

The Apache Hadoop framework for the processing of data on commodity hardware is at the center of the Big Data picture today. Key solutions and technologies include the Hadoop Distributed File System (HDFS), YARN, MapReduce, Pig, Hive, Security, as well as a growing spectrum of solutions that support Business Intelligence (BI) and Analytics.



Hadoop Articles

Rosslyn Analytics, a provider of big data cloud technology, has announced it is one of the first to offer companies analytics as a service on Azure. The big data cloud analytics platform, powered by Azure, provides self-service management from source to analytics and enables business and IT users to interact with, change, and analyze data using a combination of self-service data integration, cleansing and enrichment tools and machine learning and visualization technologies.

Posted May 20, 2015

Oracle is shipping a new big data product called Oracle Big Data Spatial and Graph. Spatial and graph analytics has been available as an option for Oracle Database for more than 10 years, and with this introduction the company is bringing spatial and graph analytics to Hadoop and NoSQL.

Posted May 20, 2015

MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, is including Apache Drill 1.0 in the MapR Distribution.

Posted May 19, 2015

Google white papers have inspired many great open source projects. What has been missing until now, however, has been a way of bringing these technologies together such that any data-centric organization can benefit from the capabilities of each technology across its entire data center, and in new ways not documented by any single white paper. This is called the "Zeta Architecture."

Posted May 19, 2015

The demand for effective data management is intensifying. At the same time, the database market has expanded into a wide array of solutions—from traditional relational database management systems to alternative databases such as NoSQL, NewSQL, cloud, and in-memory offerings.

Posted May 19, 2015

RedPoint Global was founded in 2006 by Dale Renner, Lewis Clemmens, and George Corugedo, who previously had worked together at Accenture. Based in Wellesley, Mass., RedPoint collaborates with clients around the world in 11 different verticals. "We have always been very focused on the data, and recognize that a lot of business problems live and die by the quality of the data," says Corugedo.

Posted May 19, 2015

Hadoop is contributing to the success of data analytics. Anad Rai, IT manager at Verizon Wireless, examined the differences between traditional versus big data at Data Summit 2015 in a session titled "Analytics: Traditional Versus Big Data." The presentation, which was part of the IOUG track moderated by Alexis Bauer Kolak, education manager at the IOUG, showed how big data technologies are helping data discovery and improving the transformation of information and knowledge into wisdom.

Posted May 14, 2015

At Data Summit 2015 in New York City, Tony Shan, chief architect, Wipro, gave a talk on the key components of a successful big data methodology and shared lessons learned from real world big data implementations. According to Shan, there is an 8-step process for a big data framework with specific techniques and methods.

Posted May 14, 2015

The data lake is one of the hottest topics in the data industry today. It is a massive storage reservoir that allows data to be stored in its rawest forms. Hadoop Day at Data Summit 2015 concluded with a panel on everything data lake featuring James Casaletto, solutions architect for MapR, Joe Caserta, president and founder of Caserta Concepts, and George Corugedo, CTO with RedPoint Global Inc.

Posted May 14, 2015

With the influx of big data solutions and technologies comes a bevy of new problems, according to Data Summit 2015 panelists Miles Kehoe, search evangelist at Avalon Consulting, and Anne Buff, business solutions manager for SAS best practices at the SAS Institute. Kehoe and Buff opened the second day of Data Summit with a keynote discussion focusing on resolving data conundrums.

Posted May 14, 2015

To transform data into value, IT must move from thinking about what it does to data, and instead focus on business outcomes and what can be done with the data to advance the business, according to Edd Dumbill, vice president, strategy, Silicon Valley Data Science, who gave the welcome keynote at Data Summit 2015.

Posted May 14, 2015

Splice Machine is partnering with Talend to enable customers to simplify data integration and streamline data workflows on Hadoop. Through this partnership, organizations building operational data lakes with Splice Machine can augment Talend's data integration technology with its data quality capabilities.

Posted May 12, 2015

Pentaho users will now be able to use Apache Spark within Pentaho thanks to a new native integration solution that will enable the orchestration of all Spark jobs. Pentaho Data Integration (PDI), an effort initiated by Pentaho Labs, will enable customers to increase productivity, reduce maintenance costs, and dramatically lower the skill sets required as Spark is incorporated into big data projects.

Posted May 12, 2015

Cloudera is now offering support for Capgemini's new reference architecture for the SAP HANA platform and Cloudera Enterprise. "By bringing the power of Cloudera's enterprise data hub offering to the ecosystem in support of SAP HANA, we can enable Capgemini's clients to expand the amount of data they have within their environment in a cost-efficient manner," said Tim Stevens, vice president of corporate and business development at Cloudera.

Posted May 08, 2015

Pivotal has made updates to its big data suite that include upgrades to the Pivotal HD enterprise-grade Apache Hadoop distribution, which is now based on the Open Data Platform core, and performance improvements for Pivotal Greenplum Database.

Posted May 05, 2015

The Spring 2015 release of the SnapLogic Elastic Integration Platform extends the platform's cloud and big data integration capabilities to the Internet of Things (IoT) with support for Message Queuing Telemetry Transport (MQTT), a lightweight machine-to-machine connectivity protocol.

Posted May 05, 2015

Splice Machine, a provider of Hadoop RDMS, announced that it is partnering with mrc (michaels, ross & cole ltd), to allow Splice Machine's Hadoop RDBMS to be certified and integrated with mrc's m-Power platform. "Our partnership with mrc gives businesses a solution that can speed real-time application deployment on Hadoop with the staff and tools they currently have, while also offering affordable scale-out on commodity hardware for future growth," said Monte Zweben, co-founder and CEO, Splice Machine.

Posted April 28, 2015

Pivotal HAWQ is now available on the Hortonworks Data Platform (HDP), enabling the benefits of SQL on Hadoop to be leveraged by enterprises that are investing in HDP. This marks the first time that the features and capabilities of Pivotal HAWQ have been made available outside of Pivotal. The availability aligns with a common Open Data Platform (ODP) Core that allows users to leverage the best-of-breed technology across providers.

Posted April 27, 2015

The future will flourish with machines. We've been told this in pop culture for decades, from the helpful robots of the Jetsons, to the infamous Skynet of the Terminator movies, to the omniscient "computer" of Star Trek. Smart, connected devices will be ubiquitous and it's up to us, the humans, to decide what's next. But the Internet of Things (IoT) is about more than devices and data.

Posted April 23, 2015

SUSE and Veristorm are partnering to provide certified high-performance Hadoop solutions that run directly on Linux on IBM z Systems, IBM Power Systems, and x86-64. Customers with IBM z Systems can team SUSE Linux Enterprise Server for System z with Veristorm zDoop, a commercial distribution of Hadoop supported on mainframes.

Posted April 23, 2015

Many DBAs are now tasked with managing multi-vendor environments, and handling a variety of data types. Increasingly, DBAs are turning to strategies such as database automation to be able to concentrate more on the big picture of moving their enterprises forward.

Posted April 23, 2015

While the new data stores and other software components are generally open source and incur little or no licensing costs, the architecture of the new stacks grows ever more complex, and this complexity is creating a barrier to adoption for more modestly sized organizations.

Posted April 22, 2015

To help organizations answer questions with data spread across disparate analytics systems and data repositories, Teradata has expanded its QueryGrid technologies. "With this announcement we have our foot on the gas pedal," Imad Birouty, director of product marketing, Teradata. "We have seven updates. We are announcing new connectors that are on their way, announcing that we have delivered on the connectors that we previously announced, and we are refreshing previously-released connector versions of the technologies."

Posted April 20, 2015

Unstructured data types and new database management systems are playing an increasing role in the modern data ecosystem, but structured data in relational database management systems (RDBMS) remains the foundation of the information infrastructure in most companies. In fact, structured data still makes up 75% of data under management for more than two-thirds of organizations, with nearly one-third of organizations not yet actively managing unstructured data at all, according to a new survey commissioned by Dell Software and conducted by Unisphere Research, a division of Information Today, Inc.

Posted April 15, 2015

AtScale, Inc. has introduced a platform that will enable interactive, multi-dimensional analysis on Hadoop, directly from standard business intelligence tools such as Microsoft Excel, Tableau Software or QlikView. Dubbed the "AtScale Intelligence Platform," the new offering provides a Hadoop-native analysis server that allows users to analyze big data at full scale and top speed, while leveraging the existing BI tools they already own.

Posted April 14, 2015

Think Big, a Teradata company, has introduced the Dashboard Engine for Hadoop, which enables organizations to access and report on big data in Hadoop-based data lakes to make agile business decisions. "There are endless streams of data from web browsers, set top boxes, and contact centers that often land in Hadoop, but sometimes don't make their way into downstream analytics," said Ron Bodkin, president, Think Big.

Posted April 13, 2015

Oracle has unveiled Oracle Data Integrator for Big Data to help make big data integration more accessible and actionable for customers. The goal with the new data integration capabilities is to bring together disparate communities that have emerged within the Oracle client base and allow the mainstream DBAs and ETL developers as well as the big data development organization to be brought together on a single platform for collaboration, said Jeff Pollock, vice president of product management at Oracle.

Posted April 08, 2015

In order to truly appreciate Apache Drill, it is important to understand the history of the projects in this space, as well as the design principles and the goals of its implementation.

Posted April 08, 2015

How does an organization acknowledge that data is important? An organization does so by enabling and supporting efforts for gathering and persisting information about the organization's data resources.

Posted April 06, 2015

Using StretchDB, an enterprise can "stretch" an on-premises database into the cloud, such that "hot," heavily used data is stored in the on-premises instance of SQL Server, while "cold" and infrequently used data is transparently stored in Azure. A stretched database automatically and transparently manages synchronization and movement of aging data from on-premises to the cloud.

Posted April 06, 2015

The Independent Oracle Users Groups (IOUG) has been serving Oracle technologists and professionals for more than 20 years, and we are very pleased with how much the community has grown as well as how much IOUG has accomplished. Having said this, we will not rest on our laurels. There are many great opportunities that lie ahead of us. While we set the bar pretty high in 2014 with the establishment of the content-rich blog #IOUGenius, an increased number of Master Classes offered across the nation and a truly inspirational COLLABORATE 14, you'll be very pleased with what IOUG has in store for 2015.

Posted April 01, 2015

Syncsort is partnering with Impetus Technologies to provide integrated solutions for building real-time, streaming analytics applications integrated with Apache Kafka, RabbitMQ, and other message brokers on Hadoop. While Syncsort has contributed to open source projects for several years, the partnership with Impetus is taking the next step in streamlining data analytics for real-time applications.

Posted April 01, 2015

Splice Machine has formed a strategic partnership with RedPoint Global, to provide a marketing solution for big data. According to the vendors, with database technology from Splice Machine and cross-channel marketing and data quality technology from RedPoint, the partnership provides a platform that can use big data to enable personalized, real-time interactions to engage customers across channels.

Posted March 31, 2015

Mtelligence Corporation (dba Mtell), and MapR Technologies have introduced a new big data platform called Mtell Reservoir that combines the MapR Distribution including Hadoop, Mtell Previse Software, and Open TSDB (time-series database) software technology. The solution is targeted at the oil and gas industry.

Posted March 31, 2015

Talend has introduced a new solution called Talend Integration Cloud to provide instant, elastic and secure capacity so IT teams can more easily shift workloads between on-premise and cloud environments. Targeted at SMBs, large enterprises, and IT integration developers, and planned for availability in April, the hosted cloud integration platform will provide a single solution for bulk, batch and real-time data integration across hundreds of data sources including Hadoop, Amazon Redshift and NoSQL databases, said Ashley Stirrup, chief marketing officer for Talend.

Posted March 24, 2015

EMC has unveiled a new fully engineered solution incorporating storage and big data analytics technologies from EMC Information Infrastructure, Pivotal, and VMware. Dubbed "the Federation Business Data Lake (FBDL)," it is designed for speed, self-service, and scalability for the enterprise, enabling organizations to deploy Hadoop and real-time analytics capabilities in as little as 7 days.

Posted March 23, 2015

It looks like 2015 will be an important year for big data and many other technologies such as HTAP and in-memory computing. Many businesses have gone from investigation to experimentation to actual implementation. With installations coming online, and more to come in 2015 and beyond, big data will become more efficient and more customer-focused. Essentially, what many saw as hype will now turn into real implementations.

Posted March 12, 2015

Hadoop distribution provider MapR Technologies has announced the results of testing based on the recently released benchmark for big data technologies from TPC (Transaction Processing Performance Council). The recently released TPCx-HS benchmark for big data technologies is a series of tests that compare Hadoop architectures across several dimensions. Cisco is also now reselling the MapR Distribution with Cisco UCS, as part of an agreement that includes marketing, sales and training worldwide.

Posted March 05, 2015

"This acquisition is strategic, synergistic, and will strengthen our leadership in the big data and Hadoop market," said Shimon Alon, Attunity's chairman and CEO, during a conference call this morning discussing his company's purchase of Appfluent, a provider of data usage analytics for big data environments, including data warehousing and Hadoop. "We also expect it to accelerate our revenue growth and to be accretive to earnings." The total purchase price is approximately $18 million, payable in cash and stock, with additional earn-out consideration based on performance milestones.

Posted March 05, 2015

Informatica, a data management company, is collaborating with two major big data players - Capgemini and Pivotal - on a data lake solution. As part of the Business Data Lake ecosystem developed by Capgemini and Pivotal, Informatica will deliver certified technologies for data integration, data quality and master data management (MDM).

Posted March 03, 2015

Xplenty has formed a partnership with Avlino, a big data solutions provider, to further accelerate batch processing within Xplenty's software. With the new partnership, Xplenty and Avlino say they want to reverse a commonly accepted industry statistic concerning data processing - that it takes business users 80% of the time to prepare data, allowing them only 20% of the time to actually analyze it.

Posted March 03, 2015

Syncsort, a provider of big data and mainframe software, has completed the acquisition of UK-based William Data Systems, a provider of advanced network monitoring and security software products for the IBM z Systems z/OS mainframe platform. According to Syncsort, because the mainframe is a high-volume transactional supercomputer, the networking and security data collected by the William Data product suite is particularly valuable to fast-growing big data and analytical platforms.

Posted March 02, 2015

The "Internet of Things" (IoT) is opening up a new world of data interchange between devices, sensors, and applications, enabling businesses to monitor, in real time, the health and performance of products long after they leave the production premises. At the same time, enterprises now have access to valuable data—again, in real time if desired—on how customers are adopting products and services.

Posted February 25, 2015

Sqrrl, a big data analytics company that develops software to uncover hidden patterns, trends, and links in data, has announced the launch of Sqrrl Enterprise 2.0 coinciding with receiving $7 million in new funding. Sqrrl was developed by former employees of the NSA. "We don't shy away from our NSA heritage, we see it as a strength," stated Ely Kahn, co-founder and director of business development for Sqrrl.

Posted February 24, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Sponsors