Hadoop Articles
The COLLABORATE 16 conference for Oracle users kicked off with a presentation by Unisphere Research analyst Joe McKendrick who shared insights from a ground-breaking study that examined future trends and technology among 690 members of three major Oracle users groups.
Posted April 12, 2016
Thanks to the digital business transformation, the world around us is changing—and quickly—to a very consumer- and data-centric economy, where companies must transform to remain competitive and survive. The upshot is that for many companies today, it is a full-on Darwinian experience of survival of the fittest.
Posted April 08, 2016
Qubole, a big data-as-a-service company, is open sourcing its Quark platform, a cost-based SQL optimizer. The Quark project is also available in a SaaS implementation via the Qubole Data Service (QDS).
Posted April 07, 2016
IBM says it is making it easier and faster for organizations to access and analyze data in-place on the IBM z Systems mainframe with a new z/OS Platform for Apache Spark. The platform enables Spark to run natively on the z/OS mainframe operating system.
Posted April 04, 2016
Databricks, the company behind Apache Spark, is releasing a new set of APIs that will enable enterprises to automate their Spark infrastructure to accelerate the deployment of production data-driven applications.
Posted April 01, 2016
ManageEngine is introducing a new application performance monitoring solution, enabling IT operations teams in enterprises to gain operational intelligence into big data platforms. Applications Manager enables performance monitoring of Hadoop clusters to minimize downtime and performance degradation. Additionally, the platform's monitoring support for Oracle Coherence provides insights into the health and performance of Coherence clusters and facilitates troubleshooting of issues.
Posted April 01, 2016
It's become almost a standard career path in Silicon Valley: A talented engineer creates a valuable open source software commodity inside of a larger organization, then leaves that company to create a new startup to commercialize the open source product. Indeed, this is virtually the plot line for the hilarious HBO comedy series, Silicon Valley. Jay Krepes, a well-known engineer at LinkedIn and creator of the NoSQL database system, Voldemort, has such a story.
Posted March 31, 2016
Denodo, a provider of data virtualization software, is releasing Denodo Platform 6.0, further accelerating its "fast data" strategy. "It's a major release for us," said Ravi Shankar, Denodo CMO. There are three important areas that nobody else is focusing on in the industry, he noted. "This, we hope, will change how data virtualization, and in a broader sense, data integration will shape up this year."
Posted March 31, 2016
NoSQL databases were born out of the need to scale transactional persistence stores more efficiently. In a world where the relational database management system (RDBMS) was king, this was easier said than done.
Posted March 29, 2016
MapR is now available as part of Bigstep's big data platform-as-a-service, supporting a wide range of Hadoop applications.
Posted March 29, 2016
Reltio is releasing an enhanced version of Reltio Cloud 2016.1, adding new analytics integration, collaboration, and recommendation capabilities to help companies be right faster.
Posted March 29, 2016
Teradata has introduced a new "design pattern" approach for data lake deployment. The company says its concept of a data lake pattern leverages IP from its client engagements, as well as services and technology to help organizations more quickly and securely get to successful data lake deployment.
Posted March 28, 2016
The rise of big data technologies in enterprise IT is now seen as an inevitability, but adoption has occurred at a slower pace than expected, according to Joe Caserta, president and CEO of Caserta Concepts, a firm focused on big data strategy consulting and technology implementation. Caserta recently discussed the trends in big data projects, the technologies that offer key advantages now, and why he thinks big data is reaching a turning point.
Posted March 23, 2016
Informatica has launched an end-to-end solution to help customers gain greater insight from big data.
Posted March 23, 2016
SAP SE's newest in memory query engine, SAP HANA Vora, is now generally available, equipping enterprises with contextual analytics across all data stored in Hadoop, enterprise systems, and other distributed data sources.
Posted March 23, 2016
As more businesses leverage applications that are hosted in the cloud, the lines between corporate networks and the internet become blurred. Accordingly, enterprises need to develop an effective strategy for ensuring security. The problem is, many of today's most common approaches simply don't work in this new cloud-based environment.
Posted March 23, 2016
The OAUG volunteers planning COLLABORATE 16: Technology and Applications Forum for the Oracle Community (April 10-14 at Mandalay Bay in Las Vegas) are themselves Oracle users and technologists, understanding innately the myriad options and challenges faced by the wider user community in a period of rapid change and transformation. With participation and contributions from all corners of the user community, COLLABORATE offers the information and perspective to make sense of it all.
Posted March 21, 2016
Oracle has released a free and open API and developer kit for its Data Analytics Accelerator (DAX) in SPARC processors through its Software in Silicon Developer Program. "Through our Software in Silicon Developer Program, developers can now apply our DAX technology to a broad spectrum of previously unsolvable challenges in the analytics space because we have integrated data analytics acceleration into processors, enabling unprecedented data scan rates of up to 170 billion rows per second," said John Fowler, executive vice president of Systems, Oracle.
Posted March 16, 2016
Available now, Talend says its Integration Cloud Spring '16 release adds enhancements to help IT organizations execute big data and data integration projects running on AWS Redshift or AWS Elastic MapReduce (EMR) with greater ease - using fewer resources, and at a reduced cost.
Posted March 16, 2016
Attivio is receiving $31 million in investment financing that will help expand the company as it accelerates its offerings into the big data market.
Posted March 09, 2016
IDERA, a provider of database lifecycle management solutions, is extending its product portfolio by adding Embarcadero Technologies' ER/Studio and DB PowerStudio tools, allowing organizations to rely on a single vendor for all their database lifecycle needs.
Posted March 09, 2016
In a new book titled "Next Generation Databases," Guy Harrison, an executive director of R&D at Dell, shares what every data professional needs to know about the future of databases in a world of NoSQL and big data.
Posted March 08, 2016
Syncsort is introducing new capabilities to its data integration software, DMX-h, that allow organizations to work with mainframe data in Hadoop or Spark in its native format, which is necessary for preserving data lineage and maintaining compliance.
Posted March 07, 2016
Infobright, the columnar database analytics platform, has unveiled its new Infobright Approximate Query (IAQ) solution for large-scale data environments, allowing users to gain insights faster and efficiently. "This technology is being delivered on the basis of rethinking the business problem and using technology in a very meaningful way to solve problems that would otherwise be unsolvable using a traditional approach," said Don DeLoach, CEO.
Posted February 26, 2016
SAP SE is introducing new predictive capabilities within its platforms with the release of SAP HANA Cloud Platform predictive services 1.0 and SAP Predictive Analytics 2.5.
Posted February 24, 2016
The promise of the data lake is an enduring repository of raw data that can be accessed now and in the future for different purposes. To help companies on their journey to the data lake, Information Builders has unveiled the iWay Hadoop Data Manager, a new solution that provides an interface to generate portable, reusable code for data integration tasks in Hadoop.
Posted February 23, 2016
It is hard to think of a technology that is more identified with the rise of big data than Hadoop. Since its creation, the framework for distributed processing of massive datasets on commodity hardware has had a transformative effect on the way data is collected, managed, and analyzed - and also grown well beyond its initial scope through a related ecosystem of open source projects. With 2016 recognized as the 10-year anniversary for Hadoop, Big Data Quarterly chose this time to ask technologists, consultants, and researchers to reflect on what has been achieved in the last decade, and what's ahead on the horizon.
Posted February 18, 2016
Currently, the IT industry is the midst of a major transition as it moves from the last generation - the internet generation - to the new generation of cloud and big data, said Andy Mendelsohn, Oracle's EVP of Database Server Technologies, who recently talked with DBTA about database products that Oracle is bringing to market to support customers' cloud initiatives. "Oracle has been around a long time. This is not the first big transition we have gone through," said Mendelsohn.
Posted February 17, 2016
Hewlett Packard Enterprise (HPE) has selected RedPoint Data Management platform as the underlying platform for a new HPE Risk Data Aggregation and Reporting (RDAR) integrated solution to support financial institutions' compliance with BDBS 239.
Posted February 16, 2016
Addressing the shift toward business-user-oriented visual interactive data preparation, Trillium Software has launched a new solution that integrates self-service data preparation with data quality capabilities to improve big data analytics.
Posted February 16, 2016
Say what you will about Oracle, it certainly can't be accused of failing to move with the times. Typically, Oracle comes late to a technology party but arrives dressed to kill.
Posted February 10, 2016
Oracle has introduced a new Big Data Preparation Cloud Service. Despite the increasing talk about the need for companies to become "data-driven," and the perception that people who work with business data spend most of their time on analytics, Oracle contends that in reality many organizations devote much more time and effort on importing, profiling, cleansing, repairing, standardizing, and enriching their data.
Posted February 10, 2016
Looker, provider of a BI platform, has added support for Presto and Spark SQL as well as updates to its support for Impala and Hive.
Posted February 09, 2016
Trifacta, provider of a self-service data preparation platform, is receiving $35 million in growth-stage financing that will be used to continue expanding the company globally and allow for additional projects. "The multi-billion-dollar big data and IoT revolution requires a modern, innovative approach to preparing data and empowering end users," said Ping Li, partner at Accel and director of the company's Big Data Fund. "Trifacta delivers that platform. We've backed Trifacta's incredible team from the beginning and are excited to help the company scale to take advantage of this significant market opportunity."
Posted February 09, 2016
Franz Inc. is releasing an updated version of its AllegroGraph platform along with receiving certification on the latest release of Cloudera Enterprise through the Cloudera Certified Technology Program (CCTP).
Posted February 08, 2016
EMC has begun shipping the latest version of EMC Elastic Cloud Storage (ECS), a multi-purpose, shared global storage that scales into Exabytes to store both small and large files. ECS was first launched in spring 2014 to provide high-density storage with exabyte scalability in a private cloud.
Posted February 08, 2016
OpsClarity's intelligent monitoring solution now provides monitoring for a growing and popular suite of open source data processing frameworks, including Apache Kafka, Apache Storm, Apache Spark as well as datastores such as Elasticsearch, Cassandra, MongoDB. The solution is intended to enable DevOps teams to gain visibility into how these technologies are dependent on each other and troubleshoot performance issues.
Posted February 08, 2016
Hewlett Packard Enterprise (HPE) has announced the availability of HPE Investigative Analytics, a new software solution to enable financial institutions and other organizations in highly regulated industries to use big data technologies to detect patterns, relationships, behaviors, and anomalies across structured and unstructured data stores. The software is aimed at helping companies reduce risk by proactively preventing fraudulent actions.
Posted February 02, 2016
The combination of location data with high speed data in VoltDB version 6.0 will allow VoltDB to better support businesses that need to maximize the value of real-time data streams including location data to make personalized offers to customers and improve decision making.
Posted January 28, 2016
Paxata's Winter '15 release allows administrators to deploy the data prep platform in heterogeneous environments including the Hortonworks Data Platform on YARN and with multiple versions of Apache Spark. The latest release also improves the way business analysts find, access, and apply data by delivering additional ease of use capabilities supported by machine learning innovations, and provides enterprise-grade security and a multi-tenant governance model.
Posted January 27, 2016
ClearStory Data is making advancements and core improvements in the upcoming release of its native Apache Spark platform. With Apache Spark 1.6, ClearStory further speeds exploration on big, diverse data when business users need unrestricted data discovery and free-form exploration to answer new questions.
Posted January 26, 2016
Rocana has unveiled the latest version of its solution for managing and analyzing event-oriented machine data that introduces new advanced analytics and anomaly detection abilities. In addition to the new features, the platform update also introduces support for Hortonworks along with Cloudera, further deepening the platform's reach.
Posted January 12, 2016
Though MySQL has evolved into a robust database engine that now handles large-scale operations for mission-critical workloads, the complexity of performance tuning can still prove challenging for DBAs who are increasingly being tasked with managing multiple database platforms in production for larger and complex workloads.
Posted January 06, 2016