Xplenty has announced new $4 million in funding from Bain Capital Ventures, True Ventures, and Rembrandt Venture Partners, and with participation in the funding round from existing Xplenty investors Magma Venture Partners and Waarde Capital.
Posted January 04, 2017
The Data Warehousing Sanity Check
Posted January 04, 2017
Pitney Bowes has joined Hortonworks Partnerworks in the Modern Data Solutions (MDS) partner program. According to the vendors, location-based data, in particular, is becoming more important in how businesses understand their customers because it is one of the most consistent ways to link people, places, and things.
Posted January 04, 2017
Using Data Lake Management Strategies for Big Data Analytics
Posted January 03, 2017
ZeroPoint technology focuses on analyzing documents, email, web content and server traffic for hazardous content such as malicious code
Posted December 13, 2016
When software providers consider transitioning to (or at the very least adding) a SaaS offering, they think about the impact to their business of moving from a perpetual license model to a recurring revenue stream. And while it's easy to remember and consider such migration costs as application-level rearchitecture, other upfront and ongoing costs - such as infrastructure and service-related costs - are often severely underestimated.
Posted December 12, 2016
The Modern Heterogeneous Enterprise Data Architecture Takes Shape
Posted December 08, 2016
It has become all too clear that no organization is immune from the risk of a data breach, and that anyone accessing data can pose a threat - including trusted employees and partners. Here, IT executives speculate on the impact newer technologies such as IoT, blockchain, and cloud, as well as the need for data protection, including disaster recovery plans, encryption, and comprehensive oversight.
Posted December 07, 2016
Many providers of cloud services market the idea that all critical computing functions should be run using their public cloud services because this paradigm is the future and the future is now. While we do share that long-term vision, the reality is less impressive, and the solution is not yet complete. Amazon itself does not run 100% of its critical business systems in the AWS Public Cloud, a fact that was revealed in The Wall Street Journal article, "Cloud-Computing Kingpins Slow to Adapt to Own Movement." This is also true for Google, Microsoft, and other top cloud providers.
Posted November 15, 2016
The definition of "data visualization" often varies depending on whom you ask. For some, it's a process of visually transforming data for exploration or analysis. For others, it's a tool to share analytical insights or invite discovery.
Posted November 15, 2016
Data as a service (DaaS) is a business-centric service that transforms raw data into meaningful and reusable data assets, and delivers these data assets on-demand via a standard connectivity protocol in a pre-determined, configurable format and frequency for internal and external consumption.
Posted November 04, 2016
New data sources such as sensors, social media, and telematics along with new forms of analytics such as text and graph analysis have necessitated a new data lake design pattern to augment traditional design patterns such as the data warehouse. Unlike the data warehouse - an approach based on structuring and packaging data for the sake of quality, consistency, reuse, ease of use, and performance - the data lake goes in the other direction by storing raw data that lowers data acquisition costs and provides a new form of analytical agility.
Posted November 03, 2016
A new semantic-based graph data model has emerged within the enterprise. This data model has all of the advantages of the relational data model, but goes even further in providing for more intelligence built into the database itself, enabling greater elasticity to absorb the inevitable changes to data requirements, at cloud scales.
Posted November 02, 2016
Data has become a disruptive force for global businesses and a catalyst for digital transformation. But data can only be leveraged for BI initiatives to the extent it can be accessed and trusted. And, while today's self-service BI and analytics tools satisfy a user's craving for more "consumerized" technology, they often leave an analyst stuck in neutral because the users, first and foremost, cannot find the data they need to perform any analysis.
Posted November 02, 2016
Kinetica, provider of an in-memory database accelerated by GPUs (graphics processing units) has introduced two new software and services offerings designed to help customers ingest and use streaming datasets through use of GPUs.
Posted October 31, 2016
The focus of data governance should not be on creating bureaucracy and rules, but instead on business enablement within context of use. To do this, I suggest looking at data governance not as enforcement of a discipline, but instead as a process of guiding a data expedition. Let's look at what a data expedition entails and how data governance will be the guide of this ongoing journey.
Posted October 13, 2016
Data Modeling for the Modern World
Posted October 10, 2016
The rise of big data with new sources of data for analytics represents new opportunity to put data to work in organizations for a wide range of uses. A developing use case for leveraging data analytics on large datasets is fraud discovery.
Posted October 05, 2016
Choosing when to leverage cloud infrastructure is a topic that should not be taken lightly. There are a few issues that should be considered when debating cloud as part of a business strategy.
Posted October 04, 2016
NoSQL and Hadoop—two foundations of the emerging agile data architecture—have been on the scene for several years now, and, industry observers say, adoption continues to accelerate—especially within mainstream enterprises that weren't necessarily at the cutting edge of technology in the past.
Posted October 04, 2016
At Strata + Hadoop World, Hortonworks showcased its technology solutions for streaming analytics, security, governance, and Apache Spark at scale.
Posted September 30, 2016
Data lakes are quickly transitioning from interesting idea to priority project. A recent study, "Data Lake Adoption and Maturity," from Unisphere Research showed that nearly half of respondents have an approved budget or have requested budget to launch a data lake project. What's driving this rapid rush to the lake?
Posted September 27, 2016
At Strata + Hadoop World, MapR Technologies announced support for microservices that leverage continuous analytics, automated actions, and rapid response to better impact business as it happens. The new capabilities in the MapR Platform range from microservices application monitoring and management to integrated support for agile microservices application development.
Posted September 27, 2016
Big Data 50 - Companies Driving Innovation
Posted September 14, 2016
Conventional wisdom insists that IT will migrate to the cloud entirely at some point. But practical experience shows that enterprises that have invested in legacy architecture that still has many years of life left in it are not likely to rip and replace, at potentially astronomical costs. Instead, implementing a Bimodal IT approach supported by SDDC on integrated systems will allow companies to address scalability needs with agility, while also ensuring the mission-critical functions of their legacy systems are not compromised.
Posted September 12, 2016
Data lakes are quickly transitioning from interesting idea to priority project. A recent study, "Data Lake Adoption and Maturity," from Unisphere Research showed that nearly half of respondents have an approved budget or have requested budget to launch a data lake project.
Posted September 12, 2016
The elastic and distributed technologies that run modern applications require a new approach to operations — one that learns about your infrastructure and assists IT operators with maintenance and problem-solving. The interdependencies between new applications are creating chaos in existing systems and surfacing the operational challenges of modern systems. Solutions such as micro services architectures alleviate the scalability pains of centralized proprietary services but at a tremendous cost in complexity.
Posted August 25, 2016
Getting to Know Hadoop and its Advantages
Posted August 25, 2016
Perhaps the biggest and most overlooked is how to create accurate test data. You're implementing a new system in order to deal with a massive amount of data, and perhaps your relational database can't handle the volume, so it's vitally important to properly test this new system and ensure that it doesn't fall over as soon as the data floods in.
Posted August 23, 2016
Informatica is releasing five new Informatica Cloud offerings in Amazon Web Services Marketplace (AWS Marketplace) to help organizations jumpstart data management projects in the cloud.
Posted August 12, 2016
Paxata is unleashing new native push/pull seamless connectivity options to and from Amazon Web Services (AWS) that include the Amazon Redshift data warehouse and Amazon Simple Storage Service (Amazon S3).
Posted August 11, 2016
Nimbus Data is releasing a new all-flash platform for cloud, big data, virtualization, and massive digital content that will offer unprecedented scale and efficiency.
Posted August 09, 2016
1010data, Inc. is making enhancements to its Consumer Insights Platform (CIP) with its latest release. CIP 3.0 supports a wider range of business users by facilitating faster decision-making with a more user-friendly interface, additional reports, and enhanced interactivity built right into key features, according to the company.
Posted August 05, 2016
Overcoming Big Data Integration Challenges
Posted July 21, 2016
Monte Zweben, CEO and co-founder of the company that was founded in 2012, talks with Big Data Quarterly about why Splice Machine has rolled out an open source Community Edition - and why it is doing so now
Posted July 18, 2016
GridGain Systems, provider of enterprise-grade In-Memory Data Fabric solutions based on Apache Ignite, is releasing a new edition of its signature platform.
Posted July 05, 2016
RedPoint Global, a provider of data management and customer engagement software, has announced integration with Microsoft Azure HDInsight to support enhanced data management capabilities via Hadoop deployments on Microsoft Azure. RedPoint is a member of the Microsoft Partner Network, and the new integration evolved from its participation in the Microsoft Enterprise Cloud Alliance Program.
Posted June 30, 2016
The next major release of MarkLogic's enterprise NoSQL database platform is expected to be generally available by the end of this year. Gary Bloom, president and CEO of the company, recently reflected on the changing database market and how new features in MarkLogic 9 address evolving requirements for data management in a big data world. "For the first time in years, the industry is going through a generational shift of database technology - and it is a pretty material shift," observed Bloom.
Posted June 30, 2016
Hortonworks, Inc. unveiled new innovations at Hadoop Summit that will improve the Hortonworks Data Platform (HDP), allowing enterprises to accumulate, analyze, and act on data.
Posted June 29, 2016
Qubole unveiled a new feature to its Qubole Data Service (QDS) called auto-caching, a next-generation disk cache for cloud storage systems that works across different data engines.
Posted June 28, 2016
Hortonworks, Inc.,is partnering with AtScale to resell AtScale's technology, providing users with the ability to query data without any data movement from any business intelligence tool.
Posted June 28, 2016
MapR Technologies is introducing a new initiative that will help support Hadoop deployments and increase user and administrator productivity.
Posted June 28, 2016
Pepperdata is unveiling a new tool that will evaluate and assess Hadoop clusters and provide visibility into current cluster conditions.
Posted June 27, 2016
At the 2016 Hadoop Summit in San Jose, Teradata announced the certification of multiple BI and visualization solutions on the Teradata Distribution of Presto.
Posted June 27, 2016
Talend has released the newest version of its Data Fabric, an integration platform for both developers and business users whether their applications are on-premises or in the cloud. The updated platform now features enterprise-grade support for data preparation, a solution that reduces the time required for collection and analysis of data.
Posted June 27, 2016
Big data with its increasing volume, velocity, and variety is causing organizations to see their data as a valuable commodity. But central to that value, says Amit Walia, chief product officer of Informatica, is the ability to bring all data together to get a single view. Here, Walia discusses the three key challenges organizations face and what he sees as the biggest disruptor on the horizon.
Posted June 23, 2016
Trifacta, a provider of data wrangling software, is deepening technical integration with the Hortonworks Data Platform (HDP) and the industry's first certification for Apache Atlas, a data governance and metadata framework for Hadoop.
Posted June 23, 2016
Hortonworks, Inc. is enhancing its Global Professional Services (GPS) program to support and enable Hortonworks Connected Data Platforms customers.
Posted June 16, 2016
Embarking on a Data Expedition for Data Governance
Posted June 09, 2016
Comparing Commercial Versus Open Source Software for Analytics
Posted June 08, 2016