Big Data Articles
Helping to support enterprises' plans to implement SAP Business Suite 4 SAP HANA (SAP S/4HANA), BackOffice Associates, a provider of information governance and data modernization solutions, announced that its data management solutions are available to help enterprises orchestrate the data migration and modernization process for implementations of SAP S/4HANA.
Posted May 20, 2015
Rosslyn Analytics, a provider of big data cloud technology, has announced it is one of the first to offer companies analytics as a service on Azure. The big data cloud analytics platform, powered by Azure, provides self-service management from source to analytics and enables business and IT users to interact with, change, and analyze data using a combination of self-service data integration, cleansing and enrichment tools and machine learning and visualization technologies.
Posted May 20, 2015
Oracle provides informational resources, including educational events, webcasts, and white papers.
Posted May 20, 2015
IOUG offers webcasts on a range of topics including security strategies, cloud computing, data sharing and many more.
Posted May 20, 2015
Oracle is collaborating with Mirantis to enable Oracle Solaris and Mirantis OpenStack users to accelerate application and database provisioning in private cloud environments via Murano, the application project in the OpenStack ecosystem.
Posted May 20, 2015
Oracle is shipping a new big data product called Oracle Big Data Spatial and Graph. Spatial and graph analytics has been available as an option for Oracle Database for more than 10 years, and with this introduction the company is bringing spatial and graph analytics to Hadoop and NoSQL.
Posted May 20, 2015
MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, is including Apache Drill 1.0 in the MapR Distribution.
Posted May 19, 2015
DataStax, which provides an enterprise distribution of Apache Cassandra, has introduced DataStax Enterprise 4.7 (DSE 4.7), which the company says, is purpose-built for the stringent performance and availability demands of web, mobile and Internet of Things (IOT) applications. The new release includes advancements to integrated enterprise search, analytics, in-memory computing, and database management and monitoring to address the increasing shift toward mixed workload deployments.
Posted May 19, 2015
The shortage of skilled talent and data scientists in Western Europe and the U.S. has triggered the question of whether to outsource analytical activities. This need is further amplified by competitive pressure to reduce time to market and lower costs.
Posted May 19, 2015
As the excitement and opportunity provided by big data tools develop, many organizations find their big data initiatives originating outside existing data management policies. As a result, many concepts of formal data governance are either intentionally or unintentionally omitted as these enterprises race to ingest huge new data streams at a feverish pace in the hope of increased insight and new analytic value.
Posted May 19, 2015
Similar to the dot-com revolution, the Internet of Things is the culmination of radical advances in four core technology pillars.
Posted May 19, 2015
Google white papers have inspired many great open source projects. What has been missing until now, however, has been a way of bringing these technologies together such that any data-centric organization can benefit from the capabilities of each technology across its entire data center, and in new ways not documented by any single white paper. This is called the "Zeta Architecture."
Posted May 19, 2015
Business pressures, including cost reduction, scalability, and "just-in-time" application software implementation, are just some of the requirements prompting businesses to "cloudify" at least some aspect of their IT infrastructure.
Posted May 19, 2015
Data-driven companies continue to explore data management technologies that better unify operational, analytical, and other disparate or siloed data in a way that offers tangible business value and data management relief.
Posted May 19, 2015
The demand for effective data management is intensifying. At the same time, the database market has expanded into a wide array of solutions—from traditional relational database management systems to alternative databases such as NoSQL, NewSQL, cloud, and in-memory offerings.
Posted May 19, 2015
Data preparation is gaining considerable visibility as a distinct aspect of data management and analytics work.
Posted May 19, 2015
Just when you thought NoSQL meant the end of SQL, think again, and realize why you need to hold on to your relational database administrator like it was 1999. NoSQL has proven to be a resilient next-generation database technology for increasingly common internet-era specialized workloads. Now approaching a decade after its arrival on the scene, NoSQL is moving beyond architectural marvels to practical tools in the software development toolkit and, in that process, unveiling tried-and-true capabilities formerly known to be the scalpels of the enterprise relational database. Let's go back to the future and take a look at how the DBA is becoming as relevant as ever while NoSQL evolves for the enterprise.
Posted May 19, 2015
RedPoint Global was founded in 2006 by Dale Renner, Lewis Clemmens, and George Corugedo, who previously had worked together at Accenture. Based in Wellesley, Mass., RedPoint collaborates with clients around the world in 11 different verticals. "We have always been very focused on the data, and recognize that a lot of business problems live and die by the quality of the data," says Corugedo.
Posted May 19, 2015
Hadoop is contributing to the success of data analytics. Anad Rai, IT manager at Verizon Wireless, examined the differences between traditional versus big data at Data Summit 2015 in a session titled "Analytics: Traditional Versus Big Data." The presentation, which was part of the IOUG track moderated by Alexis Bauer Kolak, education manager at the IOUG, showed how big data technologies are helping data discovery and improving the transformation of information and knowledge into wisdom.
Posted May 14, 2015
At Data Summit 2015 in New York City, Tony Shan, chief architect, Wipro, gave a talk on the key components of a successful big data methodology and shared lessons learned from real world big data implementations. According to Shan, there is an 8-step process for a big data framework with specific techniques and methods.
Posted May 14, 2015
The data lake is one of the hottest topics in the data industry today. It is a massive storage reservoir that allows data to be stored in its rawest forms. Hadoop Day at Data Summit 2015 concluded with a panel on everything data lake featuring James Casaletto, solutions architect for MapR, Joe Caserta, president and founder of Caserta Concepts, and George Corugedo, CTO with RedPoint Global Inc.
Posted May 14, 2015
With the influx of big data solutions and technologies comes a bevy of new problems, according to Data Summit 2015 panelists Miles Kehoe, search evangelist at Avalon Consulting, and Anne Buff, business solutions manager for SAS best practices at the SAS Institute. Kehoe and Buff opened the second day of Data Summit with a keynote discussion focusing on resolving data conundrums.
Posted May 14, 2015
To transform data into value, IT must move from thinking about what it does to data, and instead focus on business outcomes and what can be done with the data to advance the business, according to Edd Dumbill, vice president, strategy, Silicon Valley Data Science, who gave the welcome keynote at Data Summit 2015.
Posted May 14, 2015
If used correctly, machine data can provide a company a significant advantage in terms of understanding user and machine behavior, fighting cyber security risks and fraudulent behavior, service levels and customer behavior. In his talk at Data Summit 2015, Dejan Deklich, vice president, engineering platform and cloud at Splunk, discussed issues around machine data analysis and showcased some prominent use cases.
Posted May 13, 2015
In order to break down barriers in creating and storing data, understanding the modern data architecture is key. That was the focus of Mike Lamble, CEO at Clarity Solution Group, and Ron Huizenga's, product manager at Embarcadero Technologies, presentation at Data Summit 2015.
Posted May 12, 2015
Capgemini is extending its long-standing strategic partnership with SAP, allowing Capgemini to act as a single point of contact for customers globally, and delivering SAP products and support services through one consolidated framework. By signing a global value-added reseller (VAR) agreement with SAP, Capgemini is among a select group of global SAP partners that are part of the global program, which has specific entry requirements that include global reach, reseller capabilities and revenue targets.
Posted May 12, 2015
Splice Machine is partnering with Talend to enable customers to simplify data integration and streamline data workflows on Hadoop. Through this partnership, organizations building operational data lakes with Splice Machine can augment Talend's data integration technology with its data quality capabilities.
Posted May 12, 2015
Pentaho users will now be able to use Apache Spark within Pentaho thanks to a new native integration solution that will enable the orchestration of all Spark jobs. Pentaho Data Integration (PDI), an effort initiated by Pentaho Labs, will enable customers to increase productivity, reduce maintenance costs, and dramatically lower the skill sets required as Spark is incorporated into big data projects.
Posted May 12, 2015
HP has made multiple contributions to the OpenStack Kilo release, including new converged storage management automation and new flash storage technologies to support flexible, enterprise-class clouds. HP's storage contributions to the OpenStack Kilo release focus on two strategic goals.
Posted May 11, 2015
Teradata has made enhancements to the Teradata Database's hybrid row and column capabilities to provide quicker access to data stored on columnar tables and drive faster query performance. Other relational database management systems store data tables in rows or columns, and each method offers benefits, depending on the application and type of data. However, they have been mutually exclusive. Teradata's new hybrid row and column capabilities allow the best of both worlds.
Posted May 08, 2015
Cloudera is now offering support for Capgemini's new reference architecture for the SAP HANA platform and Cloudera Enterprise. "By bringing the power of Cloudera's enterprise data hub offering to the ecosystem in support of SAP HANA, we can enable Capgemini's clients to expand the amount of data they have within their environment in a cost-efficient manner," said Tim Stevens, vice president of corporate and business development at Cloudera.
Posted May 08, 2015
The certification enables Nimble Storage to participate in SAP's program for SAP HANA tailored data center integration using its certified solutions. Through participation in the program, customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing further choice for organizations even when working in heterogeneous environments.
Posted May 07, 2015
Software AG has made updates to its Terracotta In-Memory Data Management platform. New improvements to Terracotta Open Source Kit 4.3 include distributed storage and off-heap storage.The platform is used for boosting performance, scalability, and building real-time applications. Additionally, Terracotta helps developers leverage in-memory storage for current and emerging data workloads.
Posted May 07, 2015
Tableau's cloud analytic solution, Tableau Online, is being upgraded to version 9.0. The new release enables faster performance, and provides additional live database connection support, single sign-on, and other new features designed to help users do more with their data in the cloud. The new update brings a complete redesign of Tableau Online to deliver a faster, more scalable, resilient, and extensible platform with capabilities such as parallel queries, query fusion, vectorization and smarter query caches that will make Tableau Online as much as 10 times faster.
Posted May 07, 2015
When databases are built from a well-designed data model, the resulting structures provide increased value to the organization. The value derived from the data model exhibits itself in the form of minimized redundancy, maximized data integrity, increased stability, better data sharing, increased consistency, more timely access to data, and better usability.
Posted May 06, 2015
Pivotal has made updates to its big data suite that include upgrades to the Pivotal HD enterprise-grade Apache Hadoop distribution, which is now based on the Open Data Platform core, and performance improvements for Pivotal Greenplum Database.
Posted May 05, 2015
The Spring 2015 release of the SnapLogic Elastic Integration Platform extends the platform's cloud and big data integration capabilities to the Internet of Things (IoT) with support for Message Queuing Telemetry Transport (MQTT), a lightweight machine-to-machine connectivity protocol.
Posted May 05, 2015
Deep Information Sciences has closed $8 million in Series A funding. The round brings the total invested in Deep to $18 million. The funding will assist in the growth of the Deep Engine, which break downs the performance, speed and scale limitations of databases to help businesses achieve new insights and opportunities from big data.
Posted May 05, 2015
Dell is partnering with Datawatch Corporation to continue growing its analytics business by integrating Datawatch's interactive visualization and dashboarding capabilities directly into its Statistica advanced analytics platform.
Posted April 30, 2015
CA Workload Automation Advanced Integration 1.0 for SAP Business Warehouse has received SAP certification. Specifically, the SAP Integration and Certification Center has certified that CA Workload Automation Advanced Integration 1.0 integrates with SAP Business Warehouse to provide a unified view for jobs running in both SAP and non-SAP applications.
Posted April 30, 2015
BackOffice Associates' HiT Software division, a provider of data replication and change data capture solutions for heterogeneous database environments, has announced the release of version 8.5 of its flagship product DBMoto.
Posted April 29, 2015
Splice Machine, a provider of Hadoop RDMS, announced that it is partnering with mrc (michaels, ross & cole ltd), to allow Splice Machine's Hadoop RDBMS to be certified and integrated with mrc's m-Power platform. "Our partnership with mrc gives businesses a solution that can speed real-time application deployment on Hadoop with the staff and tools they currently have, while also offering affordable scale-out on commodity hardware for future growth," said Monte Zweben, co-founder and CEO, Splice Machine.
Posted April 28, 2015
Embarcadero Technologies, a provider of software solutions for application and database development, has unveiled the new XE7 version of ER/Studio, its flagship data architecture suite.
Posted April 28, 2015
ProfitBricks, a provider of cloud infrastructure for IaaS, has announced the release of a Node.js SDK and an SDK for Ruby, written against its recently launched REST API.
Posted April 27, 2015
Cloud technology was a dominant focus at COLLABORATE 15, which took place earlier this month, according to Melissa English, president of the Oracle Applications Users Group (OAUG). "What's on top of everybody's mind is cloud strategy," English noted.
Posted April 27, 2015
Predixion Software, a developer of cloud-based predictive analytics (PA) software, announced that Software AG will lead the company's series D funding round. the company says that this fourth round of funding, which includes participation from existing financial and strategic investors, including GE Software Ventures, will support Predixion's move into the Internet of Things (IoT) analytics market.
Posted April 27, 2015
Pivotal HAWQ is now available on the Hortonworks Data Platform (HDP), enabling the benefits of SQL on Hadoop to be leveraged by enterprises that are investing in HDP. This marks the first time that the features and capabilities of Pivotal HAWQ have been made available outside of Pivotal. The availability aligns with a common Open Data Platform (ODP) Core that allows users to leverage the best-of-breed technology across providers.
Posted April 27, 2015
The future will flourish with machines. We've been told this in pop culture for decades, from the helpful robots of the Jetsons, to the infamous Skynet of the Terminator movies, to the omniscient "computer" of Star Trek. Smart, connected devices will be ubiquitous and it's up to us, the humans, to decide what's next. But the Internet of Things (IoT) is about more than devices and data.
Posted April 23, 2015
SUSE and Veristorm are partnering to provide certified high-performance Hadoop solutions that run directly on Linux on IBM z Systems, IBM Power Systems, and x86-64. Customers with IBM z Systems can team SUSE Linux Enterprise Server for System z with Veristorm zDoop, a commercial distribution of Hadoop supported on mainframes.
Posted April 23, 2015