Big Data Articles
Sepaton, Inc., a provider of backup and recovery platforms, today unveiled a new data protection solution called Sepaton VirtuoSO that is intended to address the challenges organizations are facing due to the exponential growth of both structured and unstructured information. According to the vendor, the scale-out NAS-based data protection solution provides five times the capacity of competitive solutions at initial release with up to 22 times the capacity planned for future releases.
Posted October 15, 2013
IBM announced new systems and solutions intended to help clients and managed service providers build private and hybrid clouds to get the most out of big data, social, and mobile workloads. These include PureSystems, Power Systems, Smarter Storage Systems, System x, and Technical Computing offerings that provide the flexibility clients need to quickly deploy clouds. "IBM is positioned to compete aggressively for public, private and hybrid cloud computing opportunities," said Tom Rosamilia, senior vice president of IBM Systems & Technology Group and Integrated Supply Chain.
Posted October 14, 2013
With the upcoming release of SQL Server 2014 (SQL2014), Microsoft is making advancements in the area of mission-critical performance. Microsoft wants to stake out this ground not only as performance enhancements in the relational engine, but also in terms of features which support better data availability, performance, security, and data integration. Here's what SQL2014 will do for you in those key areas.
Posted October 09, 2013
To augment its product and service offerings both within and outside of the cloud, TransLattice, a provider of a geographically distributed database management system for enterprise, cloud and hybrid environments, has acquired Red Bank, NJ-based StormDB, Inc. The StormDB database is based on PostgreSQL, and TransLattice expects to incorporate proprietary StormDB capabilities and elements of the open source Postgres-XC into the TransLattice Elastic Database.
Posted October 09, 2013
Scaleout Software, a provider of in-memory data grids (IMDGs), announced the availability of ScaleOut hServer V2, which incorporates new technology to run Hadoop MapReduce on live data. This new platform now gives users the ability to analyze live data using standard Hadoop MapReduce code, in memory and in parallel, without the need to install and manage the Hadoop stack of software.
Posted October 08, 2013
Logi Analytics, a data analytics company, has formed a partnership with Hortonworks, a contributor to and provider of Apache Hadoop technology, to help its customers better address the challenges they are facing as they collect and store more information than ever before. Hortonworks provides the Hortonworks Data Platform, an enterprise-grade Hadoop distribution. Logi Analytics provides web-based BI and analytic applications that can be integrated within applications, systems, and processes. As a result of the new partnership, Logi Analytics customers can connect directly to the Hortonworks Data Platform, allowing them to uncover new insights, distribute valuable information, and improve business decision making.
Posted October 04, 2013
EMC's CEO and chairman Joe Tucci gave a keynote at Oracle OpenWorld 2013 on the transition occurring in IT and the data center of the future. There are four key macro trends driving the transformation in IT, said Tucci. These tremendously disruptive and opportunistic trends include mobility, cloud computing, big data, and social networking. Jeremy Burton, EVP at EMC, cited a recent IOUG-Unisphere Research survey report which showed that the daily DBA activities most on the rise are systems monitoring, performance diagnosis, and managing backup and recovery. Oracle and EMC are integrating their technologies to allow customers to spend less time in the back office so they can devote more time to the front office dealing with more impactful business issues, said Burton.
Posted October 02, 2013
Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"
Posted October 02, 2013
Splunk, a platform for real-time operational intelligence, introduced its Splunk Enterprise 6 edition. Splunk's mission is to make machine data accessible, usable and valuable to everyone and - to further that objective - Splunk Enterprise 6 has features that are aimed at eliminating the data divide struggle between the business and IT
Posted October 01, 2013
Tokutek, a provider of big data performance solutions, has introduced TokuMX Enterprise Edition with hot backup. This new edition is designed to eliminate backup-related downtime for non-stop big data applications. TokuMX is an open source database and a performance engine for MongoDB. TokuMX delivers support for ACID transactions without any changes to an application, 20x performance improvements and, on the compression side, a 90% reduction in database size. When databases get extremely large, this is a significant money saver, explained Coco Jaenicke, vice president of marketing, Tokutek, in an interview.
Posted October 01, 2013
MemSQL announced that its distributed in-memory database can now provide JSON analytics, to deliver a consolidated view of structured and semi-structured data, including standard enterprise and social-media data. "This basically fuses SQL and semi-structured data into one system," said Eric Frenkiel, CEO, MemSQL, in an interview.
Posted September 30, 2013
Embarcadero Technologies, a provider of software solutions for application and database development, has introduced CONNECT, the company's new metadata governance platform. "There is a growing recognition that data governance is the pragmatic approach to the complexities of the enterprise data landscape," Henry Olson, director of product management, Embarcadero, explained in a recent interview. The two key ideas for CONNECT center around what Embarcadero calls "embracing the sprawl," said Olson.
Posted September 30, 2013
Revolution Analytics, which provides production-grade analytics software built upon the open source R statistics language, and Teradata, a provider of analytic data solutions, recently announced that as a result of a deepened partnership, Teradata is now offering in-database R Analytics that are fully parallel and scalable. David Smith, Revolution Analytics' vice president of marketing and community, talks about what's ahead for the companies' joint customers.
Posted September 30, 2013
Dataguise, a provider of data security intelligence and protection solutions, has secured $13 billion in Series B funding. According to Dataguise, traditional approaches to securing Hadoop fail because they are too complex, expensive, and incapable of selectively protecting the data that matters in large and diverse environments. Dataguise's solution, DG for Hadoop, aims to provide an efficient and economical method of determining where and how to secure sensitive data in Hadoop. "We are helping people go from a sandbox environment to the production environment," Manmeet Singh, CEO, Dataguise, said in an interview.
Posted September 30, 2013
Syncsort, a software company specializing in high speed sorting products, data integration and backup software and services, has formed a technology partnership with Cloudera. The joint effort is aimed a providing an approach that customers can leverage to exploit critical mainframe data in Hadoop for big data analytics. "We view Cloudera as enabling a lot of the enterprise-level services that Hadoop needs," said Jorge Lopez, director of product marketing, Syncsort, in an interview. "Together, we can really bridge the gap between the mainframe and Hadoop."
Posted September 30, 2013
There is no limit to the potential, business- building applications for big data, springing from the capability to provide new, expansive insights never before available to business leaders. However, the new forms of data, along with the speed in which it needs to be processed, requires significant work on the back end, which many organizations may not yet be ready to tackle. IT leaders agree that to make the most of big data, they will need to redouble efforts to consolidate data environments, bring in new solutions, and revisit data retention policies. These are the conclusions of a new survey of 322 data managers and professionals who are members of the Independent Oracle Users Group (IOUG). The survey was underwritten by Oracle Corp. and conducted by Unisphere Research, a division of Information Today, Inc.
Posted September 26, 2013
RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.
Posted September 26, 2013
SAP AG has introduced a new offering to speed adoption of the SAP HANA platform for customers running SAP Business Suite software. The offering automates the migration process through a series of preconfigured software, implementation services, standardized content and end-user enablement. Delivered as a complete package with a modular approach, the SAP Rapid Deployment Solution allows customers to take advantage of a simple and pre-integrated framework that helps them unlock value quickly, Elvira Wallis senior vice president of analytics, Database Technology and Mobile Solutions Packages, at SAP, tells 5 Minute Briefing.
Posted September 25, 2013
Attunity Ltd., a provider of information availability software solutions, has released a new version of its data replication software intended to address requirements for big data analytics, business intelligence, business continuity and disaster recovery initiatives. Addressing expanding use cases for the solution, Attunity Replicate 3.0, is engineered to provide secure data transfer over long distances such as wide area networks (WANs), the cloud and satellite connections, said Lawrence Schwartz, vice president of marketing at Attunity, in an interview.
Posted September 25, 2013
Glassbeam, a big data applications company specializing in multi-structured machine data analytics has secured $3 million in additional equity funding. Glassbeam's mission is to help companies and enterprises across a variety of markets understand machine data by providing a platform and set of key values to reduce support costs, increase service revenue and improve products. "It's an exciting market. It is all about how to ride this new wave of connected devices and pings across multiple verticals," Puneet Pandit, founder and CEO, Glassbeam, said in an interview.
Posted September 19, 2013
As part of multi-year strategic initiative with Oracle, Accenture has created a set of solutions incorporating Oracle Engineered Systems into its data center transformation consulting and outsourcing services.
Posted September 18, 2013
Syncsort, a provider of big data integration solutions, has announced the availability of MFX ZPCopy, a new software product that can offload mainframe copy processing to zIIP engines, and can be licensed separately as an add-on to Syncsort MFX. After looking at mainframe processing from several customers, Syncsort realized that copy-related processing accounts for hundreds of hours of CPU processing time annually and contributes to batch window bottlenecks, inflating software costs and making it more difficult to meet SLA, said Jorge Lopez, director of product marketing at Syncsort, in an interview.
Posted September 17, 2013
There may be no more commonly used term in today's IT conversations than "big data." There also may be no more commonly misused term. Here's a look at the truth behind the five most common big data myths, including the misguided but almost universally accepted notion that big data applies only to large organizations dealing with great volumes of data.
Posted September 17, 2013
IBM has introduced an array of new software, system and services offerings to help organizations manage big data projects. The technology is aimed at helping customers increase their confidence in their data, their speed in gaining business value out of their data, and sharpen their skill sets to address big data challenges."We have to hold that data to the same standards, manage it, and govern it appropriately for the enterprise. You can't drop those standards because it is unstructured data," said Nancy Kopp-Hensley, a director in product marketing and strategy for Big Data Systems at IBM, in an interview.
Posted September 16, 2013
Oracle holds an enviable position in the IT marketplace with a wide array of database systems, development tools, languages, platforms, enterprise applications, and servers. Riding the coattails of this industry giant is a healthy and far-flung ecosystem of software developers, integrators, consultants, and OEMs. These are the partners that will help make or break Oracle's struggle with new forces disrupting the very foundations of IT. And lately, Oracle—long known for its own brand of xenophobia and disdain for direct competitors—has been making a lot of waves by forging new alliances with old foes. This is opening up potentially lucrative new frontiers for business partners at all levels.
Posted September 16, 2013
Pentaho has launched Pentaho Business Analytics 5.0. The new release represents a redesign of the company's data integration and analytics platform, and provides analytics for big data-driven businesses supported by more than 250 new features and improvements. Pentaho Business Analytics 5.0 is the culmination of years of work on development and also incorporates the result of usability studies on the Pentaho Business Analytics interface. "This is really an evolution of our platform as a whole. It is a significant release with a simplified analytics experience," Donna Prlich, Pentaho senior director of product and solution marketing, tells 5 Minute Briefing.
Posted September 12, 2013
If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.
Posted September 11, 2013
Oracle has introduced its latest ZFS Storage Appliances, the ZS3 Series, aimed at enabling customers to improve operational efficiencies, reduce data center costs, and increase business application performance.
Posted September 11, 2013
Cloudera has announced the general availability of Cloudera Search, a search engine for interactive exploration of data stored in the Hadoop Distributed File System (HDFS) and Apache HBase. In addition, the accompanying add-on RTS (Real-time Search) subscription provides technical support, legal indemnification and continual influence over the development of the open source project.
Posted September 10, 2013
To encourage partners to build, market and sell software applications on top of technology platforms from SAP, the company has introduced the new SAP PartnerEdge program for Application Development. The new partnering model, a component of the SAP PartnerEdge program, is intended to help partners create and monetize innovative, specific applications in the mobile, cloud, database or high-performance in-memory areas. Participating partners will be able to also get go-to-market support, including the SAP partner logo, free application reviews and the ability to leverage SAP Store, the online channel from SAP for enterprise applications and services.
Posted August 31, 2013
Building on the momentum of SAP's OEM (original equipment manufacturer) partner base in North America, three OEM partners - AlertEnterprise, Clockwork, and PROS - will offer their customers access to solutions with SAP HANA, a real-time in-memory technology platform. SAP says it has quadrupled the total number of OEM partners licensing SAP HANA in the first 6 months of 2013 compared to the last 6 months of 2012, demonstrating strong partner adoption in North America and globally.
Posted August 31, 2013
Revolution Analytics, a commercial provider of software, services and support for the open source R project, plans to offer increased support for Hadoop as a platform for big data analytics with Cloudera CDH3 and CDH4 in its upcoming release of Revolution R Enterprise 7.0.
Posted August 27, 2013
Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.
Posted August 21, 2013
Pentaho Corporation, a provider of big data analytics and data integration software, has formed an alliance with Splunk Inc. to provide a big data analytics solution that enables business users to analyze machine data that is generated by websites, applications, servers, storage, network, mobile and other devices, including system sensors. As a result of the new alliance, Splunk customers will have the ability to use Pentaho's platform to do two things, said Eddie White, executive vice president, business development at Pentaho, in an interview.
Posted August 13, 2013
NuoDB has announced the last release of its current product version and a technology preview of some upcoming second-generation features available later in 2013. The preview is contained in the free download of the new NuoDB Starlings Release 1.2. The NewSQL approach is gaining greater acceptance, said Barry Morris, founder and CEO of NuoDB, in an interview. "What people are saying back to us is that they are getting all of the features of NoSQL without throwing SQL or transactions away. And that concept is becoming the popular notion of what NewSQL is."
Posted August 13, 2013
Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.
Posted August 07, 2013
More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.
Posted August 07, 2013
Protegrity USA, Inc., a provider of end-to-end data security solutions, has announced general availability of release 6.5 of its Data Security Platform. This latest release expands the Protegrity Big Data Protector capabilities to include support and certification on many Apache Hadoop distributions. In addition, the new File Protector Gateway Server provides another option for fine-grain data protection of sensitive data before it enters Hadoop or other data stores.
Posted August 06, 2013
Datawatch Corporation, a provider of information optimization solutions, introduced new server and management automation capabilities for its Datawatch Monarch Professional, Datawatch Data Pump and Datawatch Enterprise Server products. Datawatch says the new releases of its flagship information optimization software will enable businesses to better secure, simplify and accelerate their big data and business intelligence applications, and also extend the technology to more users.
Posted August 06, 2013
Syncsort, a provider of big data integration solutions, is expanding its partner program to recruit regional systems integrators (RSIs) that have big data practices. The company is looking for RSIs that have specialized systems integration solutions and services expertise that will add value for customers using DMX-h, DMX and MFX ETL and Sort for a variety of use cases to sort, integrate and process big data in support of critical business intelligence and analytics.
Posted August 06, 2013