Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

In what is being hailed as the biggest tech merger ever, Dell Inc. and EMC Corp. today formally announced they have signed a definitive agreement under which Dell will acquire EMC. The total transaction is valued at $67 billion. The deal is expected to close in the second or third quarter of Dell's fiscal year which ends February 3, 2017 (within the months of May to October 2016). The industry is going through a "tremendous transformation," with the old style of IT being "pretty quickly disrupted" yet this rapid change is also presenting "incredibly rich" opportunities, said Joe Tucci, chairman and chief executive officer of EMC, during a conference call with media and industry analysts.

Posted October 12, 2015

MapR Technologies has added native JSON support to the MapR-DB NoSQL database. The in-Hadoop document database will allow developers to quickly deliver scalable applications that also leverage continuous analytics on real-time data. A developer preview of MapR-DB with sample code is available for download and general availability of these new capabilities in MapR-DB will be available in Q4 2015.

Posted October 07, 2015

Ever since Linux became a viable server operating system, organizations have been looking to all kinds of open source software (OSS) to save on license and maintenance costs and to enjoy the benefits of an open platform that invites innovation. If you're considering MySQL or another open source DBMS as either your primary database or to, perhaps, operate alongside your existing commercial systems, such as Oracle or Microsoft SQL Server, for one reason or another, here are seven things to keep in mind.

Posted October 07, 2015

The Agile methodology is great for getting turgid development teams to start working faster and more coherently. With Agile, which focuses on more rapid, incremental deliverables and cross-departmental collaboration, the bureaucratic plaque is flushed from the information technology groups' arteries. But there is a dark side to Agile approaches.

Posted October 07, 2015

Prior to SQL Server 2016, currently in CTP, your main method for encrypting a SQL Server application was to use a feature called Transparent Data Encryption. TDE provides strong encryption, but with some shortcomings. First, you have to encrypt an entire database. No granularity is offered at a lower level, such as encrypting specific tables or certain data within a table. Second, TDE encrypts only data at rest, in files. Data in memory or in-flight between the application and server are unencrypted. Enter Always Encrypted.

Posted October 07, 2015

Too little emphasis overall is placed on the integrity and recoverability of the data—and too much is placed on performance. Yes, performance is probably the most visible aspect of database systems, at least from the perspective of the end user. But the underlying assumption of the end user is always that they want to access accurate and, usually, up-to-date data. But what good does it do to quickly access the wrong data? Anybody can provide rapid access to the wrong data!

Posted October 07, 2015

There's unrelenting pressure on businesses to compete on analytics and to be able to anticipate customer needs and trends ahead of the curve. Enterprises are looking to expand BI and analytics capabilities as far and wide as technologies and budgets will allow them to go. As a result, the continuing advance of analytic capabilities across the enterprise has reached a "tipping point."

Posted October 07, 2015

IT suppliers and data management managers are experiencing a major pain point with efficient data logging management. The availability of NoSQL open source software has enabled enterprises to collect large volumes of data from different sources, and software companies have implemented "call back home" features that allow their software to send information to data collection centers within various parameters, creating additional run time configurations and data traffic. And as the Internet of Things and a "connected everything" approach to businesses become increasingly popular, more and more data will flow in and out of data management systems, leaving IT managers muddled with millions of pieces of data they must properly manage and store.

Posted October 07, 2015

Magnitude Software, a provider of enterprise information management (EIM) software, has released new product versions designed to improve every component of the Noetix operational reporting solution for Oracle E-Business Suite. Now available, NoetixViews 6.5 for Oracle E-Business Suite features incremental regeneration for global views, as well as additional enhancements. Before incremental regeneration, the only option for implementing NoetixViews Workbench customizations was a full regeneration of the views. Incremental regeneration only processes NoetixViews Workbench changes since the last full or incremental regeneration reducing the wait time to minutes so that end users benefit from quicker access to data and faster time to business decisions, requiring fewer resources and less planning from IT.

Posted October 07, 2015

Oracle has announced Oracle SOA Cloud Service and Oracle API Manager Cloud Service, new additions to the Oracle Cloud Platform for Integration. The two cloud services join Oracle's other iPaaS services, including Oracle Integration Cloud, which was announced in June. The two new releases are part of Oracle's ongoing process of augmenting and covering integration use cases to address the variety of different user requirements, according to Amit Zavery, senior vice president of Oracle Cloud Platform. "These are two different offerings for two different use cases."

Posted October 07, 2015

Embarcadero Technologies, a provider of software solutions for application and database development, recently unveiled DB PowerStudio 2016, a suite of database tools that provides database managers and data professionals with comprehensive administration, development, performance and monitoring capabilities across multiple platforms.

Posted October 07, 2015

Basho Technologies today announced Basho Riak TS, a distributed NoSQL database that is designed to enable analysis of massive amounts of sequenced, unstructured data generated from the Internet of Things (IoT) and other time series data sources.

Posted October 06, 2015

SnapLogic made three announcements during Strata + Hadoop World in NYC, including a collaboration with Microsoft, new product updates, and the development of new connectors

Posted October 06, 2015

At Strata + Hadoop World 2015, Attunity announced the release of Attunity Replicate Express, a downloadable edition of its data replication and loading software. The solution, which answers a growing demand for more accessible real-time big data analytics, is freely available to download online. The new solution supports ingesting data to and from Oracle, SQL Server, and Hadoop Data Lakes for test and development environments.

Posted October 06, 2015

Built on Hadoop, Kyvos gives business users and analysts the ability to query billions of rows of data within seconds. Kyvos' technology allows users to pre-process data and build cubes on Hadoop for faster performance and instant responses. With this partnership, Kyvos can connect Tableau users to their Hadoop data within minutes, the companies say. "It's a benefit to Tableau because it opens up the data that's available to the business user through Tableau and improves the response time," said Ajay Anand, vice president of product management and marketing at Kyvos."They've been very supportive with what we are trying to do."

Posted October 06, 2015

One of the noticeable changes this year at Strata + Hadoop World 2015 was the rise of Apache Spark, an engine for large scale data processing. In recent months, many companies have extended support to Spark, which can be complementary to Hadoop, but can also be deployed without it.

Posted October 05, 2015

Syncsort is continuing to grow its platforms capabilities by announcing new integration with two active open source platforms, Apache Kafka and Apache Spark, enabling users to better handle real-time, large-scale data processing, analytics, and feeds.

Posted October 01, 2015

At Strata + Hadoop World 2015, SAP showcased its portfolio of big data solutions, including the HANA platform that offers real-time integration of big data and information held in Hadoop with business processes and operational systems, Lumira and SAP BI tools that enable data discovery on Hadoop along with data wrangling capabilities, SAP Data Services, and the newest SAP product for the Hadoop world, HANA Vora, which takes advantage of an in-memory query engine for Apache Spark and Hadoop to speed queries. SAP HANA Vora can be used as a stand-alone, or in concert with SAP HANA platform to extend enterprise-grade analytics to Hadoop clusters and provide enriched, interactive analytics on Hadoop and HANA data.

Posted October 01, 2015

At Strata + Hadoop World, TIBCO announced the availability of the Spotfire Cloud's data discovery and advanced analytics connector to Apache Spark SQL, along with a commercial integration with SparkR. The Spark SQL direct connector is now available in TIBCO Spotfire Cloud, and will also be incorporated in the next TIBCO Spotfire on-premises release.

Posted October 01, 2015

Objectivity, which recently introduced ThingSpan, a purpose-built information fusion platform intended to simplify and accelerate companies' ability to deploy and derive value from industrial Internet of Things (IoT) applications, has announced plans to support Intel's TAP (Trusted Analytics Platform) at Strata + Hadoop World, in NYC. ThingSpan is aimed at helping companies "that are drowning in data but thirsty for answers in time" said Jay Jarrell, CEO and president of Objectivity, during an interview at the conference.

Posted September 30, 2015

At Strata + Hadoop World in New York City, Talend, a provider of data integration software for the cloud and big data, is announcing a new version of its platform, now offering support for Apache Spark and Spark Streaming. Talend 6 will leverage over 100 Spark components to deliver rapid data processing speed and enable any company to convert streaming big data or IoT sensor information into immediate actionable insights.

Posted September 30, 2015

DataTorrent is teaming up with two big companies that will allow it to provide access to better security and make adoption of Hadoop easier. DataTorrent is partnering with Cisco to allow integration between its DataTorrent RTS platform and Cisco's Application Centric Infrastructure (ACI) through the Application Policy Infrastructure Controller (APIC), offering a unified management architecture for enterprises to manage their big data applications along with network and security. DataTorrent is also integrating its platform with Microsoft Azure HDInsight via the Microsoft Azure Marketplace.

Posted September 29, 2015

Paxata, provider of an adaptive data preparation platform, is partnering with Cisco, creating a jointly developed solution dubbed Cisco Data Preparation (CDP). "We are delighted to partner with a world-class organization like Cisco as we continue to fulfill our vision to bringing Adaptive Data Preparation to every analyst in the enterprise," said Prakash Nanduri, Co-founder and CEO of Paxata.

Posted September 29, 2015

Pentaho is updating its platform to help users blend data more efficiently and manage the analytic data pipeline. "We've learned so much over the last couple of years from our big data customers and customers that have scaled and seen the value of big data and their environments," said Donna Prlich, vice president of products solutions and marketing at Pentaho. "We're really looking at our product line and saying, ‘Where do we take this and where does it need to go?' In 6.0 it's really all about putting big data to work."

Posted September 29, 2015

Arcadia Data, a provider of a unified visual analytics and business intelligence (BI) platform for big data, is releasing Arcadia Enterprise, a solution that will run natively in Hadoop. The company says the platform, dubbed Arcadia Enterprise, bypasses the restrictions of legacy BI and visualization tools by allowing users to work directly with their data on Hadoop. "We give the analyst the ability to do free-form exploration of the highest granularity of data in the Hadoop system," said Priyank Patel, co-founder and chief product officer at Arcadia.

Posted September 29, 2015

The Hortonworks DataFlow (HDF) support subscription is now available. HDF, powered by Apache NiFi, a top-level open source project, is intended to help organizations take advantage of data related to the Internet of Anything (IoAT) and helps make it easier to automate and secure data flows and collect, conduct and curate real-time business insights and actions derived from any data, from anything, anywhere. "By flowing that data into HDP, our customers are able to rapidly bring these new data elements under management in a completely secure and purely open way," said Tim Hall, vice president of product management at Hortonworks.

Posted September 29, 2015

IBM introduced a new cloud security technology that helps safeguard the increasing use of "bring-your-own" cloud-based apps at work. Cloud Security Enforcer combines cloud identity management (Identity-as-a-Service) with the ability for companies to discover outside apps being accessed by employees, including those they are using on their mobile devices. These combined capabilities enable companies to equip their workforce with a secure way to access and use the apps that they want.

Posted September 28, 2015

IBM expanded its array of APIs, technologies, and tools for developers who are creating products, services and applications embedded with Watson. Over the past 2 years, the Watson platform has evolved from one API and a limited set of application-specific deep Q&A capabilities to more than 25 APIs powered by over 50 technologies.

Posted September 28, 2015

Pentaho, a Hitachi Data Systems company, and Melissa Data, a provider of global contact data quality solutions, have formed a partnership to create new data quality plug-ins for Pentaho's big data integration and analytics platform.

Posted September 24, 2015

Cambridge Semantics, a provider of data solutions driven by semantic web technology, has formed an alliance with MarkLogic, which provides enterprise NoSQL database technology. According to the vendors, the partnership will will help organizations to rapidly store, access, visualize and act upon diverse data to create scalable, semantic-driven data management and investigative analytics applications at a fraction of the time and cost of traditional approaches.

Posted September 24, 2015

MemSQL, a provider of real-time databases for transactions and analytics, has announced Spark Streamliner, an integrated Spark solution to give enterprises immediate access to real-time analytics.

Posted September 24, 2015

Traditional data warehousing models and open source alternatives such as Apache Hadoop and Storm have been touted as solutions to a variety of "big data" challenges. However, utilities have found that these approaches cannot handle the scale and complexity of data generated in industrial environments. Additionally, they fail to provide the real-time analysis and situational awareness that utilities need to improve decision making or address critical events in real-time, such as optimizing crews during outages and severe weather events.

Posted September 24, 2015

There are various terms being bandied about that describe the new world data centers are entering—from the "third platform" to the "digital enterprise" to the "always-on" organization. Whatever the terminology, it's clear there is a monumental shift underway. Business and IT leaders alike are rethinking their approaches to technology, rethinking their roles in managing this technology, and, ultimately, rethinking their businesses. The underlying technologies supporting this movement—social, mobile, data analytics, and cloud—are also causing IT leaders to rethink the way in which database systems are being developed and deployed.

Posted September 24, 2015

Ricoh Company, Ltd, the printing and document management company, is using Oracle's SPARC T5 servers with Oracle ZFS Storage Appliances and Oracle Database to analyze data from RICOH @Remote, its real-time support service for customers in more than 100 countries and regions. To unify the data across the company for accurate views and analysis, Ricoh built a private cloud accessible by the entire remote service group that allows high speed access from internal networks.

Posted September 24, 2015

StreamSets Inc., a company that aims to speed access to enterprise big data, has closed a $12.5 million round of Series A funding. The single biggest barrier to a successful enterprise analytics platform is the effective and efficient ingest of data, the company says.

Posted September 24, 2015

SAP is releasing a set of enhancements for one of its platforms that it says will change the way enterprises interact with customers. A portfolio of SAP hybris tools is being introduced to enable in-the-moment customer profiling, digital commerce and community development, allowing an organization's front office to stay connected with the frequently shifting needs of its customers and prospects.

Posted September 23, 2015

To help companies get more value from big data, SAP has introduced HANA Vora, a new in-memory computing engine that leverages and extends the Apache Spark execution framework to provide enriched, interactive analytics on Hadoop. HANA Vora is a completely new product built from the ground up and is aimed at better processing of data to make business decisions.

Posted September 23, 2015

Looker, which provides a data discovery and analytics platform, has announced a new offering. According to the vendor, Looker Blocks, which are reusable and customizable, are apps that are components of business logic, such as churn prediction or lifetime value metrics, that can be put together and customized to address data needs company-wide as well as any industry-specific requirements.

Posted September 22, 2015

Percona, a provider of enterprise-class MySQL and MongoDB solutions and services, is introducing Percona Server for MongoDB, an enhanced, open source, fully compatible drop-in replacement for MongoDB Community Edition that offers additional enterprise features and lower total cost of ownership. Completely open source and free to download and use, Percona Server for MongoDB offers all the features of MongoDB 3.0 Community Edition, along with two additional storage engine options - PerconaFT and Facebook's RocksDB - and enterprise-grade functionality, including Simple Authentication and Security Layer (SASL) and advanced auditing.

Posted September 22, 2015

MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, has extended its support for SAS, a provider of business analytics software and services. According to the vendors, the collaboration between SAS and MapR provides advanced analytics with ease of data preparation and integration with legacy systems, assurance of SLAs, and security and data governance compliance. Additionally, joint customers can cost-effectively grow their big data storage infrastructure without relying on storage area network (SAN) or network-attached storage (NAS).

Posted September 22, 2015

Splunk, a provider of real-time operational intelligence solutions, has announced a new version of the Splunk Enterprise platform and the availability of a new solution that targets the complexity of IT intelligence.

Posted September 22, 2015

Platfora, a data discovery platform native to Hadoop, is reaching another milestone for its solution, bringing modern data preparation and real-time capabilities to a wide audience.

Posted September 22, 2015

Oracle announced new releases of Oracle Transportation Management and Oracle Global Trade Management, which help customers optimize their logistics and global trade compliance operations. The solution provides comprehensive transportation and global trade management functionality on a unified platform, spanning transportation, compliance and customs processes across industries and geographies.

Posted September 16, 2015

Marketers continue to become more technologically astute as their roles evolve, but one tool remains under-utilized: data management platforms. These platforms provide numerous ways for marketers to slice and dice data to develop compelling ways to reach customers, often in real time. However, when used incorrectly, these platforms can push marketing campaigns off-course - which is why marketers must get smart on what the platforms are designed to do and how to use them most effectively.

Posted September 16, 2015

Tableau is releasing an upgraded version of its comprehensive data visualization platform, introducing a new mobile app and a web data connector that will allow developers to do more with their data.

Posted September 15, 2015

WebAction unveiled several key advancements that will help expand its brand and platform, allowing the company to continue to derive value and insights from big data. The company has a new $21 million round of series B funding, led by Intel Capital, and is rebranding its streaming analytics platform as "Striim," pronounced as "stream."

Posted September 15, 2015

IBM is acquiring StrongLoop, Inc., a software provider, to help developers connect enterprise applications to mobile, Internet of Things (IoT) and web applications in the cloud. StrongLoop is a provider of application development software - known as enterprise Node.js - that enables software developers to build applications using APIs (application programming interfaces). Financial terms and conditions were not released.

Posted September 14, 2015

VMware recently made a slew of announcements spanning cloud deployments, data center, and devices. The company introduced what it calls the first fully automated software suite for delivering the software-defined data center as an integrated system. Serving as a foundation of VMware's Unified Hybrid Cloud platform, the solution, called VMware EVO SDDC, will extend the virtualization principals of abstraction, pooling and automation across all data center resources and services.

Posted September 14, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165

Sponsors