Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Redgate is releasing a major upgrade for its popular tool, SQL Prompt, helping developers increase the speed and quality of their SQL coding and take advantage of the most recent improvements in SQL Server.

Posted November 11, 2019

Rockset, the serverless search and Analytics Company that enables real-time SQL on NoSQL data, is introducing the capability to build interactive, live Tableau dashboards on NoSQL data, without requiring users to write a single line of code.

Posted November 08, 2019

The data warehouse and data lake each solve different business problems and impose their own unique challenges.Organizations shouldn't write off data warehouses—as they evolve, they are taking on new roles in digital enterprises. Data lakes may add a great deal of flexibility to an enterprise data strategy, but they are supported by fast-breaking technologies that require constant vigilance.

Posted November 06, 2019

Dotscience is offering new platform advancements that make deploying and monitoring machine learning models on Kubernetes clusters simple and accessible to data scientists. New Dotscience Deploy and Monitor features dramatically simplify the act of deploying ML models to Kubernetes and setting up monitoring dashboards for the deployed models with cloud-native tools Prometheus and Grafana.

Posted November 01, 2019

Every organization needs a data warehouse. A data warehouse has never been a one-size-fits-all kind of solution. Variations exist and should be accepted.

Posted October 31, 2019

Truly, the speed and performance of your production database systems encompasses a wide range of parameters and decisions that are made well before implementation. DBAs need to understand the options available, the factors that impact performance and development with each DBMS option, and work to keep the IT organization up-to-speed and educated on all of the available choices.

Posted October 31, 2019

There is a sea change underway in enterprise architecture. Just a few years ago, enterprise administrators were fearful of the security implications of trusting an outside provider to protect their data assets. Although security is still a cloud concern—one which predominates at the time of cloud migration, and even grows stronger post-implementation—the use of cloud platforms has gained widespread acceptance.

Posted October 31, 2019

Oracle has identified a need for "augmented" analytics, leveraging machine learning and AI throughout the analytics process to help drive up the impact and value of data, and enable knowledge workers to uncover more insights. Recently, Bruno Aziza, group VP, Oracle Analytics, described this new phase in analytics, the role that cloud plays in making it possible, and what the capabilities will enable for customers.

Posted October 16, 2019

The cloud, ethics in artificial intelligence, privacy, and compliance were some of the topics that attendees were concerned about at Strata Data NY as data professionals converged at the Jacob Javits Center in New York City from September 24 - 26.

Posted October 15, 2019

Advanced analytics software provider Mode today announced the availability of its first instant, responsive data engine, Helix. Helix creates the dual backbone between modern business intelligence and interactive data science. By combining these workflows, data scientists no longer have to choose between shipping fast, one-off answers and building dashboards for broader coverage.

Posted October 14, 2019

GoodData, provider of end-to-end analytics solutions, announced it is now leveraging Amazon Redshift, a cloud data warehouse by Amazon Web Services (AWS). In addition, the combination of GoodData and Amazon Redshift can change the way in which enterprise and product teams are making business decisions. Namely, the trend toward integrating analytics into business applications so every customer, partner, and team member becomes empowered with data and insights at the point of work.

Posted October 11, 2019

DataStax, the company behind a database built on Apache Cassandra, is opening early access to the DataStax Change Data Capture (CDC) Connector for Apache Kafka. The DataStax CDC Connector for Apache Kafka gives developers bidirectional data movement between DataStax, Cassandra, and Kafka clusters. CDC is designed to capture and forward insert, update, and delete activity applied to tables (column families).

Posted October 10, 2019

Collibra, the Data Intelligence company, is making platform-wide upgrades to improve access to critical data and hasten time to insight. The new release introduces machine learning enhancements to Collibra Catalog and marks the availability of Collibra Privacy & Risk, a sustainable approach to compliance with modules for CCPA and GDPR.

Posted October 09, 2019

Has the meaning of big data changed? Many agree that data no longer has to be "big" to meet today's evolving requirements. In particular, open source and cloud tools and platforms have brought data-driven sensibilities into organizations that previously did not have such expertise, making big data more accessible.

Posted September 26, 2019

The big cloud vendors tout many reasons for running IT infrastructure in the cloud. A very prominent benefit is "accelerated innovation and delivery." That's a powerful selling point because every IT manager I have ever known wants to deliver better results, faster, and at lower cost. However, it seems that the less IT managers know about doing actual hands-on IT work, the more demanding they are.

Posted September 26, 2019

Yellowbrick Data, a provider of next-generation enterprise data warehousing, is releasing the Yellowbrick Cloud Data Warehouse and the Yellowbrick Cloud Disaster Recovery (Cloud DR) service. The Yellowbrick Cloud Data Warehouse has been operating in enterprise production environments since early 2019, and both new products leverage the power of the Yellowbrick Data Warehouse.

Posted September 25, 2019

Splunk, a provider of the data-to-everything platform, is making several advancements to pricing, partner, and investment initiatives designed to help customers make smarter business decisions.

Posted September 25, 2019

Although adopting advanced analytics is on the radar for most organizations these days, it is important to understand some of the problems that can occur as you implement analytics projects. Perhaps the most important obstacle to overcome is ensuring buy-in from your organization's leaders.

Posted September 16, 2019

In order to protect your organization, it is critical to watch over the elements that have been built, keep processes running, and be on top of change. Spend the time and resources necessary to properly maintain the solutions for which you are responsible. The amount spent in such endeavors will be less time than that spent trying to play catch up on too many changes after bad things have resulted.

Posted September 03, 2019

We've reached the point where hybrid cloud arrangements have become commonplace in enterprises, and with this trend come implications for databases and data management. The rise of both hybrid and multi-cloud platforms means data needs to be managed in new ways, industry experts point out. And, there are lingering questions about which data should go into the cloud, and which should stay on-premise.

Posted August 29, 2019

In previous articles, we looked at creating and managing Oracle Cloud Infrastructure (OCI) database and compute instances through the web UI. The UI is ideal for a new user and occasional management of a cloud environment. When a cloud administrator is managing and automating dozens or hundreds of instances, manually clicking through the UI becomes untenable. Oracle supports and maintains a number of developer tools to solve this problem.

Posted August 21, 2019

Franz Inc., an early adopter of Artificial Intelligence (AI) and provider of Semantic Graph Database technology for Knowledge Graph Solutions, is now supporting Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) specification for validating graph-based data against a set of conditions.

Posted August 15, 2019

Social media, the Internet of Things, demands for mobile access, and real-time insights are just some of the factors that have increased the pressure on organizations to change how data is managed. And as a result there have never been so many data management choices to deal with it all.

Posted August 14, 2019

An effective approach to processing and transforming large datasets is likely comprised of multiple steps. The large data will likely be split apart into several smaller sets, maybe even in a couple of differing fashions with a common and understandable theme. But there should not be too many split-apart variants; rather, as with the three bears, it should be just the right number of smaller datasets. And then, similar to solving a Rubik's Cube, a twist or two at the very end brings all the new and old datapoints together in a complete and organized fashion.

Posted August 07, 2019

Here is a list of common wait types and techniques that every DBA (or wannabe DBA) should know. While there are many more wait types than listed here, understanding these will give you a leg up when it comes to optimizing and tuning your SQL Server database performance.

Posted August 07, 2019

While change has always been a part of the database credo, the growing emphasis on data-driven decision making in today's economy has resulted in a dizzying plethora of technologies and methodologies entering the market. The number and scope of game-changing technologies are too numerous to mention, and one thing is certain: Database management will never be the same. We have identified some of the most promising technology initiatives, based on discussions with and input from data experts from across the industry spectrum, gathering their views on the key technologies—well-known or under the radar—that are worth watching.

Posted July 25, 2019

IBM has launched a data prep solution designed to help clients improve their DataOps processes so they can have data ready for AI more quickly and efficiently. Jointly developed with data prep software provider Trifacta, the new InfoSphere solution is engineered to work in conjunction with clients' existing data environments, including data lakes.

Posted July 08, 2019

Oracle has introduced the new Autonomous Database Dedicated service to provide customers with more control for their most mission-critical workloads. "Our Autonomous Database Dedicated service eliminates the concerns enterprise customers previously had about security, isolation, and operational policies when moving to cloud," said Juan Loaiza, executive vice president, Mission-Critical Database Technologies, Oracle.

Posted July 03, 2019

source{d}, a data platform for the software development life cycle (SDLC), is releasing a new Enterprise Edition with built-in visualization, management capabilities, and advanced analytic functions. source{d} enables enterprises to aggregate all SDLC data sources into one data lake where they can easily extract, load and transform source code, version control data, project tracking data, build systems data, configuration files and more.

Posted July 02, 2019

EnterpriseDB has been acquired by Great Hill Partners, a private equity firm. Financial terms of the private transaction were not disclosed. "This acquisition comes at a time when the Postgres market is exploding," said Ed Boyajian, president and CEO of EnterpriseDB. Michael Stonebraker, a pioneer in database technology and original architect of what is now Postgres, has been named technical advisor; and database expert and entrepreneur Andy Palmer has been named to the board of directors.

Posted July 01, 2019

Scale Computing, a provider of edge computing solutions, announced that the KVM-based hypervisor in the HC3 product family is now fully supported by Parallels Remote Application Server17 (Parallels RAS). When combined with Parallels RAS, Scale Computing HC3 enables administrators to rapidly provision and manage virtual machines (VM) thin clones centrally from Parallels RAS Console to make VDI solutions faster, more affordable, and easier to use.

Posted June 18, 2019

AI, machine learning, and predictive analytics are used synonymously by even the most data-intensive organizations, but there are subtle, yet important, differences between them. Machine learning is a type of AI that enables machines to process data and learn on their own, without constant human supervision. Predictive analytics uses collected data to predict future outcomes based on historical data.

Posted June 10, 2019

What are the practices and procedures that you have found to be most helpful to automate in administering your databases? Yes, I know that automation has been a standard claim for most DBMS vendors, as well as third-party DBA tool vendors, for many years. But are you really anywhere closer to an "on demand," "lights-out," "24/7" database environment yet?

Posted June 10, 2019

CDC can greatly minimize the amount of data processed; but the cost is that the processes themselves become more complicated and overall storage may be higher. Costs are moved around, the final level of processing becomes focused on the minimal changes, and this minimization is the efficiency to be gained. Moving forward, using the data becomes standardized and ultimately straightforward.

Posted June 10, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44

Sponsors