Database Management Articles
Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it - the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).
Posted August 05, 2013
NuoDB, provider of a NewSQL database, has announced the beta program for a new tool that facilitates migration from MySQL, Microsoft SQL Server, IBM DB2, PostgreSQL and Oracle RDBMSs. The migration tool is open source and available for free download on the NuoDB DevCenter or on GitHub.
Posted July 30, 2013
IBM says it is accelerating its Linux on Power initiative with the new PowerLinux 7R4 server as well as new software and middleware applications geared for big data, analytics and next generation Java applications in an open cloud environment. According to IBM, the new PowerLinux 7R4 server, built on the same Power Systems platform running IBM's Watson cognitive computing solution, can provide clients the performance required for the new business-critical and data-intensive workloads increasingly being deployed in Linux environments. IBM is also expanding the portfolio of software for Power Systems with the availability of IBM Cognos Business Intelligence and EnterpriseDB database software, each optimized for Linux on Power.
Posted July 30, 2013
After four years of operating BigCouch in production, Cloudant has merged the BigCouch code back into the open source Apache CouchDB project. Cloudant provides a database-as-a-service and CouchDB serves as the foundation of Cloudant's technology. The company developed BigCouch, an open source variant of CouchDB, to support large-scale, globally distributed applications.There are three main reasons Cloudant is doing this, Adam Kocoloski, co-founder and CTO at Cloudant, told 5 Minute Briefing in an interview.
Posted July 30, 2013
SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.
Posted July 25, 2013
Oracle has announced the latest 12c releases of its Cloud Application Foundation, which integrates application server and in-memory data grid capabilities into a foundation for cloud computing, representing "a major release of our middleware infrastructure," Mike Lehmann, Oracle vice president of product management, tells 5 Minute Briefing. The focus for the products is to provide mission-critical cloud infrastructure and a lot of work has been done around native cloud capabilities, says Lehmann.
Posted July 17, 2013
The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.
Posted July 17, 2013
MemSQL, a provider of real-time analytics, announced the availability of MemSQL 2.1, which includes new features and enhancements to enable customers to access, explore and increase the value of data, regardless of size or file format. To meet the demands posed by increasing amounts of data and data types, MemSQL has updated its analytics platform to enable customers to receive real-time results on analytical queries across both real-time and historical datasets.
Posted July 16, 2013
One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.
Posted July 09, 2013
Oracle Database 12c is available for download from Oracle Technology Network (OTN). First announced by Oracle CEO Larry Ellison during his keynote at Oracle OpenWorld 2012, Oracle Database 12c introduces a new multi-tenant architecture that simplifies the process of consolidating databases onto the cloud; enabling customers to manage many databases as one - without changing their applications. During the OpenWorld keynote, Ellison described Oracle Database 12c as "the first multi-tenant database in the world" and said it provides "a fundamentally new architecture" to "introduce the notion of a container database" with the ability to plug in multiple separate, private databases into that single container.
Posted July 09, 2013
Dell Software has introduced the latest version of the Dell KACE K1000 Management Appliance, which now includes integrated software asset management to boost software license compliance, while helping lower IT costs. The K1000 adds automated software asset identification, tracking and optimization to its capabilities for managing the deployment, operation and retirement of software assets. The need for the appliance is being fueled by a range of factors, including the influx of new technologies such as cloud computing, virtualization, and BYOD, which are adding complexity in terms of systems management, Lisa Richardson, senior product marketing manager for Endpoint Systems Management, Dell Software, tells 5 Minute Briefing.
Posted July 08, 2013
SAP AG announced this week that version 16 of the company's Sybase software has achieved a Guinness World Record for loading and indexing big data. In cooperation with BMMsoft, HP and Red Hat, SAP Sybase IQ 16 achieved an audited result of 34.3 terabytes per hour, surpassing the previous record of 14 terabytes per hour achieved by the same team using an earlier version of SAP Sybase IQ. The latest version of th real-time analytics server and enterprise data warehouse (EDW) provides a new, fully parallel data loading capability and a next-generation column store, enabling the jump in big data performance.
Posted June 27, 2013
RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.
Posted June 27, 2013
The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.
Posted June 27, 2013
Oracle announced the general availability of MySQL Cluster 7.3, which adds foreign key support, a new NoSQL JavaScript Connector for node.js, and an auto-installer to make setting up clusters easier. MySQL Cluster is an open source, auto-sharded, real-time, ACID-compliant transactional database with no single point of failure, designed for advanced web, cloud, social and mobile applications. "Foreign key support has been a longstanding feature request from day-one," Tomas Ulin, vice president of MySQL Engineering at Oracle, tells 5 Minute Briefing.
Posted June 19, 2013
Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.
Posted June 19, 2013
These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.
Posted June 19, 2013
The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.
Posted June 13, 2013
Although storage management can be an afterthought for the DBA, it really shouldn't be. Storage issues are vitally important and unless managed appropriately, it can be very costly. The cost of managing storage can be as much as 10 times higher than the initial cost of acquiring the storage—and the growth rate for disk storage was 37% between 1996 and 2007. Even so, it is common for storage-related issues to be relegated to the backburner by DBAs, but every database professional should understand modern storage basics.
Posted June 13, 2013
At the recent Oracle users conference COLLABORATE 13, IOUG hosted an evening hands-on lab for attendees. The turnout was great—even after a full day of conference sessions. What topic drew such a dedicated crowd? Two words: Performance Tuning.
Posted June 13, 2013
There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.
Posted June 13, 2013
Mike Ruane, president and CEO of Revelation Software will demonstrate O4W (OpenInsight for Web) Mobile at user group meetings and conferences across the country and the U.K. this summer and early fall. In the presentations, Ruane's team will show attendees how easy it is to build a mobile application with Revelation's software by actually allowing them to build a mobile pizza-ordering application, Robert Catalano, director of sales at Revelation, tells DBTA.
Posted June 13, 2013
Trillium Software, a provider of enterprise data quality software and services, and Collibra, a provider of data governance software, have formed an alliance to address the demand for data governance initiatives and enable customers to leverage data as a strategic asset.
Posted June 04, 2013
As a leader in the data modeling space, CA ERwin is privileged to be an integral part of organizations' key strategic initiatives such as business intelligence and analytics, data governance, or data quality—many of which revolve around data. At CA Technologies, we understand that data runs your business, and we've put a strong focus on developing a solution that can act as an "information hub" for these initiatives.
Posted June 03, 2013
Percona makes MySQL faster and more reliable for nearly 2,000 customers worldwide. Founded in 2006 to provide MySQL Consulting services, we've grown rapidly with the addition of MySQL Support, Remote DBA, Training, and Server Development services. Our global workforce of nearly 100 now provides 24x7, worldwide coverage to our customer base of leading MySQL users.
Posted June 03, 2013
For close to 30 years, I have been actively involved with the database community. From one of the first Oracle Database Administrators, to President of the Independent Oracle Users Group; from one of the Founders of the Professional Association of SQL Server, to one of the original Oracle*Press authors; from Oracle ACE to VMware vExpert for database virtualization, I have seen an amazing succession of changes take place in our industry.
Posted June 03, 2013
Embarcadero Technologies gives 97% of the world's top 2000 companies the tools needed to address the biggest challenges in data management. Facing significant growth in complexity, diversity and volume of enterprise data, companies worldwide are increasingly turning to data governance as a strategic solution. Helping our customers manage this complexity, and close the "governance gap" has been a major driver of innovation in our products.
Posted June 03, 2013
Scott Hayes is an IBM DB2 LUW Performance Expert, IBM DB2 GOLD Consultant, IBM Information Management Champion, US patent inventor, published author, blogger on DB2 LUW performance topics, and popular frequent speaker at IBM IOD and IDUG Conferences. He started DBI Software in July 2005 with one simple mission: "Help People!" Eight years later, this simple mission is still DBI's #1 core value, though Mr. Hayes admits, "We are better at helping people with DB2 than their marriages or cars."
Posted June 03, 2013
The business world depends on data. Continuent's Tungsten software offers a database-as-a-service that gives cost-effective MySQL databases the enterprise-class clustering and replication required by business-critical applications. Continuent Tungsten provides 24x7 data availability, performance scaling, and simple management without risky application changes or database upgrades. Continuent Tungsten operates at a fraction of the cost of commercial DBMS servers like Oracle and MS SQL Server.
Posted June 03, 2013
Clustrix is the leading scale-out SQL database engineered for the cloud. The combination of big data and cloud computing has broken the legacy database, creating an industry transition to new scale-out database platforms. Clustrix has delivered on a complete reinvention of the relational database, and the result is a highly differentiated platform that is ideal for the industry transition to cloud computing.
Posted June 03, 2013
Over the past year it has been great to see the "Big Data" moniker lose some of its glamour as a catch-all phrase. Fortunately, Hadoop's role is now well understood for processing massive amount of old data for analytics for use cases, which are not sensitive to latency. But, increasing numbers of companies are building new applications that are driving real competitive advantage and disrupting established markets—on the Web, over mobile, or in the Cloud—generating massive amounts of "Big Data." In these new cases, latency matters.
Posted June 03, 2013
NuoDB, Inc., a provider of a cloud data management system offering SQL compliance and guaranteed ACID transactions, has introduced the NuoDB Starlings Release 1.1. Following on from its 1.0 release in January, NuoDB's Starlings Release 1.1 focuses on overall usability in three key areas, Seth Proctor, NuoDB chief architect, tells DBTA. The enhancements focus on greater Microsoft Windows support, general performance and stability, and an improved development and management experience in the web console, says Proctor.
Posted May 23, 2013
Cloudant, a provider of a globally distributed database-as-a-service (DBaaS), has announced $12 million in series B funding from Devonshire Investors, the private equity firm affiliated with Fidelity Investments; RackspaceHosting, the open cloud leader; and Toba Capital. The company also announced that current investors — Avalon Ventures, In-Q-Tel, Samsung Venture Investment Corporation — purchased additional shares. The funding will be used to support Cloudant's global expansion and grow the company's support, service, and go-to-market strategies.
Posted May 21, 2013
Key findings from a new study, "Big Data Opportunities," will be presented at Big Data Boot Camp at the Hilton New York. Big Data Boot Camp will kick off at 9 am on Tuesday, May 21, with a keynote from John O'Brien, founder and principal of Radiant Advisors, on the dynamics and current issues being faced in today's big data analytic implementations. Directly after the opening address, David Jonker, senior director of Big Data Marketing, SAP, will showcase the results of the new big data survey, which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data. The study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by SAP.
Posted May 16, 2013
Data is not sedentary. Once data has been created, organizations tend to move it around to support many different purposes—different applications, different geographies, different users, different computing environments, and different DBMSs. Data is copied and transformed and cleansed and duplicated and stored many times throughout the organization. Different copies of the same data are used to support transaction processing and analysis; test, quality assurance, and operational systems; day-to-day operations and reporting; data warehouses, data marts, and data mining; and distributed databases. Controlling this vast sea of data falls on the DBA who uses many techniques and technologies to facilitate data movement and distribution.
Posted May 09, 2013
When you decide to undertake your own benchmarking project, it's a strongly recommended best practice to write up a benchmarking plan. A benchmark must produce results that are both reliable and repeatable so that we can foster conclusions that are predictable and actionable. Keeping the "reliable and repeatable" mantra in mind necessitates a few extra steps.
Posted May 09, 2013