Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

Database management systems support numerous unique date and time functions - and while the date-related functions are many, they do not go far enough. One date-driven circumstance often encountered has to do with objects having a type of date range that needs to be associated with it. While there are some exceptions, this date range need generally ends up implemented via two distinct date columns—one signaling the "start" and the other designating the "end." Maybe, should the creative juices of DBMS builders' flow, such things as numeric-range-datatypes could be created in addition to a date-range data-type. Who knows where things could end up?

Posted September 11, 2013

Data models attempt to express the business rules of an organization. A good data model reflects the semantics used within an organization to such an extent that business people within that organization can relate to and easily agree with what is being expressed. In this regard the data modeler's goal is to properly mirror back the organization's concepts onto those people within the organization. The goal is not to force an organization into a "standard" data model, nor is the goal to abstract everything in the creation of a master model that will never need to change even if the business rules were drastically shifted.

Posted September 03, 2013

One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.

Posted August 06, 2013

Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.

Posted June 19, 2013

The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.

Posted June 13, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.

Posted May 22, 2013

Datawatch Corporation, provider of information optimization solutions, has announced a strategic partnership with Lavastorm Analytics, an analytics software vendor, to provide customers the ability to expand their use of unstructured and semi-structured data sources when developing analytic applications.

Posted May 07, 2013

Dimensions are the workhorses of a multidimensional design. They are used to manage the numeric content being analyzed. It is through the use of dimensions that the metrics can be sliced, diced, drilled-down, filtered and sorted. Many people relate to dimensions by thinking of them as reference tables. Such thoughts aren't exactly accurate. A dimension groups together the textual/descriptor columns within a rationalized business category. Therefore, much of the content coming from relational tables may be sourced from reference tables, but the relationship between each source reference table and the targeted dimension is unlikely to be one-for-one. These grouped-format dimensions often contain one or more hierarchies of related data items used within the OLAP queries supported by the structures.

Posted April 10, 2013

Do not allow well-meaning but confused proponents to obscure concepts related to normalization and dimensional design. Under a normalized approach one usually would not expect for numeric data items and textual data items to fall into different logical relations when connected to the same entity object. Yet within a multidimensional approach that is exactly what happens. Multidimensional design and normal design are not the same, and one should not expect to claim that both approaches were used and that they resulted in the same data model.

Posted March 14, 2013

Establishing a data warehousing or business intelligence environment initiates a process that works its way through the operational applications and data sources across an enterprise. This process focuses not only on identifying the important data elements the business lives and breathes, but the process also tries very hard to provide rationality in explaining these elements to business intelligence users.

Posted February 27, 2013

Sonata Software, an IT consulting and software services provider headquartered in Bangalore, India, has announced its center of excellence (CoE) for Exalytics, Oracle's engineered system designed for high performance data analysis, modeling and planning.

Posted February 20, 2013

Multi-dimensional design involves dividing the world into dimensions and facts. However, like many aspects of language, the term "fact" is used in multiple ways. Initially, the term referred to the table structure housing the numeric values for the metrics to be analyzed. But "fact" also is used to refer to the metric values themselves. Therefore, when the unique circumstances arise wherein a fact table is defined that does not contain specific numeric measures, such a structure is referred to by the superficially oxymoronic characterization of a "factless fact."

Posted February 12, 2013

Within the information technology sector, the term architect gets thrown around quite a lot. There are software architects, infrastructure architects, application architects, business intelligence architects, data architects, information architects, and more. It seems as if any area may include someone with an "architect"status. Certainly when laying out plans for a physical building, an architect has a specific meaning and role. But within IT "architect" is used in a much fuzzier manner.

Posted December 11, 2012

Micro Focus, a provider of enterprise application modernization solutions, announced it is shipping a new release of its COBOL application migration toolset, with support for Microsoft's latest operating system and integrated development environment. Visual COBOL 2.1 also includes many enhancements designed to facilitate an improved application developer experience, as well as deliver an upgrade path for core business applications. "With support for the Windows 8 platform, Visual COBOL provides even greater developer productivity, collaboration, and quality improvements while reducing their costs and time to market," Ed Airey, Micro Focus product marketing director for COBOL Products, tells 5 Minute Briefing.

Posted December 10, 2012

A new educational webcast examines the results of the 2012 IOUG Test, Development & QA Survey, and covers the best practices and issues that it highlights. Mining the data assets being gathered from all corners of their enterprise, including transactions, customer data, employee input, and information about market conditions, has been essential to companies in uncovering new opportunities, but, in the rush to deliver results, many IT and development departments take shortcuts within the testing process, taking live data right out of production environments to run through testing, development and quality assurance processes.

Posted November 21, 2012

In writing a definition for an entity, an attribute, or any other element within a database design, the desired end is a descriptive text that is clear, factual and concise. Semantics are an ambiguous and often painful tool to employ. Balancing the need for clarity against the desire to avoid redundancy can be a juggling act that is hard to accomplish. One might not easily recognize what is complete versus what is lacking, versus what has gone too far. But even so, within a definition if one finds oneself listing valid values and decoding the value's meaning, then one has likely already moved beyond what is "concise." Lists of values easily add bulk and quantity of verbiage into a definition, yet such lists do not usually increase the quality of a definition.

Posted November 13, 2012

The beauty of a truly wonderful database design is its ability to serve many masters. And good database designers are able to empathize with those who will use their designs. In business intelligence settings, three perspectives deserve consideration when composing designs.

Posted November 06, 2012

Software operates the products and services that we use and rely on in our daily lives. It is often the competitive differentiation for the business. As software increases in size, complexity, and importance to the business, so do the business demands on development teams. Developers are increasingly accountable to deliver more innovation, under shorter development cycles, without negatively impacting quality. Compounding this complexity is today's norm of geographically distributed teams and code coming in from third-party teams. With so many moving parts, it's difficult for management to get visibility across their internal and external supply chain. Yet, without early warning into potential quality risks that could impact release schedules or create long term technical debt, there may be little time to actually do something about it before the business or customers are impacted.

Posted October 24, 2012

CA Technologies has announced a major new release of the ERwin data modeling solution. This new release which is the second in less than a year provides a collaborative data modeling environment to manage enterprise data using an intuitive, graphical interface. It helps improve data re-use, optimize system quality, accelerate time-to-benefit and enable appropriate information governance—key objectives for IT organizations serving companies in today's highly competitive and closely regulated markets.

Posted October 16, 2012

It is an understatement to say we're witnessing an example of Moore's Law — which states the number of transistors on a chip will double approximately every two years — as we seek to manage the explosion of big data. Given the impact this new wealth of information has on hundreds of millions of business transactions, there's an urgent need to look beyond traditional insight-generation tools and techniques. It's critical we develop new tools and skills to extract the insights that organizations seek through predictive analytics.

Posted October 16, 2012

An educational and interactive webcast will review the findings of the 2012 IOUG Test, Development and QA Survey and discuss the best practices and issues that it highlights. This IOUG study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by IBM. Presented by Kimberly Madia, WW product marketing manager at IBM, and Thomas Wilson, president and CEO, Unisphere Research, the webcast will be held Thursday, September 27, from 12 - 1 PM CDT. Attendees to the webcast will receive a copy of the study report.

Posted September 26, 2012

It seems easy to fall into a state where projects and activities assume such soft-focus that routine takes control, where one simply does necessary tasks automatically, no questions are raised regarding what is moving through the work-life production line and everyone is essentially asleep at the switch. Certainly, we may have one eye open ensuring that within a broad set of parameters all is well, but as long as events are basically coloring inside the borders we continue allowing things to just move along. In this semi-somnambulant state we can easily add columns to tables, or even add new entities and tables, or triggers and procedures to our databases, then eventually at some point down the road have someone turn to us and ask, "Why this?" or, "What does this really mean?" And at that point, we surprise ourselves with the discovery that the only answer we have is that someone else told us it was what we needed, but we do not really understand why it was needed.

Posted September 11, 2012

The whole world can be divided into two groups, these being splitters and lumpers. Design battles are waged across conference rooms as debates rage over whether to split or to lump. Splitters take a group of items divide them up into sub-groups and sub-sub-groups occasionally going so far as to end with each lowest level becoming a group of one. On the other side of the design fence, lumpers combine items until everything is abstracted into group objects covering very broad territory, such as a "Party" construct, or ultimately an "Object" object. Within data modeling, arguments arise, such as whether to sub-type an entity. Or perhaps lumping is discussed as the grain of a multidimensional fact is proposed. This debate underlies much of the decision-making involved in determining what domains to create within a data model. The split-versus-lump issue is ubiquitous and universal. The question to split or lump arises across many kinds of choices, in addition to the entity definition, table grain, or the domain grain mentioned in the previous examples; this issue is at the heart of deliberations regarding establishing functions, overriding methods, or composing an organizational structure.

Posted July 11, 2012

SAP has announced that the PowerBuilder Developers Conference will be held October 15-19, 2012, at the Venetian Resort Hotel in Las Vegas, concurrently with SAP TechEd Las Vegas 2012. The conference will be comprised of an opening keynote at SAP TechEd, followed by a PowerBuilder general session and PowerBuilder technical breakout sessions.

Posted June 27, 2012

In the dim, dark past of data warehousing, there was a time when the argument was put forward that "history does not change." It was posited that once a piece of data was received by the data warehouse, it was sacrosanct and nonvolatile. A fact record, once processed, was to remain unchanged forever. Dimensions, due to their descriptive nature, could be changed following the prescribed Type 1, 2, or 3 update strategies, but that was all. It was the expectation that due to their very nature, fact tables would become huge and in being huge would give poor update performance; performance so poor that updates would be virtually impossible to enact.

Posted June 13, 2012

It seems only reasonable that what one person can do, others can learn. On the other hand, taking people through training does not usually result in the creation of great new database administrators (DBAs). It often appears as if those who are exceptional at the craft operate at higher levels as they dive into a problem. Can training alone provide folks with the attention to detail, the urge to keep digging, or the ability to recall minutiae that allow them to rise from simply holding the DBA title to becoming someone who is a great DBA? Or must the genetic potential exist first, and then one might fall into the DBA occupation and astound those around them. It is very hard to say with any degree of certainty whether great DBAs are made or born; yet again the battle between nature and nurture arises.

Posted June 06, 2012

Embarcadero Technologies has introduced a new version of its database management and development platform, DB Power Studio XE3, which offers enhancements to further improve the performance and availability of databases.

Posted June 06, 2012

Plans are underway for an event specifically focused on Sybase PowerBuilder and tools that will be separate from TechEd but held at the same time and location, according to Christine Weber, marketing manager, Events, at Sybase, an SAP company. There will also be close to 100 hours of sessions specifically focused on Sybase database and analytics products at SAP TechEd 2012, Weber tells 5 Minute Briefing. "It is a good portion and it is focused on the traditional kind of content that we have always done with Sybase." Sybase-specific content will include tips and tricks on how to use existing products, as well as previews of what's ahead in new product releases. Now that the call for papers has closed, Sybase is going through its approval process for the presentations. Well over 200 presentations were submitted - "a good problem to have," Weber notes.

Posted May 22, 2012

10gen, the company behind MongoDB, has announced its support for MongoDB with Node.js. This includes an official Node.js driver as well as commercial support from 10gen for MongoDB-backed applications developed with Node.js. Node.js joins the existing set of programming languages and environments 10gen supports, including Java, PHP, C#, Ruby, Python, C++, C, Perl, Scala, Haskell and Erlang. Launched in 2009 and sponsored by Joyent, JavaScript-based Node.js is designed to help developers build data-intensive, real-time applications that support large numbers of concurrent users and devices.

Posted May 01, 2012

Quest Software has released version 11.5 of its Toad for Oracle software, the flagship product in the Toad portfolio of productivity software for database developers, DBAs, and analysts. Drawing on community feedback from their two million users, Quest has introduced a number of new features and improvements, most notably a new social intelligence component. Toad for Oracle 11.5, Quest contends, will allow users to take advantage of the best ideas and practices from the community and further increase user productivity.

Posted April 25, 2012

Sybase has announced that Ford Motor Company will centralize all of its logical and physical modeling functions with SAP Sybase PowerDesigner, the data modeling software and metadata management solution for data, information, and enterprise architectures. The solution provides the capability to generate Data Description Language (DDL) for Ford Motor Company's database platforms, including all of the leading databases like Sybase ASE, DB2, SQL Server, Teradata and Oracle.

Posted April 25, 2012

Solution development work is usually accomplished via projects, or a combination of programs and projects. This project perspective often leads to thoughts of documentation as project-owned. And while many documents are project-specific, such as timelines, resource plans, and such, not everything is project-specific. Unless projects are established in a fashion whereby each is very limited in scope to the creation or enhancement of a single application or system, specification and design documents belong to the final solution and not to the project.

Posted April 11, 2012

1010data, Inc., provider of an internet-based big data warehouse, has announced the launch of a new software tool that enables 1010data's customers to automatically segment and analyze huge consumer transaction databases and produce statistical models with specificity, even to the level of social groups, families and individuals. For the first stage of the launch, 1010data is making the tool available in an invitational beta release for retail, consumer goods, and mobile telecom companies. "In all consumer-driven industries, customers are demanding to be treated as individuals, not boomers, tweeners, or dinks - dual income, no kids," said Tim Negris, vice president of marketing at 1010data.

Posted March 22, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37

Sponsors