- Crack Erwin Data Modeler Interview Questions
- Crack Erwin Data Modeler Interview Question
- Erwin Data Modeler Tutorial
- Crack Erwin Data Modeler Interview Answers
Title: Data Modeler in New Jersy posted on 2014-03-26 09:58:48 Job Description:.Job title: Data Modeler:.Location - Jersey City, NJ.Long term contract -12+Months.Data Modeler:.Client has an opening for data modeler having 3-5 years of experience with responsibility for creating logical and physical data models for new and existing. Data Modeling Interview Questions Dimensional and ER Modeling ERwin. 'Data Modeling Interview Questions' are useful for preparing 'Data Modeler / Data Architect ' Interviews. These Data Modeling Interview Questions are useful for Beginners as well as Experienced Data Modeling Professionals. What is your APPROACH to start a Data Model? Data Warehousing Training and Certification with Erwin. 4.8 ( 509 ) Ratings. Data Warehousing training and certification by Intellipaat will help you master Business Intelligence concepts such as Data Warehousing (DW) architecture, data integration, data modeling, Erwin, and the fundamentals of ETL: Extract, transform, and load. Get CA ERwin Data Modeler alternative downloads. Table of Contents Try to look erwin 7.3 in another Keygen Data Base. Probably you can find erwin 7.3 key. Free erwin data modeler r7 license, the license key is invalid.
CA ERwin Data Modeler
CA ERwin Data Modeler is a program that provides a powerful way to visualize data from multiple sources across the organization, increasing efficiency through reuse and standards, while at the same time increasing data quality and providing a unified view of strategic data assets. It provides support for forward and reverse engineering, complete compare and design layers.
- Publisher: CA, Inc.
- Home page:www.ca.com
- Last updated: April 1st, 2015
TOOLBUS for Enterprise Architect and CA ERwin
The TOOLBUS Interface for Enterprise Architect and CA ERwin enables the bi-directional conversion of UML Class Models and Data Models from Sparx Enterprise Architect to Logical Data Models and Physical Data Models respectively for CA ERwin Data Modeler (and vice versa).
- Publisher: Reischmann Informatik
Download Accelerator
Download Accelerator is a file download accelerator/manager that can increase download speeds up to 300 percent. Download Accelerator can also recover from lost connections, computer shutdowns, and other errors. The program even searches for mirror s..
- Publisher: SpeedBit
- Home page:www.speedbit.com
- Last updated: July 25th, 2008
ER/DataGen Enterprise
Protection of personal information is a critical issue these days and ER/DataGen provides a solution as your powerful utility application. Medieval conquest warband wiki. ER/DataGen can generate test data with almost infinite diversity for a system test as well as a comprehensive test, eliminating any need to use actual data.
- Publisher: Banner Software, Inc.
- Last updated: October 31st, 2011
AnyVid Video Downloader
AmoyShare AnyVid is the best HD video downloader and converter that empowers you to get any video download with ease. AnyVid is dedicated to offering excellent video download experience to all users, its a completely ad-free and plugin-free soft
Crack Erwin Data Modeler Interview Questions
- Publisher: AmoyShare Technology Company
- Last updated: September 22nd, 2020
Flight1 Downloader
The Flight1 Downloader Tool will protect your file against any corruption, has full resume, and can validate and repair previous downloads. Using this tool will prevent you from having to restart a download from the beginning should you lose your Internet connection. Windows xp professional 1 2cpu iso download.
- Publisher: Flight One Software, Inc.
- Home page:www.flight1.com
- Last updated: March 26th, 2011
Megaupload Downloader
Megaupload Downloader is a tool that allows users to download files from free file hosting servers on the Internet. It is designed to eliminate unwanted pop-ups and advertisements that are sometimes automatically attached to downloadable files, making file transfer a clean and secure process.
- Publisher: marco6
- Home page:sourceforge.net
- Last updated: February 23rd, 2013
Aurigma File Downloader
Aurigma File Downloader is a unique mass download solution for your website. It enables end-users to download multiple files from an HTML page with just a few clicks. Now you can forget about clumsy download solutions as ZIP files, standalone download applications, or – even worse – a set of links for each file. With Aurigma, user-friendly mass download becomes a reality.
- Publisher: Aurigma Inc.
- Last updated: October 31st, 2011
Internet Download Manager
Internet Download Manager supports proxy servers, FTP and HTTP protocols, firewalls, redirects, cookies, authorization, MP3 audio and MPEG video content processing. IDM integrates seamlessly into Microsoft Internet Explorer, Netscape, MSN Explorer, AOL, Opera, Mozilla, Mozilla Firefox, Mozilla Firebird, Avant Browser, MyIE2, and all other popular browsers to automatically handle your downloads.
- Publisher: Tonec Inc.
- Home page:www.internetdownloadmanager.com
- Last updated: September 15th, 2021
HP SoftPaq Download Manager
HP SoftPaq Download Manager provides a simple way to download software updates for the HP client PC models in your environment. The program lists the updates available based on your search criteria on a single screen. You can easily find the updates that are most important to you and your users, avoid the cumbersome task of downloading updates individually, and more.
- Publisher: Hewlett-Packard Company
- Home page:support.hp.com
- Last updated: June 29th, 2020
Download Navigator
It is an application designed to update your Epson devices to the latest version. It support the next devices:- Epson Expression Home XP-102.- Epson Expression Home XP-103.- Epson Expression Home XP-202.- Epson Expression Home XP-203.- Epson Expression Home XP-207.
- Publisher: SEIKO EPSON Corporation
- Home page:esupport.epson-europe.com
- Last updated: June 27th, 2013
Good Download Manager
Good Download Manager is an application which helps you download files from torrent websites. When you start downloading, a new window appears which shows the loading level of the file. The program offers you the means to check the download and the upload speed.
- Publisher: GoodDownloadManager Team
- Home page:www.GoodDownloadManager.com
- Last updated: November 6th, 2020
MaxiGet Download Manager
MaxiGet Download Manager is a simple, neat and handy tool that lets you manage and organize your downloads, as well as increase their transfer speeds. This handy tool also lets you resume broken downloads from the point they were interrupted, or willingly pause them so that you can resume them at later times.
- Publisher: Maxiget Ltd.
- Home page:maxiget.com
- Last updated: April 20th, 2015
Microsoft Download Manager
It was specifically designed to help users manage file downloads from supporting Microsoft Web sites. Once started, the Microsoft Download Manager's easy-to-use interface displays the status of downloads and enables you to suspend active downloads or resume downloads that have failed.
- Publisher: Microsoft Corporation
- Last updated: July 4th, 2020
Super Mp3 Download
Super MP3 Download is easy-to-use software which enables you to search and download over 100 million MP3 files. With this app you can search songs in tags of titles, artists, albums, and editions. You can even search the live, piano, guitar or cover editions of your favorite songs.
- Publisher: Super MP3 Download
- Home page:www.super-mp3-download.com
- Last updated: March 25th, 2017
Download Manager
It guarantees you the fastest possible download , as it has speeding boosting technology such as automatically looking for the fast download sources and switching to mirror sites to make downloading even faster. And if that's not fast enough, you can preview videos while they are still downloading
- Publisher: IGN Entertainment, Inc.
- Home page:www.fileplanet.com
- Last updated: October 2nd, 2012
Video Download Toolbar
Video Download Toolbar is free toolbar that gives users one-click access for downloading their favorite video files, but users will need to either download and install the free video player or use their own to actually play the files. Video Download Toolbar fits snugly into your Internet Explorer.
- Publisher: Sakysoft s.r.l.
- Home page:www.videodownloadtoolbar.com
- Last updated: July 17th, 2012
eMusic Download Manager
eMusic is an online service that allows you to find and buy your favorite music, either single songs or entire albums. eMusic Download Manager tool is the client app that you can install on your PC for managing all your eMusic pending downloads. The program allows you to pause your downloads and synchronize them with media players such as WinAmp, Windows Media Player, and iTunes.
- Publisher: eMusic.com
- Home page:www.emusic.com
- Last updated: August 14th, 2015
Q) Erwin Tutorial
All Fusion Erwin Data Modeler commonly known as Erwin, is a powerful and leading data modeling tool from Computer Associates. Computer Associates delivers several softwares for enterprise management, storage management solutions, security solutions, application life cycle management, data management and business intelligence.
Erwin makes database creation very simple by generating the DDL(sql) scripts from a data model by using its Forward Engineering technique or Erwin can be used to create data models from the existing database by using its Reverse Engineering technique.
Erwin workplace consists of the following main areas:
•Logical: In this view, data model represents business requirements like entities, attributes etc.
•Physical: In this view, data model represents physical structures like tables, columns, datatypes etc.
•Modelmart : Many users can work with a same data model concurrently.
Q)What can be done with Erwin?
1.Logical, Physical and dimensional data models can be created.
2.Data Models can be created from existing systems(rdbms, dbms, files etc.).
3.Different versions of a data model can be compared.
4.Data model and database can be compared.
5.SQl scripts can be generated to create databases from data model.
6.Reports can be generated in different file formats like .html, .rtf, and .txt.
7.Data models can be opened and saved in several different file types like .er1, .ert, .bpx, .xml, .ers, .sql, .cmt, .df, .dbf, and .mdb files.
8.By using ModelMart, concurrent users can work on the same data model.
In order to create data models in Erwin, you need to have this All Fusion Erwin Data Modeler installed in your system. If you have installed Modelmart, then more than one user can work on the same model.
Q)What is Data Modeling Development Cycle?
Gathering Business Requirements - First Phase: Data Modelers have to interact with business analysts to get the functional requirements and with end users to find out the reporting needs.
Conceptual Data Modeling(CDM) - Second Phase:
This data model includes all major entities, relationships and it will not contain much detail about attributes and is often used in the INITIAL PLANNING PHASE.
Logical Data Modeling(LDM) - Third Phase:
This is the actual implementation of a conceptual model in a logical data model. A logical data model is the version of the model that represents all of the business requirements of an organization.
Physical Data Modeling(PDM) - Fourth Phase:
This is a complete model that includes all required tables, columns, relationship, database properties for the physical implementation of the database.
Database - Fifth Phase:
DBAs instruct the data modeling tool to create SQL code from physical data model. Then the SQL code is executed in server to create databases.
Q)Standardization Needs | Modeling data:
Several data modelers may work on the different subject areas of a data model and all data modelers should use the same naming convention, writing definitions and business rules.
Nowadays, business to business transactions(B2B) are quite common, and standardization helps in understanding the business in a better way. Inconsistency across column names and definition would create a chaos across the business.
For example, when a data warehouse is designed, it may get data from several source systems and each source may have its own names, data types etc. These anomalies can be eliminated if a proper standardization is maintained across the organization.
Table Names Standardization:
Giving a full name to the tables, will give an idea about data what it is about. Generally, do not abbreviate the table names; however this may differ according to organization's standards. If the table name's length exceeds the database standards, then try to abbreviate the table names. Some general guidelines are listed below that may be used as a prefix or suffix for the table.
Examples:
Lookup – LKP - Used for Code, Type tables by which a fact table can be directly accessed.
e.g. Credit Card Type Lookup – CREDIT_CARD_TYPE_LKP
Fact – FCT - Used for transaction tables:
e.g. Credit Card Fact - CREDIT_CARD_FCT
Cross Reference - XREF – Tables that resolves many to many relationships.
e.g. Credit Card Member XREF – CREDIT_CARD_MEMBER_XREF
History – HIST - Tables the stores history.
e.g. Credit Card Retired History – CREDIT_CARD_RETIRED_HIST
Statistics – STAT - Tables that store statistical information.
e.g. Credit Card Web Statistics – CREDIT_CARD_WEB_STAT
Column Names Standardization:
Some general guidelines are listed below that may be used as a prefix or suffix for the column.
Examples:
Key – Key System generated surrogate key.
e.g. Credit Card Key – CRDT_CARD_KEY
Identifier – ID - Character column that is used as an identifier.
e.g. Credit Card Identifier – CRDT_CARD_ID
Code – CD - Numeric or alphanumeric column that is used as an identifying attribute.
e.g. State Code – ST_CD
Description – DESC - Description for a code, identifier or a key.
e.g. State Description – ST_DESC
Indicator – IND – to denote indicator columns.
e.g. Gender Indicator – GNDR_IND
Database Parameters Standardization:
Some general guidelines are listed below that may be used for other physical parameters.
Examples:
Index – Index – IDX – for index names.
e.g. Credit Card Fact IDX01 – CRDT_CARD_FCT_IDX01
Primary Key – PK – for Primary key constraint names.
e.g. CREDIT Card Fact PK01- CRDT-CARD_FCT_PK01
Alternate Keys – AK – for Alternate key names.
e.g. Credit Card Fact AK01 – CRDT_CARD_FCT_AK01
Foreign Keys – FK – for Foreign key constraint names.
e.g. Credit Card Fact FK01 – CRDT_CARD_FCT_FK01
Q)Steps to create a Data Model
These are the general guidelines to create a standard data model and in real time, a data model may not be created in the same sequential manner as shown below. Based on the enterprise's requirements, some of the steps may be excluded or included in addition to these.
Sometimes, data modeler may be asked to develop a data model based on the existing database. In that situation, the data modeler has to reverse engineer the database and create a data model.
1» Get Business requirements.
2» Create High Level Conceptual Data Model.
3» Create Logical Data Model.
4» Select target DBMS where data modeling tool creates the physical schema.
5» Create standard abbreviation document according to business standard.
6» Create domain.
7» Create Entity and add definitions.
8» Create attribute and add definitions.
9» Based on the analysis, try to create surrogate keys, super types and sub types.
10» Assign datatype to attribute. If a domain is already present then the attribute should be attached to the domain.
11» Create primary or unique keys to attribute.
12» Create check constraint or default to attribute.
13» Create unique index or bitmap index to attribute.
14» Create foreign key relationship between entities.
15» Create Physical Data Model.
15» Add database properties to physical data model.
16» Create SQL Scripts from Physical Data Model and forward that to DBA.
17» Maintain Logical & Physical Data Model.
18» For each release (version of the data model), try to compare the present version with the previous version of the data model. Similarly, try to compare the data model with the database to find out the differences.
19» Create a change log document for differences between the current version and previous version of the data model.
Q)Data Modeler Role
Business Requirement Analysis:
» Interact with Business Analysts to get the functional requirements.
» Interact with end users and find out the reporting needs.
» Conduct interviews, brain storming discussions with project team to get additional requirements.
» Gather accurate data by data analysis and functional analysis.
Development of data model:
» Create standard abbreviation document for logical, physical and dimensional data models.
» Create logical, physical and dimensional data models(data warehouse data modelling).
» Document logical, physical and dimensional data models (data warehouse data modelling).
Reports:
» Generate reports from data model.
Review:
» Review the data model with functional and technical team.
Creation of database:
» Create sql code from data model and co-ordinate with DBAs to create database.
» Check to see data models and databases are in synch.
Support & Maintenance:
» Assist developers, ETL, BI team and end users to understand the data model.
» Maintain change log for each data model.
Q)What is Conceptual Data Modeling
Conceptual data model includes all major entities and relationships and does not contain much detailed level of information about attributes and is often used in the INITIAL PLANNING PHASE.
Conceptual data model is created by gathering business requirements from various sources like business documents, discussion with functional teams, business analysts, smart management experts and end users who do the reporting on the database. Data modelers create conceptual data model and forward that model to functional team for their review.
Conceptual Data Model - Highlights
•CDM is the first step in constructing a data model in top-down approach and is a clear and accurate visual representation of the business of an organization.
•CDM visualizes the overall structure of the database and provides high-level information about the subject areas or data structures of an organization.
•CDM discussion starts with main subject area of an organization and then all the major entities of each subject area are discussed in detail.
•CDM comprises of entity types and relationships. The relationships between the subject areas and the relationship between each entity in a subject area are drawn by symbolic notation(IDEF1X or IE). In a data model, cardinality represents the relationship between two entities. i.e. One to one relationship, or one to many relationship or many to many relationship between the entities.
•CDM contains data structures that have not been implemented in the database.
•In CDM discussion, technical as well as non-technical team projects their ideas for building a sound logical data model
Q)What is Enterprise Data Modeling?
The development of a common consistent view and understanding of data elements and their relationships across the enterprise is referred to as Enterprise Data Modeling. This type of data modeling provides access to information scattered throughout an enterprise under the control of different divisions or departments with different databases and data models.
Enterprise Data Modeling is sometimes called as global business model and the entire information about the enterprise would be captured in the form of entities.
Data Model Highlights
When a enterprise logical data model is transformed to a physical data model, super types and sub types may not be as is. i.e. the logical and physical structure of super types and sub types may be entirely different. A data modeler has to change that according to the physical and reporting requirement.
When a enterprise logical data model is transformed to a physical data model, length of table names, column names etc may exceed the maximum number of the characters allowed by the database. So a data modeler has to manually edit that and change the physical names according to database or organization's standards.
One of the important things to note is the standardization of the data model. Since a same attribute may be present in several entities, the attribute names and data types should be standardized and a conformed dimension should be used to connect to the same attribute present in several tables.
Standard Abbreviation document is a must so that all data structure names would be consistent across the data model.
Q) Logical V/s Physical Data Model ?
When a data modeler works with the client, his title may be a logical data modeler or a physical data modeler or combination of both. A logical data modeler designs the data model to suit business requirements, creates and maintains the lookup data, compares the versions of data model, maintains change log, generate reports from data model and whereas a physical data modeler has to know about the source and target databases properties.
A physical data modeler should know the technical-know-how to create data models from existing databases and to tune the data models with referential integrity, alternate keys, indexes and how to match indexes to SQL code. It would be good if the physical data modeler knows about replication, clustering and so on.
The differences between a logical data model and physical data model is shown below.
Logical vs Physical Data Modeling
LDM :Represents business information and defines business rules
PDM: Represents the physical implementation of the model in a database.
LDM :Entity
PDM :Table
LDM:Attribute
PDM:Column
LDM:Primary Key
PDM:Primary Key Constraint
LDM:Alternate Key
PDM:Unique Constraint or Unique Index
LDM:Inversion Key Entry
PDM:Non Unique Index
LDM:Rule
PDM:Check Constraint, Default Value
LDM:Relationship
PDM:Foreign Key
LDM:Definition
PDM:Comment
Q)Relational vs Dimensional
Relational Data Modeling is used in OLTP systems which are transaction oriented and Dimensional Data Modeling is used in OLAP systems which are analytical based. In a data warehouse environment, staging area is designed on OLTP concepts, since data has to be normalized, cleansed and profiled before loaded into a data warehouse or data mart. In OLTP environment, lookups are stored as independent tables in detail whereas these independent tables are merged as a single dimension in an OLAP environment like data warehouse.
Relational vs Dimensional
RDM:Data is stored in RDBMS
DDM:Data is stored in RDBMS or Multidimensional databases
RDM:Tables are units of storage
DDM:Cubes are units of storage
•CDM comprises of entity types and relationships. The relationships between the subject areas and the relationship between each entity in a subject area are drawn by symbolic notation(IDEF1X or IE). In a data model, cardinality represents the relationship between two entities. i.e. One to one relationship, or one to many relationship or many to many relationship between the entities.
•CDM contains data structures that have not been implemented in the database.
•In CDM discussion, technical as well as non-technical team projects their ideas for building a sound logical data model
Q)What is Enterprise Data Modeling?
The development of a common consistent view and understanding of data elements and their relationships across the enterprise is referred to as Enterprise Data Modeling. This type of data modeling provides access to information scattered throughout an enterprise under the control of different divisions or departments with different databases and data models.
Enterprise Data Modeling is sometimes called as global business model and the entire information about the enterprise would be captured in the form of entities.
Data Model Highlights
When a enterprise logical data model is transformed to a physical data model, super types and sub types may not be as is. i.e. the logical and physical structure of super types and sub types may be entirely different. A data modeler has to change that according to the physical and reporting requirement.
When a enterprise logical data model is transformed to a physical data model, length of table names, column names etc may exceed the maximum number of the characters allowed by the database. So a data modeler has to manually edit that and change the physical names according to database or organization's standards.
One of the important things to note is the standardization of the data model. Since a same attribute may be present in several entities, the attribute names and data types should be standardized and a conformed dimension should be used to connect to the same attribute present in several tables.
Standard Abbreviation document is a must so that all data structure names would be consistent across the data model.
Q) Logical V/s Physical Data Model ?
When a data modeler works with the client, his title may be a logical data modeler or a physical data modeler or combination of both. A logical data modeler designs the data model to suit business requirements, creates and maintains the lookup data, compares the versions of data model, maintains change log, generate reports from data model and whereas a physical data modeler has to know about the source and target databases properties.
A physical data modeler should know the technical-know-how to create data models from existing databases and to tune the data models with referential integrity, alternate keys, indexes and how to match indexes to SQL code. It would be good if the physical data modeler knows about replication, clustering and so on.
The differences between a logical data model and physical data model is shown below.
Logical vs Physical Data Modeling
LDM :Represents business information and defines business rules
PDM: Represents the physical implementation of the model in a database.
LDM :Entity
PDM :Table
LDM:Attribute
PDM:Column
LDM:Primary Key
PDM:Primary Key Constraint
LDM:Alternate Key
PDM:Unique Constraint or Unique Index
LDM:Inversion Key Entry
PDM:Non Unique Index
LDM:Rule
PDM:Check Constraint, Default Value
LDM:Relationship
PDM:Foreign Key
LDM:Definition
PDM:Comment
Q)Relational vs Dimensional
Relational Data Modeling is used in OLTP systems which are transaction oriented and Dimensional Data Modeling is used in OLAP systems which are analytical based. In a data warehouse environment, staging area is designed on OLTP concepts, since data has to be normalized, cleansed and profiled before loaded into a data warehouse or data mart. In OLTP environment, lookups are stored as independent tables in detail whereas these independent tables are merged as a single dimension in an OLAP environment like data warehouse.
Relational vs Dimensional
RDM:Data is stored in RDBMS
DDM:Data is stored in RDBMS or Multidimensional databases
RDM:Tables are units of storage
DDM:Cubes are units of storage
RDM:Data is normalized and used for OLTP.
Optimized for OLTP processing
DDM:Data is denormalized and used in datawarehouse and data mart. Optimized for OLAP
RDM:Several tables and chains of relationships among them
Crack Erwin Data Modeler Interview Question
DDM:Few tables and fact tables are connected to dimensional tables
RDM:Volatile(several updates) and time variant
DDM:Non volatile and time invariant
RDM:Detailed level of transactional data
DDM:Summary of bulky transactional data
(Aggregates and Measures) used in business decisions
Q)Data Warehouse & Data Mart
A data warehouse is a relational/multidimensional database that is designed for query and analysis rather than transaction processing. A data warehouse usually contains historical data that is derived from transaction data. It separates analysis workload from transaction workload and enables a business to consolidate data from several sources.
In addition to a relational/multidimensional database, a data warehouse environment often consists of an ETL solution, an OLAP engine, client analysis tools, and other applications that manage the process of gathering data and delivering it to business users.
There are three types of data warehouses:
1. Enterprise Data Warehouse - An enterprise data warehouse provides a central database for decision support throughout the enterprise.
2. ODS(Operational Data Store) - This has a broad enterprise wide scope, but unlike the real entertprise data warehouse, data is refreshed in near real time and used for routine business activity.
3. Data Mart - Datamart is a subset of data warehouse and it supports a particular region, business unit or business function.
Data warehouses and data marts are built on dimensional data modeling where fact tables are connected with dimension tables. This is most useful for users to access data since a database can be visualized as a cube of several dimensions. A data warehouse provides an opportunity for slicing and dicing that cube along each of its dimensions.
Data Mart: A data mart is a subset of data warehouse that is designed for a particular line of business, such as sales, marketing, or finance. In a dependent data mart, data can be derived from an enterprise-wide data warehouse. In an independent data mart, data can be collected directly from sources.
Q)Star Schema in detail
In general, an organization is started to earn money by selling a product or by providing service to the product. An organization may be at one place or may have several branches.
When we consider an example of an organization selling products throughtout the world, the main four major dimensions are product, location, time and organization. Dimension tables have been explained in detail under the section Dimensions. With this example, we will try to provide detailed explanation about STAR SCHEMA.
Q)What is Star Schema?
Star Schema is a relational database schema for representing multimensional data. It is the simplest form of data warehouse schema that contains one or more dimensions and fact tables. It is called a star schema because the entity-relationship diagram between dimensions and fact tables resembles a star where one fact table is connected to multiple dimensions. The center of the star schema consists of a large fact table and it points towards the dimension tables. The advantage of star schema are slicing down, performance increase and easy understanding of data.
Steps in designing Star Schema
•Identify a business process for analysis(like sales).
•Identify measures or facts (sales dollar).
•Identify dimensions for facts(product dimension, location dimension, time dimension, organization dimension).
•List the columns that describe each dimension.(region name, branch name, region name).
•Determine the lowest level of summary in a fact table(sales dollar).
Important aspects of Star Schema & Snow Flake Schema
•In a star schema every dimension will have a primary key.
•In a star schema, a dimension table will not have any parent table.
•Whereas in a snow flake schema, a dimension table will have one or more parent tables.
•Hierarchies for the dimensions are stored in the dimensional table itself in star schema.
•Whereas hierachies are broken into separate tables in snow flake schema. These hierachies helps to drill down the data from topmost hierachies to the lowermost hierarchies.
Glossary:
Hierarchy
A logical structure that uses ordered levels as a means of organizing data. A hierarchy can be used to define data aggregation; for example, in a time dimension, a hierarchy might be used to aggregate data from the Month level to the Quarter level, from the Quarter level to the Year level. A hierarchy can also be used to define a navigational drill path, regardless of whether the levels in the hierarchy represent aggregated totals or not.
Level
Erwin Data Modeler Tutorial
A position in a hierarchy. For example, a time dimension might have a hierarchy that represents data at the Month, Quarter, and Year levels.
Fact Table
A table in a star schema that contains facts and connected to dimensions. A fact table typically has two types of columns: those that contain facts and those that are foreign keys to dimension tables. The primary key of a fact table is usually a composite key that is made up of all of its foreign keys.
A fact table might contain either detail level facts or facts that have been aggregated (fact tables that contain aggregated facts are often instead called summary tables). A fact table usually contains facts with the same level of aggregation.
Q)Snowflake Schema in detail
A snowflake schema is a term that describes a star schema structure normalized through the use of outrigger tables. i.e dimension table hierachies are broken into simpler tables. In star schema example we had 4 dimensions like location, product, time, organization and a fact table(sales).
In Snowflake schema, the example diagram shown below has 4 dimension tables, 4 lookup tables and 1 fact table. The reason is that hierarchies(category, branch, state, and month) are being broken out of the dimension tables(PRODUCT, ORGANIZATION, LOCATION, and TIME) respectively and shown separately. In OLAP, this Snowflake schema approach increases the number of joins and poor performance in retrieval of data. In few organizations, they try to normalize the dimension tables to save space. Since dimension tables hold less space, Snowflake schema approach may be avoided.
Q)ETL Tools what to learn?
With the help of ETL tools, we can create powerful target Data Warehouses without much difficulty. Following are the various options that we have to know and learn in order to use ETL tools.
Software:
» How to install ETL tool on server/client?
Working with an ETL Tool:
» How to work with various options like designer, mapping, workflow, scheduling etc.,?
» How to work with sources like DBMS, relational source databases, files, ERPs etc., and import the source definitions?
» How to import data from data modeling tools, applications etc.,?
» How to work with targets like DBMS, relational source databases, files, ERPs etc., and import the source definitions?
» How to create target definitions?
» How to create mappings between source definitions and target definitions?
Crack Erwin Data Modeler Interview Answers
» How to create transformations?
» How to cleanse the source data?
» How to create a dimension, slowly changing dimensions, cube etc.,?
» How to create and monitor workflows?
» How to configure, monitor and run debugger?
» How to view and generate metadata reports?