(General Reporting Tips) If a query refers to more than one table, all columns should be prefixed by a descriptor (table name or alias) - Answer- Using des... [Show More] criptors ensures you have unambiguous column references, preventing issues that can occur when two tables contain columns with the same name.
Chapter 1. (Study Checklist) Caboodle Console - Answer- The Caboodle Console is a web application housed on the Caboodle server. It includes the following:
Dictionary
Dictionary Editor
Executions
Work Queue
Configuration
Chapter 1. (Study Checklist) Data Warehouse - Answer- In a data warehouse, multiple sources may load data pertaining to a single entity. This means that more than one package may populate a given row in a Caboodle table. As a result, there may be multiple business key values associated with a single entity in a Caboodle table.
Chapter 1. (Study Checklist) ETL - Answer- Extract, Transform, Load
Chapter 1. (Study Checklist) SSIS Package - Answer- The architecture of Caboodle includes a staging database and a reporting database. Data is extracted from source systems (like Clarity), transformed in the staging database, and presented for users in the reporting database. This movement of data is realized via a set of SQL Server Integration Services (SSIS) packages.
Chapter 1. (Study Checklist) Data Lineage - Answer- Generally, data lineage refers to the process of identifying the source of a specific piece of information. In Caboodle, data lineage is defined at the package level.
Chapter 1. (Study Checklist) Star Schema - Answer- The standard schema for a dimensional data model. The name refers to the image of a fact table surrounded by many linked dimension tables, which loosely resembles a star.
The Caboodle data model structure is based on a "star schema" ‐ where one central fact table will join to many associated lookup or dimension tables. This structure provides the foundation of the Caboodle data model.
Chapter 1. (Study Checklist) DMC - Answer- DATA MODEL COMPONENT
No table in Caboodle "stands alone." Each is considered part of a Data Model Component, which refers to the collection of metadata tables that support the ETL process and reporting views stored in the FullAccess schema.
Each DMC gets a type. Strict table naming conventions are followed in Caboodle, so that a table's suffix provides information about its structure and purpose.
These suffixes are:
· Dim for dimensions (e.g. PatientDim)
· Fact for facts (e.g. EncounterFact)
· Bridge for bridges (e.g. DiagnosisBridge)
· DataMart for data marts (e.g. HospitalReadmissionDataMart)
· AttributeValueDim for EAV tables (e.g. PatientAttributeValueDim)
· X for custom tables (e.g. CustomFactX)
Chapter 1. (Study Checklist) Staging Database - Answer- The Caboodle database into which records are loaded by SSIS packages and stored procedures.
Chapter 1. (Study Checklist) Reporting Database - Answer- The architecture of Caboodle includes a staging database and a reporting database. Data is extracted from source systems (like Clarity), transformed in the staging database, and presented for users in the reporting database. This movement of data is realized via a set of SQL Server Integration Services (SSIS)
packages.
Chapter 1. (Study Checklist) Dbo Schema - Answer- STAGING DATABASE
Import tables and Mapping tables live here. This is
primarily used by administrators for moving data into Caboodle.
REPORTING DATABASE
The dbo schema stores reporting data and acts as the
data source for SlicerDicer. The Caboodle Dictionary reflects the contents of the dbo schema.
Chapter 1. (Study Checklist) FullAccess Schema - Answer- STAGING DATABASE
The FullAccess schema does not exist on the Staging database.
REPORTING DATABASE
The FullAccess schema houses views that simplify reporting. FullAccess should be your default schema when reporting.
(ETL Terms) Execution - Answer- An execution is the process that extracts data from a source system using packages, transforms the data in the staging database, and loads it to Caboodle for reporting. You create and run executions in the Caboodle Console.
(ETL Terms) Extract - Answer- Extracts to Caboodle from Clarity can be either backfill or incremental. Backfill extracts load or reload every row in a table from Clarity, whereas incremental extracts load only changed rows. Existing data is available while extracts are in progress.
(ETL Terms)package - Answer- A package is a definition of an extract of data from one specific source to a specific import table. For example, a fact might have packages for Epic inpatient data, Epic outpatient data, and several non-Epic data sources. Packages are defined in SSIS .dtsx files.
Chapter 1. (Study Checklist) Identify key characteristics of the dimensional data model. - Answer- MADE for report writers.
· Simpler and more intuitive.
· Easily extensible.
· More performant..
Chapter 1. (Study Checklist) Identify documentation resources for reporting out of Caboodle - Answer- Caboodle Dictionary
Reporting with Caboodle document
Caboodle ER diagram
Chapter 1. (Study Checklist) Identify reporting needs that best fit Caboodle - Answer- Custom data packages can be written by Caboodle developers to accommodate your organization's reporting needs.
(General Reporting Tips) Add a filter to most queries to exclude Caboodle's special rows for unspecified, not applicable, and deleted records, which have surrogate keys of -1, -2, and -3 - Answer- Include only rows where the key is greater than 0.
(General Reporting Tips) Caboodle has a numbers table, NumbersDim, that you can use as needed in your reports - Answer- NumbersDim contains the integers from 1 to 1,000,000, which you can reference to help manipulate strings and complete other processes. If you need more than 1,000,000 rows to accomplish a task, you can refer to NumbersDim multiple times in your query.
Chapter 1. (Study Checklist) How does Epic data flow into Caboodle - Answer- Epic data moves between several databases before it gets to Caboodle.
CHRONICLES flows into CLARITY via ETL. After transformation, the data is stored in a relational database on a separate server. Even though the structure of the Chronicles and Clarity databases differ significantly, the ETL process preserves the relationships mapped in Chronicles.
CLARITY flows into Caboodle data is extracted
from Clarity, transformed in the staging database, and presented for users in the reporting database. This movement of data is realized via a set of SQL Server Integration Services (SSIS) packages.
Chapter 1. (Study Checklist) How does Non-Epic data flow into Caboodle - Answer- The Caboodle developer designs custom DMCs (Data Model Components) and writes SSIS
packages to bring additional data into the warehouse. This may be additional Epic data from Clarity or non‐Epic data from 3rd party sources.
CHAPTER 1. (Reviewing the Chapter) What are the differences between a normalized and dimensional data model? - Answer- In a normalized data model, the focus is on not repeating data, which reduces the size of the
database.
In a dimensional data model, the focus is on ease of reporting and uses the star schema, which focuses on a central fact table pertaining to a reportable event and surrounding dimension tables providing context for the event.
CONTEXT...It gives CONTEXT
CHAPTER 1. (Reviewing the Chapter) Briefly define the roles of the Caboodle report writer, administrator, and developer. - Answer- · The Caboodle report writer queries data that already exists in the database. They use their knowledge of the tools and the source database(s) to conduct research into the necessary data points required for a given report.
· The Caboodle administrator uses the Caboodle Console to manage and monitor the ETL process. They troubleshoot ETL errors and handle configuration steps for the database.
· The Caboodle developer designs custom DMCs (Data Model Components) and writes SSIS packages to bring additional data into the warehouse. This may be additional Epic data from Clarity or non‐Epic data from 3rd party sources.
CHAPTER 1. (Reviewing the Chapter) TRUE or FALSE: Naming conventions are enforced in Caboodle. - Answer- True. Naming conventions, such as Fact, are enforced in Caboodle.
CHAPTER 1. (Reviewing the Chapter) What is the relationship between SlicerDicer and Caboodle? - Answer- SlicerDicer is Epic's self-service reporting tool in Hyperspace that dynamically queries Caboodle data.
Chapter 1. (After-Class Exercise) Wh [Show Less]