Latest MCSE 70-467 Real Exam Download 21-30

Ensurepass

 

QUESTION 21

You are designing a SQL Server Integration Services (SSIS) solution that will load multiple Online

Transactional Processing (OLTP) data sources into a SQL Server data mart. You have the following requirements:

Ensure that the process supports the creation of an exception report that details possible duplicate key values, null ratios within columns, and column-length distributions of values.

Ensure that users can generate the exception report in an XML format. Use the minimum development effort.

You need to design the SSIS solution to meet the requirements.

What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.)


A. Use a Data Profiling task. Use a Data Flow task to extract the XML output of the Data Profiling task into a SQL Server table. Query the table to view the exceptions.

B. Use Data Flow tasks to process the clean data.

C. Use a Data Profiling task. Read the exceptions in Data Profile Viewer.

D. Design a stored procedure that examines data for common dirty data patterns. Use an Execute SQL

task.

 

Correct Answer: C

 

 

QUESTION 22

You are modifying a star schema data mart that feeds order data from a SQL Azure database into a SQL Server Analysis Services (SSAS) cube. The data mart contains two large tables that include flags and indicators for some orders. There are 100 different flag columns, each with 10 different indicator values. Some flags reuse indicators. The tables both have a granularity that matches the fact table.

 

You have the following requirements:

 

Allow users to slice data by all flags and indicators.

Modify the date dimension table to include a surrogate key of a numeric data type and add the surrogate key to the fact table.

Use the most efficient design strategy for cube processing and queries.

 

You need to modify the schema.

What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.) A. Define the surrogate key as an INT data type. Combine the distinct flag/indicator combinations into a

single dimension.

B. Define the surrogate key as an INT data type. Create a single fact dimension in each table for its flags and indicators.

C. Define the surrogate key as a BIGINT data type. Combine the distinct flag/indicator combinations into a single dimension.

D. Define the surrogate key as a BIGINT data type. Create a single fact dimension in each table for its flags and indicators.

 

Correct Answer: A

 

 

QUESTION 23

You need to configure the Scenario attribute to ensure that business users appropriately query the

Sales Plan measure.

What should you do? (Each correct answer presents part of the solution. Choose all that apply.)

 

A. Set the AttributeHierarchyVisible property to False.
B. Set the IsAggregatable property to False.

C. Set the Usage property to Parent

D. set the DefaultMember property to the Forecast member. E. Set the AttributeHierarchyEnabled property to False.

F. Set the RootMemberIf property to ParentIsMissing.

 

Correct Answer: CD

 

 

QUESTION 24

Topic 2, Tailspin Toys

 

Background

 

You are the business intelligence (BI) solutions architect for Tailspin Toys.

You produce solutions by using SQL Server 2012 Business Intelligence edition and Microsoft


SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition.

Technical Background

Extract, transform, load (ETL) processes populate the data warehouse every 24 hours.

ETL Processes

 

One SQL Server Integration Services (SSIS) package is designed and developed to populate each data warehouse table. The primary source of data is extracted from a SQL Azure database.

Secondary data sources include a Microsoft Dynamics CRM 2011 on-premises database.

 

ETL developers develop packages by using the SSIS project deployment model. The ETL developers are responsible for testing the packages and producing a deployment file. The deployment file is given to the ETL administrators. The ETL administrators belong to a Windows security group named SSISOwners that maps to a SQL Server login named SSISOwners.

 

Data Models

 

The IT department has developed and manages two SQL Server Analysis Services (SSAS) BI Semantic Model (BISM) projects: Sales Reporting and Sales Analysis. The Sales Reporting database has been developed as a tabular project. The Sales Analysis database has been developed as a multidimensional project. Business analysts use PowerPivot for Microsoft Excel to produce self-managed data models based directly on the data warehouse or the corporate data models, and publish the PowerPivot workbooks to a SharePoint site.

 

The sole purpose of the Sales Reporting database is to support business user reporting and adhoc analysis by using Power View. The database is configured for DirectQuery mode and all model queries result in SSAS querying the data warehouse. The database is based on the entire data warehouse.

 

 

The Sales Analysis database consists of a single SSAS cube named Sales. The Sales cube has been developed to support sales monitoring, analysts, and planning. The Sales cube metadata is shown in the following graphic.

 

 

Details of specific Sales cube dimensions are described in the following table.

 

 

 

The Sales measure group is based on the FactSales table. The Sales Plan measure group is based on the FactSalesPlan table. The Sales Plan measure group has been configured with a multidimensional OLAP (MOLAP) writeback partition. Both measure groups use MOLAP partitions, and aggregation designs are assigned to all partitions. Because the volumes of data in the data warehouse are large, an incremental processing strategy has been implemented.

 

The Sales Variance calculated member is computed by subtracting the Sales Plan forecast amount from Sales. The Sales Variance °/o calculated member is computed by dividing Sales Variance by Sales. The cube’s Multidimensional Expressions (MDX) script does not set any color properties.

 

Analysis and Reporting

 

SQL Server Reporting Services (SSRS) has been configured in SharePoint integrated mode.

 

A business analyst has created a PowerPivot workbook named Manufacturing Performance that integrates data from the data warehouse and manufacturing data from an operational database hosted in SQL Azure. The workbook has been published in a PowerPivot Gallery library in SharePoint Server and does not contain any reports. The analyst has scheduled daily data refresh from the SQL Azure database. Several SSRS reports are based on the PowerPivot workbook, and all reports are configured with a report execution mode to run on demand.

 

Recently users have noticed that data in the PowerPivot workbooks published to SharePoint

Server is not being refreshed. The SharePoint administrator has identified that the Secure Store Service target application used by the PowerPivot unattended data refresh account has been deleted.

 

Business Requirements

ETL Processes


All ETL administrators must have full privileges to administer and monitor the SSIS catalog, and to import and manage projects.

 

Data Models

 

The budget and forecast values must never be accumulated when querying the Sales cube. Queries should return the forecast sales values by default.

 

Business users have requested that a single field named SalespersonName be made available to report the full name of the salesperson in the Sales Reporting data model.

 

Writeback is used to initialize the budget sales values for a future year and is based on a weighted allocation of the sales achieved in the previous year.

 

Analysis and Reporting

 

Reports based on the Manufacturing Performance PowerPivot workbook must deliver data that is no more than one hour old.

 

Management has requested a new report named Regional Sales. This report must be based on the Sales cube and must allow users to filter by a specific year and present a grid with every region on the columns and the Products hierarchy on the rows. The hierarchy must initially be collapsed and allow the user to drill down through the hierarchy to analyze sales. Additionally, sales values that are less than S5000 must be highlighted in red.

 

Technical Requirements

 

Data Warehouse

 

Business logic in the form of calculations should be defined in the data warehouse to ensure consistency and availability to all data modeling experiences.

The schema design should remain as denormalized as possible and should not include unnecessary columns.

The schema design must be extended to include the product dimension data.

ETL Processes

Package executions must log only data flow component phases and errors.

 

Data Models

 

Processing time for all data models must be minimized.

A key performance indicator (KPI) must be added to the Sales cube to monitor sales performance. The KPI

trend must use the Standard Arrow indicator to display improving, static, or deteriorating

Sales Variance % values compared to the previous time period.

 

Analysis and Reporting

 

IT developers must create a library of SSRS reports based on the Sales Reporting database. A shared

SSRS data source named Sales Reporting must be created in a SharePoint data connections library.

 

You need to extend the schema design to store the product dimension data. Which design should you use?

To answer, drag the appropriate table or tables to the correct location or locations in the answer area. (Fill from left to right. Answer choices may be used once, more than once, or not all.)

 

 

 

 

complement-4 (exhibit):


 

clip_image001

 

complement-5 (exhibit):


 

clip_image002

 

complement-6 (exhibit):


 

clip_image003

 

Select and Place:


 

clip_image004

 

Correct Answer:

clip_image005

 QUESTION 25

You are validating whether a SQL Server Integration Services (SSIS) package named Master.dtsx in the


SSIS catalog is executing correctly.

You need to display the number of rows in each buffer passed between each data flow component of the package.

Which three actions should you perform in sequence? (To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.)

 

Select and Place:

clip_image006

 

Correct Answer:

clip_image007

 QUESTION 26

You are developing the database schema for a SQL Server Analysis Services (SSAS) BI Semantic Model

(BISM). The BISM will be based on the schema displayed in the following graphic.

 

You have the following requirements:

 

clip_image009Ensure that queries of the data model correctly display average student age by class and average class level by student.

clip_image009[1]Minimize development effort.

 

You need to design the data model.What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.)


Exhibit:

Ensurepass 2017 PDF and VCE

 

A. Create a multidimensional project and define measures and a reference relationship.
B. Create a tabular project and define calculated columns.

C. Create a multidimensional project and define measures and a many-to-many dimensional relationship.
D. Create a tabular project and define measures.

 

Correct Answer: C

 

 

QUESTION 27

DRAG DROP You are designing a SQL Server Reporting Services (SSRS) solution. An existing report aggregates data from a SQL Server database in a chart.

You need to use the chart in a new report and ensure that other users can use the chart in their reports. Which three actions should you perform in sequence?

(To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.)

 

Select and Place:

clip_image011

 

Correct Answer:


 

clip_image012

 

 

 

QUESTION 28

You are creating a SQL Server Integration Services (SSIS) package to populate a fact table from a source table. The fact table and source table are located in a SQL Azure database.

The source table has a price field and a tax field. The OLE DB source uses the data access mode of Table. You have the following requirements:

clip_image009[2]The fact table must populate a column named TotalCost that computes the sum of the price and tax columns.

clip_image009[3]Before the sum is calculated, any records that have a price of zero must be discarded.

You need to create the SSIS package in SQL Server Data Tools. In what sequence should you order four of the listed components for the data flow task?

(To answer, move the appropriate components from the list of components to the answer area and arrange them in the correct order.)

 

Select and Place:

clip_image013

 

Correct Answer:


 

clip_image014

 

 

 

QUESTION 29

You are designing a SQL Server Integration Services (SSIS) package to execute 12 Transact-SQL (T-SQL) statements on a SQL Azure database. The T-SQL statements may be executed in any order. The T-SQL statements have unpredictable execution times. You have the following requirements:

clip_image009[4]The package must maximize parallel processing of the T-SQL statements.

clip_image015After all the T-SQL statements have completed, a Send Mail task must notify administrators. You need to design the SSIS package. Which three actions should you perform in sequence?

(To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.)

 

Select and Place:

clip_image016

 

Correct Answer:


 

clip_image017

 

 

 

QUESTION 30

You are configuring the partition storage settings for a SQL Server Analysis Services (SSAS) cube. The partition storage must meet the following requirements:

 

clip_image009[5]clip_image015[1]Optimize the storage of source data and aggregations in the cube. Use proactive caching.

clip_image009[6]Drop cached data that is more than 30 minutes old.

clip_image015[2]Update the cache when data changes, with a silence interval of 10 seconds.

 

You need to select the partition storage setting.Which setting should you select? To answer, select the appropriate setting in the answer area.

 

Hot Area:

clip_image018

 

Correct Answer:


 

clip_image019

Download Latest 70-467 Real Free Tests , help you to pass exam 100%.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.