[Free] 2018(June) Ensurepass Microsoft 70-767 Dumps with VCE and PDF 1-10

Ensurepass.com : Ensure you pass the IT Exams
2018 May Microsoft Official New Released 70-767
100% Free Download! 100% Pass Guaranteed!

Implementing a SQL Data Warehouse

Question No: 1

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this sections, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have the following line-of-business solutions:

-> ERP system

-> Online WebStore

-> Partner extranet

One or more Microsoft SQL Server instances support each solution. Each solution has its own product catalog. You have an additional server that hosts SQL Server Integration Services (SSIS) and a data warehouse. You populate the data warehouse with data from each of the line-of-business solutions. The data warehouse does not store primary key values from the individual source tables.

The database for each solution has a table named Products that stored product information. The Products table in each database uses a separate and unique key for product records. Each table shares a column named ReferenceNr between the databases. This column is used to create queries that involve more than once solution.

You need to load data from the individual solutions into the data warehouse nightly. The following requirements must be met:

-> If a change is made to the ReferenceNr column in any of the sources, set the value of IsDisabled to True and create a new row in the Products table.

-> If a row is deleted in any of the sources, set the value of IsDisabled to True in the

data warehouse.

Solution: Perform the following actions:

-> Enable the Change Tracking feature for the Products table in the three source databases.

-> Query the CHANGETABLE function from the sources for the deleted rows.

-> Set the IsDIsabled column to True on the data warehouse Products table for the listed rows.

Does the solution meet the goal?

  1. Yes

  2. No

Answer: B Explanation:

We must check for updated rows, not just deleted rows.

References: https://www.timmitchell.net/post/2016/01/18/getting-started-with-change- tracking-in-sql-server/

Question No: 2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this sections, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Microsoft Azure SQL Data Warehouse instance that must be available six months a day for reporting.

You need to pause the compute resources when the instance is not being used. Solution: You use SQL Server Management Studio (SSMS).

Does the solution meet the goal?

  1. Yes

  2. No

Answer: B Explanation:

To pause a SQL Data Warehouse database, use any of these individual methods. Pause compute with Azure portal

Pause compute with PowerShell Pause compute with REST APIs

References:

https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-manage- compute-overview

Question No: 3

Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided in a question apply only to that question.

You are a database administrator for an e-commerce company that runs an online store. The company has the databases described in the following table.

Ensurepass 2018 PDF and VCE

Product prices are updated and are stored in a table named Products on DB1. The Products table is deleted and refreshed each night from MDS by using a Microsoft SQL

Server Integration Services (SSIS) package. None of the data sources are sorted. You need to update the SSIS package to add current prices to the Products table. What should you use?

  1. Lookup transformation

  2. Merge transformation

  3. Merge Join transformation

  4. MERGE statement

  5. Union All transformation

  6. Balanced Data Distributor transformation

  7. Sequential container

  8. Foreach Loop container

Answer: D Explanation:

In the current release of SQL Server Integration Services, the SQL statement in an Execute SQL task can contain a MERGE statement. This MERGE statement enables you to accomplish multiple INSERT, UPDATE, and DELETE operations in a single statement.

References: https://docs.microsoft.com/en-us/sql/integration-services/control-flow/merge- in-integration-services-packages

Question No: 4 DRAG DROP

Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.

You have a Microsoft SQL Server data warehouse instance that supports several client applications.

The data warehouse includes the following tables: Dimension.SalesTerritory, Dimension.Customer, Dimension.Date, Fact.Ticket, and Fact.Order. The Dimension.SalesTerritory and Dimension.Customer tables are frequently updated. The Fact.Order table is optimized for weekly reporting, but the company wants to change it daily. The Fact.Order table is loaded by using an ETL process. Indexes have been added to the table over time, but the presence of these indexes slows data loading.

All data in the data warehouse is stored on a shared SAN. All tables are in a database named DB1. You have a second database named DB2 that contains copies of production data for a development environment. The data warehouse has grown and the cost of storage has increased. Data older than one year is accessed infrequently and is considered historical.

You have the following requirements:

-> Implement table partitioning to improve the manageability of the data warehouse and to avoid the need to repopulate all transactional data each night. Use a partitioning strategy that is as granular as possible.

-> – Partition the Fact.Order table and retain a total of seven years of data.

-> – Partition the Fact.Ticket table and retain seven years of data. At the end of each month, the partition structure must apply a sliding window strategy to ensure that a new partition is available for the upcoming month, and that the oldest month of data is archived and removed.

-> – Optimize data loading for the Dimension.SalesTerritory, Dimension.Customer, and Dimension.Date tables.

-> – Incrementally load all tables in the database and ensure that all incremental changes are processed.

-> – Maximize the performance during the data loading process for the Fact.Order partition.

-> – Ensure that historical data remains online and available for querying.

-> – Reduce ongoing storage costs while maintaining query performance for current data.

You are not permitted to make changes to the client applications. You need to configure the Fact.Order table.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Ensurepass 2018 PDF and VCE

Answer:

Ensurepass 2018 PDF and VCE

Explanation:

Ensurepass 2018 PDF and VCE

From scenario: Partition the Fact.Order table and retain a total of seven years of data. Maximize the performance during the data loading process for the Fact.Order partition.

Step 1: Create a partition function.

Using CREATE PARTITION FUNCTION is the first step in creating a partitioned table or index.

Step 2: Create a partition scheme based on the partition function.

To migrate SQL Server partition definitions to SQL Data Warehouse simply: Step 3: Execute an ALTER TABLE command to specify the partition function.

References: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data- warehouse-tables-partition

Question No: 5 HOTSPOT

You have a Microsoft SQL Server Integration Services (SSIS) package that contains a

Data Flow task as shown in the Data Flow exhibit. (Click the Exhibit button.)

Ensurepass 2018 PDF and VCE

You install Data Quality Services (DQS) on the same server that hosts SSIS and deploy a knowledge base to manage customer email addresses. You add a DQS Cleansing transform to the Data Flow as shown in the Cleansing exhibit. (Click the Exhibit button.)

Ensurepass 2018 PDF and VCE

You create a Conditional Split transform as shown in the Splitter exhibit. (Click the Exhibit button.)

Ensurepass 2018 PDF and VCE

You need to split the output of the DQS Cleansing task to obtain only Correct values from the EmailAddress column.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

Ensurepass 2018 PDF and VCE

Answer:

Ensurepass 2018 PDF and VCE

Explanation:

Ensurepass 2018 PDF and VCE

The DQS Cleansing component takes input records, sends them to a DQS server, and gets them back corrected. The component can output not only the corrected data, but also additional columns that may be useful for you. For example – the status columns. There is one status column for each mapped field, and another one that aggregated the status for the whole record. This record status column can be very useful in some scenarios, especially when records are further processed in different ways depending on their status. Is such cases, it is recommended to use a Conditional Split component below the DQS Cleansing component, and configure it to split the records to groups based on the record status (or based on other columns such as specific field status).

References: https://blogs.msdn.microsoft.com/dqs/2011/07/18/using-the-ssis-dqs- cleansing-component/

Question No: 6 DRAG DROP

You have a series of analytic data models and reports that provide insights into the participation rates for sports at different schools. Users enter information about sports and participants into a client application. The application stores this transactional data in a Microsoft SQL Server database. A SQL Server Integration Services (SSIS) package loads the data into the models.

When users enter data, they do not consistently apply the correct names for the sports. The following table shows examples of the data entry issues.

Ensurepass 2018 PDF and VCE

You need to improve the quality of the data.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Ensurepass 2018 PDF and VCE

Answer:

Ensurepass 2018 PDF and VCE

Explanation:

Ensurepass 2018 PDF and VCE

References: https://docs.microsoft.com/en-us/sql/data-quality-services/perform-knowledge- discovery

Question No: 7

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this sections, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have the following line-of-business solutions:

-> ERP system

-> Online WebStore

-> Partner extranet

One or more Microsoft SQL Server instances support each solution. Each solution has its own product catalog. You have an additional server that hosts SQL Server Integration Services (SSIS) and a data warehouse. You populate the data warehouse with data from each of the line-of-business solutions. The data warehouse does not store primary key values from the individual source tables.

The database for each solution has a table named Products that stored product information. The Products table in each database uses a separate and unique key for product records. Each table shares a column named ReferenceNr between the databases. This column is used to create queries that involve more than once solution.

You need to load data from the individual solutions into the data warehouse nightly. The following requirements must be met:

-> If a change is made to the ReferenceNr column in any of the sources, set the value of IsDisabled to True and create a new row in the Products table.

-> If a row is deleted in any of the sources, set the value of IsDisabled to True in the

data warehouse.

Solution: Perform the following actions:

-> Enable the Change Tracking for the Product table in the source databases.

-> Query the CHANGETABLE function from the sources for the updated rows.

-> Set the IsDisabled column to True for the listed rows that have the old ReferenceNr value.

-> Create a new row in the data warehouse Products table with the new ReferenceNr value.

Does the solution meet the goal?

  1. Yes

  2. No

Answer: B Explanation:

We must check for deleted rows, not just updates rows.

References: https://www.timmitchell.net/post/2016/01/18/getting-started-with-change- tracking-in-sql-server/

Question No: 8 DRAG DROP

Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.

You have a Microsoft SQL Server data warehouse instance that supports several client applications.

The data warehouse includes the following tables: Dimension.SalesTerritory, Dimension.Customer, Dimension.Date, Fact.Ticket, and Fact.Order. The Dimension.SalesTerritory and Dimension.Customer tables are frequently updated. The Fact.Order table is optimized for weekly reporting, but the company wants to change it daily. The Fact.Order table is loaded by using an ETL process. Indexes have been added to the table over time, but the presence of these indexes slows data loading.

All data in the data warehouse is stored on a shared SAN. All tables are in a database named DB1. You have a second database named DB2 that contains copies of production data for a development environment. The data warehouse has grown and the cost of storage has increased. Data older than one year is accessed infrequently and is considered historical.

You have the following requirements:

-> Implement table partitioning to improve the manageability of the data warehouse and to avoid the need to repopulate all transactional data each night. Use a partitioning strategy that is as granular as possible.

-> Partition the Fact.Order table and retain a total of seven years of data.

-> Partition the Fact.Ticket table and retain seven years of data. At the end of each month, the partition structure must apply a sliding window strategy to ensure that a new partition is available for the upcoming month, and that the oldest month of data is archived and removed.

-> Optimize data loading for the Dimension.SalesTerritory, Dimension.Customer, and Dimension.Date tables.

-> Incrementally load all tables in the database and ensure that all incremental changes are processed.

-> Maximize the performance during the data loading process for the Fact.Order partition.

-> Ensure that historical data remains online and available for querying.

-> Reduce ongoing storage costs while maintaining query performance for current data.

You are not permitted to make changes to the client applications.

You need to optimize data loading for the Dimension.Customer table.

Which three Transact-SQL segments should you use to develop the solution? To answer, move the appropriate Transact-SQL segments from the list of Transact-SQL segments to the answer area and arrange them in the correct order.

NOTE: You will not need all of the Transact-SQL segments.

Ensurepass 2018 PDF and VCE

Answer:

Ensurepass 2018 PDF and VCE

Explanation:

Ensurepass 2018 PDF and VCE

Step 1: USE DB1

From Scenario: All tables are in a database named DB1. You have a second database named DB2 that contains copies of production data for a development environment.

Step 2: EXEC sys.sp_cdc_enable_db

Before you can enable a table for change data capture, the database must be enabled. To enable the database, use the sys.sp_cdc_enable_db stored procedure. sys.sp_cdc_enable_db has no parameters.

Step 3: EXEC sys.sp_cdc_enable_table

@source schema = N #39;schema#39; etc.

Sys.sp_cdc_enable_table enables change data capture for the specified source table in the current database.

Partial syntax: sys.sp_cdc_enable_table

[ @source_schema = ] #39;source_schema#39;,

[ @source_name = ] #39;source_name#39; , [,[ @capture_instance = ] #39;capture_instance#39; ] [,[ @supports_net_changes = ] supports_net_changes ]

Etc.

References: https://docs.microsoft.com/en-us/sql/relational-databases/system-stored- procedures/sys-sp-cdc-enable-table-transact-sql

https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sys- sp-cdc-enable-db-transact-sql

Question No: 9

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this sections, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing a Microsoft SQL Server Integration Services (SSIS) projects. The project consists of several packages that load data warehouse tables.

You need to extend the control flow design for each package to use the following control flow while minimizing development efforts and maintenance:

Ensurepass 2018 PDF and VCE

Solution: You add the control flow to a script task. You add an instance of the script task to the storage account in Microsoft Azure.

Does the solution meet the goal?

  1. Yes

  2. No

Answer: B Explanation:

A package consists of a control flow and, optionally, one or more data flows. You create the control flow in a package by using the Control Flow tab in SSIS Designer.

References: https://docs.microsoft.com/en-us/sql/integration-services/control-flow/control- flow

Question No: 10 DRAG DROP

You deploy a Microsoft Server database that contains a staging table named EmailAddress_Import. Each night, a bulk process will import customer information from an external database, cleanse the data, and then insert it into the EmailAddress table. Both tables contain a column named EmailAddressValue that stores the email address.

You need to implement the logic to meet the following requirements:

-> Email addresses that are present in the EmailAddress_Import table but not in the EmailAddress table must be inserted into the EmailAddress table.

-> Email addresses that are not in the EmailAddress_Import but are present in the

EmailAddress table must be deleted from the EmailAddress table.

How should you complete the Transact-SQL statement? To answer, drag the appropriate Transact-SQL segments to the correct locations. Each Transact-SQL segment may be used once, more than once, or not at all. You may need to drag the split bar between

panes or scroll to view content.

Ensurepass 2018 PDF and VCE

Answer:

Ensurepass 2018 PDF and VCE

Explanation:

Ensurepass 2018 PDF and VCE

Box 1: EmailAddress

The EmailAddress table is the target.

Box 2: EmailAddress_import

The EmailAddress_import table is the source. Box 3: NOT MATCHED BY TARGET

Box 4: NOT MATCHED BY SOURCE

References: https://docs.microsoft.com/en-us/sql/t-sql/statements/merge-transact-sql

100% Ensurepass Free Download!
Download Free Demo:70-767 Demo PDF
100% Ensurepass Free Guaranteed!
Download 2018 EnsurePass 70-767 Full Exam PDF and VCE

EnsurePass ExamCollection Testking
Lowest Price Guarantee Yes No No
Up-to-Dated Yes No No
Real Questions Yes No No
Explanation Yes No No
PDF VCE Yes No No
Free VCE Simulator Yes No No
Instant Download Yes No No

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.