Lead2pass New Updated Exam Questions
https://www.getfreevce.com/easily-pass-70-458-exam-with-the-help-of-lead2pass-new-microsoft-70-458-vce-and-pdf-dumps-141-150.html
Export date: Sun Dec 22 16:19:11 2024 / +0000 GMT

Easily Pass 70-458 Exam with the Help of Lead2pass New Microsoft 70-458 VCE and PDF Dumps (141-150)


Lead2pass delivers you the best possible study guide which is also updated regularly to take your Microsoft 70-458 exam. The Microsoft 70-458 real exam is planned and researched by IT experts who are very much involved in the IT field. They have been trying their level best to create concise and logical study guide by using their data. Using the product of Lead2pass will not only help you pass the exam but also safe a bright future for you ahead.

QUESTION 141
You are creating a Data Quality Services (DQS) solution.
You must provide statistics on the accuracy of the data.
You need to use DQS profiling to obtain the required statistics.
Which DQS activity should you use?

A.    Cleansing
B.    Knowledge Discovery
C.    Matching Policy
D.    Matching

Answer: A

QUESTION 142
You are installing the Data Quality Server component of Data Quality Services.
You need to provision the hardware and install the software for the server that runs the Data Quality Server.
You must ensure that the minimum Data Quality Server prerequisites are met.
What should you do?

A.    Install Microsoft Internet Explorer 6.0 SP1 or later.
B.    Install SQL Server 2012 Database Engine.
C.    Install Microsoft SharePoint Server 2010 Enterprise Edition with PowerPivot.
D.    Install the .NET Framework 4.0.

Answer: B

QUESTION 143
You are designing a SQL Server Integration Services (SSIS) 2012 package that imports data from a Windows Azure SQL Database database into a SQL Server database.
The SSIS package has the following requirements:
- Every night, a very large amount of data is imported into the staging database.
- Package processing time must be minimized.
- The package must run on its own dedicated server when it is deployed to production.
- Transaction log activity when data is imported must be minimized.
You need to design the package to meet the requirements.
Which destination component should you use?

A.    Raw File
B.    OLE DB
C.    ODBC
D.    Bulk Insert

Answer: B

QUESTION 144
Hotspot Question
You are developing a SQL Server Integration Service (SSIS) package.
The package loads a customer dimension table by using a data flow task.
Changes to the customer attributes must be tracked over time.
You need to produce a checksum value to identify the rows that have changed since the last Extract, Transform and Load (ETL) process execution.
You need to use the least amount of development effort to achieve this goal.
Which transformation should you use? (To answer, select the appropriate transformation in the answer area.)

wpsE5B.tmp

Answer:

wpsDCD.tmp_thumb[2]

QUESTION 145
You are designing a SQL Server Integration Services (SSIS) data flow to load sales transactions from a source system into a data warehouse hosted on Windows Azure SQL Database.
One of the columns in the data source is named ProductCode.
Some of the data to be loaded will reference products that need special processing logic in the data flow.
You need to enable separate processing streams for a subset of rows based on the source product code.
Which Data Flow transformation should you use?

A.    Script Task
B.    Conditional Split
C.    Multicast
D.    Source Assistant

Answer: A
Explanation:
The Script component extends the data flow capabilities of Microsoft Integration Services packages with custom code written in Microsoft Visual Basic or Microsoft Visual C# that is compiled and executed at package run time. The Script component simplifies the development of a custom data flow source, transformation, or destination when the sources, transformations, and destinations included with Integration Services do not fully satisfy your requirements.

QUESTION 146
You are developing a SQL Server Integration Services (SSIS) package to load data into a Windows Azure SQL Database database.
Thepackage consists of several data flow tasks.
The package has the following auditing requirements:
- If a data flow task fails, a Transact-SQL (T-SQL) script must be executed.
- The T-SQL script must be executed only once per data flow task that fails, regardless of the nature of the error.
You need to ensure that auditing is configured to meet these requirements.
What should you do?

A.    Use an event handler for OnError for the package.
B.    Use an event handler for OnError for each data flow task.
C.    Use an event handler for OnTaskFailed for the package.
D.    View the job history for the SQL Server Agent job.
E.    View the All Messages subsection of the All Executions report for the package.
F.    Store the System::SourceID variable in the custom log table.
G.    Store the System::ServerExecutionID variable in the custom log table.
H.     Store the System::ExecutionInstanceGUID variable in the custom log table.
I.      Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J.     Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
K.      Deploy the project by using dtutil.exe with the /COPY DTS option.
L.     Deploy the project by using dtutil.exe with the /COPY SQL option.
M.     Deploy the .ispac file by using the Integration Services Deployment Wizard.
N.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
O.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
P.     Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. *tart_execution stored procedures.
Q.    Create a table to store error information. Create an error output on each data flow
destination that writes OnError event text to the table.
R.    Create a table to store error information. Create an error output on each data flow
destination that writes OnTaskFailed event text to the table.

Answer: C

QUESTION 147
You are developing a SQL Server Integration Services (SSIS) project with multiple packages to copy data to a Windows Azure SQL Database database.
An automated process must validate all related Environment references, parameter data types, package references, and referenced assemblies.
The automated process must run on a regular schedule.
You need to establish the automated validation process by using the least amount of administrative effort.
What should you do?

A.    Use an event handler for OnError for the package.
B.    Use an event handler for OnError for each data flow task.
C.    Use an event handler for OnTaskFailed for the package.
D.    View the job history for the SQL Server Agent job.
E.    View the All Messages subsection of the All Executions report for the package.
F.    Store the System::SourceID variable in the custom log table.
G.    Store the System::ServerExecutionID variable in the custom log table.
H.     Store the System::ExecutionInstanceGUID variable in the custom log table.
I.      Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J.     Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
K.      Deploy the project by using dtutil.exe with the /COPY DTS option.
L.     Deploy the project by using dtutil.exe with the /COPY SQL option.
M.     Deploy the .ispac file by using the Integration Services Deployment Wizard.
N.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
O.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
P.     Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. *tart_execution stored procedures.
Q.    Create a table to store error information. Create an error output on each data flow
destination that writes OnError event text to the table.
R.    Create a table to store error information. Create an error output on each data flow
destination that writes OnTaskFailed event text to the table.

Answer: N

QUESTION 148
You are developing a SQL Server Integration Services (SSIS) project by using the Project Deployment Mode.
The project will be deployed to an SSIS catalog folder where Environments have already been created.
You need to deploy the project.
What should you do?

A.    Use an event handler for OnError for the package.
B.    Use an event handler for OnError for each data flow task.
C.    Use an event handler for OnTaskFailed for the package.
D.    View the job history for the SQL Server Agent job.
E.    View the All Messages subsection of the All Executions report for the package.
F.    Store the System::SourceID variable in the custom log table.
G.    Store the System::ServerExecutionID variable in the custom log table.
H.     Store the System::ExecutionInstanceGUID variable in the custom log table.
I.      Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J.     Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
K.      Deploy the project by using dtutil.exe with the /COPY DTS option.
L.     Deploy the project by using dtutil.exe with the /COPY SQL option.
M.     Deploy the .ispac file by using the Integration Services Deployment Wizard.
N.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
O.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
P.     Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. *tart_execution stored procedures.
Q.    Create a table to store error information. Create an error output on each data flow
destination that writes OnError event text to the table.
R.    Create a table to store error information. Create an error output on each data flow
destination that writes OnTaskFailed event text to the table.

Answer: M

QUESTION 149
You are developing a SQL Server Integration Services (SSIS) package to load data into a data warehouse. The package consists of several data flow tasks.
The package experiences intermittent errors in the data flow tasks.
If any data flow task fails, all package error information must be captured and written to a SQL Server table by using an OLE DB connection manager.
You need to ensure that the package error information is captured and written to the table.
What should you do?

A.    Use an event handler for OnError for the package.
B.    Use an event handler for OnError for each data flow task.
C.    Use an event handler for OnTaskFailed for the package.
D.    View the job history for the SQL Server Agent job.
E.    View the All Messages subsection of the All Executions report for the package.
F.    Store the System::SourceID variable in the custom log table.
G.    Store the System::ServerExecutionID variable in the custom log table.
H.     Store the System::ExecutionInstanceGUID variable in the custom log table.
I.      Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J.     Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
K.      Deploy the project by using dtutil.exe with the /COPY DTS option.
L.     Deploy the project by using dtutil.exe with the /COPY SQL option.
M.     Deploy the .ispac file by using the Integration Services Deployment Wizard.
N.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
O.    Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
P.     Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. *tart_execution stored procedures.
Q.    Create a table to store error information. Create an error output on each data flow
destination that writes OnError event text to the table.
R.    Create a table to store error information. Create an error output on each data flow
destination that writes OnTaskFailed event text to the table.

Answer: I

QUESTION 150
You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy.
The package reads data from a source system. Depending on the value in a source column, the package redirects rows to one of five different data flow paths.
You need to add a data flow transformation to support the package redirection.
Which data flow transformation should you use?

A.    Lookup
B.    Multicast
C.    Conditional Split
D.    Pivot

Answer: C

Being a Microsoft 70-458 certified professional guarantee a better future and success for you. Lead2pass Microsoft certification preparation exams would prove themselves to be the most helpful step in the direction of that Microsoft goal. You can easily find  Microsoft preparation material for all certifications in Lead2pass by visiting our website below:

http://www.lead2pass.com/70-458.html

Post date: 2015-01-20 09:58:34
Post date GMT: 2015-01-20 09:58:34

Post modified date: 2015-01-20 09:58:34
Post modified date GMT: 2015-01-20 09:58:34

Export date: Sun Dec 22 16:19:11 2024 / +0000 GMT
This page was exported from Lead2pass New Updated Exam Questions [ https://www.getfreevce.com ]
Export of Post and Page has been powered by [ Universal Post Manager ] plugin from www.ProfProjects.com