SSIS Expert Quiz
SSIS – SQL Server Integration Services :This SSIS Expert Quiz contains set of 60 SSIS Quiz which will help to clear any exam which is designed for Expert.
1) Which are the correct statements with respect to SSISDB Catalog?
Option A) Each instance of SQL Server can have one catalog
Option B) Each instance of SQL Server can have multiple catalog
Option C) Each catalog can have zero or more folders
Option D) Each folder can have zero or more projects and zero or more environments
- Option A,C&D
- Option B,C &D
- Option A&C
- Option B&C
2) During development phase the protection level of packages is set to the default value which is EncryptSensitiveWithUserKey, Now the package needs to deployed into production and operations team is performing the deployment and once deployed the operations team only should have the access. What is required to achieve this?
- No change is required with the protection level of the packages
- Dev team needs to change the protection level EncryptSensitiveWithPassword and share with Operations team. Once deployed Operations team change the protection level
- Default Protection level EncryptSensitiveWithUserKey can be changed by Operations team by themselves since they have the admin privileges
- Request the developer to login with his credentials and help the operations team with the deployment
3) ___________ Protection Level set or Uses a key that is based on the current user profile to encrypt the whole package. Only the user who created or exported the package can open the package in SSIS Designer or run the package by using the dtexec command prompt utility.
- EncryptSensitiveWithPassword
- EncryptSensitiveWithUserKey
- EncryptAllWithUserKey
- EncryptAllWithPassword
4) Once the development of SSIS packages are completed and require to automate the execution of packages with SQL Server Agent. Pls specify what roles\roles provide permissions to Execute All the packages with SQL Server Agent
Option A) db_ssisadmin
Option B) db_ssisltduser
Option C) db_ssisoperator
- Option A&B
- Option B&C
- Option A&C
- Option A
5) Which one is not a standard SQL Server Integration Services Role
- db_ssisadmin
- db_ssisltduser
- db_ssisoperator
- db_ssisreadonly
6) Which action is not supported by db_ssisltduser role?
- Enumerate all packages.
- View all packages.
- View own packages.
- Execute own packages.
7) When you configure a package to use configurations, checkpoints, and logging, the package stores this information outside the package. The information that is stored outside the package might contain sensitive data, while securing data which option is an incorrect one.
- To protect configurations that the package saves to SQL Server database tables and use SQL Server security features.
- To protect logs that the package saves to SQL Server database tables, use SQL Server security features.
- To control access to files, use the access control lists (ACLs)
- Set Protection Level to secure the sensitive data
8) Which feature is supported SQL Server Integration Services 2012
- Visual Studio Tools for Applications (VSTA)
- The options for connecting shapes on the design surface of the Control Flow tab and Data Flow tab
- The options to configure a data viewer to display data in a histogram, scatter plot, or column chart
- Data Source Views
9) Customer is in the process of migrating SSIS packages built with 2005 to 2012 and some of the packages have Active X Scripts, when upgrading packages what is expected output and choose the right option
- SSIS packages are upgraded with out any issues
- Need to rewrite ActiveX Script functionality
- Use SSIS package upgrade wizard
- Partial upgrade is possible
10) Which control flow task support to execute XMLA commands
- Analysis Services Execute DDL Task
- Data Flow Task
- File System Task
- Execute Process Task
11) While configuring Analysis Services Execute DDL Task and there are options available to specify the connection and Source Type. Choose the INVALID source type to specify the DDL Statements
- Direct Input
- File Connection
- Variable
- SQL Connection
12) What does XMLA stands, which is related SSAS objects
- XML for Analytics
- XML for Analysis
- XML for Adhoc Analysis
- XML for Addition
13) Task runs prediction queries based on models built in Analysis Services
- Data Mining Query Task
- Analysis Services Execute DDL Task
- Analysis Services Processing Task
- Data Profiling Task
14) Back Up Database task supports various recovery models and backups , which is NOT a standard supported recovery model and backup types by Back Up Database Task
- Simple
- Bulk-Logged
- Full
- Partial
15) Need to design a package to remove the backup files or maintenance plan reports on the specified server and also to remove a specific file or remove a group of files in a folder, Which tasks supports this implementation?
- History Cleanup Task
- Maintenance Cleanup Task
- Rebuild Index Task
- Shrink Database Task
16) Which Task helps to reduce the size of SQL Server database data and log files
- Execute T-SQL Statement Task
- Update Statistics Task
- Maintenance Cleanup Task
- Shrink Database Task
17) To configure the Data Flow task for better performance, you can configure the task’s properties and modify the DefaultBufferSize, What is the default value set for the buffer size
- 10 MB
- 100 MB
- 1000 MB
- 10000 MB
18) To configure the Data Flow task for better performance, you can configure the task’s properties and modify the DefaultBufferMaxRows, What is the default value set for maximum number of rows
- 10 rows
- 100 rows
- 1000 rows
- 10000 rows
19)
Parallel execution improves performance on computers that have multiple physical or logical processors. To support parallel execution Integration Services uses few property\properties to improve performance. Choose the correct option\s
a) EngineThreads
b) DefaultBufferSize
c) MaxConcurrentExecutables
- Option A&B
- Option B&C
- Option A&C
- Option A,B&C
20) The data flow engine begins the task of sizing its buffers by calculating the estimated size of a single row of data. Then it multiplies the estimated size of a row by the value of DefaultBufferMaxRows to obtain a preliminary working value for the buffer size. Which statement is INCORRECT with respect to adjusting the size of buffers
- If the result is more than the value of DefaultBufferSize, the engine reduces the number of rows
- If the result is less than the internally-calculated minimum buffer size, the engine increases the number of rows
- If the result falls between the minimum buffer size and the value of DefaultBufferSize, the engine sizes the buffer as close as possible to the estimated row size times the value of DefaultBufferMaxRows
- When sufficient memory is available, you should use larger number of small buffers
21) To configure individual data flow components for better performance, there are some general guidelines that you can follow. There are also specific guidelines for each type of data flow component: source, transformation, and destination. Choose the option which DOESN’t help to improve performance
- Set the IsSorted property on the output of an upstream data flow component to True
- Performing the sort operation in the data flow tasks enhances the performance
- Specify the Data Flow task runs in optimized mode (RunInOptimizedMode property)
- When you use an OLE DB source to retrieve data from a view, select “SQL command” as the data access mode and enter a SELECT statement
22) Source tables change over time. A data mart or data warehouse that is based on those tables needs to reflect these changes. However, a process that periodically copies a snapshot of the entire source consumes too much time and resources. SQL Server has inbuilt feature which is supported from SQL Server 2008 version. Identify the process or feature?
- Include timestamp columns in the database tables
- Define triggers in the database
- Include a script or process to identify the changes
- Enable Change Data Capture feature in the database
23) When implementing Change Data Capture using SQL Server Integration Services there are few steps in creating the packages and identify the step which involves the Data Flow Task?
- Retrieving and Understanding the Change Data
- Specifying an Interval of Change Data
- Determining Whether the Change Data Is Ready
- Preparing to Query for the Change Data
24) When implementing Change Data Capture using SQL Server Integration Services there are few steps in creating the packages and identify the step which involves the Control Flow Task?
- Retrieving and Understanding the Change Data
- Determining Whether the Change Data Is Ready
- Processing Inserts, Updates, and Deletes
- Applying the Changes to the Destination
25) What is NOT TRUE with respect to minimally logged operations
- It keep track of extent allocations and metadata changes only
- Is often faster than a fully logged operation if logging is the bottleneck
- Fewer writes go the transaction log, a much smaller log file with a lighter I/O requirement
- Cannot participate in a transaction
26) Which is a command- D34line utility tool for performing bulk load
- BULK INSERT
- BCP
- Integration Services Data Destinations to perform bulk load
- SELECT INTO method to perform bulk load
27) Which is fastest method to bulk load data where the integration services runs in the same instance of SQL Server
- Using Integration Services with SQL Server Destination
- Integration Services with OLE DB Destination
- Using BULK Insert command
- BCP
28) If your source data is a table inside SQL Server, which options is a quick and easy one to achieve Bulk Loading an Empty, NonPartitioned Table
- Using BCP
- Using SELECT INTO or INSERT … SELECT
- Using BULK Insert command
- Using Integration Services Data Destinations
29) What are the different package configuration types supported by Integrations Services
Option A) XML Configuration File
Option B) Text File
Option C) Environment Variable
Option D Registry Entry
- Option A,B&C
- Option B,C &D
- Option A,C&D
- Option A,B,C&D
30) During Bulk Load there are typical waits, which are required to be gathered to analyze the root cause to further investigate and eliminate the bottlenecks. In such bulk loading scenario it is observed that the “input data is too slow” and to verify this which is Wait Type stats data needs to be collected for analysis
- PAGEIOLATCH_<X>
- PREEMPTY_COM_<X>
- OLEDB
- ASYNC_NETWORK_IO
31) During Bulk Load there are typical waits, which are required to be gathered to analyze the root cause to further investigate and eliminate the bottlenecks. In such bulk loading scenario it is observed that the “Network cannot keep up” and to verify this which is Wait Type stats data needs to be collected for analysis
- IMPROV_IO
- ASYNC_NETWORK_IO
- WRITELOG
- PAGELATCH_UP
32) Which Performance Object Counter helps to Measure the bandwidth you are getting from the NICs in the server
- Bytes Total / sec
- Bulk copy rows / sec
- Log bytes Flushed / Sec
- Disk Read Bytes / sec
33) Which Performance Object Counter helps to Measure the number of rows coming into the database
- Disk Read Bytes / sec
- % Processor time – Total
- Bulk copy rows / sec
- Bytes Total / sec
34) When installing SQL Server Integration Services engine on 64-bit development environment, Which directory path under which the 64-bit features are installed
- Program Files (x86)
- Program Files
- Program Files (32)
- Program Files (64
35) With SQL Server 2012 and above Project Deployment Model options is available and which is NOT a applicable statement to Project Deployment Model
- CLR integration is required on the DB engine
- Package is the Unit of deployment
- Project deployment file is referred with .ispac extension
- Packages are parameters are deployed to SSIS Catalog
36) Which of the following is used to enumerate the result set of an XML
- Foreach Item
- Foreach Nodelist
- Foreach ADO.NET
- Foreach SMO
37) which of the following task provide structures in packages
- precedence constraints that connect the executables
- Containers
- Transformations connected through paths
- Annotations
38) Which of the following is not a valid case while developing a package
- you cannot use a foreach loop container inside another for each loop container
- you cannot call the dataflow task as the first task in a package
- you should atleast have minimum two tasks in a package
- None of the above
39) Which of the following is a Maintenance task for SSIS administrators
- Transfer Database Task
- Execute T-SQL Statement Task
- Transfer Master Stored Procedures Task
- Execute Process Task
40) Which of the following information is not present in the Metadata for external columns
- name
- datatype
- length
- value
41) An error output for a source component contains
- Same columns as the regular output
- Only error information columns
- Same columns as the regular output and error information columns
- No error output available in source component
42) If you want to load the country details stored in a flat file into sql server and oracle databases, which transformation will you use
- conditional split transformation
- lookup transformation
- multicast transformation
- merge join transformation
43) “The Slowly Changing Dimension transformation directs these rows to an output named Changing Attributes Updates Output.” What type of SCD is supported here
- Type 1
- Type 2
- Type 3
- Type 4
44) Which type is not supported by Slowly Changing Dimension transformation
- Type 1
- Type 2
- Type 3
- Type 4
45) The “Inferred Member Updates Output” – Output of the Slowly Changing Dimension transformation is used to
- Update the Dimension data which is already referred by fact
- Update the dimension key in fact table
- Used to insert the new dimension record
- Used to insert new fact record
46) Fuzzy Lookup transformation requires a reference table is
- Only Sql Server
- Any RDBMS
- Only Files
- Fuzzy Lookup transformation does not require any reference table
47) Fuzzy Lookup transformation creates a new table for
- match lookup data
- match index table
- exact match data
- mismatch data
48) What is the range of the similarity score in fuzzy grouping transformation
- 0 to 1
- 0 to 100
- 1 to 10
- 1,2,3,4
49) To identify duplicate rows, what similarity threshold will you set in fuzzy group
- closer to 1
- closer to 0
- closer to 100
- none of the above
50) The Term Extraction transformation extracts terms from
- Any language text data
- English text
- Unicode Data
- All of the above
51) If an apostrophe is in a word like “bicycle’s”, the Term Extraction Transformation will output this word as
- bicycle
- bicycles
- bicycle’s
- Term Extraction Transformation cannot identify a word with apostrophe
52) If you want to mine the highest number of tweets you made about a person from your tweet archive.
- Term Extraction Transformation
- Fuzzy Lookup Transformation
- Fuzzy Grouping Transformation
- Term Lookup Transformation
53) To find the number of times a term in the lookup table occurs in a input column, which transformation is used
- Term Extraction Transformation
- Fuzzy Lookup Transformation
- Fuzzy Grouping Transformation
- Term Lookup Transformation
54) The lookup table used in Term Lookup Transformation must be a table in
- Sql Server
- Sql Server or Access
- RDBMS
- file
55) Data Mining Query Transformation is used to
- T-Sql Queries against data mining models
- DMX queries against data mining models
- T-Sql Queries against OLTP Table and data mining models
- All of the above
56) One Data Mining Query Transformation can execute multiple prediction queries if
- The models are built on the different data mining structure
- The models are built on the same data mining structure
- Data Mining Query Transformation can not execute multiple prediction queries
- both a & b
57) The Pivot transformation is used to make
- Remove duplicates in demoralized data set
- Demoralized data set into a normalized data set
- Normalized data set into a demoralized data set
- none of the above
58) Which Transformation can be used to transpose rows to columns
- PIVOT Transformation
- UNPIVOT Transformation
- DERIVED COLUMN Transformation
- Can’t use a transformation, use T-sql instead
59) Consider a table containing employee’s salary increment data stored with the corresponding year in Year column, If you want to get the Salary Increment data for each employee in a single row for every year as separate columns, which PivotUsage property you need to set in PIVOT transformation
- 0
- 1
- 2
- 3
60) An error output for a source component contains
- PIVOT Transformation
- UNPIVOT Transformation
- Both
- None