Black Friday Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70special

Hitachi HCE-5920 Hitachi Vantara Certified Specialist - Pentaho Data Integration Implementation Exam Practice Test

Page: 1 / 6
Total 60 questions

Hitachi Vantara Certified Specialist - Pentaho Data Integration Implementation Questions and Answers

Testing Engine

  • Product Type: Testing Engine
$37.5  $124.99

PDF Study Guide

  • Product Type: PDF Study Guide
$33  $109.99
Question 1

In a PDI transformation you are retrieving data from a large lookup table using a Database Lookup step from improve performance, you enable caching in the stepand use the Load all data from table option.

In this scenario, which three statement s are correct about the data flow of the ‘Database Lookup step? (Choose three.)

Options:

A.

When caching is enable, only rows with matching lookup values will passed through.

B.

There must be enough allocated heap space to store the lookup fields allocated heap space to store the lookup fields in memory.

C.

Cached comparisons are case sensitive.

D.

Every input row must have only one matching row in the lookup table

E.

Only one matching row is used from the Lookup table.

Question 2

Which PDI step or entry processes data within the Hadoop cluster?

Options:

A.

the Hadoop File Output step

B.

the Hadoop Fie Input step

C.

the Pentaho MapReduce entry

D.

the Hadoop Copy files entry

Question 3

To simplify themigration of a PDI solution from one environment to another, you need to externalize all database connection details.

Which two methods will satisfy this requirement? (Choose two.)

Choose 2 answers

Options:

A.

Use the Data Source Wizard to generate the connection details.

B.

Create a properties file that includes the connection details and use the Set Variables step to load these properties.

C.

Use Manage Data Source to create a new connection in Spoon.

D.

Set the connection details in the Kettle properties file.

Question 4

you want to make a dynamic PDI transformation that is driven with variables that areloaded from a properties file.

Which free form text fields within a step can be configured with variables?

Options:

A.

Any free form text field with a ''V'' next to it

B.

any free form text field with a '@' sign next to it.

C.

any free form text field with a '$' sign next to it.

D.

Any free form text field with the variable name entered in all caps

Question 5

Which three file formats are splittable on HDFS? (Choose three).

Choose 3 answers

Choose 3 answers

Options:

A.

txt

B.

xml

C.

Parquet

D.

xisx

E.

Avro

Question 6

A customer has an existing PDI job that calls a transformation. They want to execute the transformation through Spark on their Hadoop cluster.

Which change must be made to satisfy this requirement?

Options:

A.

Change the Parameters of the transformation

B.

Change the Run Configuration of the transformation

C.

Change theLogging options of the transformation

D.

Change the Arguments of the transformation

Question 7

You are migrating to a new version of the Pentaho server and you want to import your old repository.

What are two methods to accomplish this task? (Choose two.) Choose 2 answers

Options:

A.

Use the pan script

B.

Use the import-export script.

C.

UsingSpoon Tools > Repository > import Repository

D.

Use the encr script.

Question 8

You have used for copies of the Sort rows step in parallel on the local JVM using the ‘Change number of copies to start’option.

Each of the sorted stream need to be merged together to ensure the proper sort sequence.

Which step should be used in this scenario?

Options:

A.

the Sorted Merge step

B.

the Merge Join step

C.

the Dummy step

D.

the Append Streams step

Question 9

A transformation is running in a production environment and you want to monitor it in real time.

Which tool should you use?

Options:

A.

Pentaho Operations Mart

B.

Kettle status page

C.

Log4j

D.

Monitoring tab

Page: 1 / 6
Total 60 questions