New Year Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70special

Snowflake ARA-R01 SnowPro Advanced: Architect Recertification Exam Exam Practice Test

Page: 1 / 16
Total 162 questions

SnowPro Advanced: Architect Recertification Exam Questions and Answers

Testing Engine

  • Product Type: Testing Engine
$37.5  $124.99

PDF Study Guide

  • Product Type: PDF Study Guide
$33  $109.99
Question 1

An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

Based on this scenario, what Time Travel query options are available on the secondary database?

Options:

A.

A query using Time Travel in the secondary database is available for every hourly table version within the retention window.

B.

A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.

C.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.

D.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.

Question 2

Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

Options:

A.

Use Snowpipe with auto-ingest.

B.

Use a COPY command with a task.

C.

Use a materialized view on an external table.

D.

Use the COPY INTO command.

E.

Use a combination of a task and a stream.

Question 3

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

Options:

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;

CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;

CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY

ALLOWED_IP_LIST = ('10.1.1.20');

Question 4

How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

Options:

A.

Set masking policy conditions using current_role targeting the role in use for the current session.

B.

Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

C.

Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

D.

Determine if there are ownership privileges on the masking policy that would allow the use of any function.

E.

Assign the accountadmin role to the user who is executing the object.

Question 5

A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

* Confirmed Private Link URLs are working by logging in with a username/password account

* Verified DNS resolution by running nslookups against Private Link URLs

* Validated connectivity using SnowCD

* Disabled public access using a network policy set to use the company’s IP address range

However, the following error message is received when using SSO to log into the company account:

IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

Options:

A.

Alter the Azure security integration to use the Private Link URLs.

B.

Add the IP address in the error message to the allowed list in the network policy.

C.

Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

D.

Update the configuration of the Azure AD SSO to use the Private Link URLs.

E.

Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

Question 6

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

Options:

A.

CSV

B.

XML

C.

Avro

D.

JSON

E.

Parquet

Question 7

What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

Options:

A.

Privileges can be granted at the database level and can be inherited by all underlying objects.

B.

A user can use a "super-user" access along with securityadmin to bypass authorization checks and access all databases, schemas, and underlying objects.

C.

A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.

D.

A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

Question 8

An Architect has been asked to clone schema STAGING as it looked one week ago, Tuesday June 1st at 8:00 AM, to recover some objects.

The STAGING schema has 50 days of retention.

The Architect runs the following statement:

CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-06-01 08:00:00');

The Architect receives the following error: Time travel data is not available for schema STAGING. The requested time is either beyond the allowed time travel period or before the object creation time.

The Architect then checks the schema history and sees the following:

CREATED_ON|NAME|DROPPED_ON

2021-06-02 23:00:00 | STAGING | NULL

2021-05-01 10:00:00 | STAGING | 2021-06-02 23:00:00

How can cloning the STAGING schema be achieved?

Options:

A.

Undrop the STAGING schema and then rerun the CLONE statement.

B.

Modify the statement: CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-05-01 10:00:00');

C.

Rename the STAGING schema and perform an UNDROP to retrieve the previous STAGING schema version, then run the CLONE statement.

D.

Cloning cannot be accomplished because the STAGING schema version was not active during the proposed Time Travel time period.

Question 9

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Question 10

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

Options:

A.

1. Create a share and add the database privileges to the share

2. Create a new listing on the Snowflake Marketplace

3. Alter the listing and add the share

4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)

2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared

3. Create a share and add the database privileges to the share

4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Add the reader account to the share

4. Share the reader account's URL and credentials with the customer

Question 11

An Architect needs to allow a user to create a database from an inbound share.

To meet this requirement, the user’s role must have which privileges? (Choose two.)

Options:

A.

IMPORT SHARE;

B.

IMPORT PRIVILEGES;

C.

CREATE DATABASE;

D.

CREATE SHARE;

E.

IMPORT DATABASE;

Question 12

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.

The parameter will be ignored.

C.

The command will return an error.

D.

The command will return a warning stating that the file has unmatched columns.

command is used to load data from staged files into an existing table in Snowflake. The command supports various file formats, such as CSV, JSON, AVRO, ORC, PARQUET, and XML1.
  • The match_by_column_name parameter is a copy option that enables loading semi-structured data into separate columns in the target table that match corresponding columns represented in the source data. The parameter can have one of the following values2:
  • The match_by_column_name parameter only applies to semi-structured data, such as JSON, AVRO, ORC, PARQUET, and XML. It does not apply to CSV data, which is considered structured data2.
  • When using the copy into
  • command with the CSV file format, the match_by_column_name parameter behaves as follows2:

    References:

    • 1: COPY INTO
    | Snowflake Documentation
  • 2: MATCH_BY_COLUMN_NAME | Snowflake Documentation
  • Question 13

    Which columns can be included in an external table schema? (Select THREE).

    Options:

    A.

    VALUE

    B.

    METADATASROW_ID

    C.

    METADATASISUPDATE

    D.

    METADAT A$ FILENAME

    E.

    METADATAS FILE_ROW_NUMBER

    F.

    METADATASEXTERNAL TABLE PARTITION

    Question 14

    The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

    Step 1: Data files are loaded in a stage.

    Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

    Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

    If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

    Options:

    A.

    The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

    B.

    The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

    C.

    The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

    D.

    The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

    Question 15

    An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

    The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

    AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

    What is the MOST cost-effective way to increase the availability of the reports?

    Options:

    A.

    Use materialized views and pre-calculate the data.

    B.

    Increase the warehouse to size Large and set auto_suspend = 600.

    C.

    Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

    D.

    Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

    Question 16

    Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

    Options:

    A.

    Graph model

    B.

    Dimensional/Kimball

    C.

    Data lake

    D.

    lnmon/3NF

    E.

    Bayesian hierarchical model

    F.

    Data vault

    Question 17

    Why might a Snowflake Architect use a star schema model rather than a 3NF model when designing a data architecture to run in Snowflake? (Select TWO).

    Options:

    A.

    Snowflake cannot handle the joins implied in a 3NF data model.

    B.

    The Architect wants to remove data duplication from the data stored in Snowflake.

    C.

    The Architect is designing a landing zone to receive raw data into Snowflake.

    D.

    The Bl tool needs a data model that allows users to summarize facts across different dimensions, or to drill down from the summaries.

    E.

    The Architect wants to present a simple flattened single view of the data to a particular group of end users.

    Question 18

    Which Snowflake architecture recommendation needs multiple Snowflake accounts for implementation?

    Options:

    A.

    Enable a disaster recovery strategy across multiple cloud providers.

    B.

    Create external stages pointing to cloud providers and regions other than the region hosting the Snowflake account.

    C.

    Enable zero-copy cloning among the development, test, and production environments.

    D.

    Enable separation of the development, test, and production environments.

    Question 19

    A company wants to deploy its Snowflake accounts inside its corporate network with no visibility on the internet. The company is using a VPN infrastructure and Virtual Desktop Infrastructure (VDI) for its Snowflake users. The company also wants to re-use the login credentials set up for the VDI to eliminate redundancy when managing logins.

    What Snowflake functionality should be used to meet these requirements? (Choose two.)

    Options:

    A.

    Set up replication to allow users to connect from outside the company VPN.

    B.

    Provision a unique company Tri-Secret Secure key.

    C.

    Use private connectivity from a cloud provider.

    D.

    Set up SSO for federated authentication.

    E.

    Use a proxy Snowflake account outside the VPN, enabling client redirect for user logins.

    Question 20

    A new user user_01 is created within Snowflake. The following two commands are executed:

    Command 1-> show grants to user user_01;

    Command 2 ~> show grants on user user 01;

    What inferences can be made about these commands?

    Options:

    A.

    Command 1 defines which user owns user_01

    Command 2 defines all the grants which have been given to user_01

    B.

    Command 1 defines all the grants which are given to user_01 Command 2 defines which user owns user_01

    C.

    Command 1 defines which role owns user_01

    Command 2 defines all the grants which have been given to user_01

    D.

    Command 1 defines all the grants which are given to user_01

    Command 2 defines which role owns user 01

    Question 21

    There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

    An Architect needs to create a read-only role for certain employees working in the human resources department.

    Which permission sets must be granted to this role?

    Options:

    A.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

    B.

    USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

    C.

    MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

    D.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

    Question 22

    A company needs to have the following features available in its Snowflake account:

    1. Support for Multi-Factor Authentication (MFA)

    2. A minimum of 2 months of Time Travel availability

    3. Database replication in between different regions

    4. Native support for JDBC and ODBC

    5. Customer-managed encryption keys using Tri-Secret Secure

    6. Support for Payment Card Industry Data Security Standards (PCI DSS)

    In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

    Options:

    A.

    Standard

    B.

    Enterprise

    C.

    Business Critical

    D.

    Virtual Private Snowflake (VPS)

    Question 23

    At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

    Options:

    A.

    Global

    B.

    Database

    C.

    Schema

    D.

    Table

    Question 24

    Which Snowflake objects can be used in a data share? (Select TWO).

    Options:

    A.

    Standard view

    B.

    Secure view

    C.

    Stored procedure

    D.

    External table

    E.

    Stream

    Question 25

    An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

    1. Update the order in the f_0RDERS fact table.

    2. Load the changed order data into the special table ORDER _REPAIRS.

    This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

    What data processing logic design will be the MOST performant?

    Options:

    A.

    Useone stream and one task.

    B.

    Useone stream and two tasks.

    C.

    Usetwo streams and one task.

    D.

    Usetwo streams and two tasks.

    Question 26

    An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

    What is the reason for this?

    Options:

    A.

    The query is processing a very large dataset.

    B.

    The query has overly complex logic.

    C.

    The query Is queued for execution.

    D.

    The query Is reading from remote storage

    Question 27

    An Architect Is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake system. The company is planning on sharing data among Its corporate branches using Snowflake data sharing.

    What should be considered when sharing the unstructured data within Snowflake?

    Options:

    A.

    A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

    B.

    A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

    C.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

    D.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

    Question 28

    Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

    A)

    B)

    C)

    D)

    Options:

    A.

    Option A

    B.

    Option B

    C.

    Option C

    D.

    Option D

    Question 29

    A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

    What would be the MOST efficient solution?

    Options:

    A.

    Ask the partner to create a share and add the company's account.

    B.

    Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

    C.

    Keep the current structure but request that the partner stop changing files, instead only appending new files.

    D.

    Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

    Question 30

    A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

    What should the Architect tell the data organization? (Select TWO).

    Options:

    A.

    Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

    B.

    Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

    C.

    Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

    D.

    Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

    E.

    There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

    Question 31

    A user named USER_01 needs access to create a materialized view on a schema EDW. STG_SCHEMA. How can this access be provided?

    Options:

    A.

    GRANT CREATE MATERIALIZED VIEW ON SCHEMA EDW.STG_SCHEMA TO USER USER_01;

    B.

    GRANT CREATE MATERIALIZED VIEW ON DATABASE EDW TO USER USERJD1;

    C.

    GRANT ROLE NEW_ROLE TO USER USER_01;

    GRANT CREATE MATERIALIZED VIEW ON SCHEMA ECW.STG_SCHEKA TO NEW_ROLE;

    D.

    GRANT ROLE NEW_ROLE TO USER_01;

    GRANT CREATE MATERIALIZED VIEW ON EDW.STG_SCHEMA TO NEW_ROLE;

    Question 32

    Following objects can be cloned in snowflake

    Options:

    A.

    Permanent table

    B.

    Transient table

    C.

    Temporary table

    D.

    External tables

    E.

    Internal stages

    Question 33

    What step will im the performance of queries executed against an external table?

    Options:

    A.

    Partition the external table.

    B.

    Shorten the names of the source files.

    C.

    Convert the source files' character encoding to UTF-8.

    D.

    Use an internal stage instead of an external stage to store the source files.

    Question 34

    A user has the appropriate privilege to see unmasked data in a column.

    If the user loads this column data into another column that does not have a masking policy, what will occur?

    Options:

    A.

    Unmasked data will be loaded in the new column.

    B.

    Masked data will be loaded into the new column.

    C.

    Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.

    D.

    Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.

    Question 35

    An Architect needs to design a solution for building environments for development, test, and pre-production, all located in a single Snowflake account. The environments should be based on production data.

    Which solution would be MOST cost-effective and performant?

    Options:

    A.

    Use zero-copy cloning into transient tables.

    B.

    Use zero-copy cloning into permanent tables.

    C.

    Use CREATE TABLE ... AS SELECT (CTAS) statements.

    D.

    Use a Snowflake task to trigger a stored procedure to copy data.

    Question 36

    How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

    Options:

    A.

    Shared databases are read-only.

    B.

    Shared databases must be refreshed in order for new data to be visible.

    C.

    Shared databases cannot be cloned.

    D.

    Shared databases are not supported by Time Travel.

    E.

    Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

    F.

    Shared databases can also be created as transient databases.

    Question 37

    The following DDL command was used to create a task based on a stream:

    Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

    Options:

    A.

    The warehouse MY_WH will be made active every five minutes to check the stream.

    B.

    The warehouse MY_WH will only be active when there are results in the stream.

    C.

    The warehouse MY_WH will never suspend.

    D.

    The warehouse MY_WH will automatically resize to accommodate the size of the stream.

    Question 38

    An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

    The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

    Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

    Options:

    A.

    1) Create a share in the Production account for each database

    2) Share access to the QA account as a Consumer

    3) The QA account creates a database directly from each share

    4) Create clones of those databases on a nightly basis

    5) Run tests directly on those cloned databases

    B.

    1) Create a stage in the Production account

    2) Create a stage in the QA account that points to the same external object-storage location

    3) Create a task that runs nightly to unload each table in the Production account into the stage

    4) Use Snowpipe to populate the QA account

    C.

    1) Enable replication for each database in the Production account

    2) Create replica databases in the QA account

    3) Create clones of the replica databases on a nightly basis

    4) Run tests directly on those cloned databases

    D.

    1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

    2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

    Question 39

    A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

    The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

    Options:

    A.

    1. Create a share.

    2. Add objects to the share.

    3. Add a consumer account to the share for the vendor to access.

    B.

    1. Create a share.

    2. Create a reader account for the vendor to use.

    3. Add the reader account to the share.

    C.

    1. Create a new role called db_share.

    2. Grant the db_share role privileges to read data from the company database and schema.

    3. Create a user for the vendor.

    4. Grant the ds_share role to the vendor's users.

    D.

    1. Promote an existing database in the company's local account to primary.

    2. Replicate the database to Snowflake on Azure in the West-Europe region.

    3. Create a share and add objects to the share.

    4. Add a consumer account to the share for the vendor to access.

    Question 40

    Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

    Options:

    A.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

    B.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

    C.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

    D.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

    Question 41

    Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

    As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

    How can this requirement be met?

    Options:

    A.

    Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.

    B.

    Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.

    C.

    Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.

    D.

    Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

    Question 42

    Which system functions does Snowflake provide to monitor clustering information within a table (Choose two.)

    Options:

    A.

    SYSTEM$CLUSTERING_INFORMATION

    B.

    SYSTEM$CLUSTERING_USAGE

    C.

    SYSTEM$CLUSTERING_DEPTH

    D.

    SYSTEM$CLUSTERING_KEYS

    E.

    SYSTEM$CLUSTERING_PERCENT

    Question 43

    An Architect needs to design a data unloading strategy for Snowflake, that will be used with the COPY INTO command.

    Which configuration is valid?

    Options:

    A.

    Location of files: Snowflake internal location

    . File formats: CSV, XML

    . File encoding: UTF-8

    . Encryption: 128-bit

    B.

    Location of files: Amazon S3

    . File formats: CSV, JSON

    . File encoding: Latin-1 (ISO-8859)

    . Encryption: 128-bit

    C.

    Location of files: Google Cloud Storage

    . File formats: Parquet

    . File encoding: UTF-8

    · Compression: gzip

    D.

    Location of files: Azure ADLS

    . File formats: JSON, XML, Avro, Parquet, ORC

    . Compression: bzip2

    . Encryption: User-supplied key

    Question 44

    A user can change object parameters using which of the following roles?

    Options:

    A.

    ACCOUNTADMIN, SECURITYADMIN

    B.

    SYSADMIN, SECURITYADMIN

    C.

    ACCOUNTADMIN, USER with PRIVILEGE

    D.

    SECURITYADMIN, USER with PRIVILEGE

    Question 45

    A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

    Options:

    A.

    Setting a '0' or NULL value means the warehouses will never suspend.

    B.

    Setting a '0' or NULL value means the warehouses will suspend immediately.

    C.

    Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

    D.

    Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

    Question 46

    An Architect is using SnowCD to investigate a connectivity issue.

    Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

    Options:

    A.

    SYSTEMSALLOWLIST ()

    B.

    SYSTEMSGET_PRIVATELINK

    C.

    SYSTEMSAUTHORIZE_PRIVATELINK

    D.

    SYSTEMSALLOWLIST_PRIVATELINK ()

    Question 47

    A table, EMP_ TBL has three records as shown:

    The following variables are set for the session:

    Which SELECT statements will retrieve all three records? (Select TWO).

    Options:

    A.

    Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

    B.

    SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

    C.

    SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

    D.

    SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

    E.

    SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

    Question 48

    A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

    Which requirements will be addressed with this approach? (Choose two.)

    Options:

    A.

    There needs to be fewer objects per tenant.

    B.

    Security and Role-Based Access Control (RBAC) policies must be simple to configure.

    C.

    Compute costs must be optimized.

    D.

    Tenant data shape may be unique per tenant.

    E.

    Storage costs must be optimized.

    Page: 1 / 16
    Total 162 questions