From Domo to Snowflake: Setting Up Write Access

In the first article in this series, we configured the Snowflake environment for connecting Domo to Snowflake using Cloud Amplifier. Next, we setup read access to Snowflake. This article will cover writeback with Cloud Amplifier.

Connect Domo Cloud Amplifier

As in the last article, this walkthrough assumes you have at least demo Snowflake and Domo accounts. The screenshots are from my demo accounts – yours may look different but the steps to implement are the same.

First, login to your Domo account and click on the Data icon. in the top menu bar. Next, click on the Data Warehouse icon on the left menu bar. Now, click on the Domo icon in upper right corner of the “Data Wheel”. Click the + button to add a new cloud account. Next, click the Snowflake icon and then click “Add New Integration”.

Initial Domo Setup

Configure Snowflake Connection

In the screen that opens after choosing “Add New Integration” in the step above, this is where you’ll need all your Snowflake information that we configured in the first article in this series. The required items for the connection are listed below:

  1. Integration Name: This is a unique integration name in Domo. For this demo, I’m going to use DOMO_WRITER because this integration will just be for writing to Snowflake
  2. Snowflake Connection URL: The URL for your Snowflake account which looks like XXXXXXXX.snowflakecomputing.com where the X’s represent your account. Do not include the https:// in the URL. You can get this information from your Snowflake account through Snowsight:
  3. Snowflake Username: username of the account to connect to Snowflake. I’ll use the reader service account created previously: DOMO_WRITER_SVC_01
  4. Snowflake Password: password for the account above

Additional Settings

In the connection setup screen there are two additional settings you may want to change based upon your use case.

  1. Check for data updates every: this setting is how frequently Domo will check for data changes in Snowflake. Adjusting this value to match the frequency of your Snowflake data loads/changes will help with Snowflake costs. Every “check” from Domo will consume Snowflake credits to query metadata looking for data changes. If your data only loads once a day, there’s no need for Domo to check every 15 minutes for changes.
  2. Cache TTL: this sets how long the data will remain cached in Domo.

After completing the fields as shown, click the Next button to continue with setup.

Snowflake Setup in Domo

Configuring Write Operations

Once the connection is configured, Domo will show the list of compute warehouses the account has access to. From the operations selection box, check the LOAD and QUERY options. These options will allow our connection to read and write from the Snowflake.

Choosing Operation for Snowflake Warehouse

Setup Write Integration

This demo is primarily for establishing the WRITEBACK portion of Cloud Amplifier. Next, we’ll click on “Configure Write Access”. Domo will walk us through some additional SQL setup in Snowflake necessary to allow writeback to Snowflake. Note that we’ll need the Snowflake role and database name we setup in the initial setup article.

Enter the role and database name configured in the initial setup, and then click Generate SQL. Domo will generate three SQL Commands to run in your Snowflake account. The first one should look familiar as we already ran that as part of our initial setup. The second and third commands will need to be copied and executed in Snowflake to setup the integration between Snowflake and Domo.

We’ll need several pieces of information from the DESC command in Snowflake to finish the connection. After configuring the storage integration and running the DESC query below click the next button. In the fields provided, copy/paste the values from Snowflake to Domo for the AWS Connection and then click Next.

DESC INTEGRATION DOMO_DB_DOMO_INTEGRATION;
SELECT "property","property_value" FROM TABLE(RESULT_SCAN(LAST_QUERY_ID()))
WHERE "property" IN ('STORAGE_AWS_IAM_USER_ARN','STORAGE_AWS_EXTERNAL_ID');

Finally, copy the last SQL Command and execute it in Snowflake to complete setup of the Stages needed for Domo to write to Snowflake. After doing so, click Next. Domo will prompt one last screen asking you to confirm that Domo can make changes to your Snowflake account. Check the box and click Next.

Writing Data From Domo to Snowflake

With the writeback connection setup, now we can use Domo’s Magic ETL to write data back to Snowflake. Setup in Magic ETL would be the same as writing back to Domo, except in the target dataset, we’ll specify the DOMO_WRITER as the cloud target.

Snowflake Writeback from Domo

Domo will write the data back to Snowflake in the defined database in the connection; however, at the time of this article, specifying the schema location of the data is not supported. Domo will create a new schema in the database and then write the data there.

Building Magic ETL to Write to Snowflake
SnowflakeTable after Magic ETL

Wrapping Up

In this article, we walked through setting up Domo for write access to Snowflake using Cloud Amplifier. With so many built in Domo connectors like Twitter, Facebook, Excel, and many others – imagine the possibilities of pairing Domo’s ETL tools to import data to Snowflake, leverage the power of Snowflake’s data processing capabilities and finally use Domo’s visualization tools for display.

Snowflake and Domo work well together and the integrations are getting even better. Hopefully in the near future we’ll be able to specify the database and schema for writing back to Snowflake.

Follow me on LinkedIn and Medium for more content on Data Management and demos including Snowflake, Streamlit, and SQL Server.

From Snowflake to Domo: Setting Up Seamless Read Access

In the first article in this series, we configured the Snowflake environment for connecting Domo to Snowflake using Cloud Amplifier. This article will cover the Domo configuration for read access to Snowflake – writeback with Cloud Amplifier will require some additional configuration in the Snowflake account and I’ll cover that in another article. Let’s get Domo’d!

Connect Domo Cloud Amplifier

As in the last article, this walkthrough assumes you have at least demo Snowflake and Domo accounts. The screenshots are from my demo accounts – yours may look different but the steps to implement are the same.

First, login to your Domo account and click on the Data icon. in the top menu bar. Next, click on the Data Warehouse icon on the left menu bar. Now, click on the Domo icon in upper right corner of the “Data Wheel”. Click the + button to add a new cloud account. Next, click the Snowflake icon and then click “Add New Integration”.

Initial Domo Setup

Configure Snowflake Connection

In the screen that opens after choosing “Add New Integration” in the step above, this is where you’ll need all your Snowflake information that we configured in the first article in this series. The required items for the connection are listed below:

  1. Integration Name: This is a unique integration name in Domo. For this demo, I’m going to use DOMO_READER because this integration will just be for reading from Snowflake
  2. Snowflake Connection URL: The URL for your Snowflake account which looks like XXXXXXXX.snowflakecomputing.com where the X’s represent your account. Do not include the https:// in the URL. You can get this information from your Snowflake account through Snowsight:
  3. Snowflake Username: username of the account to connect to Snowflake. I’ll use the reader service account created previously: DOMO_READER_SVC_01
  4. Snowflake Password: password for the account above

Additional Settings

In the connection setup screen there are two additional settings you may want to change based upon your use case.

  1. Check for data updates every: this setting is how frequently Domo will check for data changes in Snowflake. Adjusting this value to match the frequency of your Snowflake data loads/changes will help with Snowflake costs. Every “check” from Domo will consume Snowflake credits to query metadata looking for data changes. If your data only loads once a day, there’s no need for Domo to check every 15 minutes for changes.
  2. Cache TTL: this sets how long the data will remain cached in Domo.

After completing the fields as shown, click the Next button to continue with setup.

Snowflake Cloud Amplifier Connection

Configuring Read Operations

Once the connection is configured, Domo will show the list of compute warehouses the account has access to. From the operations selection box, check the QUERY option. This option will allow our connection to read from the Snowflake tables and views. After selecting Query, click Next.

Choosing Operation for Snowflake Warehouse

Choosing Tables to Connect

This demo is only for establishing the READ portion of Cloud Amplifier. We’ll configure WRITE in another demo. Next, we’ll click on “Choose Tables to Connect”. Domo will connect to Snowflake and populate the list of Databases, Schemas and Tables/View our account has access to. Choose the tables/views you want to connect and then click “Create Datasets”. Domo will then create a Dataset for each table/view you choose.

Connecting Snowflake Tables to Domo

Monitoring Domo Connection in Snowflake

So, we’ve connected Domo to Snowflake. Now what? As you use the datasets for cards, Magic ETL, etc. in Domo, queries are sent back to Snowflake to load/transform the data. For example, this simple Magic ETL in Domo generated twenty-one (21) SQL commands in Snowflake when I ran the preview.

Domo Magic ETL
Sample of Queries Generated by Domo Magic ETL

The Magic ETL shown generates a Domo dataset. In a later demo, I’ll show how to write this back to Snowflake using Magic ETL.

Wrapping Up

In this article, we walked through setting up Domo for read access to Snowflake using Cloud Amplifier. We started by logging into Domo, navigating to the Data Warehouse interface, and creating a new integration with Snowflake by configuring essential settings like the connection URL, username, and password. We also covered optimizing performance by adjusting data update frequency and cache TTL.

After establishing the connection, we configured read operations by selecting the QUERY option and choosing specific tables to create datasets within Domo. This setup allows for dynamic data visualization and analysis. Stay tuned for the next article, where we’ll explore configuring writeback capabilities to Snowflake.

Follow me on LinkedIn and Medium for more content on Data Management and demos including Snowflake, Streamlit, and SQL Server.

Integrating Domo with Snowflake: Role Setup and Environment Preparation

Domo? That’s new to this blog – well, yes and no. If you look at my Credentials page, you’ll see that I have also earned some Domo Certifications. But what is Domo?

Domo revolutionizes business intelligence by transforming data into actionable insights, seamlessly integrating with existing systems to deliver real-time analytics. Oh great! Yet another BI platform. Well, yes. But there’s more to Domo than just BI. I won’t go into all the cool things that Domo does that differentiate it from other BI platforms here; however, I am going to focus on one of their newer features – Domo Cloud Amplifier for Snowflake. This article will focus first on setting up most of the required environment in Snowflake to support both read and write operations between Snowflake and Domo. Most? Yes, there are some aspects of the write connection that are Domo account-specific, I’ll cover that in another article.

All Snowflake and Domo integrations shown in this series are done using free demo accounts from both products.

Snowflake Roles

First, we’ll create three Domo-specific roles in Snowflake. Doing so now helps facilitate the separation of read and write processes in our integration. The script shown will set the following roles:

  1. DOMO_READER_ROLE: Primary read role from Domo.
  2. DOMO_WRITER_ROLE: Allows read/write capabilities from Domo.
  3. DOMO_ADMIN_ROLE: Parent to the read and write roles.
USE ROLE SECURITYADMIN;

/*DOMO READ ROLE */
CREATE OR REPLACE ROLE DOMO_READER_ROLE;
/*DOMO WRITE ROLE */
CREATE OR REPLACE ROLE DOMO_WRITER_ROLE;
/*DOMO ADMIN ROLE */
CREATE OR REPLACE ROLE DOMO_ADMIN_ROLE;
/*CREATE HIERARCHY */
GRANT ROLE DOMO_READER_ROLE TO ROLE DOMO_ADMIN_ROLE;
GRANT ROLE DOMO_WRITER_ROLE TO ROLE DOMO_ADMIN_ROLE;
GRANT ROLE DOMO_ADMIN_ROLE TO ROLE SYSADMIN;

/*ASSUME CURRENT USER WILL BE DOMO ADMIN */
SET CURR_USER = CURRENT_USER();
GRANT ROLE DOMO_ADMIN_ROLE TO USER IDENTIFIER($CURR_USER);
UNSET CURR_USER;

Snowflake Compute Warehouses

For Domo to read and/or write data while connected to Snowflake, we’ll need compute warehouses in Snowflake to provide the computing power for our queries. The script below creates two extra small warehouses – one for read and one for write – and assigns grants to the appropriate roles created above.


USE ROLE ACCOUNTADMIN;
/*READ WAREHOUSE */
CREATE OR REPLACE WAREHOUSE DOMO_READ_XS_WH
    WAREHOUSE_SIZE= XSMALL
    MIN_CLUSTER_COUNT = 1
    MAX_CLUSTER_COUNT = 1
    AUTO_SUSPEND = 60
    AUTO_RESUME = TRUE;

/*GRANT USASE TO READER ROLE*/
GRANT USAGE ON WAREHOUSE DOMO_READ_XS_WH TO ROLE DOMO_READER_ROLE;

/*WRITE WAREHOUSE */
CREATE OR REPLACE WAREHOUSE DOMO_WRITE_XS_WH
    WAREHOUSE_SIZE= XSMALL
    MIN_CLUSTER_COUNT = 1
    MAX_CLUSTER_COUNT = 1
    AUTO_SUSPEND = 60
    AUTO_RESUME = TRUE;

/*GRANT USASE TO WRITER ROLE*/
GRANT USAGE ON WAREHOUSE DOMO_WRITE_XS_WH TO ROLE DOMO_WRITER_ROLE;

Service Accounts for Domo

For simplicity in this demo, we’ll establish two new users as service accounts for our connection to Domo. My demo accounts are not connected to any third-party user authentication tools like Azure AD, so using service accounts keeps things simple for this demo. We’ll add each account to the appropriate Snowflake role as well.

/*DOMO READ SERVICE ACCOUNT */
USE ROLE SECURITYADMIN;
CREATE USER DOMO_READER_SVC_01
    PASSWORD = 'STRONG PASSWORD'
    DISPLAY_NAME = 'DOMO READER SERVICE ACCOUNT'
    FIRST_NAME = 'DOMO READER'
    LAST_NAME = 'SERVICE ACCOUNT'
    DEFAULT_ROLE = 'DOMO_READER_ROLE'
    DEFAULT_WAREHOUSE = 'DOMO_READ_XS_WH';
GRANT ROLE DOMO_READER_ROLE TO USER DOMO_READER_SVC_01;

/*DOMO READ WRITE ACCOUNT */
USE ROLE SECURITYADMIN;
CREATE USER DOMO_WRITER_SVC_01
    PASSWORD = 'STRONG PASSWORD'
    DISPLAY_NAME = 'DOMO WRITER SERVICE ACCOUNT'
    FIRST_NAME = 'DOMO WRITER'
    LAST_NAME = 'SERVICE ACCOUNT'
    DEFAULT_ROLE = 'DOMO_WRITER_ROLE'
    DEFAULT_WAREHOUSE = 'DOMO_WRITE_XS_WH';
GRANT ROLE DOMO_WRITER_ROLE TO USER DOMO_WRITER_SVC_01;

Domo DB

Finally, create a new database and schemas for read and write in Snowflake as well as apply proper grants to the roles for each.


USE ROLE ACCOUNTADMIN;
/*CREATE DB */
CREATE OR REPLACE DATABASE DOMO_DB;
/*MAKE DOMO ADMIN OWNER*/
GRANT OWNERSHIP ON DATABASE DOMO_DB TO ROLE DOMO_ADMIN_ROLE;
GRANT ALL ON DATABASE DOMO_DB TO ROLE DOMO_ADMIN_ROLE;
GRANT ALL ON FUTURE SCHEMAS IN DATABASE DOMO_DB TO DOMO_ADMIN_ROLE;

/*CREATE READ SCHEMA */
CREATE OR REPLACE SCHEMA DOMO_READ_SCH;
/*APPLY GRANTS TO READ ROLE */
GRANT USAGE ON DATABASE DOMO_DB TO ROLE DOMO_READER_ROLE;
GRANT USAGE ON SCHEMA DOMO_READ_SCH TO ROLE DOMO_READER_ROLE;
GRANT SELECT ON FUTURE TABLES IN SCHEMA DOMO_DB.DOMO_READ_SCH TO ROLE DOMO_READER_ROLE;
GRANT SELECT ON FUTURE VIEWS IN SCHEMA DOMO_READ_SCH TO ROLE DOMO_READER_ROLE;

/*CREATE WRITE SCHEMAS */
CREATE OR REPLACE SCHEMA DOMO_UTIL;
CREATE OR REPLACE SCHEMA S3_STAGES;
GRANT ALL ON DATABASE DOMO_DB to role DOMO_WRITER_ROLE; 
GRANT ALL ON SCHEMA DOMO_DB.DOMO_UTIL to role DOMO_WRITER_ROLE;
GRANT ALL ON SCHEMA DOMO_DB.S3_STAGES to role DOMO_WRITER_ROLE;

Load Sample Data

Snowflake provides sample data in trial accounts from a share and we could use that data in the sample database to connect to Domo; however, in a “real-world” application, you’d likely have your data sitting in a Domo-specific location like we’ve configured above. Using the script below will copy the sample tables and data from the TPCH_SF1 schema to our DOMO_READ_SCH to simulate what a production Domo/Snowflake environment may look like.


USE ROLE DOMO_ADMIN_ROLE;
USE DATABASE DOMO_DB;
USE SCHEMA DOMO_READ_SCH;
USE WAREHOUSE DOMO_WRITE_XS_WH;

SET TARGET_DB = 'DOMO_DB';
SET TARGET_SCH = 'DOMO_READ_SCH';

DECLARE
    DYN_SQL STRING;
BEGIN
    LET rsTables RESULTSET :=(
        SELECT
            CONCAT_WS('.',TABLE_CATALOG,TABLE_SCHEMA,TABLE_NAME) AS SRC_TBL,
            CONCAT_WS('.',$TARGET_DB,$TARGET_SCH,TABLE_NAME) AS TGT_TBL
            FROM SNOWFLAKE_SAMPLE_DATA.INFORMATION_SCHEMA.TABLES
            WHERE TABLE_SCHEMA = 'TPCH_SF1'
        );
    LET cTables CURSOR for rsTables;
    FOR t IN cTables DO
        DYN_SQL := 'CREATE OR REPLACE TABLE '|| t.TGT_TBL || ' AS SELECT * FROM ' || t.SRC_TBL;
        EXECUTE IMMEDIATE(:DYN_SQL);
    END FOR;
END;

Wrapping Up and What’s Next

And there you have it—a solid foundation for integrating Domo with Snowflake by establishing critical roles and preparing the Snowflake environment.

In the next article, we’ll dive into connecting Domo to Snowflake and leveraging sample data to illustrate how the integration works in practice. We’ll walk through the steps to configure the connection, ensuring seamless data flow and real-time analytics. Stay tuned to see how Domo’s Cloud Amplifier for Snowflake can transform your data insights and drive better decision-making.

Remember, this series uses free demo accounts from both Snowflake and Domo, making it accessible for you to follow along and experiment on your own. Keep an eye out for the next post where we’ll bring it all together and showcase the real power of combining Domo and Snowflake.

The full set of SQL scripts are available on my GitHub Repo.

Follow me on LinkedIn and Medium for more content on Data Management and demos including Snowflake, Streamlit, and SQL Server.