Bulk Creating Fleets with the Shipyard API - Part 2 with Different Vendors
Shipyard Uses Shipyard

Bulk Creating Fleets with the Shipyard API - Part 2 with Different Vendors

Steven Johnson
Steven Johnson

Last month, we released a post that explained how you can bulk create Fleets using the Shipyard API. However, that post was limited to the same vendors for each of the three Fleets that were created. We know that not every workflow in your business uses the same vendors over and over again.

In this post, we will change the script to be able to handle a change of vendors then create the Fleets programmatically with the Shipyard API. We've chosen to keep the first two vendors the same in each Fleet with Fivetran and dbt. However, we know that some organizations have multiple BI tools, so we are changing the BI tool that is included in each Fleet. Let's dive into the details of building this out!

1. Building an Initial Fleet in Shipyard

As mentioned above, we are going to continue with a three Vessel Fleet that has Fivetran and dbt as the first two Vessels, with a third Vessel being a variable BI tool. Instead of just including Tableau as the BI tool, we are going to give three options for the last Vessel as Tableau, Domo, or Mode.

The inputs for our first two Vessels are filled in with dummy data that can be changed easily later on.

2. Grabbing our Fleet's YAML

Now that we have our initial flow created, we need to take the YAML from Shipyard to our coding environment. To do this, we click the YAML editor button and copy and paste the code into a text editor and name it tutorial_new_vendor.yaml. You will also need to delete the id field for the Fleet and Vessel, so Shipyard will generate new ids for the Fleets generated with the API call later. Here is the YAML if you want to follow along:

name: Fleet for Tutorial - New Vendors
vessels:
    Execute Fivetran Sync:
        source:
            blueprint: Fivetran - Execute Sync
            inputs:
                FIVETRAN_API_KEY: YOUR_FIVETRAN_API_KEY
                FIVETRAN_API_SECRET: YOUR_API_SECRET
                FIVETRAN_CONNECTOR_ID: YOUR_CONNECTOR_ID
            type: BLUEPRINT
        guardrails:
            retry_count: 1
            retry_wait: 0s
            runtime_cutoff: 4h0m0s
            exclude_exit_code_ranges:
        notifications:
            emails: []
            after_error: true
            after_on_demand: false
    Execute dbt Cloud Job:
        source:
            blueprint: dbt Cloud - Execute Job
            inputs:
                DBT_ACCOUNT_ID: YOUR_ACCOUNT_ID
                DBT_API_KEY: YOUR_DBT_API_KEY
                DBT_JOB_ID: YOUR_JOB_ID
            type: BLUEPRINT
        guardrails:
            retry_count: 1
            retry_wait: 0s
            runtime_cutoff: 4h0m0s
            exclude_exit_code_ranges:
                - "200"
                - "201"
                - "211"
                - "212"
        notifications:
            emails: []
            after_error: true
            after_on_demand: false
connections:
    Execute Fivetran Sync:
        Execute dbt Cloud Job: SUCCESS
notifications:
    emails: []
    after_error: true
    after_on_demand: false

3. Grabbing YAML for each BI tool

Now that we have the base YAML that we will start with for each company, we need to get the YAML for each vendor Blueprint that we will add in later. We will input dummy data into the values for each required field to be inputted later as well. The scripts you see below have been changed to Python dictionaries from YAML and includes the added definition needed for the new Vessel created to be connected to the dbt Cloud Vessel upstream.

Tableau

tableau = {'Trigger Tableau Datasource Refresh': {'source': {'blueprint': 'Tableau - Trigger Datasource Refresh',
   'inputs': {'TABLEAU_DATASOURCE_NAME': 'YOUR_DS_NAME',
    'TABLEAU_PASSWORD': 'YOUR_PASSWORD',
    'TABLEAU_PROJECT_NAME': 'YOUR_PROJECT_NAME',
    'TABLEAU_SERVER_URL': 'YOUR_SERVER_URL',
    'TABLEAU_SIGN_IN_METHOD': 'username_password',
    'TABLEAU_SITE_ID': 'YOUR_SITE_ID',
    'TABLEAU_USERNAME': 'YOUR_USERNAME'},
   'type': 'BLUEPRINT'},
  'guardrails': {'retry_count': 1,
   'retry_wait': '0s',
   'runtime_cutoff': '1h0m0s',
   'exclude_exit_code_ranges': ['200-205']},
  'notifications': {'emails': [],
   'after_error': True,
   'after_on_demand': False}}}
tableau_connection = {'Execute dbt Cloud Job': {'Trigger Tableau Datasource Refresh': 'SUCCESS'}}

Mode

mode = {'Trigger Mode Report Refresh': {'source': {'type': 'BLUEPRINT',
   'blueprint': 'Mode - Trigger Report Refresh',
   'inputs': {'MODE_TOKEN_ID': 'YOUR_MODE_TOKEN_ID',
    'MODE_TOKEN_PASSWORD': 'YOUR_MODE_TOKEN_PASSWORD',
    'MODE_WORKSPACE_NAME': 'YOUR_MODE_WORKSPACE_NAME',
    'MODE_REPORT_ID': 'YOUR_MODE_REPORT_ID'}},
  'guardrails': {'retry_count': 0,
   'retry_wait': '0h0m0s',
   'runtime_cutoff': '1h0m0s',
   'exclude_exit_code_ranges': ['200', '203', '204']},
  'notifications': {'emails': ['your_email@email_service.com'],
   'after_error': True,
   'after_on_demand': False}}}
mode_connection = {'Execute dbt Cloud Job': {'Trigger Mode Report Refresh': 'SUCCESS'}}

Domo

domo = {'Refresh Domo Dataset': {'source': {'type': 'BLUEPRINT',
   'blueprint': 'Domo - Refresh Dataset',
   'inputs': {'DOMO_ACCESS_TOKEN': 'YOUR_DOMO_ACCESS_TOKEN',
    'DOMO_CLIENT_ID': 'YOUR_DOMO_CLIENT_ID',
    'DOMO_SECRET_KEY': 'YOUR_DOMO_SECRET_KEY',
    'DOMO_INSTANCE': 'YOUR_DOMO_INSTANCE',
    'DOMO_DATASET_ID': 'YOUR_DOMO_DATASET_ID'}},
  'guardrails': {'retry_count': 0,
   'retry_wait': '0h0m0s',
   'runtime_cutoff': '1h0m0s',
   'exclude_exit_code_ranges': ['200',
    '201',
    '203',
    '204',
    '205',
    '210',
    '211',
    '207']},
  'notifications': {'emails': ['your_email@email_service.com'],
   'after_error': True,
   'after_on_demand': False}}}
domo_connection = {'Execute dbt Cloud Job': {'Refresh Domo Dataset': 'SUCCESS'}}

4. Define Companies and Their Credentials

At this juncture, it was necessary for me to retrieve the authentication details of each company and compile them into a dictionary for access in the script. You can find the dictionaries containing factual information about these businesses below. While they didn't mind me sharing their credentials publicly, it would be best if you refrain from using them for any services as a sign of respect for their privacy. In the dictionaries below, you can see that the BI tool is listed for each business. This is what is going to be used later to add that Vendor to their Fleet.

clients = {
    "pizza_planet": {
    "COMPANY_NAME": "pizza_planet",
    "BI_TOOL": "tableau",
    "FIVETRAN_API_KEY": "pizza",
    "FIVETRAN_API_SECRET": "planet",
    "FIVETRAN_CONNECTOR_ID": "pizza_sync",
    "DBT_ACCOUNT_ID": "buzz",
    "DBT_API_KEY": "lightyear",
    "DBT_JOB_ID": "toinfinity",
    "TABLEAU_DATASOURCE_NAME": "forky",
    "TABLEAU_PASSWORD": "isnottrash",
    "TABLEAU_PROJECT_NAME": "toys_data",
    "TABLEAU_SERVER_URL": "https://toys.online.tableau.com/",
    "TABLEAU_SITE_ID": "toydevelopment",
    "TABLEAU_USERNAME": "andy"
},
    "gusteau": {
    "COMPANY_NAME" : "gusteau",
    "BI_TOOL": "mode",
    "FIVETRAN_API_KEY": "remy",
    "FIVETRAN_API_SECRET": "skinner",
    "FIVETRAN_CONNECTOR_ID": "emilie",
    "DBT_ACCOUNT_ID": "zucchini",
    "DBT_API_KEY": "pepper",
    "DBT_JOB_ID": "onion",
    "MODE_TOKEN_ID": "crepe",
    "MODE_TOKEN_PASSWORD": "creme_brulee",
    "MODE_WORKSPACE_NAME": "wine",
    "MODE_REPORT_ID": "https://gusteau.online.tableau.com/"
},
    "buy_n_large": {
    "COMPANY_NAME" : "buy_n_large",
    "BI_TOOL": "domo",
    "FIVETRAN_API_KEY": "wall_e",
    "FIVETRAN_API_SECRET": "eve",
    "FIVETRAN_CONNECTOR_ID": "plant",
    "DBT_ACCOUNT_ID": "lightbulb",
    "DBT_API_KEY": "trash",
    "DBT_JOB_ID": "recycling",
    "DOMO_ACCESS_TOKEN": "clean_up",
    "DOMO_CLIENT_ID": "earth",
    "DOMO_SECRET_KEY": "space",
    "DOMO_INSTANCE": "https://buy_n_large.online.tableau.com/",
    "DOMO_DATASET_ID": "ship"
}
}

Based on their specified BI tool in the dictionary and the starting YAML, each company will have a 3 Vessel Fleet that you can see below:

Pizza Planet
Gusteau
Buy N Large

5. Creating and Running a Script to Generate Fleets

The Python script needed contains two sections. The first section is substituting the credentials from each business into the YAML that we created earlier in Shipyard and inserting the BI tool into the Fleet YAML. The second section sends a request to the Shipyard API to create the Fleets. If you would like to take a look at the script in one piece, check it out on GitHub. Let's take a look at each section:

YAML Generation

import yaml
import requests


for company in company_list:
    with open('tutorial_new_vendor.yaml', 'r') as f:
    	data = yaml.safe_load(f)

    company_list = ['pizza_planet','gusteau','buy_n_large']
    fivetran_inputs = data['vessels']['Execute Fivetran Sync']['source']['inputs']
    dbt_inputs = data['vessels']['Execute dbt Cloud Job']['source']['inputs']


    data['name'] = f'{clients[company]["COMPANY_NAME"]} Fleet'
    fivetran_inputs['FIVETRAN_API_KEY'] = clients[company]['FIVETRAN_API_KEY']
    fivetran_inputs['FIVETRAN_API_SECRET'] = clients[company]['FIVETRAN_API_SECRET']
    fivetran_inputs['FIVETRAN_CONNECTOR_ID'] = clients[company]['FIVETRAN_CONNECTOR_ID']
    dbt_inputs['DBT_ACCOUNT_ID'] = clients[company]['DBT_ACCOUNT_ID']
    dbt_inputs['DBT_API_KEY'] = clients[company]['DBT_API_KEY']
    dbt_inputs['DBT_JOB_ID'] = clients[company]['DBT_JOB_ID']

    bi_tool = clients[company]['BI_TOOL']
    if bi_tool == 'tableau':
        data['vessels'].update(tableau)
        data['connections'].update(tableau_connection)
        tableau_inputs = data['vessels']['Trigger Tableau Datasource Refresh']['source']['inputs']
        tableau_inputs['TABLEAU_DATASOURCE_NAME'] = clients[company]['TABLEAU_DATASOURCE_NAME']
        tableau_inputs['TABLEAU_PASSWORD'] = clients[company]['TABLEAU_PASSWORD']
        tableau_inputs['TABLEAU_PROJECT_NAME'] = clients[company]['TABLEAU_PROJECT_NAME']
        tableau_inputs['TABLEAU_SERVER_URL'] = clients[company]['TABLEAU_SERVER_URL']
        tableau_inputs['TABLEAU_SITE_ID'] = clients[company]['TABLEAU_SITE_ID']
        tableau_inputs['TABLEAU_USERNAME'] = clients[company]['TABLEAU_USERNAME']
    if bi_tool == 'domo':
        data['vessels'].update(domo)
        data['connections'].update(domo_connection)
        domo_inputs = data['vessels']['Refresh Domo Dataset']['source']['inputs']
        domo_inputs['DOMO_ACCESS_TOKEN'] = clients[company]['DOMO_ACCESS_TOKEN']
        domo_inputs['DOMO_CLIENT_ID'] = clients[company]['DOMO_CLIENT_ID']
        domo_inputs['DOMO_SECRET_KEY'] = clients[company]['DOMO_SECRET_KEY']
        domo_inputs['DOMO_INSTANCE'] = clients[company]['DOMO_INSTANCE']
        domo_inputs['DOMO_DATASET_ID'] = clients[company]['DOMO_DATASET_ID']
    if bi_tool == 'mode':
        data['vessels'].update(mode)
        data['connections'].update(mode_connection)
        mode_inputs = data['vessels']['Trigger Mode Report Refresh']['source']['inputs']
        mode_inputs['MODE_TOKEN_ID'] = clients[company]['MODE_TOKEN_ID']
        mode_inputs['MODE_TOKEN_PASSWORD'] = clients[company]['MODE_TOKEN_PASSWORD']
        mode_inputs['MODE_WORKSPACE_NAME'] = clients[company]['MODE_WORKSPACE_NAME']
        mode_inputs['MODE_REPORT_ID'] = clients[company]['MODE_REPORT_ID']

    with open(f'{clients[company]["COMPANY_NAME"]}_fleet.yaml', 'w') as f:
    	data = yaml.dump(data, f, sort_keys=False, default_flow_style=False)

You can see in the script that we are going to loop through each company to change the credentials and job information in the YAML. The script also looks in the company's dictionary to see what BI tool they use and adds it to the Fleet YAML. Once those values are replaced in the YAML, we dump the information into a new YAML called {company}_fleet.yaml.

Fleet Creation

shipyard_api_key = YOUR_API_KEY_HERE
org_id = YOUR_ORG_ID_HERE
project_id = YOUR_PROJECT_ID_HERE


headers = {
            'X-Shipyard-API-Key': f'{shipyard_api_key}',
            'Content-Type': 'application/x-www-form-urlencoded',
        }

with open(f'{clients[company]["COMPANY_NAME"]}_fleet.yaml', 'rb') as f:
	api_data = f.read()

response = requests.put(
f'https://api.app.shipyardapp.com/orgs/{org_id}/projects/{project_id}/fleets',
headers=headers,
data=api_data,
)

The second part works alongside the first in the same loop. Here, we're utilizing the Python requests package to send the YAML we just made to the Shipyard API, which will then create the Fleets. Once the script is done, we can check Shipyard to see the newly created Fleets.

Next Steps

With both parts of this series now available, there are infinite possibilities with what you can create with the Shipyard API like:

  • Running a script when a new customer comes on board to create a Fleet for them.
  • Bulk create Fleets for your organization that are repetitive.
  • Bulk edit Fleets when your organization decides to switch vendors.

We would love to see what you are building with Shipyard. If you've been able to get rid of a monotonous task using the Shipyard API, let us know on LinkedIn.