Upload a database to your Obviously AI account

The endpoints allow you to upload database to your Obviously AI account. The response object consist of 2 id's:

  • process_id - process_id can be used in /add-data/status to fetch the most recent status on your uploaded database.
  • dataset_id - dataset_id is a unique identifier string for the uploaded database to your Obviously AI account. It will be used with /predict endpoints to train a model

Note:

db_type can take any of the following values:

  • mysql
  • sqlserver
  • postgress
  • redshift
  • bigquery
  • snowflake

To upload a Big Query database use the following payload:

payload = {
    "display_name": "<display name for the uploaded dataset>",
    "db_type": "<Type of the database>",
    "bigquery_credentials": {
    "project_id": "<unique id of the project>",
    "dataset_name": "<dataset name>",
    "service_account_json": "<json associated with bigquery account>"
  }
}

To upload a Snowflake database use the following payload:

payload = {
    "display_name": "<display name for the uploaded dataset>",
    "db_type": "<type of the database>",
    "credentials": {
        "host": "<snowflake account host>",
        "username": "<username>",
        "password": "<password>",
        "db_name": "<database name>",
        "additional_config": {
            "warehouse": "<warehouse_name>"  # Optional
        }
    }
}
payload = {
    "display_name": "<display name for the uploaded dataset>",
    "db_type": "<type of the database>",
    "credentials": {
        "host": "<snowflake account host>",
        "username": "<username>",
        "db_name": "<database name>",
        "additional_config": {
            "private_key": "<private_key>",            
            "private_key_passphrase": "<private_key_passphrase>",  # Optional, if applicable
            "warehouse": "<warehouse_name>"                        # Optional
        }
    }
}

Language
Credentials
Header
Click Try It! to start a request and see the response here!