Skip to content

Code Sharing

All Python modules inside of srcDir can be imported just as they would be if the plugin was executing locally. When a plugin operation is executed srcDir is the current working directory so all imports need to be relative to srcDir regardless of the path of the module doing the import.

Please refer to Python's documentation on modules to learn more about modules and imports.


Assume we have the following file structure:

├── plugin_config.yml
├── schema.json
└── src
    ├── operations
    │   ├──
    │   └──
    ├── resources
    │   ├──
    │   ├──
    │   ├──
    │   └── list_schemas.sql
    └── utils

Any module in the plugin could import with from utils import execution_util.


Since the platform uses Python 2.7, every directory needs to have an file in it otherwise the modules and resources in the folder will not be found at runtime. For more information on files refer to Python's documentation on packages.

Note that the srcDir in the plugin config file (src in this example) does not need an file.

Assume schema.json contains:

    "repositoryDefinition": {
        "type": "object",
        "properties": {
            "name": { "type": "string" }
        "nameField": "name",
        "identityFields": ["name"]
    "sourceConfigDefinition": {
        "type": "object",
        "required": ["name"],
        "additionalProperties": false,
        "properties": {
            "name": { "type": "string" }
        "nameField": "name",
        "identityFields": ["name"]

To keep the code cleaner, this plugin does two things:

  1. Splits discovery logic into its own module:
  2. Uses two helper funtions execute_sql and execute_shell in utils/ to abstract all remote execution.

When the platform needs to execute a plugin operation, it always calls into the function decorated by the entryPoint object. The rest of the control flow is determined by the plugin. In order to split logic, the decorated function must delegate into the appropriate module. Below is an example of delegating into to handle repository and source config discovery:

from operations import discovery

from dlpx.virtualization.platform import Plugin

plugin = Plugin()

def repository_discovery(source_connection):
    return discovery.find_installs(source_connection);

def source_config_discovery(source_connection, repository):
    return discovery.find_schemas(source_connection, repository)

Note is in the operations package so it is imported with from operations import discovery.

In the plugin delegates even further to split business logic away from remote execution. utils/ deals with remote execution and error handling so can focus on business logic. Note that still needs to know the format of the return value from each script.

from dlpx.virtualization import libs

from generated.definitions import RepositoryDefinition, SourceConfigDefinition
from utils import execution_util

def find_installs(source_connection):
    installs = execution_util.execute_shell(source_connection, '')

    # Assume 'installs' is a comma separated list of the names of Postgres installations.
    install_names = installs.split(',')
    return [RepositoryDefinition(name=name) for name in install_names]

def find_schemas(source_connection, repository):
    schemas = execution_util.execute_sql(source_connection,, 'list_schemas.sql')

    # Assume 'schemas' is a comma separated list of the schema names.
    schema_names = schemas.split(',')
    return [SourceConfigDefinition(name=name) for name in schema_names]


Even though is in the operations package, the import for execution_util is still relative to the srcDir specified in the plugin config file. execution_util is in the utils package so it is imported with from utils import execution_util. has two methods execute_sql and execute_shell. execute_sql takes the name of a SQL script in resources/ and executes it with resources/ execute_shell takes the name of a shell script in resources/ and executes it.

import pkgutil

from dlpx.virtualization import libs

def execute_sql(source_connection, install_name, script_name):
    psql_script = pkgutil.get_data("resources", "")
    sql_script = pkgutil.get_data("resources", script_name)

    result = libs.run_bash(
        source_connection, psql_script, variables={"SCRIPT": sql_script}, check=True
    return result.stdout

def execute_shell(source_connection, script_name):
    script = pkgutil.get_data("resources", script_name)

    result = libs.run_bash(source_connection, script, check=True)
    return result.stdout


Both execute_sql and execute_shell use the check parameter which will cause an error to be raised if the exit code is non-zero. For more information refer to the run_bash documentation.