All pages
Powered by GitBook
1 of 15

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Connector shape

Introduction

The connector shape is used to define which connector instance should be used for sending or receiving data, and then which endpoint.

Any connector instances that have been added for your installed connectors are available to associate with a connector shape. Any endpoints configured for the underlying connector will be available for selection once you've confirmed which instance you're using.

If you need more information about the relationship between connectors and instances, please see our page.

A single incoming payload for should not exceed 500MB.

We recommend processing multiple, smaller payloads rather than one single payload (1000 x 0.5MB payloads are more efficient than 1 x 500MB payload!).

For payloads up to 500MB, consider adding a to batch data into multiple, smaller payloads. Payloads exceeding 500MB should be batched at source.

Accessing connector shape settings

When you add a connector shape to a process flow, the is displayed immediately, so you can choose which of your connector instances to use, and which endpoint.

To view/update the settings for an existing connector shape, click the associated 'cog' icon to access the - for example:

Configuring settings for a connector shape

Show me

Follow the steps below to configure a connector shape.

Step 1 Click the select a source integration field and choose the instance that you want to use - for example:

All connector configured for your company are available for selection. Connectors and their associated instances are added via the .

If you select an instance that's associated with a database connector, subsequent connector settings will vary from those detailed here - please see for more information.

Step 2 Select the endpoint that you want to use - for example

All endpoints associated with the parent connector for this instance are available for selection.

Step 3 Depending on how your selected endpoint is configured, you may be required to provide values for one or more variables.

Step 4 Save your changes.

Step 5 Once your selected instance and endpoint settings are saved, go back to edit settings:

Now you can access any optional filter options that are available - for example:

Available filters and variables - and whether or not they are mandatory - will vary, depending on how the connector is configured.

Step 6 The request timeout setting allows you to override the default number of seconds allowed before a request to this endpoint is deemed to have failed - for example:

The default setting is taken from the underlying connector endpoint setup and should only be changed if you have a technical reason for doing so, or if you receive a .

Step 7 Set error-handling options as required. Available options are summarised below:

Option
Summary

Step 8 Set the payload wrapping option as appropriate for the data received from the previous step:

This setting determines how the payload that gets pushed should be handled. Available options are summarised below:

Option
Summary

Step 9 If required you can set response handling options:

These options are summarised below:

Option
Summary
Endpoint method

Step 10 Save your changes.

Valid FTP commands

Valid FTP commands

Valid FTP commands for the SFTP connector are summarised in the following sections:

  • File retrieval

File retrieval

FTP command
Summary

File operations

FTP command
Summary

Information & metadata

FTP command
Summary

Configuring a database connection

Introduction

To configure a database connection, add a connector shape to your process flow and select the required source instance and source query for an existing :

Having selected a source instance, you'll know if you're working with a database connector because the subsequent field requires you to choose a source query

get_and_move

Retrieve a file, load its content into the process flow as a payload, then move the file to another directory on the remote server. If multiple files are retrieved, multiple payloads are generated (one payload per file).

pluck

Retrieve file(s), load content into the process flow as payload(s) and then delete file(s) from the remote server. If multiple files are retrieved, multiple payloads are generated (one payload per file).

append

Add content from target file(s) to the end of source file(s).

copy

Copy file(s) from one directory to another directory on the remote server (so the file exists in both directories).

rename

Rename file(s). This operation can be combined with moving the files.

delete

Remove file(s).

put_non_empty

Upload file(s) only if it has content (i.e. avoid uploading empty files).

list

List files in a directory on the remote server. Information is loaded into the process flow as a single payload.

size

Get file sizes from a directory on the remote server. Information is loaded into the process flow as a single payload.

modified

Get the last modified date/time (UTC format) from a directory on the remote server. Information is loaded into the process flow as a single payload.

mime

Get/detect MIME type of file(s). A boolean value is returned for each file.

exists

Check if a file exists.

File operations
Information & metadata
rather than a
source endpoint
.

Generally, database connector settings work on the same principles as 'normal' connectors but there are differences, depending on whether you're using a query that receives or sends data.

Receive queries

When a connector shape is configured with a receive type query (i.e. you're receiving data from a database), you'll see settings sections for variables, error handling, and response handling:

These options are summarised below.

Receive options

Section
Setting
Summary

Variables

Custom

Any query variables defined for the selected query are displayed here. Variables may be mandatory (denoted with an asterisk) so a value must be present at runtime, or optional.

Error handling

Retries

Sets the number of retries that will be attempted if a connection can't be made. You can define a value between 0 and 2. The default setting is 1.

Response handling

Save response in payload

Set this option to on to save the response from the completed operation IN the payload for subsequent processing.

This option provides the ability to access the response body via . By default, the response is saved in a field named response however, when the save response in payload option is toggled on, you can specify your preferred field name.

Response data

When a process flow runs using a receive type query, received data is returned in a payload. You can view this data in logs - for example:

The number of payloads received is determined by pagination options defined in the query setup, and whether these options are referenced in the associated query.

Send queries

When a connector shape is configured with a send type query (i.e. you're sending data to a database), you'll see settings sections for variables, database, error handling, and response handling:

These options are summarised below.

Send options

Section
Setting
Summary

Variables

Custom

Any query variables defined for the selected query are displayed here. Variables may be mandatory (denoted with an asterisk) so a value must be present at runtime, or optional.

Database

Override query

You can enter your own database query here which will be run instead of the source query already selected.

Before using an override query you should ensure that:

  • target columns exist in the database

  • target columns are configured (in the database) to accept null values.

Items path

If your incoming payload contains a single array of items, enter the field name here. In doing so, the process flow loops through each item in the array and performs multiple inserts as one operation.

For more information please see .

Error handling

Retries

Sets the number of retries that will be attempted if a connection can't be made. You can define a value between 0 and 2. The default setting is 1.

About item paths

The items path field is used to run the associated query for multiple items (i.e. database rows) in a single operation, by looping through all items in a given array:

Here, specify the field name in your payload associated with the array to be targeted.

Example (simple array)

In the example below, our payload includes one products field with items in an array:

To send all these items into our database, the items path should be set to:

products

Example (nested arrays)

In the example below, our payload includes a top-level products array with nested accessories and footwear arrays:

To send all accessories items into our database, the items path should be set to:

products.0.accessories

To send all footwear items into our database, the items path should be set to:

products.1.footwear

The items path field can't be used to target individual payload items. For this, you'd need an alternative approach - for example, you might choose to override query and provide a more specific query to target the required item.

Example (target specific item)

In the example below, our payload includes a top-level products array with nested accessories and footwear arrays:

To insert the FIRST item in the footwear array, we'd leave the items path field blank and enter the following override query:

Response data

When a process flow runs using a send type query, the default behaviour is for the response from your database to be returned as the payload. You can view this in logs - for example:

If you want to see the data that was passed in, toggle the save response in payload option to on.

Related pages

  • Building a database connector

  • Working with queries

in the normal way
database connector

Retries

Sets the number of retries that will be attempted if a connection can't be made. You can define a value between 0 and 2. The default setting is 1.

Backoff

If you're experiencing connection issues due to rate limiting, it can be useful to increase the backoff value. This sets a delay (in seconds) before a retry is attempted.

You can define a value between 1 and 10 seconds. The default setting is 1.

Allow unsuccessful statuses

If you want the process flow to continue even if the connection response is unsuccessful, toggle this option on. The default setting is off.

Raw

Push the payload exactly as it is pulled - no modifications are made.

First

This setting handles cases where your destination system won't process array objects, but your source system sends everything (even single records) as an array. So, [{customer_1}] is pushed as {customer_1}.

Generally, if your process flow is pulling multiple records from a source connection but later pushing just a single record into a destination connection, you should set payload wrapping to first.

When multiple records are pulled, they are written to the payload as an array. If you then strip out a single record to be pushed, that single record will - typically - still be wrapped in an array. Most systems will not accept single records as an array, so we need to 'unwrap' our customer record before it gets pushed.

Wrapped

This setting handles cases where your destination system is expecting a payload to be wrapped in an array, but your payload contains a series of 'unwrapped' objects.

The most likely scenario for this is where you have a complex process flow which is assembling a payload from different routes.

Setting payload wrapping to wrapped will wrap the entire payload as an array object. So, {customer_1},{customer_2}{customer_3} is pushed as [{customer_1},{customer_2}{customer_3}] .

Save response AS payload

Set this option to on to save the response from the completed operation as a payload for subsequent processing.

POST

PUT

PATCH

DELETE

Save response IN payload

Set this option to on to save the response from the completed operation IN the payload for subsequent processing.

This option provides the ability to access the response body via payload variables. This can be useful for cases where an API returns a successful response despite an error - by inspecting response information from the payload itself, you can determine whether or not a request is successful. By default, the response is saved in a field named response - for example:

However, when the save response in payload option is toggled on, you can specify your preferred field name - for example:

GET POST

PUT

PATCH

DELETE

Expect an empty response

Set this option to on if you are happy for the process flow to continue if no response is received from this this request.

POST GET

Connectors & instances
any process flow shape
flow control shape
connector settings panel
settings panel
instances
manage connectors page
Configuring a database connection
timeout error when processing a particularly large payload
{
  "products": [
    {
      "id": 4001,
      "sku": "BAG-001",
      "colour": "Blue",
      "category": "Bags",
      "quantity": 11
    },
    {
      "id": 4002,
      "sku": "BAG-002",
      "colour": "Green",
      "category": "Bags",
      "quantity": 11
    },
    {
      "id": 4003,
      "sku": "BAG-003",
      "colour": "Pink",
      "category": "Bags",
      "quantity": 12
    },
    {
      "id": 4004,
      "sku": "BAG-004",
      "colour": "Red",
      "category": "Bags",
      "quantity": 10
    }
  ]
}
{
  "products": [
    {
      "accessories": [
        {
          "id": 4,
          "sku": "BAG-003",
          "colour": "Green",
          "category": "Bags",
          "quantity": 11
        },
        {
          "id": 5,
          "sku": "BAG-004",
          "colour": "Pink",
          "category": "Bags",
          "quantity": 12
        } 
      ]
    },
    {
      "footwear": [
        {
          "id": 6,
          "sku": "SHOE-001",
          "colour": "Black",
          "category": "Shoes",
          "quantity": 20
        },
        {
          "id": 7,
          "sku": "SHOE-002",
          "colour": "White",
          "category": "Shoes",
          "quantity": 15
        },
        {
          "id": 8,
          "sku": "BOOT-001",
          "colour": "Brown",
          "category": "Boots",
          "quantity": 10
        }
      ]
    }
  ]
}
{
  "products": [
    {
      "accessories": [
        {
          "id": 4,
          "sku": "BAG-003",
          "colour": "Green",
          "category": "Bags",
          "quantity": 11
        },
        {
          "id": 5,
          "sku": "BAG-004",
          "colour": "Pink",
          "category": "Bags",
          "quantity": 12
        } 
      ]
    },
    {
      "footwear": [
        {
          "id": 6,
          "sku": "SHOE-001",
          "colour": "Black",
          "category": "Shoes",
          "quantity": 20
        },
        {
          "id": 7,
          "sku": "SHOE-002",
          "colour": "White",
          "category": "Shoes",
          "quantity": 15
        },
        {
          "id": 8,
          "sku": "BOOT-001",
          "colour": "Brown",
          "category": "Boots",
          "quantity": 10
        }
      ]
    }
  ]
}
INSERT INTO products (id, sku, colour, category, quantity) VALUES (:products.1.footwear.0.id, :products.1.footwear.0.sku, :products.1.footwear.0.colour, :products.1.footwear.0.category, :products.1.footwear.0.quantity)
{"id":123,"response":{"id":123}}

Response handling

Save response as payload

Set this option to on to save the response from the completed operation as a payload for subsequent processing.

Save response in payload

Set this option to on to save the response from the completed operation IN the payload for subsequent processing.

This option saves the response from your database, together with the payload that was sent in. By default, the response is saved in a field named response however, when the save response in payload option is toggled on, you can specify your preferred field name.

payload parameters
About item paths

Configuring FTP connections

Introduction

The Patchworks FTP connector is used to work with data via files on FTP servers in process flows. You might work purely in the FTP environment (for example, copying/moving files between locations), or you might sync data from FTP files into other connectors, or you might use a combination of both! For example, a process flow might be designed to:

  1. Pull files from an FTP server

  2. Use the data in those files as the payload for subsequent processing (e.g. sync to Shopify)

  3. Move files to a different FTP server location

This guide explains the basics of configuring a connection shape with an FTP connector.

About the Patchworks FTP connector

When you add a connection shape and select an FTP connector, you will see that two endpoints are available:

Here:

  • FTP GET is used to retrieve files from the given server (i.e. to receive data)

  • FTP PUT is used to add/update files on the given server (i.e. to send data)

Configuring FTP endpoints

Having selected either of the two FTP endpoints, configuration options are displayed. The same options are used for both endpoints but in a different sequence, reflecting usage:

For information about these fields please see our page - details are the same.

Configuring SFTP connections

get_and_move (FTP command)

Overview

The get_and_move command retrieves the file content, loads it into the flow as a payload, and then moves the file to the specified directory on the remote server.

Need to know

  • You can use the get_and_move command to in one step. In this scenario, the file content is loaded into the flow as a payload.

  • You can use the get_and_move command to , but this requires multiple steps rather than a single get_and_move operation. In this scenario, content from files is NOT loaded into the flow; instead, payloads will contain the filename.

  • Regular expressions are supported when

Connection settings

When , three fields should be updated:

FTP command
Root
Path

Examples

Getting & moving a specific file

Scenario

The steps

Getting & moving multiple files

Scenario

The steps

In this example we moved selected files from one folder to another. If we needed to copy ALL files, we could simply remove the .

to
get_and_move
.
  • When getting and moving all files from a directory, subfolders are not included.

  • FTP command followed by the target directory - i.e. where should the file(s) be moved?

    The common root to source and target directories.

    The source directory and file(s).

    target specific files
    get & move multiple files
    (using regex in a filter)
    or all files in a directory
    targeting single files
    configuring an SFTP connector
    Get and move a specific file
    Get & move multiple files
    Scenario
    The steps
    Scenario
    The steps
    filter shape
    Cover

    Our process flow is configured as follows:

    Cover

    In this flow, we need to retrieve content from a file named orders.json which is located in /myfiles/folderA on the remote server:

    Then we want to move this file to /myfiles/folderA and keep the same filename.

    Cover

    Connector settings

    Our SFTP shape is configured as follows:

    • Start looking for the file to get from the root, which is defined as: /myfiles/

    • Check the path for the file to get, which is defined as: folderA/orders.json

    Cover

    Payload & SFTP server

    When the process flow is run, the payload for the SFTP shape shows the content retrieved from /myfiles/folderA/orders.json :

    On our remote server, the file is gone from /myfiles/folderA/:

    And now it can be found in /myfiles/folderB/:

    Cover

    Our process flow is configured as below:

    Cover

    In this flow, we need to get and move all files in /myfiles/folderF which start with 'old', on the remote server: We are moving these files to /myfiles/folderG , which is currently empty: We are keeping the same filenames.

    Cover

    Connector settings for SFTP shape 1

    Our first SFTP connector step is configured with a list command, as follows:

    • Use the SFTP GET user pass endpoint.

    • Use list as the FTP command

    Cover

    Flow control settings

    We use a flow control step to extract each file name into its own payload:

    Here we create batches of 1, so we get one payload per file name. When the flow runs, this shape outputs a multiple payloads, each containing a single filename. For example:

    Cover

    Filter shape settings

    We use a filter shape to extract only the file names that we want to process. To achieve this, we define a single filter rule, as below:

    Here we define the field name as 0, so the first value in our payload (bearing in mind we only have one field in each payload).

    We set the filter type to string and the operator to regex, then provide our regular expression. Our regular expression is set to /^old.*/i, so only files starting with 'old' will be extracted for onward processing. When the flow runs, this shape outputs three payloads:

    Cover

    Connector settings for SFTP shape 2

    Our final SFTP connector step is configured to get and move files with the names received from the previous filter shape:

    • Since we are updating the remote server (as opposed to retrieving files) the SFTP PUT user pass endpoint is selected.

    • The root

    Cover

    On our remote server, all processed files are removed from /myfiles/folderF/:

    And now they can be found in /myfiles/folderG/:

    Having loaded content from this file, move it to the path (from the root) which is specified immediately after the ftp command. This is defined as: get_and_move:folderB/{{current_filename}}

    .
  • Look for files in the root, which is defined as: /myfiles/folderF

  • Since there's no specific file to target, we leave the path empty

  • When the flow runs, the SFTP shape outputs a single payload which contains all file names found in /myfiles/folderF, as an array:

    Each one contains a single filename. For example:

    is defined as
    /myfiles/
    , which is the common root for both source (
    folderF
    ) and target (
    folderG
    ) directories.
  • The path defines which files are retrieved and moved. It's set as folderF/[[payload.0]] which means: look in folderF (from the root) for a filename resolved from the first value in the incoming payload. Keep in mind that this step repeats for each incoming payload from the previous flow control step - i.e. for each file.

  • The ftp command includes the get_and_move command, immediately followed by our target directory: copy:folderG/{{current_filename}} .

  • When the flow runs, this shape outputs a payload for each processed file, each one containing the file name. For example:

    exists (FTP command)

    Content for this page is coming soon! Please check back later.

    pluck (FTP command)

    Overview

    The pluck command retrieves a file, loads content into the process flow as a payload, and then deletes the file from the remote server.

    Need to know

    • You can pluck a single file or multiple files.

    • Regular expressions are supported when defining files to retrieve with the get command.

    Connection settings

    When , two fields should be updated:

    FTP command
    Root
    Path

    Example

    Scenario

    The steps

    FTP command only: pluck

    The root to the file(s) you want to pluck. Since the root only applies to one directory, you can enter the full path here.

    The filename(s) to be plucked.

    configuring an SFTP connector
    Scenario
    The steps
    Cover

    Our process flow is configured as follows:

    Cover

    In this flow, we need to pluck all files in /myfiles/folderB that start with orders - so orders.json, orders2.json, orders3.json:

    Cover

    Connector settings

    Our SFTP shape is configured as follows:

    • Use the SFTP GET user pass endpoint.

    • Start looking for the file to rename from the root, which is defined as: /myfiles/folderB/

    Cover

    Payload & SFTP server

    When the process flow is run, the payload for the SFTP shape shows the content retrieved from matched files. We have three payloads:

    On our remote server, all 'orders' files have been removed from /myfiles/folderB/ and this directory is now empty:

    list (FTP command)

    Overview

    The list command lists files in a given directory. The listing is loaded into the flow as a payload (where each filename is listed in an array).

    Need to know

    Check the path for the file(s) to pluck, which is defined as: /^orders.*/i. This regular expression resolves to 'all files starting with 'orders', irrespective of upper/lower case.

    Regular expressions are not supported for the list command - it should only be used to list all files in a given directory. If you need to work with a specific group of files, you may find the get command useful.

    Connection settings

    When configuring an SFTP connector, two fields should be updated:

    FTP command
    Root
    Path

    FTP command only: list

    The root to the directory you want to list. Since there is no specific file to consider, the full path to the directory will be the root.

    Not required.

    Example

    • Scenario

    • The steps

    Scenario

    The steps

    Our process flow is configured as follows:

    In this flow, we need to list all files in /myfiles/folderB on the remote server:

    Connector settings

    Our SFTP shape is configured as follows:

    • List all files in the root, which is defined as: /myfiles/folderB/

    On our remote server, all files in /myfiles/folderB/ are loaded into the flow as a payload, with files in an array:

    Cover
    Cover
    Cover
    Cover

    rename (FTP command)

    Overview

    The rename command changes the name of specified files. You can rename single/multiple files and leave them in the same directory, or you can rename files into a different directory.

    Need to know

    • No content is loaded into the flow. A boolean value is returned to indicate a successful/failed file operation.

    • Regular expressions are supported in filenames.

    Connection settings

    When , three fields should be updated:

    FTP command
    Root
    Path

    Examples

    Renaming but not moving a file

    Scenario

    The steps

    Renaming & moving a file

    Scenario

    The steps

    Configuring SFTP connections

    Introduction

    The Patchworks SFTP connector is used to work with data via files on SFTP servers in process flows. You might work purely in the SFTP environment (for example, copying/moving files between locations), or you might sync data from SFTP files into other connectors, or you might use a combination of both! For example, a process flow might be designed to:

    1. Pull files from an SFTP server

    Use the data in those files as the payload for subsequent processing (e.g. sync to Shopify)

  • Move files to a different SFTP server location

  • Show me

    This guide explains the basics of configuring a connection shape with an SFTP connector.

    Guidance on this page is for SFTP connections however, they also apply for FTP.

    About the Patchworks SFTP connector

    Authentication

    When you install the Patchworks SFTP connector from the Patchworks marketplace and then add an instance, you'll find that two authentication methods are available:

    Auth method
    Summary

    User pass

    The instance is authenticated by providing a username and password for the SFTP server.

    Key pass

    The instance is authenticated by providing a private key (RSA .pem format) for the SFTP server.

    Further information on these authentication methods can be found on our SFTP (prebuilt connector) page.

    Endpoints

    When you add a connection shape and select an SFTP connector, you will see that two endpoints are available:

    Here:

    • SFTP GET UserPass is used to retrieve files from the given server (i.e. to receive data)

    • SFTP PUT UserPass is used to add/update files on the given server (i.e. to send data)

    You may notice that the PUT UserPass endpoint has a GET HTTP method - that's because it's not actually used for SFTP. All we're actually doing here is retrieving host information from the connector instance - you'll set the FTP action later in the endpoint configuration, via an ftp command settings.

    Configuring SFTP endpoints

    Having selected either of the two SFTP endpoints, configuration options are displayed. The same options are used for both endpoints but in a different sequence, reflecting usage:

    These fields are summarised below:

    Option
    Summary

    FTP command

    A valid FTP command is expected at the start of this field (e.g. get, put, rename, etc.). If required, qualifying path/filename information can follow a given command.

    Root

    This field is only needed if you are specifying a regular expression in the subsequent path field. If you are NOT defining the path field as a regular expression, the root field isn't important - you can leave it set to /. If you are ARE defining the path field as a regular expression, enter a root path that reflects the expected file location as closely as possible - this will optimise performance for expression matching. For example, suppose the files that we want to process are in the following SFTP folder: /orders/store/year/pending and that our specified path contains a regular expression to retrieve all files for store 1 for the current day in 2023. In this case our root would be defined as: orders/store1/2023/pending

    Path

    If the name of the file that you want to target is static and known, enter the full path to it here - for example:

    store1/2023/pending/20230728.json

    When specifying a path to a given folder in this way, you don't need a / at the start or at the end.

    If the name is variable and therefore unknown, you can specify a regular expression as the path. In this case, you enter the required regular expression here, and ensure that the root field contains a path to the relevant folder (see above).

    Original filename

    This field is not currently used. For information about working with original filenames please see the section below.

    Original path

    This field is not currently used. For information about working with original paths please see the section below.

    Valid FTP commands

    Please refer to the Valid FTP commands page.

    Using an {{original_filename}} variable

    If you're processing files between SFTP server locations, the {{original_filename}} variable is used to reference filenames from a previous SFTP connection step. It's most typically used with the SFTP PUT UserPass endpoint. This handles cases where you're taking action with files/data processed by a previous connection shape which is configured to use the SFTP GET UserPass endpoint and retrieve files matching a regular expression path.

    In this scenario, we can't know the literal name of the file(s) that the SFTP PUT UserPass endpoint will receive. So, by setting the path field to {{original_filename}}, we can refer back to the filename(s) from the previous SFTP connection step.

    Example

    The sample process flow below shows two connection shapes that are configured with SFTP endpoints - the first is to get files and the second is to put files: If we look at settings for the first SFTP connection, we can see that it's configured to get files matching a regular expression, in a pending folder: The regular expression is explained below:

    • the / at the start of the string denotes the start of the regular expression

    • the ^ means that the first part of the line (i.e. filename) must start with the literal string orders

    • [0-9]* matches any sequence of digits

    • \. is an escaped full stop, so a literal full stop is used instead of the special meaning that a full stop has in regular expressions

    • json is a literal string to be matched

    • the / at the end of the string denotes the end of the regular expression

    It will match the following files:

    These are:

    • orders20230717.json

    • orders20230718.json

    • orders20230719.json

    Now if we check settings for the second SFTP connection, we can see that it's configured to put files into a ready folder. By specifying {{original_filename}} in the path field, we are referencing filenames from the first SFTP connection.

    So, the three files retrieved from the first SFTP connection step (from the pending folder) are copied to the ready folder in the second SFTP connection step.

    Using an {{original_path}} variable

    The {{original_path}} variable is used to replicate the path from a previous SFTP connection step. It's most typically used with the SFTP PUT UserPass endpoint.

    This handles cases where you're taking action with files/data processed by a previous connection shape which is configured to use the SFTP GET UserPass endpoint to retrieve files matching a regular expression path and you want to replicate the source path in the target location.

    Example

    The sample process flow below shows two connection shapes that are configured with SFTP endpoints - the first is to get files and the second is to put files: Our aim is to copy files retrieved from an FTP location in the first connection step, to a second FTP location, using the same folder structure as the source.

    If we look at settings for the first SFTP connection, we can see that it's configured to get files matching a regular expression, in a store1 folder: The path is added as a regular expression, explained below:

    • the / at the start of the string denotes the start of the regular expression

    • orders\/2023\/pending\/ is a literal string for the folder structure, with escaped forward slashes

    • orders is a literal string

    It will match the following files:

    These are:

    • orders20230717.json

    • orders20230718.json

    • orders20230719.json

    Now if we check settings for the second SFTP connection, we can see that it's configured to put files into a path specified as {{original_path}}:

    So, the three files retrieved from the first SFTP connection step are copied to the ready folder in the second SFTP connection step and the source folder structure is replicated:

    Using a {{current_filename}} variable

    The {{current_path}} variable is used to reference the filename within the current SFTP connection step. It's particularly useful when moving files - for example:

    get_and_move:store1/completed_orders/{{current_filename}}

    Creating SFTP folders dynamically based on timestamps

    A fairly common requirement is to create folders on an SFTP server which are named according to the current date. This can be achieved using a custom script, as summarised below.

    Script code

    The following four lines of code should be added to your script:

    Our example is PHP - you should change as needed for your preferred language.

    SFTP connection shape path

    The path in your SFTP connection shape should be set to:

    How it works

    The data object in the script shape contains three items: payload, meta, and variables.

    Our script code creates a timestamp, puts it in to the meta, and then puts the meta into the data. The SFTP shape always checks if there is an original_filename key in the meta and if one exists, this is used.

    Syncing SFTP data to another target connector

    Much of the information above focuses on scenarios where you are working with files between different SFTP locations. However, another approach is to take the data in files from an SFTP server and sync that data into another Patchworks connector.

    When a process flow includes a source connection for an SFTP server (using the SFTP GET UserPass endpoint) and a non-SFTP target connector (for example, Shopify), data in the retrieved file(s) is used as the incoming payload for the target connector.

    If multiple files are retrieved from the SFTP server (because the required path in settings for the SFTP connector is defined as a regular expression which matches more than one file), then each matched file is put through subsequent steps in the process flow one at a time, in turn. So, if you retrieve five files from the source SFTP connection, the process flow will run five times.

    More information

    For information about working with regular expressions, please see the link below:

    FTP command followed by the required directory and name for the file when it's renamed.

    The common root to source and target directories.

    The source directory and file(s) to be renamed.

    configuring an SFTP connector
    Rename but don't move a file
    Rename & move a file
    Scenario
    The steps
    Scenario
    The steps
    Cover

    Our process flow is configured as below:

    Cover

    In this flow, we need to rename a file named hello.txt in /myfiles/folderC to goodbye.txt:

    We want the renamed file to end up in the same directory.

    Cover

    Connector settings

    Our SFTP shape is configured as follows:

    • Use the SFTP GET user pass endpoint.

    • Start looking for the file to rename from the root, which is defined as: /myfiles/

    Cover

    Payload & SFTP server

    When the process flow is run, the payload for our SFTP shape shows just a 1:

    A 1 indicates a successful response from your remote server. A 0 indicates an unsuccessful response.

    On our remote server, hello.txt is now named goodbye.txt (still in /folderC):

    Cover

    Our process flow is configured as follows:

    Cover

    In this flow, we need to rename /myfiles/folderE/hello.txt: The file will be renamed as /myfiles/folderF/goodbye.txt - so we are both moving and renaming the file:

    Cover

    Connector settings for SFTP shape

    Our first SFTP connector step is configured as follows:

    • Use the SFTP GET user pass endpoint.

    • Start looking for the file to rename from the root, which is defined as: /myfiles/

    Cover

    Payload for SFTP shape When the process flow is run, the payload for our SFTP shape shows just a 1:

    A 1 indicates a successful response from your remote server. A 0 indicates an unsuccessful response.

    Cover

    Remote server outcome

    On our remote server, hello.txt is gone from /myfiles/folderE:

    But now we have a file named, goodbye.txt in /myfiles/folderF:

    append (FTP command)

    Overview

    The append command takes the incoming payload (either from another file on your remote server or some other source) and appends this content to a specified file on the remote server.

    $timestamp = round(microtime(true) * 100);
    $meta = $data['meta'];
    $meta['original_filename'] = 'FIXED_TEXT' . '_' . $timestamp . '.xml';
    $data['meta'] = $meta;
    {{original_filename}}

    Check the path for the file to rename, which is defined as: folderC/hello.txt

  • Rename the file to the directory/name provided immediately after the ftp command. This is defined as: rename:folderC/goodbye.txt

  • Check the path for the file to rename, which is defined as: folderE/hello.txt

  • Rename the file to the directory/name provided immediately after the ftp command. This is defined as: rename:folderF/goodbye.txt

  • [0-9]* matches any sequence of digits

  • \. is an escaped full stop, so a literal full stop is used instead of the special meaning that a full stop has in regular expressions

  • json is a literal string to be matched

  • the / at the end of the string denotes the end of the regular expression

  • In this way, any regular expression to match for the
    path
    will start in the relevant (
    2023
    )folder - rather than checking folders and subfolders for all stores and all years.
    Retrieve files from root with a common filename prefix

    Suppose our SFTP folder is: store1/orders/2023/pending ...and that it contains a number of files which all start with 'orders' followed by a date:

    Our path and root fields would be defined as below:

    Here, the root field is set to the folder containing our required files and the path field contains a regular expression where:

    • the / at the start of the string denotes the start of the regular expression

    • the ^ means that the first part of the line (i.e. filename) must start with the literal string orders

    • [0-9]* matches any sequence of digits

    • \. is an escaped full stop, so a literal full stop is used instead of the special meaning that a full stop has in regular expressions

    • json is a literal string to be matched

    • the / at the end of the string denotes the end of the regular expression

    Retrieve all files of a type from the root

    Suppose you have multiple .csv files to be retrieved from the root of your server. In this case, the root should be left blank and the path should be set as: /.*.csv/i For example:

    Here, the root field is empty and the path field contains a regular expression where:

    • the / at the start of the string denotes the start of the regular expression.

    • *. matches any character (.) zero or more times (*). The * ensures that any characters can appear before .csv element of the filename.

    • \.csv matches the literal string .csv. The backslash (\) is used to escape the dot (.), making it match a literal dot, so the csv is matched exactly.

    • / at the end of the string denotes the end of the regular expression.

    • i is a flag that means 'case-insensitive'. This ensures that the regular expression will match both upper and lower case variations of the letters in the pattern. For example, it would match both .csv and .CSV.

    Using an {{original_filename}} variable
    Using an {{original_path}} variable
    Connection settings

    When configuring an SFTP connector, three fields should be updated:

    FTP command
    Root
    Path

    FTP command only: append

    The root to the directory where the source file is located. You could enter just / and then a full path in the path field, or you can provide the full path as the root.

    The source file (i.e. the file which contains content you want to append to the target file).

    Example

    • Scenario

    • The steps

    Scenario

    The steps

    delete (FTP command)

    Overview

    The delete command deletes a file from a given location on the remote server. No content is loaded into the flow.

    Need to know

    Regular expressions are supported when defining files to be removed with the delete command.

    Connection settings

    When configuring an SFTP connector, three fields should be updated:

    FTP command
    Root
    Path

    FTP command only: delete

    The root to the directory where the required file is located. You could enter just / and then a full path in the path field, or you can provide the full path to the file. If you are deleting multiple files using a regular expression, enter the entire path here, so the path field only contains your regular expression.

    The filename or regular expression (if you specified the full path as root) or the full path to the file.

    Example

    • Scenario

    • The steps

    Scenario

    The steps

    Our process flow is configured as follows:

    In this flow, we have three SFTP shapes, each with a different task:

    SFTP shape 1 Retrieve the a list of fruit from a summer.txt file from /myfiles/folderA

    SFTP shape 2 Append this content this to another list of fruit in a file named winter.txt, which is also stored in /myfiles/folderA:

    SFTP shape 3 Get the full list of fruit from winter.txt and load it into the flow.

    Connector settings for SFTP shape 1

    Our first SFTP shape is configured to retrieve content to append, from /myfiles/folderA/summer.txt . Connector settings are defined using a get command, as follows:

    Payload from SFTP shape 1

    When the process flow is run, the payload for our first SFTP shape shows the content in /myfiles/folderA/summer.txt :

    Connector settings for SFTP shape 2

    Our second SFTP shape is configured to append the current payload to our target file on the remote server - /myfiles/folderA/winter.txt . Connector settings are defined as follows:

    Payload from SFTP shape 2

    When the process flow is run, the payload for our second SFTP shape shows just a 1:

    A 1 indicates a successful response from your remote server. A 0 indicates an unsuccessful response from your remote server.

    Connector settings for SFTP shape 3

    Our final SFTP shape is configured to retrieve our target file from the remote server - /myfiles/folderA/winter.txt . Connector settings are defined using a get command, as follows:

    Payload from SFTP shape 3

    When the process flow is run, the payload for our final SFTP shape shows the content in /myfiles/folderA/winter.txt :

    Here we can see that content from /myfiles/folderA/summer.txt has ben appended to original content in /myfiles/folderA/winter.txt.

    Cover
    Cover
    Cover
    Cover
    Cover
    Cover
    Cover
    Cover

    Our process flow is configured as follows:

    In this flow, we need to delete all files that start with 'old' in /myfiles/folderB on the remote server:

    Connector settings

    Our SFTP shape is configured as follows:

    • Look for files from the root, which is defined as: /myfiles/folderB

    • Check the path for the file(s) to remove, defined as: /^old.*/i

    • Delete files with the ftp command. This is defined as: delete

    On our remote server, all files starting with 'old' have been removed from /myfiles/folderB/:

    Cover
    Cover
    Cover
    Cover
    W3Schools.comwww.w3schools.com

    copy (FTP command)

    Overview

    The copy command copies a file from one location on the remote server to another location on the remote server. The copied file remains in the source directory and no content is loaded into the flow.

    Need to know

    • The copy command can be used to in one step. You can but this requires multiple steps rather than a single copy operation.

    • Regular expressions are supported when defining files to be copied with the copy command.

    • When copying , subfolders are not included.

    Connection settings

    When , three fields should be updated:

    FTP command
    Root
    Path

    Examples

    Copying a specific file from one directory to another

    Scenario

    The steps

    Copying all files from one directory to another

    Scenario

    The steps

    In this example we copied ALL files from one folder to another. If we needed to copy a selection of files, we could add a after the - for example:

    Here we define the field name as 0, so the first value in our payload (bearing in mind we only have one field in each payload).

    We set the filter type to string and the operator to regex

    Using connector shape response scripts

    Introduction

    Typically, a process flow run is triggered and a request for data is made via a connector shape - if the request is successful, data is retrieved and the flow continues.

    However, there may be scenarios where you need to control whether the connector shape or process flow run should fail or continue based on information returned from the connection request. To achieve this, you can apply a response script to your connector shape.

    The steps detailed here show how to apply a response script to connector shapes, so this script only runs when the associated connector shape is used.

    If you want to apply a response script wherever and whenever a given endpoint is called in your process flows, it's more efficient to add this to the endpoint as a .

    In the context of connector endpoints, response scripts and post-request scripts are the same thing.

    How it works

    When a response script is applied to a connector shape, the script runs every time a connection is attempted. The script receives the response code, headers, and body from the request and - utilising response_code actions - returns a value determining whether the connector shape/flow run continues or stops.

    Response scripts are just like any other custom script, except they receive additional information from the request - see lines 11 to 14 in the example below:

    When a response script is applied, the existing schema/data path defined for the associated endpoint is bypassed. If data is modified by the script, it is returned in its modified state. If the script does not modify data, the data is returned in its original format. You should consider this in any subsequent shapes where the schema is used - for example: , , , .

    If you use a response script on an endpoint that modifies data and you are reliant on that data to resolve variables (e.g. for pagination) you should ensure that such dependencies are not compromised by your modifications.

    Implementing a response script

    To implement a response script, you should:

    Step 1: Write & deploy your response script

    Response scripts are written and deployed in the usual way, via the option. However, two additional options can be used for scripts that you intend to : and .

    These options are only valid when the script is applied to a as a response script.

    Response code

    The response_code determines how the process flow behaves if a connection request fails. Supported response_code values are:

    Value
    Notes

    Message

    The message is optional. If supplied, it is output in the run logs.

    Example

    Step 2: Apply the response script

    To apply your response script, access settings for the required connector shape and select your script from the response script dropdown field.

    Examples

    Scenario 1

    Here we handle the scenario where a connection response appears OK because the status code received is 200, but in fact the response body includes a string (Invalid session) which contradicts this. So, when this string is found in the response body, we want to retry the process flow.

    In this case, we return a response_code of 2 with an message of Invalid session:

    Scenario 2

    Here we show how the payload received from a connection request is checked for an order number and an order status - retrying the process flow if a particular order status is found:

    Related pages

    0

    Continue

    1

    Fail the connector step and retry. The connector step is marked as failed and the queue will attempt it again.

    2

    Fail the process flow and queue it to retry. The process flow is marked as failed and queued for a retry.

    3

    Fail the process flow and do not retry.

    4

    Force the connector to re-authenticate and retry the request.

    post-request script
    map
    filter
    flow control
    route
    Write and deploy the required script
    Apply the script to your connector shape
    custom scripts
    apply via connector shapes
    response_code
    message
    connector step
    Custom scripts
    <?php
    
    /**
     * Handler function.
     *
     * @param array $data [
     *      'payload'   => (string|null) the payload as a string|null
     *      'variables' => (array[string]string) any variables as key/value
     *      'meta'      => (array[string]string) any meta as key/value
     *      'flow'      => (array[mixed]) current flow data, including variables
     *      'response'  => [
     *           'headers' => ['Content-Type' => 'application/json', .......],
     *           'body' => '....',
     *           'status' => 200
     *      ]
     *    ]
     *
     * @return array $data Structure as above, plus 'logs' => (array[string]) Logs to be written to flow run log after script execution
     */
    
    function handle($data)
    {
        return $data;
    }
    <?php
    
    function handle($data)
    {
        // Stops the flow with a message
        $data['response_code'] = 3;
        $data['message'] = 'Flow stopped by response script';
    
        return $data;
    }
    <?php
    /**
     * Handler function.
     *
     * @param array $data [
     *      'payload'   => (string|null) the payload as a string|null
     *      'variables' => (array[string]string) any variables as key/value
     *      'meta'      => (array[string]string) any meta as key/value
     *.     'response'  => [
     *           'headers' => ['Content-Type' => 'application/json', .......],
     *.          'body' => '....',
     *           'status' => 200
     *      ]
     *    ]
     */
    function handle($data)
    {
        handle invalid session error in PVX
        $data['response']['status'] = 200;
        $data['response']['body'] = 'Invalid session';
          if (str_contains($data['response']['body'], 'Invalid session')) {
            return [
             'response_code' => 2 // retry flow
             'message' => 'Invalid session'
           ];
         }
    
    <?php
    /**
     * Handler function.
     *
     * @param array $data [
     *      'payload'   => (string|null) the payload as a string|null
     *      'variables' => (array[string]string) any variables as key/value
     *      'meta'      => (array[string]string) any meta as key/value
     *.     'response'  => [
     *           'headers' => ['Content-Type' => 'application/json', .......],
     *.          'body' => '....',
     *           'status' => 200
     *      ]
     *    ]
     */
        // check if order is ready to be processed yet
          $data['payload'] = [
            'order_id' => 1,
            'status' => 'Pending',
          ]
          if ($data['payload']['status'] === 'Pending') {
            return [
              'response_code' => 2, // retry flow
              'message' => 'Order not ready for processing, adding flow back to queue'
            ]
          }
    , then provide our regular expression. In this example (
    /^old.*/i
    ), only files starting with 'old' will be copied.

    FTP command followed by the target directory - i.e. where should the file(s) be copied?

    The common root to source and target directories.

    The source directory and files

    target specific files
    copy all files in a directory, or selected files (using regex)
    all files from one directory to another
    configuring an SFTP connector
    Copy a specific file from one directory to another
    Copy all files from one directory to another
    Scenario
    The steps
    Scenario
    The steps
    filter shape
    flow control
    Cover

    Our process flow is configured as follows:

    Cover

    In this flow, we need to copy a file named orders.json from /myfiles/folderB on the remote server, to /myfiles/folderB , keeping the same filename:

    Cover

    Connector settings

    Our SFTP shape is configured as follows:

    • Start looking for the file to get, from the root, which is defined as: /myfiles/

    • Check the path for the file to copy, which is defined as: folderB/orders.json

    Cover

    On our remote server, the file remains in /myfiles/folderB/:

    And it can also be found in /myfiles/folderA/:

    Cover

    Our process flow is configured as below:

    Cover

    In this flow, we need to copy all files in /myfiles/folderC on the remote server: ...to /myfiles/folderD , which is currently empty: We are keeping the same filenames.

    Cover

    Connector settings for SFTP shape 1

    Our first SFTP connector step is configured as follows:

    • Use the SFTP GET user pass endpoint.

    • Use list as the FTP command.

    Cover

    Flow control settings

    We use a flow control step to extract each file name into its own payload:

    Here we create batches of 1, so we get one payload per file name. When the flow runs, this shape outputs multiple payloads, each containing a single filename. For example:

    Cover

    Connector settings for SFTP shape 2

    Our final SFTP connector step is configured as follows:

    • Since we are updating the remote server (as opposed to retrieving files) the SFTP PUT user pass endpoint is selected.

    • The root is defined as /myfiles/

    Cover

    On our remote server, all files remain in /myfiles/folderC/:

    And they can also be found in /myfiles/folderD/:

    Logo

    Copy this file to the path (from the root) which is specified immediately after the ftp command. This is defined as: copy:folderA/{{current_filename}}

    Look for files in the root, which is defined as: /myfiles/folderC

  • Since there's no specific file to target, we leave the path empty

  • When the flow runs, the SFTP shape outputs a single payload which contains all file names found in /myfiles/folderC, as an array:

    , which is the common root for both source (
    folderC
    ) and target (
    folderD
    ) directories.
  • The path defines which files are copied. It's set as folderC/[[payload.0]] which means: look in folderC (from the root) for a filename resolved from the first value in the incoming payload. Keep in mind that this step repeats for each incoming payload from the previous flow control step - i.e. for each file.

  • The ftp command includes the copy command, immediately followed by our target directory: copy:folderD/{{current_filename}} .

  • get (FTP command)

    Overview

    The get command retrieves file content and loads it into the flow as a payload. If multiple files are returned, multiple payloads are generated.

    Need to know

    Regular expressions are supported when defining files to retrieve with the get command.

    Connection settings

    When , two fields should be updated:

    FTP command
    Root
    Path

    Example

    Scenario

    The steps

    FTP command only: get

    The root to the directory you want to list. Since there is no specific file to consider, the full path to the directory will be the root.

    Not required.

    configuring an SFTP connector
    Scenario
    The steps
    Cover

    Our process flow is configured as follows:

    Cover

    In this flow, we need to list all files in /myfiles/folderB on the remote server:

    Cover

    Connector settings

    Our SFTP shape is configured as follows:

    • List all files in the root, which is defined as: /myfiles/folderB/

    Cover

    On our remote server, all files in /myfiles/folderB/ are loaded into the flow as a payload, with files in an array: