Go-live best practice

The aim of this guide is to provide best practices when making a stream in the current Control Panel set up. Currently we don't have access to the Stream Wizard and validation in all sections of the Control Panel is being investigated, but for now we would like to provide a guide to the best way to create a stream with with the environment that we currently have.

The purpose of this is to reduce the risks of hiccups like we have seen recently when onboarding and launching new clients and maintaining current clients with as little interruption to service as possible so that we can help them develop trust in the system.

File name formatting - FTP destination file

Description: This section is created to avoid the risk of overriding a destination file, if two files with the same name format are processed at the same time.

Best Practice: When creating a file name always use a contextual prefix and use ##EVENTID## to ensure the naming convention is unique, this can also be combined with PPK values.


filenameformat = "Order_##EVENTID##.xml"

Track entity in the core

Description: Every time we sync an entity we should store the entity in the core. This is how we identify if an entity already exists and the system defines if an event should be aborted. The purpose is to avoid duplications of orders.

Best Practice: Always make sure there is a tracked PPK, for examples on how to setup tracking have a look at creating new transformations.

To avoid duplications in the control panel also make sure to check if the entity i.e order already exists. This can be done using the below settings:

    "--abort_check": {
      "*ppk*": "id",
      "*post_format*": [
          "key_lookup": {
            "*match*": "s_id",
            "*pluck*": "d_id",
            "*on_match*": "abort",
            "*on_fail*": "ppk"

Source Filters

Description: When retrieving entities from Shopify, filters should be added to source settings to ensure we are getting the correct entities.

Best Practice: When creating a stream that will be getting orders from Shopify ensure to add the correct filters to get the right data. Important: note that the schedule delta should never be zero (e.g. the time filter should never match the schedule) and that if we make the window for picking up too short we risk missing orders. Another thing to consider is that if the window is too long for example over 3 hours or even days, for a busy store this will cause an unnecessary amount of aborted events.

filter_created_at_min = "CURRENT-2 hour"
filter_created_at_max = "CURRENT"
filter_financial_status = "paid, partially_refunded"
filter_fulfillment_status = "unfulfilled"
filter_status = "open"

Archive files after processing

Description: We need to ensure that the client and HiCo/ET always have visibility of files after they have been removed from FTP. The reason for this is if there is an issue with a file and we need to access to figure out what went wrong.

Best Practice: Ensure we have the file backed up by moving the file to an archive folder. To do this add the following setting:

move_remote_file = "archive"

It is important to make sure the 'archive' folder exists on the clients FTP. At the moment our system does not have the capability to create a new directory and even so we might not always have to correct permissions to create a new folder. Therefore this will need to be discussed with the client.

Manual peer-to-peer validation

At the moment we don't have automatic validation nor the capability of creating drafts in place for adding settings to streams, sources and destinations. As best practice therefore when you have created or updated a stream, please have someone else look at it and sign off on it just to double check that there are no errors. Checklist for reviewing a stream:

  • Stream, Source and destination settings (checking filters and ensuring no spelling mistakes)

  • Looking over the transformation file to make sure that there is no obvious errors

Last updated