LogoLogo
  • 🏃‍♀️ Get started
    • Create your first integration
      • Step 1 - Creating your Source & Destination System keys
        • Shopify Key
        • FTP Key
      • Step 2 - Select a Source Function
      • Step 3 - Select a Destination Function
      • Step 4 - Select a Transformation
      • Step 5 - Put it all together to create a Stream
      • Step 6 - Run your stream
    • Go-live best practice
  • 🏄🏻‍♀️ Integration Tutorials
    • Shopify with Descartes Peoplevox
    • Shopify with FTP
      • Shopify with FTP
      • Shopify with FTP (Part 2)
    • Shopify with ReBOUND
    • WooCommerce
      • WooCommerce with FTP
    • NetSuite
  • 🏗️ Creating streams
    • Intro
    • Streams
      • Web hook installation
      • Web hook support
        • DEAR Systems
        • Descartes Peoplevox
        • Shopify
    • Systems
      • System library
        • DEAR Systems
        • Email
        • FTP
        • Descartes Peoplevox
        • Magento 1.x
        • Magento 2.x
        • ParcelLab
        • ReBOUND
        • Shopify
        • WooCommerce
    • Keys
      • Key library
        • Shopify private apps
        • Netsuite
        • FTP and sFTP
        • oAuth
    • Sources
    • Transformations
      • Public Transformations
      • Private Transformations
      • Customising a Public Transformation
      • Write a custom transformation
        • Transformation Library
        • Example 1 - Sales Order
        • Example 2 - Shipment
    • Destinations
      • Aggregated Events
    • Data uploads
    • Lookup table
  • 🔧 Editing streams
    • Updating source and destinations
    • Deleting a stream
  • 🎛️ Monitoring streams
    • Job Statuses
    • Event Statuses
    • Resending data
      • Replicate Job
      • Replicate Event
      • Replicate Event without Event in Control Panel
    • Reporting
      • Jobs
      • Events
      • Entities
    • Alerts
  • ⚙️ Administration
    • Login and security
      • Two factor authentication (2FA) setup
        • Signup to HighCohesion without QR scan
    • Creating an organisation
    • Billing Information
    • Creating a user
    • Roles & Permissions
  • 💁🏽 Help and support
    • Glossary of platform terminology
    • Security standards
    • Data retention policy
    • Contact us
Powered by GitBook
On this page
  • File name formatting - FTP destination file
  • Track entity in the core
  • Source Filters
  • Archive files after processing
  • Manual peer-to-peer validation

Was this helpful?

  1. 🏃‍♀️ Get started

Go-live best practice

PreviousStep 6 - Run your streamNextShopify with Descartes Peoplevox

Last updated 3 years ago

Was this helpful?

The aim of this guide is to provide best practices when making a stream in the current Control Panel set up. Currently we don't have access to the Stream Wizard and validation in all sections of the Control Panel is being investigated, but for now we would like to provide a guide to the best way to create a stream with with the environment that we currently have.

The purpose of this is to reduce the risks of hiccups like we have seen recently when onboarding and launching new clients and maintaining current clients with as little interruption to service as possible so that we can help them develop trust in the system.

File name formatting - FTP destination file

Description: This section is created to avoid the risk of overriding a destination file, if two files with the same name format are processed at the same time.

Best Practice: When creating a file name always use a contextual prefix and use ##EVENTID## to ensure the naming convention is unique, this can also be combined with PPK values.

Example:

filenameformat = "Order_##EVENTID##.xml"

Track entity in the core

Description: Every time we sync an entity we should store the entity in the core. This is how we identify if an entity already exists and the system defines if an event should be aborted. The purpose is to avoid duplications of orders.

Best Practice: Always make sure there is a tracked PPK, for examples on how to setup tracking have a look at .

To avoid duplications in the control panel also make sure to check if the entity i.e order already exists. This can be done using the below settings:

    "--abort_check": {
      "*ppk*": "id",
      "*post_format*": [
        {
          "key_lookup": {
            "*match*": "s_id",
            "*pluck*": "d_id",
            "*on_match*": "abort",
            "*on_fail*": "ppk"
          }
        }
      ]
    },

Source Filters

Description: When retrieving entities from Shopify, filters should be added to source settings to ensure we are getting the correct entities.

Best Practice: When creating a stream that will be getting orders from Shopify ensure to add the correct filters to get the right data. Important: note that the schedule delta should never be zero (e.g. the time filter should never match the schedule) and that if we make the window for picking up too short we risk missing orders. Another thing to consider is that if the window is too long for example over 3 hours or even days, for a busy store this will cause an unnecessary amount of aborted events.

filter_created_at_min = "CURRENT-2 hour"
filter_created_at_max = "CURRENT"
filter_financial_status = "paid, partially_refunded"
filter_fulfillment_status = "unfulfilled"
filter_status = "open"

Archive files after processing

Description: We need to ensure that the client and HiCo/ET always have visibility of files after they have been removed from FTP. The reason for this is if there is an issue with a file and we need to access to figure out what went wrong.

Best Practice: Ensure we have the file backed up by moving the file to an archive folder. To do this add the following setting:

move_remote_file = "archive"

It is important to make sure the 'archive' folder exists on the clients FTP. At the moment our system does not have the capability to create a new directory and even so we might not always have to correct permissions to create a new folder. Therefore this will need to be discussed with the client.

Manual peer-to-peer validation

At the moment we don't have automatic validation nor the capability of creating drafts in place for adding settings to streams, sources and destinations. As best practice therefore when you have created or updated a stream, please have someone else look at it and sign off on it just to double check that there are no errors. Checklist for reviewing a stream:

  • Stream, Source and destination settings (checking filters and ensuring no spelling mistakes)

  • Looking over the transformation file to make sure that there is no obvious errors

creating new transformations