This document provides accsyn job JSON (JavaScript Object Notification) payload examples and describes best practices for using the accsyn API or CLI.
accsyn jobs are internally submitted in JSON format, from desktop app, web and CLI.
When submitting a job through the API our using a file with CLI, the correct job JSON must be provided as specified throughout the rest of this document.
Hint: the job JSON payload can be inspected from within the accsyn Desktop app, by clicking the "JSON" button beneath the green submit button in NC mode:
The general structure of job JSON:
{
.. job attributes ..,
"tasks": .. list or JSON ..,
"settings": .. JSON ..
"metadata": .. JSON ..
}
Job attributes; (optional) Typically name of job. The name of job is auto generated using filename of first task if not given.
Tasks: List of tasks - files and directories to transfer, see party and path notation below.
Settings; (optional) Additional job settings, see below for a complete listing.
Metadata; (optional) Additional job settings.
Note: tasks can be omitted if just a single file, and be replaced with job attributes "source" and "destination".
The accsyn job JSON requires "tasks" entry containing on ore more source & destination definitions, accsyn have adopted the rsync notation on how to define a target and a path:
<party>:<path>
Where party can be:
<workspace>; Main premises/hq servers, for example domain "mycompany" with accsyn URL https://mycompany.accsyn.com.
<user/email>; A user, for example "john@user.com".
<site=id|code>; A remote site server, for example "berlin".
And path can be:
C:\Users\john\Desktop\image.png; A local path at user end.
/Volumes/projects/reference.tif; An absolute path on a root share.
share=projects/reference.tif ; Same notation, but references the share "projects" and leaving accsyn to complete path with configured prefix for server platform. This is also called the accsyn path notation.
share=john@user.com/UPLOAD/test.abc; File is located at home user share john@user.com , in subfolder "UPLOAD".
myproject/reference.jpg ; Assume file paths being relative default root share, equivalent to share=projects/myproject/reference.jpg.
racing2019_grade/test.abc; Used in conjunction with user as target, deliver file into relative folder "racing2019_grade" at user end.
Notes:
A folder cannot be given as destination unless a "/" (or "\" for Windows) is added. For example downloading a file "x.jpeg" to destination inside folder "/Volumes/nas/TEMP" will store the file as "TEMP", not inside folder TEMP. Correction destination notation in this case are: "/Volumes/nas/TEMP/", or even better: "/Volumes/nas/TEMP/x.jpeg".
Destination paths can be left out if other party is a site or a user's locally mapped share, this is called "path mirroring", and is suitable for keeping servers and/or workstations in sync when it comes to file structure.
If party is omitted, accsyn interprets this as the workspace party - file is to be sent to or from hq.
When submitting a job, accsyn tries to resolve a client-server combo based on the party and path given. The rules vary depending on the type of party:
Workspace; Here accsyn tries to resolve the server that serves the volume pointed out by the path, or indirectly by paths on a shared folder or collection.
User; accsyn tries to find a file transfer client belonging to the user, this might be the default client created when using the desktop app or a user server. For more information, see Hosts.
Site; accsyn tries to find a site server, that serves the volume pointed out by the path at the remote site, or indirectly by paths on a shared folder or collection.
If no server or client can be resolved, an error will be given with appropriate feedback. Once resolved, the resolve remains static which means that if a new server endpoint is deployed or the user launches another client, the mapping is NOT updated - a new transfer job must be submitted.
Note: The accsyn Python API does not provide a built-in p2p ASC client - it can only be used to control transfers.
A user is only allowed to access files and folders on an accsyn site (workspace/main hq premises or remote site) given explicit access through ACLs.
When downloading or uploading a file, only the user owning the API/CLI session are allowed to specify local absolute paths (e.g. C:\Users\ John\Downloads), to send a file to user you must create a Delivery were the user then chooses were to download (or what to upload if it is an upload request). For more information, see Delivery.
There is an exception to this: if the user has mapped a share locally and has given explicit write permissions to it, an elevated user are allowed to push files to the user's client using mirror path option or explicit path notation (see examples below). Same goes for upload - an elevated user can pull files from a remote user's computer and upload them to a workspace volume if the user has given explicit read access to the corresponding locally mapped share. For more information on how to setup locally maped shares, see Hosts.
When transferring files between sites, the user has to have the admin role or employee with full access to the involved volume(s).
Following, is a collection of sample job JSON snippets as examples.
Download the file "lfm/114/290/lfm_114_290.jpg" from default volume to employee "emelie@mycompany.com", storing locally. Short simplified notation, single task:
{
"source":"lfm/114/290/lfm_114_290.jpg",
"destination":"D:/work/lfm/114/290/lfm_114_290.jpg"
}
Notes:
The workspace party is omitted here, this is allowed since the other user party is clearly stated and no ambiguity exists when it comes to the source party.
Neither the source volume is given here as source, just a relative path. When no source volume or share is given, the default volume is assumed being the source.
The employee emelie@company.com needs to have download (read) access to the default volume.
Corresponding full expanded syntax for reference, assuming the default volume code is "projects":
{
"source":"mycompany:volume=projects/lfm/114/290/lfm_114_290.jpg",
"destination":"emelie@mycompany.com:D:/work/lfm/114/290/lfm_114_290.jpg"
}
Upload folders "/Users/john/Desktop/delivery" & "references" to user john@user.com's home share, have job reside in queue "low_prio". Here we use the relaxed task list syntax:
{
"tasks":[
{
"source":"/Users/john/Desktop/delivery",
"destination":"share=john@acmefilm.co.uk"
},
{
"source":"/Users/john/Desktop/references",
"destination":"share=john@acmefilm.co.uk"
}
],
"queue":"low_prio"
}
Standard user downloads two files from home share to local folder "C:\Users\John\Download\accsyn", using the full URI(key) tasks syntax:
{
"tasks":{
"0":{
"source":"20180413/script_v001.pdf",
"destination":"C:\Users\John\Download\accsyn\"
},
"1":{
"source":"20180413/offline_v001.mp4",
"destination":"C:\Users\John\Download\accsyn\"
}
}
Notes:
When not party or share is given, the home share is assumed beeing the target, thus allowing us to specify the source path in this example as a relative path.
As mentioned above, leaving out the trailing slash will copy the file to file "accsyn" on the receiving side - not inside the folder as expected.
"0" and "1" are called a task's "uri" and are used to identify nested tasks, see Compute job specifications below.
Upload folder "/Users/john/Desktop/delivery" to shared folder "thefilm" into subfolder "from_john/20180413":
{"source":"/Users/john/Desktop/new_scans", "destination":"share=thefilm/from_john/20180413/"}
Note: Shares are identified wither by their unique ID, or by their unique API "code" identifier.
Sync a folder on volume or share "projects" @ main site(default: "hq") to site "berlin", paths mirrored (default when site are involved):
{ "source":"share=projects/thefilm/010", "destination":"site=berlin" }
Note: Site can have explicit permissions that controls which file operations are allowed.
Upload a folder from site "cloud" to main site(default: "hq")
{"source":"site=google:share=projects/got/sc01/sh01/render/got_sc01_sh01_comp_v012","destination":"company"}
Sync a file and a folder from site "cloud" to site "berlin":
{"source":"site=cloud:share=projects/got/sc01/sh01/render/got_sc01_sh01_comp_v012",
"destination":"site=berlin"}
Upload a folder on share "projects" from site "berlin" back to hq, with metadata that can be picked up by hooks:
{"tasks":[{
"source":"site=berlin:projects/racing2019_grade/davinci_files",
"destination":"myorg",
"metadata":{"app":"davinci_resolve"}
}],
"metadata":{"artist":"malcolm"}
}
Upload file "final_export.mov", at share "projects" from user to share "racing2019_grade", but not overwriting it if exists and size or modification date differ:
{"tasks":[{
"source":"E:\racing2019_grade\davinci_files\final_export.mov",
"destination":"mycompany:share=racing2019_grade/FROM_EDIT/final_export.mov",
"settings":{"transfer_ignore_existing":"file"}
}]}
Upload a large folder, excluding all files ending with "tmp" and is only numbers:
{"tasks":[{
"source":"F:\BIGGIE",
"destination":"mycompany:share=projects\__UPLOADS\BIGGIE",
"settings":{"transfer_exclude":"*tmp\,re('[0-9]')"}
}]}
Note: multiple exclude statements are separated by an escaped comma - \, . This means that an escaped comma cannot be used in exclude expressions.
Tasks (files) can have different priorities, enabling pre-delivery of some important files. Here is an example of downloading two files with the PDF prioritised:
{"tasks":[{
"source":"share=bidding/LFM/brief_v001.pdf",
"destination":"lisa@"mail.com:/Volumes/media/_TO_BID,
"priority":999
},{
"source":"share=bidding/LFM/material.rar",
"destination":"lisa@"mail.com:/Volumes/media/_TO_BID,
}]}
In this case, the file bried_v001.pdf will be sent first, then material.rar. accsyn priorities range from 1000 (highest) to 1 (lowest).
accsyn supports sending a subset of a numbered file sequence, this is handy when you do not want to send an entire directory with a huge amount of files but instead want to do a selection.
Transfer a part of a file sequence from one site to another, mirrored paths (requires VPN direct connection or both server setup with NAT port forwarding of accsyn protocol ports):
{"tasks":[{
"source":"site=berlin:myproj/images/movie.%04d.jpg[100-167]",
"destination":"site=london",
}]}
Notes:
The prefix "share=" can be omitted if the party is not a user, i.e. sending to/from the workspace hq/site.
Source/destination can be specified without tasks, for single file/directory transfers. Tasks can be a list instead of a JSON, each transfer task while then be numbered by the order they appear in list.
The following scenarios can also be submitted directly from the commandline, for example: "accsyn job create mycompany:/projects/lfm/114/290/lfm_114_290.jpg john@user.com:projects/lfm/114/290/lfm_114_290.jpg".
If destination exists and is a directory, accsyn behaves as unix "rsync" - the file ends up beneath the folder.
If file exists were accsyn were supposed to save a folder, transfer will fail.
In some situations, a job needs to be created in beforehand, to enable tasks to be added shortly after. To achieve this, a placeholder job can be submitted:
{"tasks":[{
"source":"site=cloud:share=projects/nofile",
"destination":"site=hq",
"status":"excluded"
}]}
The job will immediately be set to done, with no files actually transferred.
Note: This example relies on the main site name is the default - "hq", the expression can also be replaced with your workspace code/domain (mycompany in these examples).
accsyn not only supports rendering for example a single Houdini scene, splitting a frame range up in buckets, over a pool of render servers. Nested jobs with dependencies are also supported. Here is an example of a pipeline job, that runs task through a custom "pipeline" engine :
{
"name": "Mocap shoot pipeline job - 260217",
"engine": "pipeline",
"settings": {
"task_bucketsize": 1
},
"filters": "",
"description": "Daily shoot post processing pipeline job.",
"tasks": {
"EP000_SC0110_SL02_PS01_TK01": {
"tasks": {
"pickup": {
"compute": {
"parameters": {
"action": "shoot-pickup"
}
},
"tasks": {
"0": {
"compute": {
"parameters": {
"file": "mocap.tak"
}
},
"description": "Pickup and name Mocap take."
},
"1": {
"compute": {
"parameters": {
"file": "ref-camera.mov"
}
},
"description": "Pickup and name floor reference camera."
},
"2": {
"compute": {
"parameters": {
"file": "sound.wav"
}
},
"description": "Pickup and name studio recorded audio."
}
},
"description": "Pickup and name files for take: EP000_SC0110_SL02_PS01_TK01"
},
"notify-pickup-done": {
"description": "Notify someone that we are done.",
"deps": [
"EP000_SC0110_SL02_PS01_TK01/pickup"
]
}
},
"metadata": {
"take_name": "EP000_SC0110_SL02_PS01_TK01",
"take_metadata_path": "Z:\\HFSUR\\06_Harvest\\260217\\EP000_SC0110_SL02_PS01_TK01.json"
},
"description": "Post process take: EP000_SC0110_SL02_PS01_TK01"
}
},
"metadata": {
"daily_path": "Z:\\HFSUR\\06_Harvest\\260217"
}
}
Explanation of the job JSON:
Top level engine attribute; tell accsyn to run all tasks using the engine "pipeline" (API code identifier).
Task EP000_SC0110_SL02_PS01_TK01; The main parent task to execute, in this example it relates to a take in a motion capture studio pipeline.
EP000_SC0110_SL02_PS01_TK01 "tasks" attribute; Tells accsyn that this task has sub tasks (nested), that will be executed instead of the task itself.
EP000_SC0110_SL02_PS01_TK01/pickup; Sub-task of EP000_SC0110_SL02_PS01_TK01, its compute parameters will be aggregated and made available to all subsequence tasks.
EP000_SC0110_SL02_PS01_TK01/pickup/0; Leaf task, will be executed first (bucket size = 1)
EP000_SC0110_SL02_PS01_TK01/pickup/1 & 2; Subsequent sub tasks.
EP000_SC0110_SL02_PS01_TK01/notify-pickup-done; Has a dependency on on the "pickup" task, and will not executed until the pickup tasks, and all its sub tasks) have executed successfully.
Metadata; Are aggregated upstream and supplied upon execution, the same way compute data does.
accsyn is a p2p file transfer protocol which means that each transfer job can only have one unique source and one unique destination party.
A source or destination party can be:
The workspace(organization); on-prem servers at main site (hq), running the accsyn daemon app in server mode.
A user; Identified by the Email, running the accsyn desktop app or daemon app in user server mode.
A site; Identified by a unique name, running the accsyn daemon app in server mode.
Web browser; For downloads and uploads using web browser.
For the workspace party, source or destination files can reside on multiple volumes within the same job:
{
"code":"My download",
"tasks":[
{
"source":"share=share1/file.001",
"destination":"john@user.com/X:/download"
},{
"source":"share=share2/file.002",
"destination":"john@user.com/X:/download"
},
]
}
Example: Transfer one file from share1 and another from share2, that can reside on different servers.
Although now limits on the amount of jobs are enforced by accsyn, when amount reaches 500+ a substantial degrade of performance occurs and UI will become immeasurable.
The recommended approach is to reduce the amount of jobs and instead have multiple tasks within each job.
Giving an example were the API in an automatic workflow submits a sync job for each file updated and everyday, if thousands of files are updated accsyn will soon reach its limits and the job listing in desktop app /web app will be overflowed.
A better approach would be to let the API load a daily sync job for each source-destination pair (create if not exists) and add tasks to that job instead. This would reduce the amount of jobs drastically, and will make job listing much more readable when it comes to find other important jobs that otherwise would totally drown.
accsyn transfers files using the same algorithm as *NIX rsync which means that a list of files with size and modification dates are sent to receiving end in order to determine which files need to be sent.
For very large file transfers containing a lot of smaller files in deep lengthy folder structures, accsyn might run out of RAM during file transfer init and in those cases it is recommended to split a job in multiple tasks.
For example when doing project backup with accsyn, instead of sending the entire root share or directory - send each project directory as individual tasks and set "task_bucketsize" setting to "1":
{
"code":"Daily backup",
"source":"share=raid01/projects",
"destination":"site=backup"
}
=>
{
"code":"Daily backup",
"tasks":[
{
"source":"share=raid01/projects/PROJ001",
"destination":"site=backup"
},{
"source":"share=raid01/projects/PROJ002",
"destination":"site=backup"
},
],
"settings":{"task_bucketsize":"1"}
}
The following job settings can be provided on submit: