Get Started with Manager+Agents REST API

Manager+Agents native REST API allows you to automate transfer jobs, manage users, and control storage settings for your transfer environments by sending API calls to your Manager.

For information about Manager+Agents SOAP and FIMS Compliant API, read their development guides:

Architecture

Architecture flowchart

Making API Requests

All API requests to your Manager must be sent to valid endpoints, which can be accessed via:

https://your.manager.url/signiant/spring/admin/v1.0/

When making requests to the API, you must include headers to with a username, password, and Content-Type: application/json.

Authenticating and Authorizing Users

If you have created an additional user you must enable their access to the administration interface.

To authorize a user:

  1. In your Manager, navigate to Administration > Users > List.
  2. Under Roles, select Administration Interface Login.

Note: For improved performance when API calls are being sent to the Manager, enable Improve web service API performance with reduced security auditing on the General tab. However, this setting will prevent successful or failed authentications for a user from being recorded in the Manager logs.

Making A Sample Request

To make sure that you can authenticate properly with your Manager, you can perform a simple test to return a response. Use using a terminal, make a curl request to the /listusers endpoint:

curl -X GET -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/listusers

You should receive a JSON response that includes your Manager’s users:


[
  {
    "user": {
      "id": 12345,
      "userName": "Administrator",
      "state": {
        "status": "Activated",
        "lastLoginAttempt": "2018-01-20T09:03:19",
        "lastSuccessfulLogin": "2018-01-20T09:03:19",
        "lastLoginResult": "Success",
        "guest": false
      },
      "fields": {
        "firstName": "First Name",
        "lastName": "Last Name",
        "organization": "Your Organization Name",
        "mediaExchange": null
      }
    }
  }
]

Job Parameters

Job parameters are string key value pairs contained within the request body and define the overall settings for a job. Job parameters any valid string syntax, including URLs, local network storage paths, CIFS (Windows shares), Windows drive letters, and NFS.

To view a complete list of available job parameters:

  1. In your Manager, navigate to Jobs > Templates.
  2. Select the appropriate job template library and click Open Canvas.
  3. Right-click on the first component in the appropriate job template and select Show Web Services Info.

Note: Job parameters are case sensitive.

Creating a Job

Every job is based on a specified job template. Manager+Agents provides predefined job templates which allow file transfers between supported storage types:

  • Media Mover provides job templates used for transfers in file system storage.
  • Object Mover provides job templates used for object storage file transfers.

Each job is identified by its name and job group.

Note: Job names must start with a letter and can only contain letters and numbers.

After a job is created via the API, it is visible in the Manager and can be used like any other job.

Note: It is recommended to create a new job when transferring files using the API. Creating a unique job can provide more specific logging in the case a transfer fails.

Configuring Media Mover Jobs

The Media_Mover_Workflows Job Template Library includes three default job templates:

Job Template Type Description of Transfer
MediaReplicator Push One-to-one, one-to-many - optimized for many small files
MediaDistributor Push One-to-one, one-to-many
MediaDropBox Push One-to-one, one-to-many
MediaAggregator Pull One-to-one, many-to-one

You can also use the API to call any custom templates you have created. To create a job using the API, you must send an API request containing a JSON body that includes the required parameters for the job template.

You can include more than one job in each request within a limited capacity depending on your Manager configuration.

Media Mover Job Example

In this example, MediaReplicator transfers files from a Windows Agent to a Linux Agent.

Request Body

[
  {
    "job": {
      "jobName": "MyJobName",
      "fields": {
        "jobGroupName": "MyJobGroupName",
        "jobTemplateLibraryName": "Media_Mover_Workflows",
        "jobTemplateName": "MediaReplicator",
        "jobArgs": {
          "MediaReplicator.Source.SourceAgent": "your.source.agent",
          "MediaReplicator.Source.SourceData": "c:\\path\\to\\source\\data",
          "MediaReplicator.Target.TargetAgents": "your.target.agent",
          "MediaReplicator.Target.TargetDirectory": "/target/directory",
          "MediaReplicator.Schedule._sp_frequency": "once"
        }
      }
    }
  }
]

Note: Both forward slashes and back slashes are supported for all operating systems. Each backslash must be escaped by an additional backslash.

Once Manager receives a valid request, it will respond with the job names and IDs created via the API request.

Response

{
  "creator": "UserName",
  "jobs": [
    {
      "id": 12345678,
      "jobName": "MyJobName"
    }
  ]
}

Configuring Object Mover Jobs

Object Mover allows transfers between S3 compatible, Amazon S3, or Microsoft Azure cloud object storage.

Object Mover offers three job templates:

Job Template Type Source Destination
ObjectUploader Pull File System Object Storage
ObjectDownloader Push Object Storage File System
ObjectReplicator Push Object Storage Object Storage
ObjectDropBox Push File System Object Storage

Note: An Object Agent can be configured to transfer files using S3 compatible object storage or using Amazon S3 or Microsoft Azure storage, but not both.

You can also use the API to call any custom templates you have created. To create a job using the API, you must send an API request containing a JSON body that includes the required parameters for the job template.

You can include more than one job in each request within a limited capacity depending on your Manager configuration.

Using Object Storage Profiles

Object Mover job parameters require an object storage profile. Object Storage Profiles allow you to use the same set of credentials for object storage across multiple jobs without needing to re-enter credentials.

Note: More information on Object Storage Profiles is available via Signiant Help.

Using an object storage profile when creating a job requires a JSON key/value pair within the SourceData or TargetData properties. Quotes around the storage profile name must be escaped:

...
"jobArgs":{
                    "ObjectUploader.Source.SourceAgent":     "example-s3-storage",
                    "ObjectUploader.Source.SourceData":      "{\"name\":\"profileName\"}"

...                    

If desired, you can specify a custom sub-folder for the job transfer, which also requires escaped quotes:

...
"jobArgs":{
                    "ObjectUploader.Source.SourceAgent":     "example-s3-storage",
                    "ObjectUploader.Source.SourceData":      "{\"name\":\"profileName\",\"subfolder\":\"source_folder/\"}"

...

Object Mover Job Example

In this example, ObjectUploader transfers files from a Linux Agent to Object Storage.

Request Body

[
     {
        "job":{
            "jobName":                                        "MyJobName",
            "fields":{
                "jobGroupName":                               "MyJobGroupName",
                "jobTemplateLibraryName":                     "Object_Mover_Workflows",
                "jobTemplateName":                            "ObjectUploader",
                "jobArgs":{
                    "ObjectUploader.Source.SourceAgent":     "source-agent-url",
                    "ObjectUploader.Source.SourceData":      "/source-directory",
                    "ObjectUploader.Target.TargetAgents":    "target-agent-url",
                    "ObjectUploader.Target.TargetDirectory": "{\"name\":\"profileName\"}",
                    "ObjectUploader.Schedule._sp_frequency": "once"
                }
            }
        }
    }
]

Once Manager receives a valid request, it will respond with the job name and IDs created via the API request.

Response

{
  "creator": "UserName",
  "jobs": [
    {
      "id": 12345678,
      "jobName": "MyJobName"
    }
  ]
}

Deleting Jobs

Once a job is complete, you have the option to delete it.

There are two options for deleting a job via the API:

  • Soft Delete: Marks the job as deleted in the Manager, but the database records are kept for reporting purposes until they are deleted by a maintenance job.

  • Hard Delete: Immediately deletes the job and all associated database records. This choice prevents database growth that can impact performance.

To soft delete a job, make a DELETE request to /jobs/MyJobName/MyJobGroupName.

To hard delete a job, send a DELETE request jobs containing a request body with the jobGroupName and jobName:

[
    {
        "job":{
            "fields":{
               "jobGroupName":"MyJobGroupName"
            },
            "jobName": "MyJobName"
        }
    }
]

Configuring API Integration For Agent Groups

To improve efficiency when transferring files directly to or from specific Agents, you can configure Agent groups.

Agent groups allow you to define a set of Agents with a specific group name. Agents can also be swapped in and out of an Agent group without needing to reconfigure your integration.

Agent groups can be flagged as load-balanced, to provide redundancy and scalability when a job runs between Agents in the group.

For more information, see Configuring Agent Groups.

Agent groups are always associated with a specific organization. When configuring a transfer to run to or from a given Agent group, the organization’s API Identifier must be included in the request body.

You can find an organization’s API Identifier on the Administration > Manager > Organizations menu.

To use an organization’s Agent group, use the format <AgentGroupName>!<OrgId>, eg. agentGroupName!4321

Transferring Data From An Agent Group

Example Request Body

[
    {
        "job":{
           "jobName":                                           "MyJobName",
           "fields":{
                "jobGroupName":                                 "MyJobGroupName",
                "jobTemplateLibraryName":                       "Media_Mover_Workflows",
                "jobTemplateName":                              "MediaReplicator",
                "jobArgs":{
                    "MediaReplicator.Source.SourceAgent":       "agentGroupName!4321",
                    "MediaReplicator.Source.SourceData":        "c:/source/-directory",
                    "MediaReplicator.Target.TargetAgents":      "target-agent-url",
                    "MediaReplicator.Target.TargetDirectory":   "c:/target-directory",
                    "MediaReplicator.Schedule._sp_frequency":   "once"
                }
           }
        }
    }
]

Checking Job Status

The job status is maintained across two fields:

  • activeState: What the job is currently doing
  • scheduledState: What the job is scheduled to do in the future

For example, a job set to run once showing the activeState as RUNNING, will display scheduledState as DORMANT, as no other runs are scheduled for that job.

Returned statistics show started workflow components. For standard job templates, the first component to run is the file transfer component. This is the only component which takes measurable execution time.

Note: Statistics are processed by the Manager every 15 seconds. The order the response containing job statistics can vary.

To request job status, make a GET request to /jobs/MyJobName/MyJobGroupName.

Example Response

[
    {
      "job": {
        "id":                        12345678,
        "jobName":                   "MyJobName",
        "fields": {
          "jobGroupName":            "MyJobGroupName",
          "jobTemplateLibraryName":  "Media_Mover_Workflows",
          "jobTemplateName":         "MediaReplicator",
          "activeState":             "IDLE",
          "scheduledState":          "DORMANT",
          "lastExitCode":            0,
          "percentComplete":         "100%",
          "lastActiveStatusMessage": "",
          "activeFilename":          "",
          "jobArgs": {
            ...
          }
        }
      }
   }
]

Job Commands

Once you have created a job, you can send commands through the API to control the job.

Valid job commands are:

  • force - Start the job immediately.
  • kill - Cancel the running job.
  • suspend - Prevent future scheduled runs of the job, and prevent it from running automatically in the future.
  • resume - Resume a suspended job.
  • delete - Remove the job from the manager.
  • setbwlimits - Set the bandwidth limits for a job.

Bandwidth Limits

The setbwlimits command allows you to set a job-specific resource control that limits certain bandwidth limits:

  • Maximum Speed
  • Bandwidth Floor
  • Bandwidth Ceiling

Example Requests

curl -X GET -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_<maximumSpeed>:<bandwidthFloor>:<bandwidthCeiling>

To remove a bandwidth limit, pass the command again with a 0 for any limits you want to remove:

curl -X GET -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_0:0:0

If you do not need to set all three available limits, leave that parameter blank. For example, to exclude the Bandwidth Floor parameter, use the following syntax:

curl -X GET -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_MaximumSpeed>::<bandwidthCeiling>

Troubleshooting Job Failures

When a job fails, troubleshooting details are available in the Manager’s job logs.

To view a job log:

  1. In your Manager, navigate to Jobs > Groups.
  2. Select the Job Group and click View Jobs.
  3. Select the failed job and click Details.
  4. Under Job Logs, select Job Log and View. Log entries can be sorted according to the severity of error.

A REST API endpoint is available to retrieve the job log programmatically for display in the third-party application.

Implementing Custom Workflows

You can create custom workflows to fit your business requirements. Exporting and importing workflows via the Manager or the SOAP API allows you to distribute a custom workflow across different Manager installations.

Custom workflow component integration is generally managed at the API level where the files within the workflow are directed to a MAM, DAM, or other application. The Signiant job monitors the progress and status of the third-party processing, and reports results to the Manager.

The default components included with every Agent installation are written in Perl, but workflow components can be written in Ruby, Node.js, or other languages.