Get Started with Manager+Agents REST API

Manager+Agents native REST API allows you to automate transfer jobs, manage users, and control storage settings for your transfer environments by sending API calls to your Manager.

Read the Manager+Agents REST API documentation

Manager+Agents also provides additional legacy APIs for SOAP and FIMS Compliant API resources.


Manager+Agents Job Architecture Diagram

Making API Requests

All API requests to your Manager must be sent to valid endpoints, which can be accessed via:


When making requests to the API, you must include headers to with a username, password, and Content-Type: application/json.

API Call Limit

Rate limits are designed to allow Manager to reliably make API requests over time to control transfers and administration.

The Manager uses a token bucket algorithm to allow many jobs to be processed at once without overwhelming the application.

  • All Signiant Managers have a bucket size of 500 tokens
  • Creating a job via API requires 5 tokens
  • All other requests require 1 token

Tokens are replenished to the bucket once the call is completed. If the request limit is exceeded, an HTTP 429 Too Many Requests error is returned, and the call cannot be completed. To avoid repeating unprocessed request, ensure that your integration has the ability to limit the number of requests to the token bucket limit.

Authenticating and Authorizing Users

If you have created an additional user you must enable their access to the administration interface.

To authorize a user:

  1. In your Manager, navigate to Administration > Users > List.
  2. Under Roles, select Administration Interface Login.

Note: For improved performance when API calls are being sent to the Manager, enable Improve web service API performance with reduced security auditing on the General tab. However, this setting will prevent successful or failed authentications for a user from being recorded in the Manager logs.

Making A Sample Request

To make sure that you can authenticate properly with your Manager, you can perform a simple test to return a response. Use using a terminal, make a curl request to the /listusers endpoint:

curl -X GET -k -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/listusers

You should receive a JSON response that includes the Manager user list:

    "user": {
      "id": 12345,
      "userName": "Administrator",
      "state": {
        "status": "Activated",
        "lastLoginAttempt": "2018-01-20T09:03:19",
        "lastSuccessfulLogin": "2018-01-20T09:03:19",
        "lastLoginResult": "Success",
        "guest": false
      "fields": {
        "firstName": "First Name",
        "lastName": "Last Name",
        "organization": "Your Organization Name",
        "mediaExchange": null

Job Parameters

Job parameters are string key value pairs contained within the request body and define the overall settings for a job. Job parameters any valid string syntax, including URLs, local network storage paths, CIFS (Windows shares), Windows drive letters, and NFS.

To view a complete list of available job parameters:

  1. In your Manager, navigate to Jobs > Templates.
  2. Select the appropriate job template library and click Open Canvas.
  3. Right-click on the first component in the appropriate job template and select Show Web Services Info.

Note: Job parameters are case sensitive.

Creating a Job

Every job is based on a specified job template. Manager+Agents provides predefined job templates which allow file transfers between supported storage types:

  • Media Mover provides job templates used for transfers in file system storage.
  • Object Mover provides job templates used for object storage file transfers.

Each job is identified by its name and job group.

Note: Job names must start with a letter and can only contain letters and numbers.

After a job is created via the API, it is visible in the Manager and can be used like any other job.

Note: It is recommended to create a new job when transferring files using the API. Creating a unique job can provide more specific logging in the case a transfer fails.

Configuring Media Mover Jobs

The Media_Mover_Workflows Job Template Library includes three default job templates:

Job Template Type Description of Transfer
MediaReplicator Push One-to-one, one-to-many - optimized for many small files
MediaDistributor Push One-to-one, one-to-many
MediaDropBox Push One-to-one, one-to-many
MediaAggregator Pull One-to-one, many-to-one

You can also use the API to call any custom templates you have created. To create a job using the API, you must send an API request containing a JSON body that includes the required parameters for the job template.

You can include more than one job in each request within a limited capacity depending on your Manager configuration.

Configuring Object Mover Jobs

Object Mover allows transfers between S3 compatible, Amazon S3, or Microsoft Azure cloud object storage.

Object Mover offers three job templates:

Job Template Type Source Destination
ObjectUploader Pull File System Object Storage
ObjectDownloader Push Object Storage File System
ObjectReplicator Push Object Storage Object Storage
ObjectDropBox Push File System Object Storage

Note: An Object Agent can be configured to transfer files using S3 compatible object storage or using Amazon S3 or Microsoft Azure storage, but not both.

You can also use the API to call any custom templates you have created. To create a job using the API, you must send an API request containing a JSON body that includes the required parameters for the job template.

You can include more than one job in each request within a limited capacity depending on your Manager configuration.

Using Object Storage Profiles

Object Mover job parameters require an object storage profile. Object Storage Profiles allow you to use the same set of credentials for object storage across multiple jobs without needing to re-enter credentials.

Note: More information on Object Storage Profiles is available via Signiant Help.

Using an object storage profile when creating a job requires a JSON key/value pair within the SourceData or TargetData properties. Quotes around the storage profile name must be escaped:

                    "ObjectUploader.Source.SourceAgent":     "example-s3-storage",
                    "ObjectUploader.Source.SourceData":      "{\"name\":\"Profile Name\"}"


If desired, you can specify a custom sub-folder for the job transfer, which also requires escaped quotes:

                    "ObjectUploader.Source.SourceAgent":     "example-s3-storage",
                    "ObjectUploader.Source.SourceData":      "{\"name\":\"Profile Name\",\"subfolder\":\"source_folder/\"}"


Multiple Agents and Data Sources

The SourceAgents and TargetAgents JSON key accepts more than one Agent per job depending on the job template and user permissions.

For example, a MediaAggregator job that transfers files from two Agents is represented in the job JSON separated by a space:

"MediaAggregator.Source.SourceAgents": " a.linux.agent b.linux.agent",

Jobs that include using more than one data source use the siglist XML elements to list the files or folders included in a job.

The <siglist> tag accepts a type attribute that is dependent the source. Transfers from file systems use filedir to locate files using a path to a file or directory. Transfers from object storage use multilineval to locate files by name.

All <el> elements within siglist require the following attributes:

  • v - The path to the file
  • t - Transfers using filedir must set a data type:
    • d - Directory
    • f - File

The following example shows XML markup for transfers from a file system:

<siglist type="filedir">
  <el v="c:\path\to\transfer\source" t="d"></el>
  <el v="/path/to/source" t="d"></el>
  <el v="/path/to/file.mp4" t="f"></el>

The following example shows XML markup for transfers from object storage:

<siglist type="multilineval">
  <el v="/sourcefile_1"></el>
  <el v="/sourcefile_2"></el>

When used as a JSON value in an API call, all quotes (") and backslashes (\) must be escaped and included in the relevant key:

    "job": {
      "jobName": "MediaAggregatorAPI",
      "fields": {
        "jobGroupName": "MediaAggregator",
        "jobArgs": {
          "MediaAggregator.Source.SourceAgents": " b.linux.agent",
          "MediaAggregator.Source.SourceData": "<siglist type=\"filedir\"><el v=\"c:\\path\\to\\transfer\\source\" t=\"d\"></el><el v=\"/path/to/source\" t=\"d\"></el></siglist>",

Note: Transfer jobs automatically detect which Agent to use as a source for a file or directory by determining whether the path can be accessed on the job’s related Agents. If the same path exists on more than one Agent included in the API call, data from all relevant Agents will transfer in the job.

Deleting Jobs

Once a job is complete, you have the option to delete it.

There are two options for deleting a job via the API:

  • Soft Delete: Marks the job as deleted in the Manager, but the database records are kept for reporting purposes until they are deleted by a maintenance job.

  • Hard Delete: Immediately deletes the job and all associated database records. This choice prevents database growth that can impact performance.

To soft delete a job, make a DELETE request to /jobs/MyJobName/MyJobGroupName.

To hard delete a job, send a DELETE request jobs containing a request body with the jobGroupName and jobName:

    "job": {
      "fields": {
        "jobGroupName": "MyJobGroupName"
      "jobName": "MyJobName"

Configuring API Integration For Agent Groups

To improve efficiency when transferring files directly to or from specific Agents, you can configure Agent groups.

Agent groups allow you to define a set of Agents with a specific group name. Agents can also be swapped in and out of an Agent group without needing to reconfigure your integration.

Agent groups can be flagged as load-balanced, to provide redundancy and scalability when a job runs between Agents in the group.

For more information, see Configuring Agent Groups.

Agent groups are always associated with a specific organization. When configuring a transfer to run to or from a given Agent group, the organization’s API Identifier must be included in the request body.

You can find an organization’s API Identifier on the Administration > Manager > Organizations menu.

To use an organization’s Agent group, use the format <AgentGroupName>!<OrgId>, eg. agentGroupName!4321

Transferring Data From An Agent Group

Example Request Body

    "job": {
      "jobName": "MyJobName",
      "fields": {
        "jobGroupName": "MyJobGroupName",
        "jobTemplateLibraryName": "Media_Mover_Workflows",
        "jobTemplateName": "MediaReplicator",
        "jobArgs": {
          "MediaReplicator.Source.SourceAgent": "agentGroupName!4321",
          "MediaReplicator.Source.SourceData": "c:/source/-directory",
          "MediaReplicator.Target.TargetAgents": "target-agent-url",
          "MediaReplicator.Target.TargetDirectory": "c:/target-directory",
          "MediaReplicator.Schedule._sp_frequency": "once"

Checking Job Status

The job status is maintained across two fields:

  • activeState: What the job is currently doing
  • scheduledState: What the job is scheduled to do in the future

For example, a job set to run once showing the activeState as RUNNING, will display scheduledState as DORMANT, as no other runs are scheduled for that job.

Returned statistics show started workflow components. For standard job templates, the first component to run is the file transfer component. This is the only component which takes measurable execution time.

Note: Statistics are processed by the Manager every 15 seconds. The order the response containing job statistics can vary.

To request job status, make a GET request to /jobs/MyJobName/MyJobGroupName.

Example Response

      "job": {
        "id":                        12345678,
        "jobName":                   "MyJobName",
        "fields": {
          "jobGroupName":            "MyJobGroupName",
          "jobTemplateLibraryName":  "Media_Mover_Workflows",
          "jobTemplateName":         "MediaReplicator",
          "activeState":             "IDLE",
          "scheduledState":          "DORMANT",
          "lastExitCode":            0,
          "percentComplete":         "100%",
          "lastActiveStatusMessage": "",
          "activeFilename":          "",
          "jobArgs": {

Job Commands

Once you have created a job, you can send commands through the API to control the job.

Valid job commands are:

  • force - Start the job immediately.
  • kill - Cancel the running job.
  • suspend - Prevent future scheduled runs of the job, and prevent it from running automatically in the future.
  • resume - Resume a suspended job.
  • delete - Remove the job from the manager.
  • setbwlimits - Set the bandwidth limits for a job.

Bandwidth Limits

The setbwlimits command allows you to set a job-specific resource control that limits certain bandwidth limits:

  • Maximum Speed
  • Bandwidth Floor
  • Bandwidth Ceiling

Example Requests

curl -X GET -k -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_<maximumSpeed>:<bandwidthFloor>:<bandwidthCeiling>

To remove a bandwidth limit, pass the command again with a 0 for any limits you want to remove:

curl -X GET -k -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_0:0:0

If you do not need to set all three available limits, leave that parameter blank. For example, to exclude the Bandwidth Floor parameter, use the following syntax:

curl -X GET -k -H "username:<userName>" -H "password:<password>" https://manager.url/signiant/spring/admin/v1.0/jobs/command/<jobname>/<jobGroupId>/setbwlimits_MaximumSpeed>::<bandwidthCeiling>

Troubleshooting Job Failures

When a job fails, troubleshooting details are available in the Manager’s job logs.

To view a job log:

  1. In your Manager, navigate to Jobs > Groups.
  2. Select the Job Group and click View Jobs.
  3. Select the failed job and click Details.
  4. Under Job Logs, select Job Log and View. Log entries can be sorted according to the severity of error.

A REST API endpoint is available to retrieve the job log programmatically for display in the third-party application.

Implementing Custom Workflows

You can create custom workflows to fit your business requirements. Exporting and importing workflows via the Manager or the SOAP API allows you to distribute a custom workflow across different Manager installations.

Custom workflow component integration is generally managed at the API level where the files within the workflow are directed to a MAM, DAM, or other application. The Signiant job monitors the progress and status of the third-party processing, and reports results to the Manager.

The default components included with every Agent installation are written in Perl, but workflow components can be written in Ruby, Node.js, or other languages.