Search code examples
autodesk-forgepower-automateautodesk-data-managementpower-automate-custom-connector

Implementing Autodesk APS in Microsoft Power Automate Custom Connector


I'm working on creating my own custom connector in a MS flow to call Autodesk's [Data Management API], and can't get the step 5 in their documentation: direct upload to a signed S3 URL to work in my flow.

I tested the rest of the process in postman and it's working there. I also tested running step 5 using curl command in a .sh script (just like their example) and can confirm that the signed URL is properly generated, and I can finish the upload just fine.

There are two ways I've tried executing the upload in PowerAutomate;

  1. using a HTTP action that takes the signed URL, "PUT" as a method, with my file content in the body. Here's the code view of this action:

    {
      "type": "Http",
      "inputs": {
        "uri": "[REDACTED SIGNED URL]",
        "method": "PUT",
        "body": "@body('Get_file_content')"
      },
      "runAfter": {
        "Upload_file_to_S3_signed_URL": [
          "Succeeded"
        ]
      },
      "runtimeConfiguration": {
        "contentTransfer": {
          "transferMode": "Chunked"
        }
      }
    }
    

    The error I get in this case is as follow (no actual output, just the error):

    InvalidProtocolResponse

    The response to partial content upload initiating request is not valid. The response to initiating partial content upload request must contain a valid location header.

  2. I built a dedicated custom connector (because the host name for S3 upload is different from the other Autodesk APIs), here's the swagger json:

    swagger: '2.0'
    info:
      title: APS S3 Signed URL Uploader
      description: Direct upload to S3 signed URL generated by the Data Management API.
      version: '1.0'
    host: com-autodesk-oss-direct-upload.s3-accelerate.amazonaws.com
    basePath: /
    schemes:
      - https
    consumes:
      - application/octet-stream
    produces:
      - application/json
    paths:
      /upload:
        put:
          summary: Upload file to S3 signed URL
          operationId: UploadToURL
          parameters:
            - name: signedURL
              in: query
              required: true
              type: string
              description: Signed URL generated by the Data Management API.
            - name: fileContent
              in: body
              required: true
              schema:
                type: string
                format: binary
                description: Binary file content to upload.
          responses:
            '200':
              description: Upload successful
              schema:
                type: object
                properties:
                  message:
                    type: string
                    example: File uploaded successfully.
            '400':
              description: Bad Request
            '500':
              description: Internal Server Error
    securityDefinitions: {}
    
    

    Running it this way, I get a 403 error with the following output body:

    <?xml version="1.0" encoding="UTF-8"?>
    <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>V48YVK8JA62E8GX3</RequestId><HostId>z6CEun8FVfqQEB1ostuWqeWJUeEMtd9L0c4X5NsS70akh7sMSL6S3uC/I53IikjDD6waL5MjM/0=</HostId></Error>
    

I am not a developer by trade, so please bear with me if I completely missed something obvious! Any direction is much appreciated.


Solution

  • If you want to keep things simple you could use this endpoint instead:
    https://aps.autodesk.com/en/docs/data/v2/reference/http/buckets-:bucketKey-objects-:objectKey-signed-POST/
    You could create a custom connector just for this endpoint and then use HTTP component to upload to the URL it provides

    The direct S3 upload APIs are good if you want to upload big files and you want to upload the chunks in parallel, or if you can use our SDKs (which are using those endpoints to upload files)