So far the API calls that appear to help me in getting to my end goal of eventually uploading or viewing files and folders via the API are as follows:
POST https://demo.pydio.com/a/tree/admin/list
POST https://demo.pydio.com/a/workspace
GET https://demo.pydio.com/a/config/datasource
GET https://demo.pydio.com/a/config/virtualnodes/
Turns out my original thoughts regarding the Pydio Cells s3 buckets requiring an AWS account were wrong. Pydio Cells uses the same code or syntax (not sure 100%) that is used when working with AWS Buckets. The file system can be accessed using s3 buckets when working with the Pydio Endpoint https://demo.pydio.com/io
. io is the s3 Bucket.
To test I am using Postman to first place a file named 'Query.sql' with content into the 'Personal Files' Workspace.
Authorization: AWS Signature
AccessKey
: Token returned when using OpenID Connect. The "id_token" contained in the body.
SecretKey
: The demo uses the key: 'gatewaysecret'
Advanced Options:
AWS Region
: Default is 'us-east-1'. I didn't have to enter anything here but it still worked when I set it to 'us-west-1'.
Service Name
: 's3' - I found that this is Required
Session Token
: I left this blank.
Create files using PUT
. Download files using GET
.
The below example shows how to first create a file and then pull it's content/download the file.
In my GET example I manually place a file named Query.sql onto the demo.pydio.com server in the Personal Files workspace. This example shows how to access the data and/or download the Query.sql file I manually placed into the Personal Files workspace.