Search code examples
jsonbashloopsjqartifactory-query-lang

Add to existing json file using jq


I have an Artifactory AQL Spec file in JSON format. The spec file is as follows:

{
  "files": [
    {
      "aql": {
        "items.find": {
          "repo": "release-repo",
          "modified": { "$before": "30d" },
          "type": { "$eq": "folder" },
          "depth": "2"
        }
      }
    }
  ]
}

let's say I run a gitlab api query to acquire a list of SHAs that I want to iterate through and add to this json spec file.. The list of SHAs are assigned to a variable..

"a991fef6bb9e9759d513fd4b277fe3674b44e4f4"
"5a562d34bb1d4ab4264acc2c61327651218524ad"
"d4e296c35644743e58aed35d1afb87e34d6c8823"

I would like to iterate through all these commit IDs in and add them one by one to the json so that they are in this format:

{
  "files": [
    {
      "aql": {
        "items.find": {
          "repo": "release-repo",
          "modified": { "$before": "30d" },
          "type": { "$eq": "folder" },
          "$or": [
            {
              "$and": [
                {
                  "name": {
                    "$nmatch": "*a991fef6bb9e9759d513fd4b277fe3674b44e4f4*"
                  }
                }
              ]
            },
            {
              "$and": [
                {
                  "name": {
                    "$nmatch": "*5a562d34bb1d4ab4264acc2c61327651218524ad*"
                  }
                }
              ]
            },
            {
              "$and": [
                {
                  "name": {
                    "$nmatch": "*d4e296c35644743e58aed35d1afb87e34d6c8823*"
                  }
                }
              ]
            }
          ],
          "depth": "2"
        }
      }
    }
  ]
}

The list of SHAs returned from the gitlab api query will be different everything and that's why I'd like this to be a dynamic entry or update every time. The number of returned SHAs will also be different... Could return 10 one day or it could return 50 on another day.


Solution

  • #!/usr/bin/env bash
    
    template='{
      "files": [
        {
          "aql": {
            "items.find": {
              "repo": "release-repo",
              "modified": { "$before": "30d" },
              "type": { "$eq": "folder" },
              "$or": [],
              "depth": "2"
            }
          }
        }
      ]
    }'
    
    shas=(
      "a991fef6bb9e9759d513fd4b277fe3674b44e4f4"
      "5a562d34bb1d4ab4264acc2c61327651218524ad"
      "d4e296c35644743e58aed35d1afb87e34d6c8823"
    )
    
    jq -n \
            --argjson template "$template" \
            --arg shas_str "${shas[*]}" \
    '
    reduce ($shas_str | split(" ") | .[]) as $sha ($template;
      .files[0].aql["items.find"]["$or"] += [{
        "$and": [{"name": {"$nmatch": ("*" + $sha + "*")}}]
      }]
    )
    '
    

    ...emits as output:

    {
      "files": [
        {
          "aql": {
            "items.find": {
              "repo": "release-repo",
              "modified": {
                "$before": "30d"
              },
              "type": {
                "$eq": "folder"
              },
              "$or": [
                {
                  "$and": [
                    {
                      "name": {
                        "$nmatch": "*a991fef6bb9e9759d513fd4b277fe3674b44e4f4*"
                      }
                    }
                  ]
                },
                {
                  "$and": [
                    {
                      "name": {
                        "$nmatch": "*5a562d34bb1d4ab4264acc2c61327651218524ad*"
                      }
                    }
                  ]
                },
                {
                  "$and": [
                    {
                      "name": {
                        "$nmatch": "*d4e296c35644743e58aed35d1afb87e34d6c8823*"
                      }
                    }
                  ]
                }
              ],
              "depth": "2"
            }
          }
        }
      ]
    }