I'm using the jenkins dsl plugin to generate jenkins jobs for all branches of a project. When a branch is deleted, the dsl plugin also deletes the respective jenkins jobs.
The problem however is that the workspaces are not deleted together with the jobs, so they clutter up my disk eventually. One solution I've found is to periodically list all workspaces and check whether or not a jenkins job with the same name exists.
I was wondering if there is probably a more elegant solution to automatically remove obsolete workspaces for jenkins jobs that have just been deleted by the dsl plugin.
My solution to this was to add another job that runs a Groovy system script that purges all workspaces for jobs that no longer exist or that have been disabled, and have that triggered after the DLS job.
I use the following script, based on this one from this answer:
import hudson.FilePath
import jenkins.model.Jenkins
import hudson.model.Job
def deleteUnusedWorkspace(FilePath root, String path) {
root.list().sort{child->child.name}.each { child ->
String fullName = path + child.name
def item = Jenkins.instance.getItemByFullName(fullName);
if (item.class.canonicalName == 'com.cloudbees.hudson.plugins.folder.Folder') {
deleteUnusedWorkspace(root.child(child.name), "$fullName/")
} else if (item == null) {
println "Deleting (no such job): '$fullName'"
child.deleteRecursive()
} else if (item instanceof Job && !item.isBuildable()) {
println "Deleting (job disabled): '$fullName'"
child.deleteRecursive()
} else {
println "Leaving: '$fullName'"
}
}
}
for (node in Jenkins.instance.nodes) {
println "Processing $node.displayName"
def workspaceRoot = node.rootPath.child("workspace");
deleteUnusedWorkspace(workspaceRoot, "")
}
This does assume that you don't use custom workspaces.