Hi all, gradle question for the experts. I’m fairly certain I’m just not googling the right terms, so figured I’d describe it here.
We have a set of extra files we are looking to copy from the local repo over to the roboRIO on every deploy. Build avoidance is great, but I’d be fine if we just blindly copied over the files each time. Gradle, however, is reporting that every time we attempt to run the deploy, the files are “up to date” and doesn’t attempt to copy them, even if we changed them relative to what is on the RIO.
For now we’re just ssh’ing into the roboRIO and removing the folder on the remote target any time we know we want the files to update, but this is not optimal for the build season.
The snippet of how we’re currently attempting to do it:
artifacts {
//Main roboRIO Java .jar artifact
artifact('frcJava', edu.wpi.first.gradlerio.frc.FRCJavaArtifact) {
targets << "roborio"
// Debug can be overridden by command line, for use with VSCode
debug = getDebugOrDefault(false)
}
// Casserole WebServer support Files Deploy
fileTreeArtifact('CasseroleWebServerFileDeploy') {
targets << "roborio" // Web server should deploy to RIO
files = fileTree(dir: './resources') // Dev PC location for files
directory = '/home/lvuser/resources/' // RoboRIO location to deploy to
}
//Build info file deploy
fileArtifact('BuildInfoDeploy') {
targets << "roborio" // build info should deploy to RIO
file = file(BUILD_INFO_FILE) // Dev PC location for file
directory = '/home/lvuser/resources/' // RoboRIO location to deploy to
}
}
So, the question: Is there any easy way to get the fileTreeArtifact() to “always copy”? Mark the files as always dirty, or never up-to-date? Or is there something more fundamental about what we’re doing that’s incorrect?