I have a python script running on a raspberry pi.
This script is part of a git repo.
I intend on the rpi and this script to be running all the time because the application is for responding fairly quickly to MQTT messages and controlling devices. Some small <5min downtime during updating is acceptable.
I am developing the script on my desktop machine and want to deploy to the raspberry pi.
I am trying to find a way to (on the rpi)
not necessarily in that order.
I have been manually doing the above procedure by but I would like for the rpi to handle it and for me to have to only make changes to the script and push the changes to git.
I have researched a shell script called from a cron job. that seems like a potential path. I have also researched using a container, which seems like maybe overkill.
Ok the basics of the answer are below. shell script to check for updates, then if there is new code in repo, pull it down, then kill the script and rerrun it
#!/bin/bash
# Set the path to the repository and the script
REPO_DIR="/path/to/script"
SCRIPT="Main.py"
# Change directory to the repository
cd "$REPO_DIR"
# Function to check for updates
check_updates() {
git fetch
# Check if there are any new commits
LOCAL=$(git rev-parse @)
REMOTE=$(git rev-parse @{u})
if [ $LOCAL != $REMOTE ]; then
echo "New updates found. Updating..."
return 0
else
echo "No updates found."
return 1
fi
}
# Function to update and restart the script
update_and_restart() {
echo "pulling updates"
# Pull the latest changes
git pull
echo "stopping script"
# Kill the running Main.py process
pkill -f $SCRIPT
# Start the new version of script
echo "running script"
nohup python3 "$REPO_DIR/$SCRIPT" &
}
if check_updates; then
update_and_restart
fi