I loved this idea, and thought it would be no problem to get a Python script up and running periodically on my Raspberry Pi home server using cron
. However, I ran into various issues along the way (some of which were not so easy to resolve), so I’m collating all the configuration changes I made in the hopes that it will be useful to someone one day. You can find the full repo for this project here, and I have also included my Dockerfile, docker-compose.yml and crontab at the end of this TIL.
A lot of problems with cron
come down to user privileges. Each user has their own crontab
, and then there is the system-wide root crontab
. The first issue I ran into with creating a cron
job inside a container was that Docker created the crontab as a non-root user. This issue presented itself to me when I tried to run the following command, to list the current cronjobs in the Docker container:
docker-compose exec container-name crontab -l
This returned the following output:
no crontab for root
Now, it is not necessarily a problem to have non-root cron
jobs, but just make absolutely certain that you are creating the jobs with the user you expect. For me, I wanted to run as root
, so I added to following line to my docker-compose.yml:
user: root
Now, the root
user will be used when building your Docker image and the created crontab
will be where you expect.
When cron
calls your Python script, you may run into issues with ModuleNotFoundError
or ImportError
, where Python cannot find your installed packages. This is because cron
does not have access to your system environment variables, including the Python path. You can resolve most of these errors with imports by adding the PYTHONPATH
environment variable to your crontab
. This should be the path to your site-packages
folder, something like this:
PYTHONPATH=/usr/bin/local/python3
You may also need to add a shebang (#!
) to your Python script to direct cron
to the correct version. You can find the Python location with one of the following commands:
which python
which py
which python3
NOTE: These commands must be performed in your Docker container when it is up and running. In docker-compose
syntax this would be the following (with the name of your container instead of container-name
):
docker-compose exec container-name which python3
You can then add this to the top of your Python script, as follows:
#!/usr/bin/local/python3
Some modules will still run into errors even when the PYTHONPATH variable has been set. In particular, I ran into problems with reportlab
and Pillow/PIL
:
ImportError: cannot import name '_imaging' from 'PIL'
This was solved by adding the system PATH to the crontab
as well. The system path is included in the default crontab
that is created when you first run crontab -e
:
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
Therefore, it is a good idea to include it if you are making a new crontab
to make sure cron
can find everything it needs to.
By default, cron
runs from the default root path. Therefore, both your call to Python in your crontab
and the filepaths within Python should either be relative to root
(i.e /main.py
rather than main.py
) or just use full paths instead.
This error is related to Python inside a Docker container rather than cron
. However, someone might still find it useful. When you install your requirements.txt
, you may encounter errors such as
legacy-install-failure
error: command '/usr/bin/gcc' failed with exit code 1
fatal error: Python.h: No such file or directory
I was able to resolve these by adding python3-dev
, wheel
and Cmake
to my requirements.txt
. These are sometimes required when packages include other binaries or need to compile other code when installed.
cron
syntaxI hope this helped you resolve some errors! I’ve included my Dockerfile, docker-compose.yml and crontab below if you want to set up a similar project or adjust your own files. The full repo is also available here.
Dockerfile:
FROM python:3
COPY . .
RUN python3.11 -m pip install --no-cache-dir -r requirements.txt
RUN touch /var/log/cron.log
RUN apt-get update \
&& apt-get install cron -y
RUN chmod +x main.py
RUN crontab crontab
CMD cron -f
docker-compose.yml:
version: "2.4"
services:
watchman:
platform: "linux/arm64/v8"
image: watchman:latest
container_name: watchman
restart: always
user: root
build:
context: build
dockerfile: Dockerfile
crontab:
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
PYTHONPATH=/usr/bin/local/python3
15 7 * * * python3 /main.py >> /var/log/cron.log 2>&1
php
errors.
I was trying using the following syntax to call occ
and scan the files:
sudo -u www-data php /path/to/nextcloud/occ files:scan --all
but I kept running into a PHP error. Specifically this error:
Doctrine\DBAL\Exception: Failed to connect to the database: An exception occurred in the driver: could not find driver in /path/to/nextcloud/lib/private/DB/Connection.php:139
followed by a long, verbose stack trace.
It took me a decent amount of time to diagnose the exact issue, but eventually I found this list of required PHP modules in the Nextcloud admin manual.
Running php -m
will print out the list of currently installed PHP modules. I noticed I was missing quite a few of the required modules, but the one that was causing my issue was the missing pdo_mysql
module.
This can be installed by running:
sudo apt-get install php7.4-mysql
Note: This command will change based on your OS, PHP version and database type
This resolved the error! However (as is always the case), this only meant I got a shiny new error instead:
Doctrine\DBAL\Exception: Failed to connect to the database: An exception occurred in the driver: SQLSTATE[HY000] [2002] php_network_getaddresses: getaddrinfo failed: Name or service not known in /path/to/nextcloud/lib/private/DB/Connection.php:139
From first glance, this looks like something wrong in the DNS name resolution. This sent me a long way down the wrong path, changing a whole bunch of things in my docker-compose.yml file.
Eventually however, after a long and perilous journey over the high seas of Nextcloud forums and StackOverflow, I found this example of running php occ
in a docker-compose configuration.
This led me to running this command:
docker-compose exec -u www-data nextcloud-app php occ files:scan --all
Note: replace nextcloud-app with the name of your Nextcloud container. Also, this command must be run from the directory of your Nextcloud docker-compose.yml
….aaaaaand, voila! The command runs, the files are scanned and everything is up to date.