How to run a Jupyter notebook with Python code automatically on a daily basis?

Each Answer to this Q is separated by one/two green lines.

I have some Python code in a Jupyter notebook and I need to run it automatically every day, so I would like to know if there is a way to set this up. I really appreciate any advice on this.

Update
recently I came across papermill which is for executing and parameterizing notebooks.

https://github.com/nteract/papermill

papermill local/input.ipynb s3://bkt/output.ipynb -p alpha 0.6 -p l1_ratio 0.1

This seems better than nbconvert, because you can use parameters. You still have to trigger this command with a scheduler. Below is an example with cron on Ubuntu.


Old Answer

nbconvert --execute

can execute a jupyter notebook, this embedded into a cronjob will do what you want.

Example setup on Ubuntu:

Create yourscript.sh with the following content:

/opt/anaconda/envs/yourenv/bin/jupyter nbconvert \
                      --execute \
                      --to notebook /path/to/yournotebook.ipynb \
                      --output /path/to/yournotebook-output.ipynb

You have more options except –to notebook. I like this option since you have a fully executable “log”-File afterwards.

I recommend using a virtual environment to run your notebook, to avoid that future updates mess with your script. Do not forget to install nbconvert into the environment.

Now create a cronjob, that runs every day e.g. at 5:10 AM, by typing crontab -e in your terminal and add this line:

10 5 * * * /path/to/yourscript.sh

Try the SeekWell Chrome Extension. It lets you schedule notebooks to run weekly, daily, hourly or every 5 minutes, right from Jupyter Notebooks. You can also send DataFrames directly to Sheets or Slack if you like.

Here’s a demo video, and there is more info in the Chrome Web Store link above as well.

**Disclosure: I’m a SeekWell co-founder

It’s better to combine with airflow if you want to have higher quality.
I packaged them in a docker image, https://github.com/michaelchanwahyan/datalab.

It is done by modifing an open source package nbparameterize and integrating the passing arguments such as execution_date. Graph can be generated on the fly The output can be updated and saved within inside the notebook.

When it is executed

  • the notebook will be read and inject the parameters
  • the notebook is executed and the output will overwrite the original path

Besides, it also installed and configured common tools such as spark, keras, tensorflow, etc.

you can add jupyter notebook in cronjob

0 * * * * /home/ec2-user/anaconda3/bin/python /home/ec2-user/anaconda3/bin/jupyter-notebook

you have to replace /home/ec2-user/anaconda3 with your anaconda install location, and you can schedule time based on your requirements in cron

Executing Jupyter notebooks with parameters is conveniently done with Papermill. I also find convenient to share/version control the notebook either as a Markdown file or a Python script with Jupytext. Then I convert the notebook to an HTML file with nbconvert. Typically my workflow looks like this:

cat world_facts.md \
| jupytext --from md --to ipynb --set-kernel - \
| papermill -p year 2017 \
| jupyter nbconvert --no-input --stdin --output world_facts_2017_report.html

Learn more about the above, including how to specify the Python environment in which the notebook is expected to run, and how to use continuous integration on notebooks, have a look at my article Automated reports with Jupyter Notebooks (using Jupytext and Papermill) which you can read either on Medium, GitHub, or on Binder. Use the Binder link if you want to test interactively the outcome of the commands in the article.

As others have mentioned, papermill is the way to go. Papermill is just nbconvert with a few extra features.

If you want to handle a workflow of multiple notebooks that depend on one another, you can try Airflow’s integration with papermill. If you are looking for something simpler that does not need a scheduler to run, you can try ploomber which also integrates with papermill (Disclaimer: I’m the author).

To run your notebook manually:

jupyter nbconvert --to notebook --execute /home/username/scripts/mynotebook.ipynb

Create a simple batch file and add the command above to the file:

/home/username/scripts/mynotebook.sh

Paste the command above into the file

Make the file executable

chmod +x /home/username/scripts/mynotebook.sh

To schedule your notebook use cron or airflow, depends on your needs vs complexity. if you want to use cron, you can simply do crontab -e and add an entry

00 11 * * * /home/username/scripts/mynotebook.sh

You can download the notebook in the form of .py and then create a batch file to execute the .py script. Then schedule the batch file in the task scheduler

Creating a BAT file then running it through Task scheduler worked for me. Below is the code.

call C:\Users\...user...\Anaconda3\condabin\conda activate
python -m notebook_file.py
pause
call conda deactivate 

You want to use Google AI Platform Notebooks Scheduler service currently in EAP.


The answers/resolutions are collected from stackoverflow, are licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0 .