I am using IPython and want to run functions from one notebook from another (without cutting and pasting them between different notebooks). Is this possible and reasonably easy to do?

Starting your notebook server with:

ipython notebook --script

will save the notebooks (.ipynb) as Python scripts (.py) as well, and you will be able to import them.

Or have a look at: http://nbviewer.ipython.org/5491090/ that contains 2 notebook, one executing the other.

In IPython 2.0 you can simply %run 'my_shared_code.ipynb' to share code between notebooks. See for example http://nbviewer.ipython.org/gist/edrex/9044756.

Ipythons %run magic allows you execute python files and ipython scripts in a notebook. I sometimes use the -i option so it runs in the notebooks namespace. Execute a cell with %run? in it for more info.

You can use the ipython --script to save notebooks also as .py files on each save or uncomment the line c.NotebookManager.save_script=True in your ipython_notebook_config.py file for the same effect (use ipython profile create for setting that up – on Ubuntu the config files live in ~/.config/ipython/).

Edit: The following is true, but unnecessary – you can %run a .ipynb file directly. Thanks Eric.

If you use ipython magics in the notebook you want to import, I found that you can rename the .py file to .ipy (an ipython script), but I had to remove the first line (which contained the file encoding declaration) for it to work. There is probably a better way! This approach will likely confuse cell magics too (they’d all get applied at once).

There is also a “write and execute” extension, which will let you write the content of a cell to a file (and replace old content -> update code), which can then be imported in another notebook.


In one notebook (two cells)

%reload_ext writeandexecute
%%writeandexecute -i some_unique_string functions.py
def do_something(txt):

And then in the other notebook:

from functions import do_something
do_something("hello world")

You can connect with a qtconsole to the same kernel. Just supply this at startup:

ipython qtconsole --existing kernel-0300435c-3d07-4bb6-abda-8952e663ddb7.json

Look at the output after starting the notebook for the long string.

I do call notebooks from other notebooks. You can even pass “parameters” to other notebooks using the following trick:

Place params dictionary in the first cell of “report_template.ipynb”.

params = dict(platform='iOS', 
df = get_data(params ..)
do_analysis(params ..)

And in another (higher logical level) notebook, execute it using this function:

def run_notebook(nbfile, **kwargs):
    run_notebook('report.ipynb', platform='google_play', start_date="2016-06-10")

    def read_notebook(nbfile):
        if not nbfile.endswith('.ipynb'):
            nbfile += '.ipynb'

        with io.open(nbfile) as f:
            nb = nbformat.read(f, as_version=4)
        return nb

    ip = get_ipython()
    gl = ip.ns_table['user_global']
    gl['params'] = None
    arguments_in_original_state = True

    for cell in read_notebook(nbfile).cells:
        if cell.cell_type != 'code':

        if arguments_in_original_state and type(gl['params']) == dict:
            arguments_in_original_state = False

run_notebook("report_template.ipynb", start_date="2016-09-01")

This command will execute each cell of the “report_template” notebook and will override relevant keys of params dictionary starting from the second cell

I use the following function in the notebook from which I want to load functions or actions from a source notebook:

import io
import nbformat

def execute_notebook(nbfile):
    with io.open(nbfile, encoding="utf8") as f:
        nb = nbformat.read(f, as_version=4)

    ip = get_ipython()

    for cell in nb.cells:
        if cell.cell_type != 'code':

Use like:


Yes, you can “run functions from one notebook from another (without cutting and pasting them between different notebooks)” — and, yes, it’s easy to do!

tl;dr: put the code in python files (*.py) in the file system & let multiple notebooks use the same code. (It’s that simple.)

(Why put so much code in notebooks, when we have perfectly good code editors & IDEs that are so much better for writing & reading code? Not to mention the need for proper version control! What are we trying to achieve, and at what expense? </rant>)


  • Put your code in normal python files, eg my_code/foo.py, adding a (probably empty) my_code/__init__.py
  • Take advantage of having the code under proper version control (eg git) — notice how hard it was to diff ipynb json files?
  • Also put the notebooks also under version control. Raw git logs will be hard to read, but the comments can be useful. (GitHub/GitLab displays ipython notebooks, btw.)
  • Limit the py source in the ipynb notebook to small amounts of “driver” code, and output, and documentation.
  • See also: https://ipython.org/ipython-doc/stable/config/extensions/autoreload.html
  • If you want to “inline” the external python files, just use (for example) magic %cat my_code/foo.py

…If you want something fancier to display that source inline (optionally, adding the following to an external, reusable source file)…

import IPython
from pygments import highlight
from pygments.formatters import HtmlFormatter
from pygments.lexers import get_lexer_for_filename

with open(filename) as f: code = f.read()
formatter = HtmlFormatter(linenos="inline")
IPython.display.HTML('<style type="text/css">{}</style>{}'.format(
                highlight(code, get_lexer_for_filename(filename), formatter)))

Your favorite code editors & IDEs thank you for your support.

So @MikeMuller’s good idea will work for a local notebook, but not a remote one (right?). I don’t think there is a way for you to remotely invoke individual cell blocks or functions of ipynb code on a remote server and be able to get results back into your calling routine programmatically, unless that code does something fairly extraordinary to communicate results.

I was in the process of writing when @Matt submitted the same idea about

ipython <URI_to_Notebook> --script

The *.pynb is a JSON container and not an actual python script. You can get ipython to export a *.py with

If the target *.ipynb is on a remote machine you don’t control, you’ll probably need to pull the file so that you can write the output to a local path. (Haven’t looked into whether you can invoke this on a remote resource to create a local output.) Once this is created you should be able to import and run the *.py or individual functions within it.

A question for @Matt on that neat example of running another *.ipynb file wholesale with io.open(nbfile) is whether the nbfile can be remote? Seems like a long shot, but would be great…

Here are two additional tips:

  1. You can also run %qtconsole magic directly from the notebook and it will automatically connect to the notebook kernel.

  2. Check out https://github.com/atiasnir/ipnb

    You can use it to import notebook files as if they’re standard python modules (I’m the author :-)). Main limitation here is that it will discard magic cells (because it does not use IPython at all) but otherwise it should work fine.

A possible solution could be the combination of Jupyter notebooks and Visual Studio Code (VSC) or any other Python IDE.

In practice, VSC is exploited to write the core functions or classes, which can be then reused across the different notebooks.

To make this strategy work, you need two precautions:

  • transform the classes or functions into a package;
  • tell Jupyter to reload the package while running each cell.

To transform classes or functions into a package, which can be imported by other Python scripts, you should create an empty file, named __init__.py within the same directory where the classes and functions are located. Then, if all the classes are contained in a folder located in the same folder as the Jupyter notebooks, you can simply import the package in a Jupyter cell as follows:

from file_directory.file_name import class_name

Then, you can use it as you usually do for the other Python packages.

Secondly, you should tell Jupyter to reload your package while running each cell.
This can be done, by writing the following magic word in the first Jupyter cell:

%autoreload 2

The described strategy permits you to maintain your code up-to-date and ordered.

You can read a working example at this link.