Passing data between separately running Python scripts

Each Answer to this Q is separated by one/two green lines.

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?

I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.

The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.

EDIT:
Here are the examples as requested. I am aware why this doesn’t work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I’m hoping this can be achieved as a part of Python.

Script A

import time

i = 0

def return_data():
    return i

if __name__ == "__main__":
    while True:
        i = i + 1
        print i
        time.sleep(.01)

Script B

import time
from scriptA import return_data

if __name__ == '__main__':
    while True:
        print return_data()  # from script A
        time.sleep(1)

you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.

Ex:
mp2.py:

from multiprocessing import Process,Queue,Pipe
from mp1 import f

if __name__ == '__main__':
    parent_conn,child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print(parent_conn.recv())   # prints "Hello"

mp1.py:

from multiprocessing import Process,Pipe

def f(child_conn):
    msg = "Hello"
    child_conn.send(msg)
    child_conn.close()

If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).

As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.

Also, there are methods to avoid waiting for data, if there hasn’t been any passed yet (queue.empty() and pipeConn.poll()).

See an example using Queue() below:

    # main.py
    from multiprocessing import Process, Queue
    from stage1 import Stage1
    from stage2 import Stage2


    s1= Stage1()
    s2= Stage2()

    # S1 to S2 communication
    queueS1 = Queue()  # s1.stage1() writes to queueS1

    # S2 to S1 communication
    queueS2 = Queue()  # s2.stage2() writes to queueS2

    # start s2 as another process
    s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
    s2.daemon = True
    s2.start()     # Launch the stage2 process

    s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2 
    s2.join() # wait till s2 daemon finishes
    # stage1.py
    import time
    import random

    class Stage1:

      def stage1(self, queueS1, queueS2):
        print("stage1")
        lala = []
        lis = [1, 2, 3, 4, 5]
        for i in range(len(lis)):
          # to avoid unnecessary waiting
          if not queueS2.empty():
            msg = queueS2.get()    # get msg from s2
            print("! ! ! stage1 RECEIVED from s2:", msg)
            lala = [6, 7, 8] # now that a msg was received, further msgs will be different
          time.sleep(1) # work
          random.shuffle(lis)
          queueS1.put(lis + lala)             
        queueS1.put('s1 is DONE')
    # stage2.py
    import time

    class Stage2:

      def stage2(self, queueS1, queueS2):
        print("stage2")
        while True:
            msg = queueS1.get()    # wait till there is a msg from s1
            print("- - - stage2 RECEIVED from s1:", msg)
            if msg == 's1 is DONE ':
                break # ends loop
            time.sleep(1) # work
            queueS2.put("update lists")             

EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there’s no need to check first if the queue is empty. This is no possible if you use pipes.

You could use the pickling module to pass data between two python programs.

import pickle 

def storeData(): 
    # initializing data to be stored in db 
    employee1 = {'key' : 'Engineer', 'name' : 'Harrison', 
    'age' : 21, 'pay' : 40000} 
    employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack', 
    'age' : 50, 'pay' : 50000} 

    # database 
    db = {} 
    db['employee1'] = employee1 
    db['employee2'] = employee2 

    # Its important to use binary mode 
    dbfile = open('examplePickle', 'ab') 

    # source, destination 
    pickle.dump(db, dbfile)                   
    dbfile.close() 

def loadData(): 
    # for reading also binary mode is important 
    dbfile = open('examplePickle', 'rb')      
    db = pickle.load(dbfile) 
    for keys in db: 
        print(keys, '=>', db[keys]) 
    dbfile.close() 

This will pass data to and from two running scripts using TCP host socket. https://zeromq.org/languages/python/. required module zmq: use( pip install zmq ).
This this is called a client server communication. The server will wait for the client to send a request. The client will also not run if the server is not running. In addition, this client server communication allows for you to send a request from one device(client) to another device(server), as long as the client and server are on the same network and you change localhost (localhost for the server is marked with: * )to the actual IP of your device(server)( IP help( go into your device network settings, click on your network icon, find advanced or properties, look for IP address. note this may be different from going to google and asking for your ip. I am using IPV6 so. DDOS protection.)) Change the localhost IP of the client to the server IP. QUESTION to OP. Do you have to have script b always running or can script b be imported as a module to script a? If so look up how to make python modules.


The answers/resolutions are collected from stackoverflow, are licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0 .