Control a separate process in python using multiprocessing module -


i wondering easiest way able control process started main function in python.

for example, in main loop call function separate process , purpose collect data buffer. whenever want, indicate process stop collecting data , store in text file. then, when completes writing file wait signal (coming main) before starts same loop again, is, collecting new data buffer. process repeat indefinitely, though awesome if had ability stop process until want new data. tried using multiprocessing.event() reason when event.set() or event.clear() message isn't received in time , data formatting screwed up.

example:

def separateprocess():     datbuffer = []     while true:         datbuffer.append(collectdata(sample))         if signal.recv == 'timetowritetofile':             #write data buffer file.             while true:                 if signal.recv == 'newdata':                     #signal begin recording new data has been received                     datbuffer = [] #clear buffer new data.                     break         else:             #continue recording data.             pass  def main():     #this code stuff regarding experiment.     p = mp.process(target=separateprocess)     p.start()       #based on particular event send signal when needed.     if experiment == 'success':         sendtoproc('timetowritetofile') #theoretical signal other process.         sleep(10) #wait x amount of seconds begin recording new data.          sendtoproc('newdata') 

i can provide code sample of failed attempt @ creating such script if needed. wish know method achieve have there, awesome if method worked using global variables signals. know can't since new process not share global state...

that's all.

your code looks pretty good. suggest creating queue in parent process, sending worker, output data. when parent proc wants worker die, send none.

source

import multiprocessing, queue  def myproc(arg):     return arg*2  def worker(inqueue):     num in iter(inqueue.get, none):         print myproc(num)   inq = multiprocessing.queue() # prefill 3 jobs num in range(3):     inq.put(num) # signal end of jobs inq.put(none)  worker_p = multiprocessing.process(     target=worker, args=(inq,), ) worker_p.start()  worker_p.join() 

output

0 2 4 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -