问题描述
我在 python 中遇到以下问题.
I'm having the following problem in python.
我需要并行进行一些计算,其结果需要按顺序写入文件中.所以我创建了一个接收 multiprocessing.Queue 和文件句柄的函数,进行计算并将结果打印到文件中:
I need to do some calculations in parallel whose results I need to be written sequentially in a file. So I created a function that receives a multiprocessing.Queue and a file handle, do the calculation and print the result in the file:
import multiprocessing from multiprocessing import Process, Queue from mySimulation import doCalculation # doCalculation(pars) is a function I must run for many different sets of parameters and collect the results in a file def work(queue, fh): while True: try: parameter = queue.get(block = False) result = doCalculation(parameter) print >>fh, string except: break if __name__ == "__main__": nthreads = multiprocessing.cpu_count() fh = open("foo", "w") workQueue = Queue() parList = # list of conditions for which I want to run doCalculation() for x in parList: workQueue.put(x) processes = [Process(target = writefh, args = (workQueue, fh)) for i in range(nthreads)] for p in processes: p.start() for p in processes: p.join() fh.close()
但脚本运行后文件最终为空.我试图将 worker() 函数更改为:
But the file ends up empty after the script runs. I tried to change the worker() function to:
def work(queue, filename): while True: try: fh = open(filename, "a") parameter = queue.get(block = False) result = doCalculation(parameter) print >>fh, string fh.close() except: break
并将文件名作为参数传递.然后它按我的意图工作.当我尝试按顺序执行相同的操作时,没有多处理,它也可以正常工作.
and pass the filename as parameter. Then it works as I intended. When I try to do the same thing sequentially, without multiprocessing, it also works normally.
为什么它在第一个版本中不起作用?我看不出问题.
Why it didn't worked in the first version? I can't see the problem.
另外:我可以保证两个进程不会同时尝试写入文件吗?
Also: can I guarantee that two processes won't try to write the file simultaneously?
谢谢.我现在明白了.这是工作版本:
Thanks. I got it now. This is the working version:
import multiprocessing from multiprocessing import Process, Queue from time import sleep from random import uniform def doCalculation(par): t = uniform(0,2) sleep(t) return par * par # just to simulate some calculation def feed(queue, parlist): for par in parlist: queue.put(par) def calc(queueIn, queueOut): while True: try: par = queueIn.get(block = False) print "dealing with ", par, "" res = doCalculation(par) queueOut.put((par,res)) except: break def write(queue, fname): fhandle = open(fname, "w") while True: try: par, res = queue.get(block = False) print >>fhandle, par, res except: break fhandle.close() if __name__ == "__main__": nthreads = multiprocessing.cpu_count() fname = "foo" workerQueue = Queue() writerQueue = Queue() parlist = [1,2,3,4,5,6,7,8,9,10] feedProc = Process(target = feed , args = (workerQueue, parlist)) calcProc = [Process(target = calc , args = (workerQueue, writerQueue)) for i in range(nthreads)] writProc = Process(target = write, args = (writerQueue, fname)) feedProc.start() for p in calcProc: p.start() writProc.start() feedProc.join () for p in calcProc: p.join() writProc.join ()
推荐答案
你真的应该使用两个队列和三种不同的处理方式.
You really should use two queues and three separate kinds of processing.
将东西放入队列 #1.
Put stuff into Queue #1.
从 Queue #1 中取出东西并进行计算,然后将东西放入 Queue #2.您可以拥有其中的许多,因为它们从一个队列中取出并安全地放入另一个队列.
Get stuff out of Queue #1 and do calculations, putting stuff in Queue #2. You can have many of these, since they get from one queue and put into another queue safely.
从 Queue #2 中取出内容并将其写入文件.您必须恰好拥有其中的 1 个,仅此而已.它拥有"文件,保证原子访问,并绝对保证文件**净和一致地写入.
Get stuff out of Queue #2 and write it to a file. You must have exactly 1 of these and no more. It "owns" the file, guarantees atomic access, and absolutely assures that the file is written cleanly and consistently.