开发者

tail -f does not seem to work in the shell when file is being populated through file.write()

开发者 https://www.devze.com 2023-02-03 03:30 出处:网络
I am trying to daemonize a python script that currently runs in the foreground. However, I still need to be able to see its output which it currently dumps to stdout.

I am trying to daemonize a python script that currently runs in the foreground. However, I still need to be able to see its output which it currently dumps to stdout.

So I am using the following piece of code which generates a unique file name in /tmp and then it assigns sys.stdout to this new file. All subsequent calls to 'print' are then redirected to this log file.

import uuid
outfile = open('/tmp/outfile-' + str(uuid.uuid4()), 'w')
outfile.write("Log file for daemon script...\n")
sys.stdout=outfile

# Rest of script uses print statements to dump information into the /tmp file
.
.
.

The problem I am facing is that, when I tail -f the file created in /tmp, I don't see any output. However, once I kill my daemon process, output is visible in the /tmp logfile, because python flushes out the file data.

I want to monitor the /tmp log file in realtime, hence it would be great if somehow, the output can be made visible in realtime.

One 开发者_如何学JAVAsolution that I have tried was trying to use unbeffered IO, but that didn't help either.


Try harder to use unbuffered I/O. The problem is almost certainly that your output is buffered.

Opening the file like this should work:

outfile = open(name, 'w', 0)
0

精彩评论

暂无评论...
验证码 换一张
取 消