Quite often when you are running a program, you would like to capture its output into a file for later analysis. I want to do this for a real time clock accuracy monitoring project that I am in process of running. Of course you can write to the file directly from your program, but there is a neat trick to both write to the standard command output, capture to a file and monitor the output at the same time.
First you run your program with the output redirected into a file;
sudo python -u myprog.py > proglog.txt
The > is the redirect and the python -u switch means do not buffer. Without this you will not see your log file change for a while as the OS will buffer a significant amount of the output. When you write directly to the command line the buffer is flushed every line, but not when you redirect. In the Debian wheezy used on the RPi the buffer is 4kbytes so if you redirect to a file you wont see output for a while if you don’t use the -u switch. This is a common problem and makes for a confusing time as you see the log file size stays at 0 while the buffer fills.
Then in another terminal window use;
tail -f proglog.txt
You will now see the last 10 lines of the file that is being created with the output. As the file is updated the ‘tail’ will ‘follow’ it continuously so you get the benefit of writing to the console and of logging to a file. You can increase the number of lines tailed with the -n switch followed by the number of lines you require.
Another way you can do this is to ‘tee’ the output so that the file is generated and the command line shows in one terminal.
sudo python -u myprog.py | tee proglog.txt
sudo python -u myprog.py 2>&1 | tee proglog.txt
The second version with 2>&1 sends both the standard output and the standard error to the file.
These files are great for post processing the data, for example by importing them into excel. If you want to append lines to the last file rather than create the file new each time, use >> rather than >.
sudo python -u myprog.py >> proglog.txt