你如何在Windows下共享日志文件?

我有几个不同的进程,我希望他们都login到同一个文件。 这些进程在Windows 7系统上运行。 有些是python脚本,其他的是cmdbatch file

在Unix下你可以让每个人以附加模式打开文件并写下来。 只要每个进程在单个消息中writeless于PIPE_BUF字节,则每个write调用将被保证不与其他任何其他进行交错。

有没有办法在Windows下做到这一点? 天真的类Unix方法失败,因为Windows不喜欢多个进程默认情况下一次打开文件。

Solutions Collecting From Web of "你如何在Windows下共享日志文件?"

可以将多个批处理过程安全地写入单个日志文件。 我对Python一无所知,但我想这个答案中的概念可以与Python集成。

Windows允许最多一个进程在任何时间点打开一个特定的文件进行写访问。 这可以用来实现一个基于文件的锁定机制,保证事件在多个进程间被序列化。 有关示例,请参阅https://stackoverflow.com/a/9048097/1012053和http://www.dostips.com/forum/viewtopic.php?p=12454

由于您所要做的只是写入日志,因此可以使用日志文件本身作为锁。 日志操作被封装在一个子程序中,试图以附加模式打开日志文件。 如果打开失败,例程将循环返回并再次尝试。 一旦打开成功,日志被写入然后关闭,例程返回给调用者。 例程执行传递给它的任何命令,写入例程内的stdout的任何东西都会被重定向到日志。

这里是一个测试批处理脚本,它创建了5个子进程,每个进程向日志文件写入20次。 写入是安全交错的。

 @echo off setlocal if "%~1" neq "" goto :test :: Initialize set log="myLog.log" 2>nul del %log% 2>nul del "test*.marker" set procCount=5 set testCount=10 :: Launch %procCount% processes that write to the same log for /l %%n in (1 1 %procCount%) do start "" /b "%~f0" %%n :wait for child processes to finish 2>nul dir /b "test*.marker" | find /c "test" | >nul findstr /x "%procCount%" || goto :wait :: Verify log results for /l %%n in (1 1 %procCount%) do ( <nul set /p "=Proc %%n log count = " find /c "Proc %%n: " <%log% ) :: Cleanup del "test*.marker" exit /b ============================================================================== :: code below is the process that writes to the log file :test set instance=%1 for /l %%n in (1 1 %testCount%) do ( call :log echo Proc %instance% says hello! call :log dir "%~f0" ) echo done >"test%1.marker" exit :log command args... 2>nul ( >>%log% ( echo *********************************************************** echo Proc %instance%: %date% %time% %* (call ) %= This odd syntax guarantees the inner block ends with success =% %= We only want to loop back and try again if redirection failed =% ) ) || goto :log exit /b 

以下是输出结果,表明所有20个写入对于每个进程都是成功的

 Proc 1 log count = 20 Proc 2 log count = 20 Proc 3 log count = 20 Proc 4 log count = 20 Proc 5 log count = 20 

您可以打开生成的“myLog.log”文件,以查看写入是如何安全交错的。 但是输出太大,不能在这里发布。

很容易证明,通过修改:日志例程可以使多个进程同时写入失败,从而在失败时不会重试。

 :log command args... >>%log% ( echo *********************************************************** echo Proc %instance%: %date% %time% %* ) exit /b 

以下是“打破”日志例程后的一些示例结果

 The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. The process cannot access the file because it is being used by another process. Proc 1 log count = 12 Proc 2 log count = 16 Proc 3 log count = 13 Proc 4 log count = 18 Proc 5 log count = 14 

你可以试试这个Python模块: http : //pypi.python.org/pypi/ConcurrentLogHandler

它提供了一个简单的替换RotatingFileHandler ,允许多个进程并发地记录到一个文件,而不会丢弃或破坏日志事件。

我没有使用它,但我在Python中读到一个相关的bug( 问题4749 )时发现了它。

如果你实现自己的代码而不是使用该模块,请确保你阅读了错误!

你可以在Windows上使用输出重定向 ,就像你在Bash中做的一样。 将批处理文件的输出传递给通过ConcurrentLogHandler记录的Python脚本。