获取 apache 日志作为 csv 文件

问题描述 投票:0回答:3

有没有办法将所有 apache 日志保存为 CSV 文件?

access.log->access_log.csv
error.log->error_log.csv
apache
3个回答
3
投票

您可以定义自定义日志格式,使 Apache 日志直接变成逗号分隔的格式。

您可能需要摆弄一段时间才能找到正确的方法。例如,您可能希望使用

"
'
作为字段分隔符,以防止字段值内的逗号破坏 CSV。


3
投票

如果您遇到问题,想要查看过去写入的日志文件,或者来自您无权访问配置文件的 apache 服务器的日志文件,或者由于某些其他原因不想更改日志文件格式:

我编写了一个 linux shell sed 脚本,它将默认的 apache 日志文件转换为 libre office calc 可以读取的格式:

#!/bin/bash

#reformat apache's access logs, so that they can be interpreted as csv files, 
# with space as column delimiter and double quotes to bind together things
# that contain spaces but represent single columns.

# 1)  add a doublequote at the begining of the line. first column is the ip adress. 
#     ip-adresses that have 3 digits in every group but the first could be interpreted as numbers 
#     with the dots marking groups of thousands.

# 2a) end the ip-adress with quotes
# 2b) surround the second (to me unknown) column thats always just "-" and the
#     third column which is the username with quotes
# 2c) reformat the date from "[09/Jul/2012:11:17:47" to "09.Jul 2012 11:17:47"

# 3)  remove the string "+0200]" (replace it with doublequotes to end the date column)

# 4)  the string that contains the command (5th column) sometimes contains string representation 
#     of binary rubish. thats no problem as long as this does not contain a doublequote which 
#     will mess up the column zoning. According to my web searches, csv columns should allow to 
#     contain doublequotes if they are escaped with a backslash. Although this is the case with
#     these problematic strings, Libre Office does not accept it that way. therefore we escape every 
#     doublequote with a doubleqoute, which is the other valid option according to csv specifications,
#     and libre office does accept that one. More technical: we replace every doublequote that does
#     neither have a space or another doublequote before it, neither after it, with two doublequotes.

sed \
-e 's/^/"/' \
-e 's/ \([^ ]\{1,\}\) \([^ ]\{1,\}\) \[\([0-9]\{1,2\}\)\/\([a-zA-Z]\{1,3\}\)\/\([0-9]\{1,4\}\):/" "\1" "\2" "\3.\4 \5 /' \
-e 's/ +0200\] /" /' \
-e 's/\([^" ]\)"\([^" ]\)/\1""\2/g'

3
投票

这实际上只是对@kaefert 答案的修改。我确信有一种更干净的方法可以做到这一点,但这效果很好。

alias aplogcsv="sed -e 's/^/\"/' \
                -e 's/:\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\)/\",\"\1\2\3\4/' \
                -e 's/ \([^ ]\{1,\}\) \([^ ]\{1,\}\) \[\([0-9]\{1,2\}\)\/\([a-zA-Z]\{1,3\}\)\/\([0-9]\{1,4\}\):/\",\"\1\" \"\2\" \"\3 \4 \5\",\" /' \
                -e 's/ \([0-9]\{1,2\}\):\([0-9]\{1,2\}\):\([0-9]\{1,2\}\)/\1:\2:\3/' \
                -e 's/ $(date +%z)\] /\",/' \
                -e 's/\"GET /\"GET\",\"/g' \
                -e 's/\"POST /\"POST\",\"/g' \
                -e 's/ HTTP\/1.1\" \([0-9]\{1,3\}\) \([0-9]\{1,4\}\) /\",\"HTTP\/1.1\",\1,\2,/' \
                -e 's/\"-\" //g'"

然后我这样使用它:

aplogcsv access.log > ~/access.log.csv

但它也很容易使用:

grep "25/Jan/2019" access.log | aplogcsv > ~/20190125.access.log.csv
© www.soinside.com 2019 - 2024. All rights reserved.