Thanks a lot for this series of videos, it was very helpful! I'm looking for a way of classifying my logs but I don't know if using pipelines and grok extractors that way is the best practice... Could you help me to decide? I want to have 2 groups: the group with the well-known logs (I know them by syntax using regex for example), and the second group with the unknown logs that need to be manually examined every day. First, I thought of classifying them by source using a stream with a rule for each source. Then, to use grok or regular expressions in pipeline rules to set a boolean field that indicates if the log is known or unknown. But creating a grok pattern and than a pipeline rule doesn't seem to be the most straightforward way. Do you have any other ideas? Thank you in advance!
Felipe Silveira Most probably the unknown logs you want to parse them. The way I was doing it was that I let all the logs in one stream eg firewall logs, and then to see the logs that weren’t parsed I would use a search in which I’ll try to eliminate the messages that don’t have the field for the destination ip in the last 24h. I hope this answers your question.
@@BitsByteHard thank you, that sort of answers my question, but I still feel like this isn't the ideal solution, but more like a workaround... It's not very elegant nor accurate to try to search for a group of logs based on the presence or not of a parsed field. Do you know what I mean? Thanks for responding me so fast! It's great because I need to finish this graylog task by the end of the week :)
Also, I'm struggling to search logs with the regular expression syntax, I write the regex /between slashes/ but I can only compose one term, everytime there's a space or any other character between words it doesn't work... Do you plan on making a video about search with regex in graylog 3?
for search syntax please refer to these two links docs.graylog.org/en/3.3/pages/searching/query_language.html www.elastic.co/guide/en/elasticsearch/guide/2.x/_wildcard_and_regexp_queries.html
Thanks a lot for this series of videos, it was very helpful! I'm looking for a way of classifying my logs but I don't know if using pipelines and grok extractors that way is the best practice... Could you help me to decide? I want to have 2 groups: the group with the well-known logs (I know them by syntax using regex for example), and the second group with the unknown logs that need to be manually examined every day. First, I thought of classifying them by source using a stream with a rule for each source. Then, to use grok or regular expressions in pipeline rules to set a boolean field that indicates if the log is known or unknown. But creating a grok pattern and than a pipeline rule doesn't seem to be the most straightforward way. Do you have any other ideas? Thank you in advance!
Felipe Silveira Most probably the unknown logs you want to parse them. The way I was doing it was that I let all the logs in one stream eg firewall logs, and then to see the logs that weren’t parsed I would use a search in which I’ll try to eliminate the messages that don’t have the field for the destination ip in the last 24h.
I hope this answers your question.
@@BitsByteHard thank you, that sort of answers my question, but I still feel like this isn't the ideal solution, but more like a workaround... It's not very elegant nor accurate to try to search for a group of logs based on the presence or not of a parsed field. Do you know what I mean? Thanks for responding me so fast! It's great because I need to finish this graylog task by the end of the week :)
Also, I'm struggling to search logs with the regular expression syntax, I write the regex /between slashes/ but I can only compose one term, everytime there's a space or any other character between words it doesn't work... Do you plan on making a video about search with regex in graylog 3?
for search syntax please refer to these two links
docs.graylog.org/en/3.3/pages/searching/query_language.html
www.elastic.co/guide/en/elasticsearch/guide/2.x/_wildcard_and_regexp_queries.html
hello, Is there some examples how to apply pipelines and extractors for postfix. Thank you.
try to look on the graylog market