Using external lookup in Logstash -


i'm working on 2 logstash projects, 1 monitoring iis logs , 1 firewall.

now iis logs high-usage servers generating 25gb of logs each month , there several of these. issue here not want enable reverse lookup, not on servers nor in logstash, external service can cache outside of dns lookup´function in logstash.

the other problem want solve firewall project related lookup of standard, , non-standard ports. our firewall generates dest portnumber translate make our kibana dashboards more readable. firewall has around 10gb/s traffic , generates lot of syslog traffic.

we run 8-16 workers on our logstash server. there easy (?) way make api call logstash , worth considering based on performance?

another option i'm condering "offline" batch processing, ie running batch jobs directly towards elasticsearch, likley mean should have separate instance of elasticsearch or redis before frontend.

the best option likley translation in kibana interface, scripted field, of understand not work usercases?

the dns{} filter uses local machine's resolution, couldn't integrate non-dns cache without making new filter or dropping ruby{}.

depending on number of values have, publish them file , use translate{}, recommend private network lookup.

if dns data in elasticsearch, can query during filtering , add fields events way.

for firewall port problem, didn't give example of original , desired values, again, check out translate{} or drop ruby{}.


Comments

Popular posts from this blog

java - Andrioid studio start fail: Fatal error initializing 'null' -

android - Gradle sync Error:Configuration with name 'default' not found -

StringGrid issue in Delphi XE8 firemonkey mobile app -