Skip to main content

I have configured a log source with the wrong normalizer and therefor the usual field extractions from LogPoint taxonomy weren’t available in the search interface. I fixed the normalizer configuration but dont’t know how apply LogPoint taxonomy on older logs. Can someone help ?

The changes in the normalization will not reflect on older logs. If you look at the Logpoint architecture you’ll find that normalization process is handled before moving on to storage and enrichment layer. So once the logs have already been stored, it won’t be sent back to the normalization layer again.

 


I thought I’d point out that in some cases the “norm on” one-to-one command and/or dynamic enrichment might come to the rescue - probably not if the log hasn’t been normalised at all, but I’ve had cases where a field hadn’t been normalised correctly in the past and I’ve been able to use the “norm on” to strip out some extra stuff retrospectively.

For example, I had a Virus Scanner log stuff in its event_type field as Event::VirusFound, and a norm on event_type Event::<event_type:string> took care of that, even though it had been stored incorrectly. Of course for that to work, the data needs to be in the field to begin with.


Reply