Re-parse an event for normalization (JSON-event)

  • 1 September 2023
  • 6 replies

Hi !

Just a interesting question. I know that other SIEM vendors have problem with this. Maybe LogPoint have a good function for this.


So I received a JSON-event that didn’t normalise, due to that no normalization-package was enabled. I enabled this after I received the event. 

So to my question. Is It possible to parse this event afterwards so that It gets normalized? Or do I have to wait for another event from the same logsource to see If this one gets normalized?

6 replies

Userlevel 4
Badge +7

Logpoint normalises during ingestion - so once an event has been ingested and not normalised, it will stay that way. There are in-line process commands you can use during a search (such as norm, norm on, regex etc.) to process logs “on-the-fly” after the fact if need be, but that’s not reapplying the normaliser.

One good approach could be to use the universal normaliser - it can process JSON events “out of the box”, but can then further process/rename etc. the field names that JSON provides. There is some GUI functionality to copy/paste an example message to see how it gets processed (just like we have for the traditional regex-based normalisers, which are pretty useless for structured formats like JSON), and that might get you closer to a working normalisation before the next message arrives - but that is ultimately the test.

For something super complicated we have our internal “Logfaker” plugin that could be used to “inject” messages into a device from a simple text file with example data, in which case you wouldn’t need to wait for that exact event to occur again before testing the new normalisation - Support could probably make that available if need be. But hopefully that won’t be necessary.

Hi again

Thanks for the fast reply. I see.

I’ve heard about Universal Normalizer before. For the moment I’m only using Compiled & Normalization Package that Is built in. 

For these JSON-logs I only use JSONCompiled Normalizer. Would you recommend using the Universal Normalizer instead? Then I have to download that .pak file and so on. 





Userlevel 4
Badge +7

The Universal Normaliser should ship with 7.2.1 onwards, and that is the minimum required version. If you are working with JSON data then it can be a useful tool, specifically in cases where you’re not happy with the actual JSON data - for example, JSON encodes the field names, and the regular JSON normaliser just takes them as-is - so if there is a JSON Field called sender_ip, then that’s what the field name will be in Logpoint. Of course you might want that called source_address to keep with the Logpoint taxonomy, and the Universal Normaliser can do just that - process the JSON and then transform things as required. It can also further parse the field - for example if you get a access_path field, you could split that into a drive and a directory field etc.

Of course that all depends on the JSON data to begin with - it still needs to be valid JSON to work, and if you’re not particularly concerned with the structure of the data then the regular JSON normaliser should work too. You should theoretically never have a “JSON event that did not normaliser” once you have a normalisation policy with either the regular JSON normaliser or with the universal normaliser. If it’s valid JSON, they should both normalise it. The difference is just in the how and to what.

All right! Not running that version. But I will look into “Universal Normalizer”, would be good. I’m gonna work with some JSON-logs this autumn.


I actually got Impressed and got a good result with the JSONCompiled Normalizer, got all the relevant data normalised In a good way. 

The JSON logs from the particular source Is quite simple and only contains some attributes. 

What’s your thoughts on Normalizations Packages. What’s best practise. Is It fine to combine a log source that only helds JSON-logs into a Windows Normalization Package that helds all different Windows related Compiled Normalizers and Normalization Packages?

Or should I create a separate Normalization Package that only helds the JSONCompiledNormalizer for that log source that sends JSON-event?



Userlevel 4
Badge +7

If you’re working with JSON logs, all you can do is change the order of the compiled normalisers in the normalisation policy. Custom Normalisation PACKAGES can only be created as non-compiled normalisers, i.e. Regex based. You might mean combine them into a normalisation POLICY instead of PACKAGE - that’s fine, just make sure that you put the normalisers that will be most commonly used at the top. That means that in 90% of cases Logpoint will never have to try the others, and will only try them for what slips through.

But if your device will only ever send your custom logs, then it’s probably best to just have a different normalisation policy just for that.

Thanks Nils for your answers.