Thanks for the update Kai :)
Hi,
For future reference, it is possible to ingest the Timestamp in a Normalizer and name it log_ts. This will ensure that the logs will be indexed by their inherent Timestamp available in the log, rather than the time of ingestion.
To accomplish your task at Search - you would need to use a process command. Such as:
| process eval("searchtime_ts=strptime('20210905|231304|', 'yyyyMMdd|HHmmss')")
This will produce a new field 'searchtime_ts' for each log entry. However, you need to extract your timestamp with the 'norm' command first, place it in a variable, and then pipe the variable as input to the 'eval' function above.
Untested, but principally:
norm <mylogts:string> | process eval("searchtime_ts=strptime(mylogts, 'yyyyMMdd|HHmmss')")
For further reading:
https://docs.logpoint.com/docs/evaluation-process-plugin/en/latest/DateTime%20functions.html#strptime
Also, your supplied timestamp does not contain any TimeZone, so LogPoint assumes UTC. Which will be converted to YOUR TimeZone at presentation. I am on CEST so it will currently add 2 hours to your original timestamp.