extract the 'function' and use thread_name and function as unique IDs) source="." | rex field=message "Calling function (?.+)" | rex field=message "Completed calling function (?.+)" | transaction thread_name,rep_function startsWith=(message="Calling function*") endsWith=(message="Completed calling function*") | search duration>100 I am trying to write a search query to identify the functions that run for more than 'x' seconds, using transaction command as follows (i.e. 09:25:01 : (2314) : Completed calling function fetchTask 09:25:01 : (2314) : Calling function fetchWkflowVarPersistValues ![]() 09:25:01 : (2314) : Completed calling function fetchWkflowVarPersistValues 09:25:01 : (2314) : Completed calling function nextSeq 09:25:01 : (1800) : Completed calling function lockObject 09:25:01 : (2314) : Calling function startWflowLog 09:25:01 : (2314) : Completed calling function startWflowLog 09:25:01 : (2314) : Calling function endTaskInstLog 09:25:01 : (2314) : Completed calling function endTaskInstLog 09:25:01 : (2314) : Calling function startTaskInstLog ![]() 09:25:01 : (2314) : Completed calling function startTaskInstLog 09:25:01 : (4644) : Calling function fetchWorkflowRunBreaks 09:25:01 : (4644) : Completed calling function fetchWorkflowRunBreaks 09:25:02 : (4127) : Calling function fetchWorkflowRunBreaks 09:25:02 : (4127) : Completed calling function fetchWorkflowRunBreaks 09:25:02 : (5677) : Calling function fetchWorkflowRunBreaks 09:25:02 : (5677) : Completed calling function fetchWorkflowRunBreaks 09:25:02 : (3624) : Calling function fetchWorkflowRunBreaks 09:25:02 : (4386) : Calling function fetchWorkflowRunBreaks 09:25:02 : (3624) : Completed calling function fetchWorkflowRunBreaks 09:25:02 : (4386) : Completed calling function fetchWorkflowRunBreaks 09:25:02 : (2702) : Completed calling function init ![]() Our logs have multiple events for the same timestamp as follows (I have simplified the logs, removing the unrelated fields w.r.to this query): Timestamp : (thread_name) : message There are several good resources that hang out down there that would be glad to help you optimize what you are doing.I'm a Splunk newbie and I'm trying to write some queries for our logs using 'transaction'. When you have the actual details, then you can make better decisions.įeel free to get on the Splunk Slack channel and ask for help in the dashboard sub channel. The efficiency of all of the above is highly data dependent and use case dependent. Changing the filters doesn't have to rerun the base search, just the post searches. With this strategy, the base search runs, then provides the results that get filtered for presentation. Third, the best argument for using a base search is if you are going to have filters that run after the base search to change the presentation. You can almost always built a more efficient search with streamstats followed by stats. I always recommend to avoid it if possible. Second, while it may seem convenient at times, transaction is a very inefficient verb. The efficient way may be with one base search, or three, or none. First, identify what the various panels are supposed to show, then determine what fields need to exist (be extracted) at what granularity level (stats by) in order to show all of them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |