Spread our blog

WRITE_META

Hi guys!!

We know that at the time of indexing data into indexers, Splunk software parses the data stream into a series of events. Now we can perform different actions on those events. For parsing and filtering we use two configuration files that is props.conf and transforms.conf in the heavy forwarders.

But what if you want to store some specific pattern on the basis of metadata. Here it will match the regex pattern that we will specify and then will be stored in another field called extracted_sourcetype(or can be any metadata and the field name will be extracted_metadata).

Now I will show you how to do this.

See below we have given a sample data on which I am going to perform the parsing.

Hii guys
Today I am going to show you how to perform parsing.
and secondly I will use here two configuration
files that are  props.conf and
the transforms.conf ,both
the files are configured in
Heavy Forwarder and there is one another configuration file indexes.conf
which we will use later.
byee. Have a nice day.

Now follow the below steps:

STEP 1:

You have to go to the location where you want to save the sample data and then create a file. Here I have created file named host.txt in /tmp location. You can use any other location or any other existing file for storing your data.

STEP 2:

Now after creating the file, put the sample data in this file and after that press “esc” -> “:wq”

STEP 3:

We will configure inputs.conf. You can find the inputs.conf in the below path:

$SPLUNK_HOME$/etc/system/local/

In the inputs.conf we will mention the absolute path of the file of our sample data which we want to monitor. Now here we will mention the index,host and sourcetype[You can give any metadata names according to you wish].

STEP 4:

Now we will configure props.conf in HF. You can find the props.conf in the below path:

$SPLUNK_HOME$/etc/system/local/

Here you have to give the sourcetype name in stanza. Here I have used SHOULD_LINEMERGE=false by which the lines of my sample data will not merge.

Here the second attribute is TRANSFORMS-soo=do(the general format is TRANSFORMS-<class name>=<transformation name>. Here you can give any string name in “class name”, as I have given “soo”. And the “transformation name” is the name which we will specify in transforms.conf in a stanza. It is shown in the next step.

STEP 5:

Now we will configure transforms.conf in HF. You can find the transforms.conf in the same path as props.conf .

Here give the “transformation name” in stanza as I have given i.e [do]. In REGEX give the regular expression of the any string which you want to take in the field extracted_metadata(i.e host, source or sourcetype),here we have taken the sourcetype(you can take any metadata). I have used FORMAT=sourcetype::$1, here we will have to give the metadata name or the field name on which we want our regex pattern to match and then extract it.  I have used WRITE_META=true, here  in whichever event the REGEX pattern will match, that pattern will go to the by default field named extracted_metadata(here field “extracted_sourcetype”), that will be created. And by default the WRITE_META is set to false.

STEP 6:

After configuring the configuration files, you should always restart the splunk in HF and UF both, so that all the changes will be updated.

STEP 7:

After restarting the splunk, go to that text file i.e host.txt and add some more data in that.

STEP 8:

So now you can see that the events which matches the pattern defined in REGEX, will go to  the by default field which will be created, named extracted_sourcetype (i.e extracted_metadata), and will store the values which will match the regex pattern.

Hope, this has helped you in achieving the below requirement without fail:

You can also know about :  Splunk Licensing: Enforcement Vs No-Enforcement

WRITE_META

Happy Splunking  !!

What’s your Reaction?
+1
+1
+1
+1
+1
1
+1
+1

Spread our blog
Previous articleDEST_KEY=MetaData:Sourcetype
Next articleForwarding CSV file to Indexer with Header in  Splunk
Passionate content developer dedicated to producing result-oriented content, a specialist in technical and marketing niche writing!! Splunk Geek is a professional content writer with 6 years of experience and has been working for businesses of all types and sizes. It believes in offering insightful, educational, and valuable content and it's work reflects that.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here