Computer Science  >  EXAM  >  DSC 101 Intro to Data Science Splunk Power User Flashcards(Latest 2021) (All)

DSC 101 Intro to Data Science Splunk Power User Flashcards(Latest 2021)

Document Content and Description Below

1 Splunk Power User Flash Cards Splunk Power User Flashcards(Own) 1. eval command • Calculations, Convert values, Round values, Format Values , Conditional statements. • eval field_name=expre ... ssion • Results of eval written to either new or existing field you specify • If the destination field exists, the values of the field are replaced by the results of eval • Indexed data is not modified, and no new data is written into the index • Field values are treated in a case-sensitive manner eval round Function • round(field/number, decimals) function sets the value of a field to the number of decimals you specify • If the number of decimals is unspecified, the result is a whole number eval tostring function • tostring(field,”option”) • “commas” – applies commas – If the number includes decimals, it rounds to two decimal places • “duration” – formats the number as “hh:mm:ss” • “hex” – formats the number in hexadecimal eval if function • if(boolean,”true”,”false”) • Non-numeric values must be enclosed in "double quotes" • Field values are treated in a case-sensitive manner eval lower function • lower(field_name) eval concatenate function • eval full_name=”first_name”.”last_name” Multiple eval functions: • eval full_name=”first_name”.”last_name”, low_name = lower(full_name) Multiple eval commands Each subsequent command references the results of previous commands eval Command –case Function case(X1,Y1,X2,Y2…) • The first argument, X1, is a Boolean expression • If it evaluates to TRUE, the result evaluates to Y1 • If it evaluates to FALSE, the next Boolean expression, X2, is evaluated, etc. • If you want an “otherwise” clause, just test for a condition you know is true at the end (e.g., 0=0) eval function 2 Splunk Power User Flash Cards • To count the number of events that contain a specific field value, use the count and eval functions –Used within a transforming command, such as stats –Requires an as clause –Double quotes are required for character field values –Field values are case-sensitive 2. Filtering Results – search and where The search and where commands can be used at any point in the search pipeline to filter results. –search May be easier if you’re familiar with basic search syntax Treats field values in a case-insensitive manner Allows searching on keyword –where Can compare values from two different fields Functions are available, such as isnotnull () Field values are case-sensitive search Command • To filter results, use search at any point in the search pipeline • Behaves exactly like search strings before the first pipe –Uses the "*" wildcard –Treats field values in a case-insensitive manner where command where eval – expression • Uses same expression syntax as eval command • Uses boolean expressions to filter search results and only keeps results that are True • Double quoted strings are interpreted as field values –Treats field values in a case-sensitive manner • Unquoted or single-quoted strings are treated as fields where Command -Example • Filters search results using eval expressions • Used to compare two different fields where Command With like Operator • Can do wildcard searches with where command • Use (_) for one characte r and (%) for multiple characters • Must use the like operator with wildcards 3 Splunk Power User Flash Cards fillnull command • Use fillnull to replace null values in fields • Use value=string to specify string you want displayed instead Example: fillnull value=NULL • If no value=clause, default replacement value is 0 • Optionally, restrict which field(s) fillnull applies to by listing them at end of command fillnull value=”Unknown” City, Country replace “” with “Unknown” in City, Country 3. transaction command • A transaction is any group of related events that span time • Events can come from multiple applications or hosts • transaction field-list • field-list can be one field name or a list of field names • Events are grouped into transactions based on the values of these fields • –If multiple fields are specified and a relationship exists between those fields, events with related field values will be grouped into a single transaction • Common constraints: • maxspan maxpause startswith endswith • –search uses the "*" wildcard and treats field values in a case-insensitive manner • –status=404 finds the errors • –highlight highlights the terms you specify index=main sourcetype=access_combined | transaction JSESSIONID | search status=404 | highlight JSESSIONID, 404 transaction command – specific fields • duration – the difference between the timestamps for the first and last event in the transaction • eventcount – the number of events in the transaction • maxspan – Maximum total time between the earliest and latest events • ▸If not specified, default is -1 (or no limit) maxspan=10m • maxpause – Maximum total time between events • ▸If not specified, default is -1 (or no limit) maxpause=1m • startswith/endswith – To form transactions based on terms, field values, or evaluations, use startswith and endswith options 4 Splunk Power User Flash Cards startswith=”” , endswith=”” • Transactions can be useful when a single event does not provide enough information • You can use statistics and reporting commands with transactions • Transactions spanning more than 10 minutes with the same client IP are considered Unrelated transaction vs. Stats • Use transaction when you: –Need to see events correlated together –Must define event grouping based on start/end values or chunk on time –Have fewer than 1,000 events for each correlated transaction  By default, transaction displays a maximum event count of 1,000 Admins can configure max_events_per_bucket in limits.conf • Use stats when you: –Want to see the results of a calculation –Can group events based on a field value (e.g. "by src_ip") –Have more than 1,000 events for each grouped set of events • When you have a choice, always use stats as it is faster and more efficient, especially in large Splunk environments • __________ should be used when you want to see the results of a calculation, or you need to group events on a field value. Stats • __________ should be used when you need to see events correlated together, or when events need to be grouped on start and end values. Transaction • This stats function will return unique values for a given field. Value • When using a .csv file for lookups, the first row in the file represents this. Field names 4. Knowledge Objects Knowledge objects are tools you use to discover and analyze various aspects of your data –Data interpretation – Fields and field extractions –Data classification – Event types –Data enrichment – Lookups and workflow actions –Normalization – Tags and field aliases –Datasets – Data models •Shareable–Can be shared between users •Reusable–Persistent objects that can be used by multiple people or apps, such as macros and reports •Searchable–Since the objects are persistent, they can be used in a search What is a Knowledge Manager? • Oversees knowledge object creation and usage for a group or deployment • Normalizes event data 5 Splunk Power User Flash Cards • Creates data models for Pivot users Defining Naming Conventions Group → Object Type → Description Example: SEG_Alert_WinEventLogFailure Permissions: Description Create Read Write Private The person who created the object User Power Admin Person who created it Admin Person who created it Admin This App Only Object persists in a specific app Power Admin User* Power* Admin User* Power* Admin All Apps Available globally in all apps Admin User* Power* Admin User* Power* Admin * Permission to read and/or write if creator gives permission to that role • When an object is created, the permissions are set to Keep private by default • When an object’s permissions are set to This app only and All apps, all roles are given read permission. Write permission is reserved for admin and the object creator unless the creator edits permissions • Only the admin role can promote an object to All apps Managing Knowledge Objects • Knowledge objects are centrally managed from Settings> Knowledge • Your role and permissions determine your ability to modify an object’s setting • By default, objects of all owners are listed. Using the Splunk Common Information Model (CIM) • Methodology for normalizing data • Easily correlate data from different sources and source types • Leverage to create various objects discussed in this course—field extractions, field aliases, event types, tags • More details discussed in Module 13 6 Splunk Power User Flash Cards 5. Field Extractor Methods 1. Regex (Regular Expression) 2. Delimiters The regular expression method works best with unstructured event data. You select a sample event and highlight one or more fields to extract from that event, and the field extractor generates a regular expression that matches similar events in your dataset and extracts the fields from them. The regular expression method provides several tools for testing and refining the accuracy of the regular expression. It also allows you to manually edit the regular expression. The delimiters method is designed for structured event data: data from files with headers, where all of the fields in the events are separated by a common delimiter, such as a comma or space. You select a sample event, identify the delimiter, and then rename the fields that the field extractor finds. data that resides in a file that has headers and fields separated by specific characters Regular Expression Extraction Method: select Sample → Select Method → Select Fields → Validate → Save Delimiter Extraction select Sample → Select Method →Rename Fields → Save How to go to the Field Extractor: 1. Settings → Fields 2. Fields Sidebar 3. Event Actions Field Auto-Extraction • Splunk automatically discovers many fields based on source type and key/value pairs found in the data • Prior to search time, some fields are already stored with the event in the index: –Meta fields, such as host , source , and sourcetype –Internal fields such as _time and _raw • At search time, field discovery discovers fields directly related to the search’s results • Splunk may also extract other fields from raw event data that aren’t directly related to the search Performing Field Extractions • In addition to the many fields Splunk auto-extracts, you can also extract your own fields with the Field Extractor (FX) • Use FX to extract fields that are static and that you use often in searches -Graphical UI -Extract fields from events using regex or delimiter -Extracted fields persist as knowledge objects 7 Splunk Power User Flash Cards -Can be shared and re-used in multiple searches • Access FX via Settings, Fields Sidebar, or Event Actions menu Field Extraction Methods • Regex –Use this option when your event contains unstructured data like a system log file –FX attempts to extract fields using a Regular Expression that matches similar events • Delimiter -Use this option when your event contains structured data like a .csv file -The data doesn’t have headers and the fields must be separated by delimiters (spaces, commas, pipes, tabs, or other characters) Regex Field Extractions from Settings Settings > Fields > Field extractions > Open Field Extractor 1.Select the Data Type –sourcetype –source 2.Select the Source Type -access_combined Regex Field Extractions from Settings –Select Sample 3.Select a sample event by clicking on it 4.Click Next > Regex Field Extractions from Settings – Select Method 5.Select Regular Expression 6.Click Next> Regex Field Extractions from Settings – Select Values 7.Select the value(s) you want to extract. In this example, two fields are being extracted 8.Provide a field name 9.Click Add Extraction Require Option: Only events with the highlighted string will be included in the extraction. Regex Field Extractions from Settings –Preview 10.Preview the sample events 11.Click Next Regex Field Extractions from Settings –Validate 12.Validate the proper field values are extracted 13.Click Next Regex Field Extractions from Settings –Save 14.Review the name for the newly extracted fields and set permissions 15.Click Finish 8 Splunk Power User Flash Cards Note: An extractions name is provided by default. This name can be changed. Using the Extracted Fields index=main sourcetype=access_combined | table clientip gethome Editing Regex for Field Extractions 1.From Select Method, click Regular Expression 2.Click Next > Editing Regex for Field Extractions – Select Field 3.Select the field to extract 4.Provide a Field Name 5.Click Add Extraction Editing Regex for Field Extractions – Show Regex 6.Click Show Regular Expression > 7.Click Edit the Regular Expression Editing Regex for Field Extractions – Modify RegEx 8.Update the regular expression 9.Click Save Warning: Once you edit the regular expression, you cannot go back to the Field Extractor UI. Editing Regex for Field Extractions – Save 10.Review the Extractions Name and set permissions 11.Click Finish Delimited Field Extractions • Use delimited field extractions when the event log does not have a header and fields are separated by spaces, commas, or characters • In this example, the fields are separated by commas Field Extraction Workflows -Delimiters Delimited Field Extractions from Settings Settings > Fields > Field extractions > Open Field Extractor Delimited Field Extractions (Settings) – Select Sample 1.Select the Data Type –sourcetype–source 2.Select the Source Type Delimited Field Extractions (Settings) – Select Event 3.Select a sample event 9 Splunk Power User Flash Cards 4.Click Next > Delimited Field Extractions (Settings) – Select Method 5.Select Delimiters 6.Click Next > Delimited Field Extractions (Settings) – Select Delimiter 7. Select the Delimiter used in your event. Delimited Field Extractions (Settings) – Rename Field 8.Click the icon next to the default field name 9.Enter a new field name 10.Click Rename Field 11.Repeat these steps for all fields 12.After all the fields are renamed, click Next > Delimited Field Extractions (Settings) – Save 13.Review the name for your extraction and click Finish > Using a Delimited Field Extraction • Use the rex command to either extract fields using regular expression named groups, or replace or substitute characters in a field using sed expressions 6. Field Aliases • A way to normalize data over any default field (host, source or sourcetype) • Multiple aliases can be applied to one field • Applied after field extractions, before lookups • Can apply field aliases to lookups Creating a Field Alias Settings > Fields > Field Aliases > New • A new field alias is required for each sourcetype Testing the Field Alias After the field alias is created, perform a search using the new field alias. field_aliases=field_value Field Alias and Original Fields • When you create a field alias, the original field is not affected • Both fields appear in the All Fields and Interesting Fields lists, if they appear in at least 20% of events Field Aliases and Lookups After you have defined your field aliases, you can reference them in a lookup table 10 Splunk Power User Flash Cards What is a Calculated Field? • Shortcut for performing repetitive, long, or complex transformations using the eval command • Must be based on an extracted field –Output fields from a lookup table or fields/columns generated from within a search string are not supported Calculated fields are fields added to events at search time that perform calculations with the values of two or more fields already present in those events. The eval command enables you to write an expression that uses extracted fields and creates a new field that takes the value that is the result of that expression's evaluation Calculated fields enable you to define fields with eval expressions. When writing a search, you can cut out the eval expression and reference the field like any other extracted field. The fields are extracted at search time and added to events that include the fields in the eval expressions. Calculated Fields will perform some calculation taking one or more existing fields of the events. These fields can be used in future instead of writing eval command. Suppose you are performing a complex or long eval expression on regular basis, it will be difficult to write that eval expression everyday. So rather than that you can use a Calculated Fields instead of writing that eval expression. You can create a new Calculated Field by taking the existing fields of events or you can override an existing field of events by a Calculated Field. In Search, just write the name of the calculated field. Creating a Calculated Field Settings > Fields > Calculated Fields > New 1. Select the app that will use the calculated field 2. Select host, source, or sourcetype to apply to the calculated field and specify the related name 3. Name the calculated field 4. Define the eval expression 7. Describing Tags • Tags are like nicknames that you create for related field/value pairs • Tags make your data more understandable and less ambiguous • You can create one or more tags for any field/value combination • Tags are case sensitive tag=value (both tag names and value are case sensitive) Viewing Tags When tagged field/value pairs are selected, the tags appear: A. In the results as tags B. In parentheses next to the associated field/value pairs 11 Splunk Power User Flash Cards Using Tags To use tags in a search, use the syntax: tag=<tag name> Searching for Tags • To search for a tag associated with a value: –tag=<tagname> • To search for a tag associated with a value on a specific field: –tag::<field>=<tagname> • To search for a tag using a partial field value: –Use (*) wildcard Adding/Changing the Tag Name Click List by field value pair to add another tag or change the name of the tag Adding/Changing the Field Value Pair Click List by tag name to add or edit the field value pair for the tag 8. Describing Event Types • A method of categorizing events based on a search • A useful method for institutional knowledge capturing and sharing • Can be tagged to group similar types of events Creating an Event Type from the Search Page 1.Run a search and verify that all results meet your event type criteria 2.From the Save As menu, select Event Type 3.Provide a name for your event type (name should not contain space). Using the Event Type Builder 1.From the event details, select Event Actions > Build Event Type 2.Refine the criteria for your event type such as: •Search string •Field values •Tags 3.Verify your selections and click Save Using Event Types • To verify the event type, search for eventtype=web_error • ‘eventtype’ displays in the Fields sidebar and can be added as a selected field 12 Splunk Power User Flash Cards • Splunk evaluates the events and applies the appropriate event types at search time • Using the Fields sidebar, you can easily view the individual event types, the number of events, and percentage. • Eventtype allows you to categorize events based on search terms. • Tagging Event Types You can tag event types two ways: 1.Settings > Event Types 2.Event details > Actions Event Types vs. Saved Reports • Event Types –Categorize events based on a search string –Tag event types to organize data into categories –The eventtype field can be included in a search string –Does not include a time range • Saved Reports –Search criteria will not change –Includes a time range and formatting of the results –Can be shared with Splunk users and added to dashboards 9. Macros Overview • Useful when you frequently run searches or reports with similar search syntax • The time range is selected at search time • Macros can be a full search string or a portion of a search that can be reused in multiple places • Allows you to define one or more arguments within search segment –Pass parameter values to macro at execution time –Macro uses values to resolve search string Settings → Advanced Search → Search Macros • Test your search string before saving your macro. You can check the contents of your macro with keyboard shortcuts (Command-Shift-E on MAC or Ctrl-Shift-E on Linux or Windows) from the search bar in the Search Page Creating a Basic Macro Settings > Advanced search > Search Macros 1.Click New 2.Select the destination app 3.Enter a name 4.Type the search string 5.Save Using a Basic Macro 13 Splunk Power User Flash Cards • Type the macro name into the search bar • Surround the macro name with the backtick (or grave accent)character -`macroname` != ‘macroname’ -Do not confuse with single-quote character (‘) • Pipe to more commands, or precede with search string Adding Arguments • Include the number of arguments in parentheses after the macro name -monthly_sales(3) • Within the search definition, use $arg$ -currency=$currency$ -symbol=$symbol$ -rate=$rate$ • In the Arguments field, enter the name of the argument(s) • Provide one or more variables of the macro at search time Using Arguments • When using a macro with arguments, include the argument(s) in parentheses following the macro name • Be sure to pass in the arguments in the same order as you defined them Argument names may only contain alphanumeric, '_' and '-' characters. Validating Macros • You can validate the argument values in your macro –Validation Expression You can enter an expression for each argument (Eval or boolean expression) –Validation Error Message Message that appears when you run the macro Note: Don’t create macros with leading pipes – someone may put a pipe in front of the macro when using it in the actual search string. 10. Creating and using Workflow Actions What are Workflow Actions? 14 Splunk Power User Flash Cards • Execute workflow actions from an event in your search results to interact with external resources or run another search –GET – pass information to an external web resource –POST – send field values to an external resource –Search- use field values to perform a secondary search • Workflow names without space or special characters. • Specify a comma-separated list of fields that must be present in an event for the workflow action to apply to it. • When fields are specified, the workflow action only appears in the field menus for those fields; otherwise it appears in all field menus. Creating a GET Workflow Action Settings > Fields > Workflow actions > New 1.Select the app 2.Name the workflow action with no spaces or special characters 3.Define the label, which will appear in the Event Action menu 4.Determine if your workflow action applies to a field or event type 5. From the Show action in drop down list, select Event menu 6. Select link as the Action type 7. Enter the URI of where the user will be directed 8. Specify if the link should open in a New window or Current window 9. Select the Link method of get 10.Save Creating a Post Workflow Action Settings > Fields > Workflow actions > New Complete steps 1 – 6 as described in the previous example, Creating a GET Workflow Action 11.Creating Data Models Reviewing Pivot • Used for creating reports and dashboards Overview of Data Models • Hierarchically structured datasets that generate searches and drive Pivot –Pivot reports are created based on datasets –Each event, search or transaction is saved as a separate dataset Data Model Dataset Types • Data models consist of 3 types of datasets 1.Events 2.Searches 3.Transactions Data Model Events •Event datasets contain constraints and fields •Constraints are essentially the search broken down into a hierarchy 15 Splunk Power User Flash Cards •Fields are properties associated with the events Dataset Fields • Select the fields you want to include in the dataset • Like constraints, fields are inherited from parent objects •Auto-Extracted – can be default fields or manually extracted fields •Eval Expression – a new field based on an expression that you define •Lookup – leverage an existing lookup table •Regular Expression – extract a new field based on regex •Geo IP – add geographical fields such as latitude/longitude, country, etc Data Model Search Datasets • Arbitrary searches that include transforming commands to define the dataset that they represent • Search datasets can also have fields, which are added via the Add Field button Data Model Transaction Datasets • Enable the creation of datasets that represent transactions • Use fields that have already been added to the model using event or search datasets Search and Transaction Dataset Considerations •There must be at least one event or search dataset before adding a transaction dataset • Search and Transaction datasets cannot benefit from persistent data model acceleration –Acceleration is discussed later in the module • Think carefully about the reports your users will run –Can the same report be achieved with event datasets? • As you learn to create data models, consider the types of reports your users will run –Will they need raw events or transactional data? Adding a Root Event The Inherited attributes are default fields • Use Add Field > Auto-Extracted to add more fields Adding Fields –Auto-Extracted Fields that already exist for the constraint can be added as attributes to the data model. Field Types •String: Field values are recognized as alpha-numeric •Number: Field values are recognized as numeric •Boolean: Field values are recognized as true/false or 1/0 16 Splunk Power User Flash Cards •IPV4: Field values are recognized as IP addresses–This is an important field type, as at least one IPV4 attribute type must be present in the data model in order to add a Geo IP attribute Field Flags •Optional: This field doesn't have to appear in every event •Required: Only events that contain this field are returned in Pivot •Hidden: This field is not displayed to Pivot users when they select the dataset in Pivot –Use for fields that are only being used to define another field, such as an eval expression •Hidden & Required: Only events that contain this field are returned, and the fields are hidden from use in Pivot Adding Fields –Eval Expressions • You can define a new field using an eval expression –In this example, you create a field named Error Reason that evaluates the value of the status field Adding Fields –Lookups • Leverage an existing lookup definition to add fields to your event object • Configure the lookup attribute in the same way as an automatic lookup • Use Preview to test your lookup settings • Use the Events and Values tab to verify your results Adding Fields –Regular Expression You can define a new field using a regular expression Adding Fields -GeoIP • Map visualizations require latitude/longitude fields • To use Geo IP Lookup, at least one IP field must be configured as an IPv4 type • While the map function isn't available in Pivot, the data model can be called using the | pivot command and <map> element in a dashboard population search –Select the field that contains the mapping to lat/lon –Identify the lat/lon and geo fields in the data Adding Child Datasets When you create a new child dataset, you give it one or more additional constraints • Child datasets inherit all fields from the parent events –You can add more fields to child datasets 17 Splunk Power User Flash Cards Adding a Transaction • You can add a transaction to the data model • The transaction dataset below would equate to the search: sourcetype=access_* | transaction clientip maxpause=10s • You can then add an eval expression or any other field to your transaction to further define the results Testing the Data Model • Click Pivot to access the Select a Dataset window • Choose an object from the selected data model to begin building the report Using the Data Model in Pivot The New Pivot window automatically populates with a count of events for the selected dataset Pivot –Using Fields • The fields associated with each dataset are available as splits for rows or columns Fields can also be used to filter events in the Pivot interface Set Permissions • When a data model is created, the owner can determine access based on the following permissions: –Who can see the data models Owner, App, or All Apps –Which users can perform which actions (Read/Write) Everyone Power User Admin-defined roles, if applicable Download and Upload Data Models • Use the Splunk Web interface to download or upload data models: –Back up important data models –Collaborate with other Splunk users to create/modify/test data models –Move data models from a test environment to uction instance Data Model Acceleration • Uses automatically created summaries to speed completion times for pivots 18 Splunk Power User Flash Cards • Takes the form of inverted time-series index files (tsidx) that have been optimized for speed • Discussed in more detail in Advanced Searching and Reporting Accelerating a Data Model • With persistent data model acceleration, all fields in the model become "indexed" fields • You must have administrative permissions or the accelerate_datamodel capability to accelerate a data model • Private data models cannot be accelerated • Accelerated data models cannot be edited 12. Common Information Model (CIM) Add-on What is the Common Information Model (CIM)? • The Splunk Common Information Model provides a methodology to normalize data • Leverage the CIM when creating field extractions, field aliases, event types, and tags to ensure: –Multiple apps can co-exist on a single Splunk deployment –Object permissions can be set to global for the use of multiple apps –Easier and more efficient correlation of data from different sources and source types How the Splunk CIM Works Normalize Field Names – Email Data Normalize Field Names – Network Traffic Normalize Field Names – Web Data Splunk CIM Add-on • Set of 22 pre-configured data models –Fields and event category tags –Least common denominator of a domain of interest • Leverage the CIM so that knowledge objects in multiple apps can co-exist on a single Splunk deployment 19 Splunk Power User Flash Cards Using the CIM Add-on 1.Examine your data –Go to Settings > Data Models –Identify a data model relevant to your dataset 2.Create event types & tags –Identify the CIM datasets relevant to your events –Observe which tags are required for that dataset or any parent datasets –Apply those tags to your events using event types 3.Create field aliases –Determine whether any existing fields in your data have different names than the names expected by the data models –Define field aliases to capture the differently named field in your original data and map it to the field name that the CIM expects 4.Add missing fields –Create field extractions –Write lookups to add fields and normalize field values 5.Validate against data model –Use the datamodel command –Use Pivot in Splunk Web datamodel Command • Search against a specified data model object • Return a description of all or a specified data model and its objects • Is a generating command and should be the first command in the pipeline datamodel Command – Example A) • Command B) • Data model name C) • Data model dataset name D) • Keyword E) • Find field names with Web prefix 20 Splunk Power User Flash Cards Fields Extraction Commands •erex command -You do not know the regular expression to use -You have example values in your retrieved events •rex command -NO UI; you must write regex -Only persists for the duration of the search -Does not persist as a knowledge object -Good for rarely used fields erex Command • Instead of using regex, the erex command allows you to extract a field at search time by providing examples •examples=erex -examples comma-separated list of example values for the information to be extracted and saved into a new field Types of Search Dense A large percentage of the data matches the search. Sparse A small percentage of data matches the search Super Sparse Returns a small number of results from each index bucket matching the search Rare The indexer checks all buckets to find results, but bloom filters eliminate those buckets that don’t include search results Search Job Inspector –Execution Costs • Provides details on cost to retrieve results, such as: –command.search.index Time to search the index for the location to read in rawdata files –command.search.filter Time to filter out events that do not match –command.search.rawdata Time to read events from the rawdata files Viewing Results as a Visualization • Not all searches can be visually represented • A data series is a sequence of related data points that are plotted in a visualization 21 Splunk Power User Flash Cards • Data series can generate any statistical or visualization results Trellis Layout • Display multiple charts based on one result set • Allows visual comparison between different categories • Data only fetched once timechart Command– Overview •timechart command performs statistical aggregations against time • Plots and trends data over time •_time is always the x-axis • You can optionally split data using the by clause for one other field -Each distinct value of the split by field is a separate series in the chart • Timecharts are best represented as line or area chart Functions and arguments used with stats and chart can also be used with timechart. Using timechart, you can split by a maximum of one field because _time is the implied first field. iplocation Command • Use iplocation to look up and add location information to an event –This information includes city, country, metro code, region, timezone, latitude and longitude • Not all of the information is available for all ip address ranges • Automatically defines the default lat and lon fields required by geostats geostats Command (TRY AGAIN) • Use geostats to compute statistical functions and render a cluster map geostats[latfield=string] [ longfield=string] [stats-agg -term]* [by-clause] • Data must include latitude and longitude values • Define the latfield and longfield only if they differ from the default lat and lon fields • To control the column count: –On a global level, use the globallimit argument –On a local level, depending on where your focus is (i.e., where you’ve zoomed in), use the locallimit argument //geostats – generates statistics for rendering maps, e.g., “count of occurrences in geographical area.” This works almost like stats function. Choropleth Map //Shows region with color density 22 Splunk Power User Flash Cards • Uses shading to show relative metrics, such as sales, network intruders, population or election results, etc., for predefined geographic regions • To define regional boundaries, you must have either a: – KML (Keyhole Markup Language) file – KMZ (compressed Keyhole Markup Language) file • Splunk ships with: –geo_us_states, United States –geo_countries, countries of the world ...| geom[featureCollection] [featureIdField=string] Other maps show locations and dots etc, chloropleth maps show regions with color density. In order to achieve this we need to have geom function which uses internal database, which takes the identity of the area, state or town or country and it generates a polygon. Mapping Types Splunk currently has two built in mapping types that can be used. Type Description Choropleth A choropleth map uses shading to show relative metrics, such as population or election results, for predefined geographic regions. Cluster Cluster map can plot geographic coordinates as interactive markers. These markers can be configured to represent a metric, such as a pie chart with details about the location. Count of occurrences in geographical area. index=main sourcetype=access_combined | iplocation clientip | fillnull value="Unknown" City,Country | replace"" with "Unknown" in City, Country | stats count by Country | geom geo_countries featureIdField=Country 13. timechart Command • The timechart command buckets data in time intervals depending on: the selected time rang • Which of these search strings is NOT valid: index=web status=50* | chart count by host, status index=web status=50* | chart count over host by status index=web status=50* | chart count over host, status • The y-axis should always be numeric. • Which type of visualization allows you to show a third dimension of data? Bubble Chart 23 Splunk Power User Flash Cards • This command will compute the sum of numeric fields within events and place the result in a new field: addtotals • The gauge command: allows you to set colored ranges for a single-value visualization • If you want to format values without changing their characteristics, which would you use? the fieldformat command • The transaction command allows you to _________ events across multiple sources. Correlate • What will you learn from the results of the following search? sourcetype=cisco_esa | transaction mid, dcid, icid | timechart avg(duration) the average time elapsed during each transaction for all transactions • The maxpause definition: finds groups of events where the first and last events are separated by a span of time that does not exceed a certain amount finds groups of events where the span of time between included events does not exceed a specific value finds groups of related events where the total number of events does not exceed a specific number • Which function should you use with the transaction command to set the maximum total time between the earliest and latest events returned? maxspan endswith maxduration maxpause • You can create transaction based on multiple fields • It is suggested that you name your Knowledge Objects using _______ segmented keys. Six • Knowledge Objects can be used to normalize data. • During the validation step of the Field Extractor workflow: you can validate where the data originated from you cannot modify the field extraction 24 Splunk Power User Flash Cards you can remove values that aren't a match for the field you want to define • Fields extracted with the Field Extractor: are persistent require you to use regex in your search strings are specific to a host, source, or source type • After editing your regular expression from the Field Extractor Utility, you will be returned to the utility. false • In the Field Extractor Utility, this button will display events that do not contain extracted fields. selected-fields non-matches matches non-extractions • Once a field is created using the regex method, you cannot modify the underlying regular expression. false How many ways are there to access the Field Extractor Utility? 3 • The field extractor utility allows you to extract fields using the following two methods: regex and delimiter • Calculated fields are based on underlying: eval expressions • Field aliases are used to _____ data. Normalize • These allow you to categorize events based on search terms. event types • Event Types do not show up in the Fields List. False • The search expansion tool: allows you to see what a macro will expand to before you run a search • This Workflow Action type directs users to a specified URI. GET • A Workflow action can: execute a secondary search direct users to a specified URI send field values to external sources • This Workflow Action type sends field values to external resources. POST • To use field value data from an event in a Workflow Action, we need to: wrap the field in dollar signs 25 Splunk Power User Flash Cards • Workflow Actions can only be applied to a single field. False • When using a field value variable with a Workflow Action, which punctuation mark will escape the data? * ^ # (!) • Hidden fields in a data model: will not be displayed to a Pivot user, but can be used to define other datasets • Required fields in a data model: constrains the dataset to only return events that include that field • Which of these are NOT Data Model dataset types: lookups transactions events searches • _____ datasets can be added to a root dataset to narrow down the search. Child • The only way to access and use a dataset is from the Pivot interface. False • Fields used in Data Models must already be extracted before creating the datasets. False • By default, data models in the CIM Add-on will search across all indexes. true • The data models in the CIM Add-on are accelerated by default. False • You can normalize data for CIM use: at index time only after adding the CIM Add-on using Knowledge Objects • The Splunk CIM Add-on includes data models in a __________ format. JSON • This role is required to install the CIM Add-on. Admin • The CIM Add-on indexes extra data and will affect license usage. False • The CIM schema should be used when creating Field Extraction, Aliases, Event Types, and Tags. True • When extracting fields, we may choose to use our own regular expression. True • Who can create data models? Administrator • Why is there a warm and cold bucket? Cold buckets are generally searched less frequently and stored in a different location, this means they can be stored on cheaper media with slower I/O speeds. • Data Interpretation is made up of: ◦ fields field extraction • Data classification is made up of: ◦ Event types • Data enrichment is made up of: ◦ Lookups Workflow Actions 26 Splunk Power User Flash Cards • Normalisation is made up of: ◦ Tags Field aliases • Data sets are made up of: ◦ Data models • What happens to surplus resulting values of chart and timechart commands? They are grouped into other • What is a trait of scatter charts? Can only show two dimensions. Shows trends in the relationship between discrete data values • What is the argument for adjusting sampling interval of timechart? Span • • What is the data-requirement for the geostats command? Data must include latitude and longitude values • These arguments are used to control column counts when using the geostats command gloabllimit and locallimit • What command can be used to show relative metrics for predefined geographic regions? geom • (True/False) A sparkline is an inline chart, that can be added to timechart and is designed to display time based trends • what is a trend in the trendline command? ▪ displays the direction in which values are moving. • What does the trendline command do? allows you to overlay a computed moving average on a chart • (True/False) Unqouted or single-quoted strings are treated as fields. • To be able to do wildcard searches with the where command, this operator must be used like • (True/False) Transaction command creates a single event from a group of events • This field is produced by running the transaction command ◦ duration – difference between timestamp of first and last event in the transaction • How can admins change the limit of numbers of events per transaction? ◦ by configuring max_events_per_bucket in limits.conf • What are Knowledge Objects? ◦ Knowledge objects are tools you use to discover and analyze various aspects of your data • What is the Splunk Common Information Model (CIM) ◦ A methodology for normalizing data, easily correlate data from different sources and source type. • Which meta fields are already stored in the index prior to search time? Host, source, sourcetype 27 Splunk Power User Flash Cards • Which internal fields are stored in the index prior to search time? ▪ _time and _raw • At this time, field discovery discovers fields directly related to the search's results Search Time • These knowledge objects provide a way of normalizing data over any default field field aliases • (True/false) Field aliases are applied after field extraction, before lookups • (True/false) It is not possible to apply field aliases to lookups • How does a tag appear after being selected? In the results as tags, in parantheses next to the associated field/value pairs • The syntax for searching for a tag associated with a value is? tag=<tag_name> • The syntax for searching for a tag associated with a value on a specific field is? ◦ tag::<field> = <tagname> • Where can tag-settings (including permissions) be edited? ▪ Settings->Tags-> List by field value pair • This knowledge object can be used to group similar types of events Eventtype • (True/false) Event Types does not include a time range, while a saved report does • These knowledge objects are useful when you frequently run searches or reports with similar search syntax macros • What happens if an event fits in multiple eventtypes? ◦ Priority decides which event takes precedence in the display order • At what time are parameter values passed to macro? ◦ Execution Time • When setting up arguments for macros in the macro definition, with what character must the argument(s) be surrounded by? $ • (T/F) This is a valid search: | 'monthly_sales(euro, £, 0.79)' (T/F) | `monthly_sales(euro, £, 0.79)` • What is the validation expression for macros? ◦ An expression for each argument to the macro can be made, with a corresponding error message, to ensure that the macro is being used correctly • (T/F) A workflow action can be applied to both fields and event types. • How can workflow actions be tested? ◦ By pressing event actions on event in the search, and clicking the name of the created Workflow action • What is a pivot? Its essentially a subset of data based on a data model • What is a data model? 28 Splunk Power User Flash Cards ◦ Hierarchically structured datasets that generate searches and drive pivots • What three types of datasets can a data model consist of? ▪ Events, searches and transactions • How are datasets saved in pivots? Each event, search or transaction is saved as a separate dataset • What are constraint when it comes to data model events? ◦ Constraints are essentially the search broken down into a hierarchy • (True/false) Datamodels are hierarchical structures where children datasets inherit constraints and field from their parent dataset(s) • What is the benefit of using root events to root transactions and root searches when creating data models? ◦ root events can be accelerated, while the others can not. • What are constraints for a root event? ◦ Essentially search terms • What methods can be used for adding fields to a data model? ▪ Auto-extraction, eval expression, lookup, regular expression, Geo IP • What are the different field types for a data model? ◦ String, Number, Boolean, IPV4 • What are field flags for data models? ◦ Field flags are used for setting options for how the field should be used and the necessity of the field for events in the pivot driven by the data model • What different field flags exists? ▪ Optional, required, Hidden, Hidden & required • What is the use of hidden fields? ◦ They can be used for fields that are only being used to define another field, such as an eval expression • What does the "Required" Field flag imply? ◦ Only events that contain this field are returned in Pivot • What is the data model name in this search? | pivot Buttercup_Games_Site_Activity failed_request count(failed_request) AS "Count of Failed requests" ◦ Buttercup_Games_Site_Activity • What is the object name in this search? | pivot Buttercup_Games_Site_Activity failed_request count(failed_request) AS "Count of Failed requests" ◦ failed_request • What is the split row field in this search? | pivot Buttercup_Games_Site_Activity failed_request count(failed_request) AS "Count of Failed requests" 29 Splunk Power User Flash Cards ◦ count(failed_request) • What are Data Model Search datasets? ◦ Arbitrary searches that include transforming commands to define the dataset that they represent • What fields are available for use by a transaction dataset? ▪ Fields that have already been added to the model using event or search datasets. • What is required for a transaction dataset? ◦ At least one event or search dataset to the data model ◦ • How can permissions be set for data models? ◦ Based on Users, or owner, app or all apps • How can data models be moved from the test environment to prod environment? ◦ By uploading/download via the Splunk Web interface • How are data models accelerated? ◦ By creating summaries defined in time-series index files (tsidx) that have been optimized for speed • What happens with fields in when accelerating a data model? All fields in the model become "indexed" fields, i.e they are available in index files. • (True/False) A private data model can be accelerated False • (True/False) Accelerated data models can be edited False • (True/False) Only root events can be accelerated. True • What is the Common Information Model (CIM)? The Splunk Common Information Model provides a methodology to normalize data • What are some of the benefits of using the Common Information Model? -Easier and more efficient correlation of data from different sources and source types - Multiple apps can co-exist on a single Splunk deployment • When should the CIM be leveraged? When creating field extractions, field aliases, event types and tags • What are included in the CIM Add-on? Set of 22 pre-configured data models - fields and event category tags - Least common denominator of a domain of interest • (True/false) The data models included in the CIM add-on are configured with data model acceleration turned off. True 30 Splunk Power User Flash Cards • How can you use the CIM add-on? By going to settings-> Data models, and identify a data model relevant to your dataset • What can the CIM add-on be used for? Creating new event types, tags, field aliases and field extractions • How can you validate that a field extraction is correctly set up according to CIM? by using the datamodel command and search against a specified data model object to see that the extracted field exists in the data model • (True/false) data model name and dataset name are case sensitive True • What is the data set name in the following search: | datamodel Web Email search | fields Web* Email • What does the "from" command do? retrives data from a data model or named dataset • How does the "from" command differ to the "datamodel" command? datamodel returns all fields prepended with data model name, from datamodel returns specified fields only • (True/false) "from" command can also retrieve data from saved searches, reports or lookup files True • What is true about event type? Event types cannot include pipes or subsearches • What does the duration field for transaction mean? difference between timestamp of first and last event in the transaction • (True/False) Data models from CIM search across all indexes by default True • Which is the correct order to use when creating a lookup? ◦ Define a lookup table Define a lookup Create and automatic lookup • Finish this search command so that it displays data from the http_status.csv lookup file. | __________ http_status.csv inputlookup • Finish this search so that it uses the http_status.csv lookup to return events. | sourcetype=access_c* NOT status=200 | _________ http_status code as status lookup 31 Splunk Power User Flash Cards • The easiest way to extract a field is from ____________, allowing you to skip a few steps. ◦ The event actions menu • To escape the "fieldname" value which command would you use? $_________fieldname$ ! • Validating macro arguments can be done with which type of command? Boolean Expression, Eval Expressions • After creating your data model, the next step is to ___________ Add a root object • You can add additional child objects to either existing objects or the root object. True • After you configure a lookup, its fields can be found in the fields sidebar and you can use them in a search. True • Chloropleth maps are used to show relative metrics for predefined geographic regions. • to define regional boundaries, you must have either a: ▪ KML (keyhole markup language) OR KMZ (compressed keyhole markup language) • ------------------------------------------------------------------------- Transaction Command and arguments where command like operator Field Extraction Validation Step Field Extraction Macros validation CIM Data models geom -------------------------------------------------------------------- [Show More]

Last updated: 3 years ago

Preview 1 out of 31 pages

Buy Now

Instant download

We Accept:

Payment methods accepted on Scholarfriends (We Accept)
Preview image of DSC 101 Intro to Data Science Splunk Power User Flashcards(Latest 2021) document

Buy this document to get the full access instantly

Instant Download Access after purchase

Buy Now

Instant download

We Accept:

Payment methods accepted on Scholarfriends (We Accept)

Also available in bundle (1)

Click Below to Access Bundle(s)

DSC 101 Intro to Data Science Splunk Bundle 2021

DSC 101 Intro to Data Science Splunk Bundle 2021

By Hannington 4 years ago

$30

6  

Reviews( 0 )

$15.00

Buy Now

We Accept:

Payment methods accepted on Scholarfriends (We Accept)

Instant download

Can't find what you want? Try our AI powered Search

131
0

Document information


Connected school, study & course


About the document


Uploaded On

Oct 09, 2021

Number of pages

31

Written in

All

Seller


Profile illustration for Hannington
Hannington

Member since 4 years

74 Documents Sold

Reviews Received
28
9
1
2
6
Additional information

This document has been written for:

Uploaded

Oct 09, 2021

Downloads

 0

Views

 131

Document Keyword Tags

Recommended For You

Get more on EXAM »

$15.00
What is Scholarfriends

Scholarfriends.com Online Platform by Browsegrades Inc. 651N South Broad St, Middletown DE. United States.

We are here to help

We're available through e-mail, Twitter, Facebook, and live chat.
 FAQ
 Questions? Leave a message!

Follow us on
 Twitter

Copyright © Scholarfriends · High quality services·