Search Your Logs

Your connected event sources and environment systems produce data in the form of raw logs. Log Search takes every log of raw, collected data and automatically sorts them into Log Sets for you. Once you configure Core Event Sources, go to the Log Search page from the InsightIDR homepage to search your logs, build queries, visualize your data, and create alerts.

On the Log Search page, go to the Log Set panel and select the logs or log sets you want to search.

If you are unsure about which log sets to select, read the Log Set Guidance.

To send data into Log Search in a format that is not currently supported by the Platform, you can set up custom logs.

To parse logs in a format that is unknown to InsightIDR, you can create custom parsing rules.

Step 2: Build a query

To search for specific log data, you must build a query using the Log Entries Query Language (LEQL).

LEQL is the search language that allows you to build simple or advanced analytical queries.

To start searching your log data, you can try recreating one of our example queries.

Step 3: View the resulting log entries

After your query runs, the resulting log entries are displayed in the Entries or Table tab.

Tip for using the context menu

As you are viewing log data on either the Entries or Table tab, you can click and drag to highlight a full or partial value and open the Log Search context menu. This menu displays options for creating an alert, viewing a user or asset's details, or using that value as input in Quick Actions. By highlighting a specific value and using this menu, you can dive deeper into the data you want to investigate.

To present the resulting data in charts on the Visualize tab, you can group your logs based on various criteria by using the Groupby function:

Visualize your logs using the Groupby function

Groupby is a function that allows you to visualize your data by grouping it by fields in your log data. For example, the following function groups log entries by the unique values found for the destination_user field in the log data generated from your Asset Authentication logs: groupby(destination_user) calculate(count).

This example would return a list of all the unique usernames found in the destination_user field, with a count of the number of times the username was found. The most common username will be listed first, with the rest of the usernames prioritized accordingly.

This function also allows you to change results in two ways:

Statistical approximation

If more than 10,000 unique groups are found, then the results will be a statistical approximation, rather than a literal count. It’s also possible that no groups will be displayed, due to the distribution of the data.

To get an exact result, narrow your search criteria. You can do this by selecting fewer logs, a shorter time frame, or adding more search filters.

Reduce the number of groups returned when using the Groupby function

The having clause can be used to reduce the set of groups returned when using the groupby function. Currently, only single predicate based on count value, having(count <op> <number>), is supported; support for multiple predicates and additional analytical function support is coming soon.

Example Groupby query:
where(result=FAILED_BAD_LOGIN) groupby(destination_user)

Failed Login Query

This query includes all users who have failed to log in within the specified time range, including those users who only failed once.

Example Groupby query with having clause:
where(result=FAILED_BAD_LOGIN) groupby(destination_user) having (count>200)

Failed Login Query with Having Clause

This query includes users who failed to log in more than 200 times. Any users who failed to log in less than 200 times will not be included in this query's results and could be interpreted as benign.

The having clause is not supported in InsightIDR custom alerts.

Group your log data by more than one field

The Log Entry Query Language (LEQL) groupby function allows you to group by multiple fields in your log data. Run a single query to get an overall view of your log data, as well as drill down into that data. To use this feature, add up to 5 fields in a groupby query. You can do this by typing additional keys in the Advanced querybuilder mode, or you can use the button provided in Simple mode.

Groupby Function

Example query: groupby(destination_user, result, service, source_asset_address) calculate(count)

Visualize results in a stacked bar chart

When run, the results of the query will be visualized in a stacked bar chart showing 2 groups. If you added more than 2 fields to the query, click on a bar to drill down further into the next 2 groups, filtered by the bar you clicked on.

Groupby Function

View results in a table

Results will also be displayed in a table format, allowing you to drill down on groups by clicking the arrows to display subsequent fields.

Groupby FunctionGroupby Function

Modify Groupby results

Increase Groupby Limit

You can increase the number of groups returned by your Groupby query with the limit keyword by adding limit(n) at the end of your query, where n represents the number of groups. Refer to the following table for the maximum groups per number of group keys. Any value inputted that is greater than the maximum number of groups will default to the maximum.

Number of group keysMaximum groups
groupby(x0)10,000
groupby(x0, x1)20,000
groupby(x0, x1, x2)30,000
groupby(x0, x1, x2, x3)40,000
groupby(x0, x1, x2, x3, x4)50,000

The following query sets a limit of 350: groupby(source_asset_address) calculate(count) sort(desc) limit(350)

If you are grouping by multiple fields, you can pass in additional values to limit the number of rows returned for each individual group. When grouping by multiple fields, the limit applies across groups according to the examples in the following table:

QueryGroupby limits
groupby(x, y) limit(5)groupby(x, y) limit(5, 5)
groupby(x, y, z) limit(20, 12)groupby(x, y, z) limit(20, 12, 12)
groupby(x, y)groupby(x, y) limit(40, 40)

The following query sets a limit of 100 groups for the first field in the Groupby function, and 20 for the second field.

groupby(source_asset_address, service) calculate(count) sort(desc) limit(100, 20)

By default, LEQL limits each group to 40 results if you do not use a limit keyword in your query.

Sorting

In the advanced mode, you can sort returned results in ascending or descending order using a query similar to the following: where(result = SUCCESS) groupby(destination_user) calculate(count) sort(desc)

You can use desc and descending as keywords to sort in descending order, or asc and ascending to sort in ascending order. You can sort by the name of the group instead of the value, by using asc#key or desc#key

If you are grouping by multiple fields, then you can pass in additional sorting criteria.

The following query will sort the results first by the count of the first group ascending, and the name of the second group descending.

groupby(destination_user, result) calculate(count) sort(asc, desc#key)

By default, LEQL sorts results in a descending order if you do not use a sort keyword in your query.

Literal Count

If the number of unique events in the data set is less than 10,000, Groupby uses Literal Count. Literal Count fetches every log line in the data set and counts each occurrence of unique elements, while also putting identical events into a group together.

Use the following query for a count of unique elements: calculate(unique:source_address), where source_address is the value you plan to group by. This will return the number of unique values for the source_address field, visualized in a time based chart.

When you group values by unique identifiers, port numbers, or IP addresses, the number of unique values typically increases if you increase the time window.

Step 4: Create dashboards and visualizations (optional)

View your data in Visual mode and Add Cards to visualize your data. You can Create Dashboards and Reports from your queries, or you can Export log data.

Step 5: Create a Custom Alert (optional)

Create an Alert from specific log indicators, such as invalid logins.