AWS AppFabric

AWS AppFabric quickly connects SaaS applications across your organization, so IT and security teams can easily manage and secure applications. One particularly powerful capability of AppFabric is that it can collect and normalize audit logs from every SaaS application it manages.

The AWS AppFabric event source in InsightIDR allows you to configure AppFabric to send all of the logs it collects from your SaaS applications to InsightIDR. In other words, you can send logs from more than a dozen of the most popular SaaS business applications to InsightIDR using a single event source. This can simplify both initial setup and ongoing maintenance of your InsightIDR event sources.

To start using the AWS AppFabric integration, you’ll need to:

  1. Create AWS resources.
  2. Configure AWS AppFabric.
  3. Configure InsightIDR to collect data from the event source.
  4. Test the configuration.

Create AWS Resources

Task 1: Create an EC2 Collector

If you haven’t already, you will need to create an InsightIDR Collector that runs inside your AWS account on an EC2 instance. For instructions, visit Access AWS Resources with EC2 IAM Roles in our documentation.

Task 2: Create an S3 bucket and supporting resources

AppFabric will send the logs from your SaaS applications to an S3 bucket. The S3 bucket will need to have event notifications enabled so that InsightIDR knows when new logs are available. The AWS IAM Role being used by the InsightIDR Collector will also need to be given permission to access both the bucket and the event notifications. To make it as easy as possible, we have provided a CloudFormation template that will automatically perform all of these actions.

Manually create resources (Alternative)

If you can’t take advantage of the CloudFormation template, follow these steps to configure your AWS environment. With this option, you can either use an existing S3 bucket or create a new one.

Tip: Reduce AWS storage costs with lifecycle rules

If you only plan to use this S3 bucket to temporarily store logs until they are ingested by InsightIDR, you should consider implementing a lifecycle rule that deletes objects in the S3 bucket after 2 days. This will reduce your AWS storage costs. To learn more about lifecycle rules, read the AWS documentation at: https://docs.aws.amazon.com/AmazonS3/latest/userguide/how-to-set-lifecycle-configuration-intro.html.

To manually create resources:

  1. Do one of the following:
  2. Whether you use an existing S3 bucket or create a new one, copy the name and ARN of the bucket as you’ll need to reference them further along in the setup.
  3. Create a new SQS queue by following the AWS documentation at: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/step-create-queue.html
    • You must create a standard queue.
    • After you name the queue, you can leave all remaining settings in their default state. Save the queue and copy the queue’s URL as you’ll need it later in the setup process.
  4. Reopen the SQS queue you just created. Copy the queue’s URL and ARN as you’ll need them later.
  5. Go to the Access policy tab and click the button to edit the Access Policy. Replace the policy in the text editor with the following, being sure to update the placeholder values:
1
{
2
"Version": "2012-10-17",
3
"Id": "R7AppFabricS3QueuePolicy",
4
"Statement": [
5
{
6
"Sid": "Allow-Owner-Access",
7
"Effect": "Allow",
8
"Principal": {
9
"AWS": "INSERT AWS ACCOUNT ID"
10
},
11
"Action": "SQS:*",
12
"Resource": "INSERT ARN OF SQS QUEUE"
13
},
14
{
15
"Sid": "Allow-S3-SendMessage",
16
"Effect": "Allow",
17
"Principal": {
18
"Service": "s3.amazonaws.com"
19
},
20
"Action": "SQS:SendMessage",
21
"Resource": "INSERT ARN OF SQS QUEUE",
22
"Condition": {
23
"StringEquals": {
24
"aws:SourceAccount": "INSERT AWS ACCOUNT ID"
25
},
26
"ArnLike": {
27
"aws:SourceArn": "INSERT ARN OF S3 BUCKET"
28
}
29
}
30
}
31
]
32
}
  1. Set up a new event notification for the S3 bucket you created in the first step by following the AWS documentation at: https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications.html
    • The name for the event can be anything you want.
    • For “Event types”, choose All object create events.
    • For “Destination” select SQS and then specify the SQS queue you just created.
    • All other fields can be left alone.
  2. Update the role attached to the EC2 instance running the InsightIDR Collector. You will need to add a policy with the following permissions:
1
{
2
"Version": "2012-10-17",
3
"Statement": [
4
{
5
"Effect": "Allow",
6
"Action": [
7
"s3:Get*",
8
"s3:List*"
9
],
10
"Resource": [
11
"arn:aws:s3:::S3BUCKETNAMEGOESHERE",
12
"arn:aws:s3:::S3BUCKETNAMEGOESHERE/*"
13
]
14
},
15
{
16
"Effect": "Allow",
17
"Action": [
18
"sqs:ReceiveMessage",
19
"sqs:DeleteMessage"
20
],
21
"Resource": [
22
"INSERT ARN OF SQS QUEUE"
23
]
24
}
25
]
26
}

Configure AWS AppFabric

Task 1: Enable AppFabric

Enable AppFabric by following the AWS documentation at: https://docs.aws.amazon.com/appfabric/latest/adminguide/getting-started.html. You will need to create an app bundle and create at least one app authorization before you can proceed to the next step.

Task 2: Create audit log ingestions in AppFabric

Once you have authorized an application in AppFabric, it means that AppFabric can access the logs for the SaaS application, but in order for AppFabric to send the logs it collects to InsightIDR, you will also need to set up an audit log ingestion in AppFabric for the application. Follow the AWS instructions to create a new audit log ingestion: https://docs.aws.amazon.com/appfabric/latest/adminguide/getting-started.html#getting-started-3-set-up-ingestion

When configuring the ingestion, indicate that the destination is an existing S3 bucket and then select the S3 bucket you created earlier. For the Schema & Format, be sure to select OCSF - JSON.

Repeat the process to create an ingestion for every application whose logs you want to send to InsightIDR.

You can only have one audit log ingestion per application

In AppFabric, you can only create one ingestion per application, but each ingestion can send its logs to up to 5 S3 buckets. If you try to set up an ingestion and get an error such as “An ingestion with the same parameter exists”, it means that an ingestion has already been configured for that application. If this happens, locate the existing ingestion for the application, open it, and add a new destination. Then follow the instructions above for configuring the destination.

Configure InsightIDR to collect data from the event source

After you complete the prerequisite steps and configure the event source to send data, you must add the event source in InsightIDR.

To configure the new event source in InsightIDR:

  1. From the left menu, go to Data Collection and click Setup Event Source > Add Event Source.
  2. Do one of the following:
    • Search for AWS AppFabric in the event sources search bar.
    • In the Product Type filter, select Cloud Service.
  3. Select the AWS AppFabric event source tile.
  4. (Optional) Name your event source.
  5. Choose the Collector running in your AWS environment.
  6. Check the Send unparsed data box. Although this is typically an optional setting, we strongly encourage you to enable it for this event source as it might take some time for Rapid7 to add parsing when AppFabric adds support for a new SaaS application.
  7. Select the appropriate options for LDAP account attribution and Active Directory domain to ensure that users mentioned in your application logs are correctly attributed in InsightIDR.
  8. For Collection Method, choose SQS Messages.
  9. Under AWS Authentication, select EC2 Instance Profile Credential.
  10. In the SQS Queue URL field, enter the SQS queue URL that you retrieved when creating AWS resources.
  11. Click Save.

Test the Configuration

To verify that logs are flowing from AppFabric to InsightIDR:

  1. In the list of event sources in InsightIDR, locate the new event source that you created for AppFabric. Click the View Raw Log button.
    • If you see log messages in the box, that means that logs are flowing to the Collector. Wait approximately 7 minutes, then continue to step 2 to verify that data is flowing into Log Search.
    • If you don’t see log messages in the Raw Log view, open the S3 bucket that AppFabric is sending logs to and confirm that new logs have been added since you configured the InsightIDR event source. Note that InsightIDR will only start picking up new logs that are generated after the event source is configured.
  2. Verify that log entries are appearing in Log Search.
    • From the left menu, go to Log Search.
    • Select the applicable log set and log. Logs collected from the AppFabric event source will typically appear under the Cloud Service Activity log set. Depending on the application, logs might also appear under the Ingress Activity, SSO Authentication, or Third Party Alert log sets.
  3. If you don’t see any logs in the expected log sets, select the Unparsed Data log set. Logs displayed here indicate one of the following things:
    • In AppFabric, you have configured the destination for your audit log ingestions to send data in a schema and format other than OCSF - JSON.
    • We are not yet parsing logs for that particular application. If you have confirmed you are sending logs in OCSF-JSON format, then submit a support ticket to request that we add support for the application that is not being parsed.

Logs take at least 7 minutes to appear in Log Search after you set up the event source.

Each SaaS application generates logs at different cadences

When validating that your SaaS app logs are appearing in InsightIDR, be aware that each SaaS app generates and exports logs at different intervals. For example, Zoom generates logs only once every 24 hours. If you are not seeing logs from the AppFabric event source in InsightIDR, confirm that log files are showing up in the S3 bucket before troubleshooting anything in InsightIDR.

Sample logs

In Log Search, the log that is generated uses the name of your event source by default. Logs collected from the AppFabric event source will typically appear under the Cloud Service Activity log set. Depending on the application, logs might also appear under the Ingress Activity, SSO Authentication, or Third Party Alert log sets.

Here is a typical log entry that is created by the event source:

1
{
2
"activity_id": 1,
3
"activity_name": "Logon",
4
"actor": {
5
"session": {
6
"uid": "3e30440a-9ed2-492a-9320-7dc5ba894b7b"
7
},
8
"user": {
9
"email_addr": "harry.potter@fabric-stack6.gebarke.people.aws.dev",
10
"name": "harry.potter@fabric-stack6.gebarke.people.aws.dev",
11
"type": "User",
12
"type_id": 1,
13
"uid": "52ba495d-fea0-4ec6-937e-7effa42246bf"
14
}
15
},
16
"category_name": "Identity & Access Management",
17
"category_uid": 3,
18
"class_name": "Authentication",
19
"class_uid": 3002,
20
"device": {
21
"ip": "72.21.196.64",
22
"os": {
23
"name": "MacOs",
24
"type": "macOS",
25
"type_id": 300
26
},
27
"type": "Chrome",
28
"type_id": 99
29
},
30
"http_request": {},
31
"metadata": {
32
"event_code": "UserLoggedIn",
33
"log_provider": "AWS AppFabric",
34
"log_version": "2023-06-27",
35
"product": {
36
"name": "M365",
37
"uid": "m365",
38
"vendor_name": "M365"
39
},
40
"profiles": [
41
"host"
42
],
43
"uid": "6daea036-33b0-4583-8d7e-035ec0121302",
44
"version": "v1.0.0-rc.3"
45
},
46
"raw_data": "{\"CreationTime\":\"2023-06-08T14:57:32\",\"Id\":\"6daea036-33b0-4583-8d7e-035ec0121302\",\"Operation\":\"UserLoggedIn\",\"OrganizationId\":\"2927531b-3c02-4bbb-b31c-4696a639c718\",\"RecordType\":15,\"ResultStatus\":\"Success\",\"UserKey\":\"52ba495d-fea0-4ec6-937e-7effa42246bf\",\"UserType\":0,\"Version\":1,\"Workload\":\"AzureActiveDirectory\",\"ClientIP\":\"72.21.196.64\",\"ObjectId\":\"00000003-0000-0000-c000-000000000000\",\"UserId\":\"harry.potter@fabric-stack6.gebarke.people.aws.dev\",\"AzureActiveDirectoryEventType\":1,\"ExtendedProperties\":[{\"Name\":\"ResultStatusDetail\",\"Value\":\"Redirect\"},{\"Name\":\"UserAgent\",\"Value\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36\"},{\"Name\":\"RequestType\",\"Value\":\"OAuth2:Authorize\"}],\"ModifiedProperties\":[],\"Actor\":[{\"ID\":\"52ba495d-fea0-4ec6-937e-7effa42246bf\",\"Type\":0},{\"ID\":\"harry.potter@fabric-stack6.gebarke.people.aws.dev\",\"Type\":5}],\"ActorContextId\":\"2927531b-3c02-4bbb-b31c-4696a639c718\",\"ActorIpAddress\":\"72.21.196.64\",\"InterSystemsId\":\"c6832e41-5226-42a8-a79f-cee55f1c7e17\",\"IntraSystemId\":\"6daea036-33b0-4583-8d7e-035ec0121302\",\"SupportTicketId\":\"\",\"Target\":[{\"ID\":\"00000003-0000-0000-c000-000000000000\",\"Type\":0}],\"TargetContextId\":\"2927531b-3c02-4bbb-b31c-4696a639c718\",\"ApplicationId\":\"bfe99510-020c-4c18-8c4b-207f1c6529e6\",\"DeviceProperties\":[{\"Name\":\"OS\",\"Value\":\"MacOs\"},{\"Name\":\"BrowserType\",\"Value\":\"Chrome\"},{\"Name\":\"IsCompliantAndManaged\",\"Value\":\"False\"},{\"Name\":\"SessionId\",\"Value\":\"3e30440a-9ed2-492a-9320-7dc5ba894b7b\"}],\"ErrorNumber\":\"0\"}",
47
"severity_id": 0,
48
"status": "Unknown",
49
"status_id": 0,
50
"time": "2023-06-08T14:57:32.000Z",
51
"type_name": "Authentication: Logon",
52
"type_uid": 300201,
53
"unmapped": {
54
"other_resources": [
55
{
56
"data": {},
57
"name": "Claim",
58
"type": "Unknown",
59
"uid": "00000003-0000-0000-c000-000000000000"
60
}
61
]
62
},
63
"user": {
64
"type": "Unknown",
65
"type_id": 0
66
}
67
}