All pages
Powered by GitBook
1 of 18

Reporting, Alerts & SIEM

Keeper's Advanced Reporting and Alerts Module (ARAM) provides advanced event logging to meet compliance requirements.

Overview

Keeper's Advanced Reporting & Alerts Module ("ARAM") is a critical component of the Keeper Security platform which provides Keeper Administrators and Compliance teams tools for monitoring overall usage and adherence to policies.

Key Capabilities

  • Reporting Engine Run custom time-based reports with 200+ different event types that are broken down by category (e.g. Security Events, Administrative Actions, General Usage, KeeperPAM, KSM, etc). Filter on User, Event Type, Attribute (e.g. Record UID, Shared Folder UID, Geolocation).

  • Alerts Set alert triggers which can send email, SMS or Webhook notifications based on specific event types (For example, notify Admins upon any policy changes).

  • External Logging Integrate with any existing SIEM solution such as Splunk, Sumo or LogRhythm.

  • BreachWatch monitoring Get notified and track BreachWatch events (user notified of high risk password, resolved high risk password).

  • Commander CLI / SDK Integration Keeper Commander can perform customized reporting and automation.

  • Compliance Auditing Generate reports specifically to address SOX, ISO, SOC compliance auditing requirements.

Reporting Interface

Reporting & Alerts

The Reporting & Alerts dashboard provides an overview of the top 5 events, two built-in reports and your custom reports. The "Recent Activity" report is a built-in report that provides basic event tracking for the last 1,000 events across 16 event types. Customers can upgrade to the Advanced Reporting and Alerts module to track over 100 event types and generate custom reports and alert notifications.

The "Recent Activity" and "All Security Events" reports are provided in all Keeper Business and Enterprise subscriptions. Custom reporting and alerts is a feature of the Advanced Reporting and Alerts Module (ARAM). To take advantage of this capability, please contact your Keeper Security account manager or upgrade your subscription through the Secure Add Ons interface of the Admin Console.

Additionally, a user status report is available via the dashboard. See the Dashboard section in this guide.

Admins can also create custom reports by clicking Add Custom Report.

Preview the results by clicking Apply, and if you want to use the report in the future, click the Save button. You can export the events as a file in JSON, CSV or SysLog formats.

New events generated by Keeper vault devices can take up to 15 minutes to appear in the reporting module.

Geolocation based on IP address

Accuracy of geolocation based on IP address varies depending on the database used to identify the user's location. The precision of geolocation data depends on several factors. Most importantly is how well registries validate the data they receive. If information connected with an IP address is incorrect, it reduces its usefulness. Geolocation is incredibly challenging in the case of mobile phone usage where IP address changes are frequently and mobile carriers use centralized gateways that users reach the internet. Additionally, if users are using proxies or VPN's the location data will invariably be incorrect.

Keeper subscribes to one of the industries most reliable providers who performs quality assurance by validating data quality against known IP addresses sourced from the public on a regular basis.

Timeline Chart

The Timeline Chart provides a chart of events over a 24-hour, 7-day and 30-day period. Clicking on any event row will open a report containing all events from the time period.

Timeline Chart

Alerts

The Alert module allows you to create event-based triggers that will generate either email or SMS-based alerts.

Alerts

New alerts are created similarly to new reports, by clicking Add Alert and specifying a name and a filter criteria. You can add one or more recipients using email address, phone number (for SMS) or both. Recipients don't have to be a part of your enterprise and any email address or phone number can be provided. The first recipient is predefined to be the user who generated the event. This will be "off" by default, and you will need to toggle it "on" to enable sending the alerts (email only) to the originator.

Specifying a broad event and attribute filter could generate a lot of alerts. Adjust alert frequency and set narrow event types and filters to reduce alert noise.

To prevent the recipients from receiving too many emails or SMS, alerts can be throttled. One way to throttle is to specify Alert Frequency. For example, if you set the frequency to "Once Per Time Period" with a period of 1 hour than all events matching the alert filter will still trigger the alert "occurrence" but the message will be sent only if 1 hour has passed since the time of the previous message. Another way to throttle the alert is to pause it using the toggle switch. Paused alert will also accumulate "occurrences" without sending the actual messages. When resumed, the very next event matching the alert will trigger sending the message which will contain the number of events that happened while being on pause.

Below is an example of an email alert:

You can view the alert history in the Alerts Sent tab, with the ability to drill down to see the individual events:

External SIEM Logging

If you are utilizing a 3rd party SIEM solution, the Keeper Admin Console can be configured to automatically feed live event data into external SIEM products. Currently supported systems include:

  • AWS S3 Bucket

  • Azure Monitor

  • Crowdstrike NG SIEM

  • Datadog

  • Devo

  • Elastic

  • Exabeam (LogRhythm)

  • Google Security Operations (Chronicle)

  • Logz.io

  • IBM QRadar

  • Syslog Push

  • Splunk

  • Sumo Logic

  • Syslog

Event data is transmitted from Keeper's servers to the destination SIEM collector. Only one method of the external sync can be active at a time.

External Logging

Click Setup to activate the external logging solution. Setup is easy on each logging platform and typically only requires a few attributes to integrate.

Event Types

Within the Admin Console, the default "Recent Activity" report contains 16 event types. Keeper's Advanced Reporting and Alert module supports ~ 100 event types.

The events captured by Keeper Enterprise are visible in the drop-down menus for report and alert configuration.

Event Type Filter

Enabling BreachWatch Events

By default, BreachWatch events from the end-user devices are not collected and transmitted to the Advanced Reporting & Alerts module. These events are managed by the Role policy. To activate this feature, visit the Role > Enforcement Policies > Vault Features and toggle Send BreachWatch events to Reporting & Alerts and connected external logging systems "on".

Enable BreachWatch Events

Event Descriptions

A list of all available events captured by the Keeper Advanced Reporting and Alert Module are provided in the chart below. The Event Code is utilized in the user interface and within the Keeper Commander CLI command parameters. The "Message" field is utilized for the Alerting module.

Within each event, there may be additional attributes such as Record UID, Shared Folder UID, Team UID, Username, etc. These attributes will appear within the event description and they are also provided to the 3rd party SIEM provider in the format as specified by the destination.

Raw Event Data Examples

Below are examples of 2 events in JSON format that are sent. Note that Record UID is provided with the "record_update" event since it relates to a specific record.

{
  "record_uid" : "Uk6qLnfWVxWL9OQlsGdOUw",
  "audit_event" : "record_update",
  "remote_address" : "155.65.556.130",
  "client_version" : "Browser Extensions.12.3.0",
  "timestamp" : "2019-02-14T22:41:12.027Z",
  "username" : "testing@keepersecurity.com",
  "enterprise_id" : 12345
}

{
  "audit_event" : "login",
  "remote_address" : "168.123.45.130",
  "client_version" : "Web App.14.2.4",
  "timestamp" : "2019-02-14T22:40:08.655Z",
  "username" : "demo@keepersecurity.com",
  "client_version_new" : true,
  "enterprise_id" : 12345
}

Below is an example of a Syslog-format event that can be exported via Keeper Commander or into the 3rd party SIEM solution:

<110>1 2019-02-14T21:34:47Z 46.45.253.15 Keeper - 1132431639 [Keeper@Commander geo_location="Chicago, IL, US" keeper_version_category="MOBILE" audit_event_type="login_failure" keeper_version="iPhone 14.2.0" result_code="auth_failed" username="testing@keepersecurity.com" node_id="47377784242178"] User testing@keepersecurity.com login failed with code auth_failed

Note that "enterprise_id" is useful for distinguishing different Keeper Enterprise tenants within the same SIEM collector.

Locating the Record UID and Other Identifiers

The event data references several types of UID values such as Record UID, Shared Folder UID and Team UID. The Record UID and Shared Folder UID can be found either through the Keeper Commander CLI or through the Web Vault user interface.

Commander CLI

The Keeper Commander CLI provides command-line and SDK integration into Keeper's reporting system for more advanced use cases. The event data can be used for generating actionable reports.

Please see the following reporting related commands for more information:

  • audit-log

  • audit-report

  • user-report

  • security-audit-report

  • share-report

  • shared-records-report

  • msp-license-report

  • aging-report

  • action-report

  • compliance-report

Event Descriptions

Details on what triggers each event

A list of all available events captured by the Keeper Advanced Reporting and Alert Module are provided in the chart below. The Event Code is utilized in the user interface and within the Keeper Commander CLI command parameters. The "Message" field is utilized for the Alerting module.

Within each event, there may be additional attributes such as Record UID, Shared Folder UID, Team UID, Username, etc. These attributes will appear within the event description and they are also provided to the 3rd party SIEM provider in the format as specified by the destination.

AWS S3 Bucket

Integrating Keeper SIEM push to an Amazon S3 bucket endpoint

Overview

Keeper supports event streaming into an Amazon S3 bucket. Setup instructions are below.

S3 Bucket Configuration

(1) In AWS, create an S3 bucket and of course ensure that all permissions are locked down.

(2) Create a user account without console access and assign a basic role policy which can only put files within the bucket. Example below.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::name_of_bucket/*"
            ]
        }
    ]
}

(3) Generate Access Key and Secret Key, provide those to the Admin Console user interface along with the Bucket Name. You can select different time intervals for the file uploads. You can also select the file format which includes:

  • JSON

  • Syslog

  • CSV

Amazon S3 Integration Settings

For the Bucket Name, provide a full ARN that includes the region. For example: arn:aws:s3:us-west-2::my-keeper-events

Files will be posted only when events occur during the interval. In the example below, the json files are posted every hour when there is activity in the system.

If you set the time frame to a "day", all events will accumulate until the day has ended (using UTC clock) and then a new file containing all day events will be added to your S3 bucket.

Log File Examples

Syslog File

<165>1 2023-10-30T02:18:43.776Z keepersecurity.jp keeper - - - {"audit_event":"device_user_approval_requested","device_name":"iPhone","remote_address":"12.34.56.78","category":"security","client_version":"iPhone.16.9.4","username":"craig@keeperdemo.io","enterprise_id":50,"client_version_new":true}^M

<165>1 2023-10-30T02:19:19.587Z keepersecurity.jp keeper - - - {"audit_event":"device_approved","device_name":"iPhone","remote_address":"12.34.56.78","category":"security","client_version":"iPhone.16.9.4","username":"craig@keeperdemo.io","enterprise_id":50}^M

<165>1 2023-10-30T02:19:51.774Z keepersecurity.jp keeper - - - {"audit_event":"login","channel":"PASS","remote_address":"12.34.56.78","category":"login","client_version":"iPhone.16.9.4","username":"craig@keeperdemo.io","enterprise_id":50}^M

JSON File

[{"audit_event":"login","remote_address":"12.34.56.78","client_version":"iPhone.16.9.3","timestamp":"2023-09-20T21:33:17.545Z","username":"craig@keeperdemo.io","enterprise_id":67241},{"audit_event":"login","remote_address":"12.34.56.78","client_version":"iPhone.16.9.3","timestamp":"2023-09-20T21:33:27.200Z","username":"craig@keeperdemo.io","enterprise_id":67241},{"audit_event":"login","remote_address":"12.34.56.78","client_version":"iPhone.16.9.3","timestamp":"2023-09-20T21:33:22.740Z","username":"craig@keeperdemo.io","enterprise_id":67241},{"record_uid":"ac3QeHmeGz6Jyb7wnuHnfQ","audit_event":"open_record","remote_address":"12.34.56.78","client_version":"iPhone.16.9.3","timestamp":"2023-09-20T21:33:56.634Z","username":"craig@keeperdemo.io","enterprise_id":67241},{"record_uid":"ac3QeHmeGz6Jyb7wnuHnfQ","audit_event":"fast_fill","remote_address":"12.34.56.78","client_version":"iPhone.16.9.3","timestamp":"2023-09-20T21:33:56.634Z","username":"craig@keeperdemo.io","enterprise_id":67241}]

CSV File

audit_event,name,remote_address,category,client_version,timestamp,username,enterprise_id
audit_sync_setup,s3,12.34.56.78,policy,EMConsole.16.15.3,1698759022585,craig@keeperdemo.io,50
role_created,,12.34.56.78,policy,EMConsole.16.15.3,1698759049640,craig@keeperdemo.io,50
role_enforcement_changed,,12.34.56.78,policy,EMConsole.16.15.3,1698759049876,craig@keeperdemo.io,50
added_to_role,,12.34.56.78,security,EMConsole.16.15.3,1698759136968,craig@keeperdemo.io,50
added_to_role,,12.34.56.78,security,EMConsole.16.15.3,1698759136979,craig@keeperdemo.io,50
lock_user,,12.34.56.78,security,EMConsole.16.15.3,1698759169004,craig@keeperdemo.io,50
added_to_role,,12.34.56.78,security,EMConsole.16.15.3,1698759134936,craig@keeperdemo.io,50

Azure Monitor for Microsoft Sentinel

Integration of Keeper ARAM events with Azure Monitor

Overview

Azure Monitor and Microsoft Sentinel are related as Sentinel leverages Azure Monitor's infrastructure for log management and data collection. Azure Monitor provides the foundational platform, including Log Analytics and the Azure Monitor Agent, on which Sentinel is built. Sentinel then uses this data for security information and event management (SIEM) capabilities.

Keeper supports event streaming directly into Azure Monitor Log Analytics Workspace tables using the Azure Logs Ingestion API. As of January 2025, this is the preferred method and API used for streaming event data into Azure logs.

Setup Instructions

Go to the Azure Portal to begin the setup.

Step 1. Create an App Registration

The Azure App Registration is used to authenticate API requests to the Logs Ingestion API.

  • Navigate to App registrations > New Registration.

Fill out the form:

  • Name: KeeperLogging

  • Supported Account Types: Use the default option (Single tenant).

  • Leave Redirect URI blank for now.

  • Click Register.

After registering:

  • Click on "Expose an API"

  • Click "Set" for the Application ID URI

  • Accept the default suggested URI (it should be something like api://[client-id])

Step 2. Create Client Secret

From the App Registrations section of Azure, go to Manage > Certificates & Secrets > New Client Secret.

  • Add a description and expiration period.

  • Copy the generated "Value" and store it in your Keeper vault.

  • Save this value for the last step ("Client Secret Value").

On the "Overview" screen, also note the Tenant ID and Display Name.

Save the following entries for later:

  • Application (client) ID

  • Client Secret ID

  • Client Secret Value

  • Directory (tenant) ID on the App registrations page.

Step 3. Create Log Analytics Workspace

A Log Analytics Workspace is the core resource where Azure Monitor collects and stores log data. If you already have one, you can skip this step.

  • From Azure, go to Log Analytics Workspaces

  • Click Create and configure:

    • Subscription: Choose your Azure subscription.

    • Resource Group: Create a new resource group or select an existing one.

    • Name: Give your workspace a meaningful name (e.g., KeeperLogsWorkspace).

    • Region: Choose a region

    • Click Review + Create and then Create.

Step 4. Assign Role to App Registration

You need to assign the KeeperLogging application with the role of "Log Analytics Contributor" to the Log Analytics Workspace. From the Log Analytics Workspace:

  • Click on the Workspace (e.g. KeeperDemo1)

  • Select Role assignments

  • Click Add > Add role assignment

  • Type "Log Analytics Contributor" and select that role

  • Click "+Select members" and select the KeeperLogging application from the list

  • Assign it to the "KeeperLogging" application

Step 5. Create a Data Collection Endpoint (DCE)

The Data Collection Endpoint is required before you can create a Data Collection Rule.

  • From Azure, open Data Collection Endpoint (DCE)

  • Search for "Data Collection Endpoints" and click Create.

Configure the following:

  • Subscription: Select your Azure subscription.

  • Resource Group: Use the same resource group you plan to use for the DCR.

  • Region: Choose a region

  • Name: Give it a meaningful name (e.g., KeeperLogsEndpoint).

Note the "Logs Ingestion URL" which is used later.

Example: keeperlogsendpoint-mcag.eastus-1.ingest.monitor.azure.com

Step 6. Create a Table and DCR

From the Log Analytics workspaces, open the Keeper workspace and select "Tables" and Create a new table.

  • Select "New custom log (DCR-based)".

  • In this example, we are calling it "KeeperLogs".

  • Create a new Data Collection Rule

  • Save the below JSON as a file on your computer

  • When prompted, upload the below JSON file as a Data Sample:

[
  {
    "TimeGenerated": "2025-01-23T01:31:11.123Z",
    "audit_event": "some_event",
    "remote_address": "10.15.12.192",
    "category": "some_category_id",
    "client_version": "EMConsole.17.0.0",
    "username": "email@company.com",
    "enterprise_id": 1234,
    "timestamp": "2025-01-23T01:31:11.123Z",
    "data": {
      "node_id": "abc12345",
      "record_uid": "B881237126",
      "folder_uid": "BCASD12345",
      "some_flag": true
    }
  },
  {
    "TimeGenerated": "2025-01-23T01:31:11.124Z",
    "audit_event": "some_event",
    "remote_address": "10.15.12.192",
    "category": "some_category_id",
    "client_version": "EMConsole.17.0.0",
    "username": "email@company.com",
    "enterprise_id": 1234,
    "timestamp": "2025-01-23T01:31:11.123Z",
    "data": {
      "node_id": "abc12345",
      "record_uid": "B881237126",
      "folder_uid": "BCASD12345",
      "some_flag": true
    }
  },
  {
    "TimeGenerated": "2025-01-23T01:31:11.125Z",
    "audit_event": "some_event",
    "remote_address": "10.15.12.192",
    "category": "some_category_id",
    "client_version": "EMConsole.17.0.0",
    "username": "email@company.com",
    "enterprise_id": 1234,
    "timestamp": "2025-01-23T01:31:11.123Z",
    "data": {
      "node_id": "abc12345",
      "record_uid": "B881237126",
      "folder_uid": "BCASD12345",
      "some_flag": true
    }
  }
]

Review the change and submit the request to create the table.

In this example, it shows up as KeeperLogs_CL (Azure appends the _CL).

Step 7. Assign App Permissions to DCR

From the Data collection rules (DCR) area of Azure:

  • Click on the DCR (e.g. KeeperDCR)

  • Select Role assignments

  • Click Add > Add role assignment

  • Type "Monitoring Metrics Publisher" and select that role

  • Click "+Select members" and select the KeeperLogging application from the list

  • Assign it to the "KeeperLogging" application

Repeat this process and add "Monitoring Contributor" and "Monitoring Reader".

Step 8. Assign App Permissions to DCE

From the Data collection endpoints (DCE) area of Azure:

  • Click on the DCE (e.g. KeeperLogsEndpoint)

  • Select Role assignments

  • Click Add > Add role assignment

  • Type "Monitoring Metrics Publisher" and select that role

  • Click "+Select members" and select the "KeeperLogging" application from the list

  • Assign it to the "KeeperLogging" application

Repeat this process and add "Monitoring Contributor".

At this point, everything is configured on the Azure side. Next, set up the Admin Console.

Step 9. Update Admin Console

In the Keeper Admin Console, login as the Keeper Administrator. Then go to Reporting & Alerts and select "Azure Monitor Logs".

Provide the following information from Step 2 above into the Admin Console:

  • Azure Tenant ID: You can find this from Azure's "Subscriptions" area.

  • Application (client) ID: This is located in the App registration (KeeperLogging) overview screen

  • Client Secret Value: This is the Client Secret Value from the app registration secrets.

  • Endpoint URL: This is a URL that is created in the following specific format: https://<collection_url>/dataCollectionRules/<dcr_id>/streams/<table>?api-version=2023-01-01

To assemble the Endpoint URL:

  • <Collection URL> This comes from Step (5) above

  • <DCR_ID> From the Data Collector Rule, copy the "Immutable Id" value, e.g. dcr-xxxxxxx

  • <TABLE> This is the table name created by Azure, e.g. Custom-KeeperLogs_CL

https://<Collection_URL>/dataCollectionRules/<DCR_ID>/streams/<TABLE>?api-version=2023-01-01

Setup Complete!

When SIEM logs are sent from Keeper to Azure Monitor, the data will begin to populate in the Custom Logs table in a few minutes.


Troubleshooting

Just for the purpose of testing, you can generate a Bearer Token and send an API request to Azure Monitor API to understand how the process works.

Get a Bearer Token

Replace the following:

<Tenant_ID> Your Tenant ID from Step 9 above

<Application_ID> The Application (client) ID from Step 9 above

<Client_Secret_Value> This is this Client Secret Value from Step 9 above

curl -X POST 'https://login.microsoftonline.com/<Tenant_ID>/oauth2/v2.0/token' \
-H 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'grant_type=client_credentials' \
--data-urlencode 'client_id=<Application_ID>' \
--data-urlencode 'client_secret=Client_Secret_Value' \
--data-urlencode 'scope=https://monitor.azure.com/.default'

The scope must change based on the environment:

  • Azure public cloud: https://monitor.azure.com

  • Azure US Government cloud: https://monitor.azure.us

Executing this curl request will produce a token:

{"token_type":"Bearer","expires_in":3599,"ext_expires_in":3599,"access_token":"xxxxx"}

Use the token and send a Curl request for a Keeper event log in the next step.

Send SIEM Events

Send a Curl request as seen below, Replace the below:

<ENDPOINT_URL> The constructed URL from Step 9 above.

<TOKEN> The Bearer token from above

curl -X POST "<ENDPOINT_URL>" \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '[
    {
      "TimeGenerated": "2025-01-23T01:31:11.123Z",
      "audit_event": "event_one",
      "remote_address": "10.15.12.192",
      "category": "msp",
      "client_version": "EMConsole.17.0.0",
      "username": "email@company.com",
      "enterprise_id": 1234,
      "timestamp": "2025-01-23T01:31:11.123Z",
      "data": {
        "node_id": "abc12345",
        "record_uid": "B881237126",
        "folder_uid": "BCASD12345",
        "some_flag": true
      }
    },
    {
      "TimeGenerated": "2025-01-23T01:31:11.124Z",
      "audit_event": "event_two",
      "remote_address": "10.15.12.192",
      "category": "general",
      "client_version": "EMConsole.17.0.0",
      "username": "email@company.com",
      "enterprise_id": 1234,
      "timestamp": "2025-01-23T01:31:11.123Z",
      "data": {
        "node_id": "abc12345",
        "record_uid": "B881237126",
        "folder_uid": "BCASD12345",
        "some_flag": true
      }
    },
    {
      "TimeGenerated": "2025-01-23T01:31:11.125Z",
      "audit_event": "event_three",
      "remote_address": "10.15.12.192",
      "category": "security",
      "client_version": "EMConsole.17.0.0",
      "username": "email@company.com",
      "enterprise_id": 1234,
      "timestamp": "2025-01-23T01:31:11.123Z",
      "data": {
        "node_id": "abc12345",
        "record_uid": "B881237126",
        "folder_uid": "BCASD12345",
        "some_flag": true
      }
    }
  ]'

Note: The bearer token will expire after 1 hour.

The events will show up in Log Analytics Workspace after a few minutes.

Azure Sentinel (Deprecated)

Integrating Keeper SIEM event pushes to Azure Sentinel and Log Analytics

Microsoft has deprecated this logging API. Please see the Azure Monitor setup.

Overview

Keeper supports event streaming into Azure Sentinel / Log Analytics environments. This document describes the legacy method of streaming logs, which is being deprecated in 2025. Use the Azure Monitor method instead.

To proceed with this method... in Azure, go to Log Analytics workspaces > Select Workspace > Classic "Agents Management". From here you can retrieve a Workspace ID and Key. Provide these two fields to Keeper to start streaming logs to your selected workspace.

Workspace ID and Key
Azure Sentinel Integration Settings

Keeper will immediately start sending event data to the designated Azure Log Analytics workspace, under a custom table named Keeper_CL.

To view the logs, open the Log Analytics Workspace > Logs > select the Keeper_CL table.

Log Analytics Workspace Logs

Troubleshooting

If you need to troubleshoot the event log APIs, the below Python script will simulate the Keeper backend system sending event logs to your Azure environment. Replace the Workspace ID and Workspace Key before testing it.

import base64
import datetime
import hmac
import hashlib
import requests
import json

# Configuration
workspace_id = 'xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx'
workspace_key = 'xxxxxx'
log_type = 'Keeper'

# Sample body
body = [
{
  "audit_event": "role_created",
  "remote_address": "11.22.33.44",
  "category": "policy",
  "client_version": "EMConsole.17.0.0",
  "username": "user@company.com",
  "enterprise_id": 6557,
  "timestamp": "2025-01-12T00:03:44.743Z",
  "role_id": "28162100560074"
},
{
  "audit_event": "role_enforcement_changed",
  "remote_address": "11.22.33.55",
  "category": "policy",
  "client_version": "EMConsole.17.0.0",
  "timestamp": "2025-01-13T00:03:44.743Z",
  "username": "user@company.com",
  "enterprise_id": 6557,
  "role_id": "28162100560074",
  "enforcement": "RESEND_ENTERPRISE_INVITE_IN_X_DAYS",
  "value": "7"
},
{
  "audit_event": "role_enforcement_changed",
  "remote_address": "11.22.33.66",
  "category": "policy",
  "client_version": "EMConsole.17.0.0",
  "timestamp": "2025-01-14T00:03:44.776Z",
  "username": "user@company.com",
  "enterprise_id": 6557,
  "role_id": "28162100560074",
  "enforcement": "SEND_BREACH_WATCH_EVENTS",
  "value": "ON"
},
{
  "audit_event": "role_enforcement_changed",
  "remote_address": "11.22.33.77",
  "category": "policy",
  "client_version": "EMConsole.17.0.0",
  "timestamp": "2025-01-15T00:03:44.835Z",
  "username": "user@company.com",
  "enterprise_id": 6557,
  "role_id": "28162100560074",
  "enforcement": "GENERATED_PASSWORD_COMPLEXITY",
  "value": "[{\"domains\":[\"_default_\"],\"length\":20,\"lower-use\":false,\"lower-min\":5}]"
},
{
  "audit_event": "audit_alert_sent",
  "category": "usage",
  "client_version": "Keeper Service.1.2.0",
  "username": "ALERT",
  "enterprise_id": 6557,
  "timestamp": "2025-01-16T01:31:11.123Z",
  "origin": "admin_permission_added",
  "name": "XXX123",
  "recipient": "user@company.com,+19165551212",
  "username_new": true,
  "client_version_new": true
}]

body_json = json.dumps(body)
method = 'POST'
content_type = 'application/json'
resource = '/api/logs'
rfc1123date = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
content_length = len(body_json)

signature_string = f"{method}\n{content_length}\n{content_type}\nx-ms-date:{rfc1123date}\n{resource}"
decoded_key = base64.b64decode(workspace_key)
signature = base64.b64encode(hmac.new(decoded_key, signature_string.encode('utf-8'), hashlib.sha256).digest()).decode('utf-8')

headers = {
    'Content-Type': content_type,
    'Authorization': f'SharedKey {workspace_id}:{signature}',
    'Log-Type': log_type,
    'x-ms-date': rfc1123date
}

uri = f'https://{workspace_id}.ods.opinsights.azure.com/api/logs?api-version=2016-04-01'

response = requests.post(uri, data=body_json, headers=headers)
print(f"Response code: {response.status_code}")
print(f"Response message: {response.text}")

Crowdstrike NG SIEM

Integrating Keeper SIEM push to Crowdstrike NG SIEM

Overview

Keeper supports event streaming into Crowdstrike NG SIEM. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

1

Add the Data Connector

  • From the Crowdstrike dashboard, visit the Data onboarding > Data Connectors screen.

  • Select "+ Add connection" and search for Keeper

  • Click "Configure", assign a name, and then "Create connection".

Data connectors
2

Create the API Key

  • From the Data Connector screen, in the Keeper row click the overflow menu and then "Generate API Key".

  • Save the API Key and API URL for the next step.

Create API Key
Copy the API Key and API URL
3

Activate the Integration

  • From the Keeper Admin Console, go to Reporting & Alerts > External Logging

  • Select Crowdstrike Falcon Next-Gen

  • Provide the API Key and API URL from Step 2.

  • Click Test and then Save.

Setup Complete!

When SIEM logs are sent from Keeper to Crowdstrike, the data will begin to populate in the "Third Party" source within a few minutes.

Event Logs in Crowdstrike

Datadog

Integrating Keeper SIEM push to Datadog

Overview

Keeper supports event streaming into Datadog deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Datadog Integration Settings

The Datadog integration requires two fields:

  • URL (For example: datadoghq.com or datadoghq.eu)

  • API Key

To retrieve an API Key, please follow the below instructions

  • In the Datadog interface, go to Organization Settings > API Keys

  • Create a new API key

Ensure that your API Key matches up with the destination server where your Datadog environment is hosted.

Devo

Integrating Keeper SIEM push to Devo

Overview

Keeper supports event streaming into Devo deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Devo Integration Settings

Devo uses a standard "Syslog" push capability over TCP.

Ports TCP Ports 514 and 6514 (TLS)

Fields Exported "audit_event", "username", "client_version", "remote_address", "channel", "result_code", "email", "to_username", "client_version_new","username_new", "file_format", "record_uid", "folder_uid", "folder_type", "shared_folder_uid", "attachment_id", "team_uid", "role_id"

Payload Format Pipe-delimited, e.g. "audit_event=login|username=bob@foo.com|..."

Important: Ensure that the endpoint is using a valid signed SSL certificate. Keeper's systems will refuse to connect to an invalid or self-signed endpoint. Also, ensure that your Devo server allows traffic from Keeper servers. See Firewall Configuration page.

Elastic

Integrating Keeper SIEM push to Elastic

Overview

Keeper supports event streaming into Elastic deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Elastic Integration Settings

Elastic integration uses a TCP push to the destination endpoint. The fields required are:

  • Host (e.g. mycompany.gcp.cloud.us.io:9243)

  • Search Index (e.g. keeper)

  • API Key

Please refer to the Elastic documentation for generating an API key:

https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-api-key.html

Important: Ensure that the endpoint is using a valid signed SSL certificate that has a domain matching the subject name in the certificate. The certificate must also include the full certificate chain from your CA. Keeper's systems will refuse to connect to a self-signed certificate. Also, ensure that your Elastic server allows traffic from Keeper servers. See Firewall Configuration page.

Troubleshooting

If Keeper is unable to connect to your Elastic instance, please check the following:

  • In the host field, do not type http or https

  • Make sure to include the port

  • If you are using a "Space", add the space name to the end of the Host field after the port. For example: example-elastic01.us-east.found.io:9243/s/spacename

  • Make sure any firewall in front of Elastic is configured per this page

Exabeam (LogRhythm)

Integrating Keeper SIEM push to Exabeam

Overview

Keeper supports event streaming into Exabeam (formerly LogRhythm) deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Exabeam uses a standard "Syslog" push capability over TCP.

Ports TCP Ports 514 and 6514 (TLS)

Fields Exported "audit_event", "username", "client_version", "remote_address", "channel", "result_code", "email", "to_username", "client_version_new","username_new", "file_format", "record_uid", "folder_uid", "folder_type", "shared_folder_uid", "attachment_id", "team_uid", "role_id"

Payload Format Pipe-delimited, e.g. "audit_event=login|username=bob@foo.com|..."

Important: Ensure that the endpoint is using a valid signed SSL certificate that has a domain matching the subject name in the certificate. The certificate must also include the full certificate chain from your CA. Keeper's systems will refuse to connect to a self-signed certificate. Also, ensure that your Exabeam server allows traffic from Keeper servers. See Firewall Configuration page.

Google Security Operations (Chronicle)

Integrating Keeper SIEM push to Google Security Operations (formerly Chronicle)

Overview

Keeper supports event streaming into Google Security Operations, formerly known as Google Chronicle. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

1

Create an API Key

  • Go to the Google Cloud console and select the project associated to your Google Security Operations (Chronicle) environment.

  • Select APIs & Services > Credentials and create a new Credential > API Key.

  • After creating the API key, edit the key and apply restrictions.

  • Ensure that the API key is restricted to "Chronicle API" capabilities only.

  • Save this API key for step 3 below.

API Key
2

Create a Feed

From your Google Security Operations tenant:

  • Go to Settings > Feeds > Add Feed

  • Select Source Type of "Webhook" and then select Log Type of "Keeper Enterprise Security"

  • Select Next and then Submit.

  • When prompted, generate the Secret Key and save it for the step 3.

  • Also, copy the Feed Endpoint and save this for step 3.

Feed Secret Key
Endpoint Information
3

Activate Integration

  • From the Keeper Admin Console, go to Reporting & Alerts > External Logging

  • Select Google Security Operations

  • Provide API Key from step 1, Feed Endpoint and Feed Secret Key from Step 2.

  • Click Test and then Save.

Admin Console Settings

Setup Complete!

When SIEM logs are sent from Keeper to Google, the data will begin to populate within 15 minutes.

Logz.io

Integrating Keeper SIEM push to Logz.io

Overview

Keeper supports event streaming into Logz.io deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Logz.io Integration Settings

Logz.io uses their HTTPS listener method.

The connection to Logz.io requires two fields:

  • Host (e.g. mycompany.logz.io)

  • Token

Please refer to your Logz.io documentation for generating a security token.

Important: Ensure that the endpoint is using a valid signed SSL certificate that has a domain matching the subject name in the certificate. The certificate must also include the full certificate chain from your CA. Keeper's systems will refuse to connect to a self-signed certificate. Also, ensure that your Logz.io server allows traffic from Keeper servers. See Firewall Configuration page.

QRadar

Integrating Keeper SIEM event pushes to IBM QRadar

Overview

Keeper supports event streaming into IBM QRadar deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

QRadar Push Integration Settings

QRadar uses a standard "Syslog" push capability over TCP.

Ports TCP Ports 514 and 6514 (TLS)

Fields Exported "audit_event", "username", "client_version", "remote_address", "channel", "result_code", "email", "to_username", "client_version_new","username_new", "file_format", "record_uid", "folder_uid", "folder_type", "shared_folder_uid", "attachment_id", "team_uid", "role_id"

Example Payload

<165>1 2022-10-13T21:05:51.996Z yourLogSourceID keeper - - - {"record_uid":"XXX","audit_event":"fast_fill","remote_address":"12.34.56.78","category":"usage","client_version":"Browser Extensions.16.4.7","username":"user@company.com","enterprise_id":123456}

Important: Ensure that the endpoint is using a valid signed SSL certificate that has a domain matching the subject name in the certificate. The certificate must also include the full certificate chain from your CA. Keeper's systems will refuse to connect to a self-signed certificate. Also, ensure that your QRadar server allows traffic from Keeper servers. See Firewall Configuration page.

Splunk

Integrating Keeper SIEM push to Splunk Enterprise

Overview

Keeper supports event streaming into Splunk Cloud and Splunk Enterprise deployments. External logging is real-time, and new events will appear almost immediately.

An example configuration is displayed below. Note that Host field should only contain the domain portion of the collector URL.

Splunk Integration Settings

Splunk Cloud (Self-Service)

Keeper supports the HTTP Event Collector (HEC) feature of Splunk Cloud deployments.

The standard form for the HEC URL in self-service Splunk Cloud is as follows:

<host>:<port>/<endpoint>

In Keeper, you only need to supply the domain portion of the URL. For example:

Host: input-prd-p-2dm85a8f6db.cloud.splunk.com Port: 8088 Token: HEC token generated in Splunk

Splunk Managed Cloud

Keeper supports the HTTP Event Collector (HEC) feature of Splunk Managed Cloud deployments. The standard form for the HEC URL in managed Splunk Cloud is as follows:

http-inputs-<host>:<port>/<endpoint>

In Keeper, you only need to supply the domain portion of the URL. For example:

Host: http-inputs-prd-p-2dm85a8f6db.splunkcloud.com Port: 443 Token: HEC token generated in Splunk

Ensure that your endpoint has the "Indexer Acknowledgement" feature disabled.

Splunk Enterprise

Keeper supports the HTTP Event Collector (HEC) feature of Splunk Enterprise and Splunk Cloud deployments. To configure Keeper with Splunk, a few things to note:

  • Instructions on creating a HEC for Keeper can be found on Splunk's documentation here: https://docs.splunk.com/Documentation/Splunk/8.1.1/Data/UsetheHTTPEventCollector

  • Keeper requires that the collector endpoint uses SSL with a valid certificate signed by a certificate authority. If the collector is not using SSL, Keeper will reject the connection.

  • The collector endpoint URI needs to be accessible from Keeper's servers. See the AllowList section below for a list of IP addresses.

Step 1. Create Collector

On the Spunk interface, create a new HEC or select an existing collector.

  • Generate a token and store it for Step 2.

  • In the Global Settings, ensure that "Enable SSL" is selected and ensure that the collector is configured to use SSL.

Enable SSL on HEC

Step 2. Activate Integration

On Keeper, provide the endpoint Host, Port and Token from the HEC. In Keeper, you only need to supply the domain portion of the URL.

Splunk Settings
  • Click on "Test Connection" to ensure that the connection is successful. If it's successful, the "Save" button will become active. If there is a communications error, nothing will happen or you will receive an error message.

  • Click "Save" to activate the collector. Keeper will then show the active status.

Active Sync Status

If the status shows "Paused", it could mean that there was a communications error when transmitting events to the Splunk server. A common reason for this is because the HEC is not using SSL with a valid certificate signed by a certificate authority (CA).

Setup Complete!

When SIEM logs are sent from Keeper to Splunk, the data will begin to populate within 15 minutes.

Troubleshooting

As stated above, the HEC in Splunk Enterprise must be secured with SSL having a certificate that is signed by a certificate authority. As a way to check this from a Mac or Linux command line, type the following (replacing your endpoint URI and Token):

$ curl https://splunk.acme-demo.com:8088/services/collector -H "Authorization: Splunk b56ashdd-8b97-443b-1234-abcabcabcabc" -d '{"event": "hello world"}'

If you receive an error about the SSL certificate like below, then it's not configured correctly.

curl: (60) SSL certificate problem: self signed certificate in certificate chain
More details here: https://curl.haxx.se/docs/sslcerts.html

curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.

If you add a "-k" to the curl request to ignore the certificate, you may receive a successful response. This is a good indicator that the HEC certificate is not valid.

To configure Splunk Enterprise for SSL on the collector, refer to the documentation. The local/server.conf file should be modified to include the [sslConfig] section that enables SSL on the splunkd service with a bundled certificate file chain.

[sslConfig]
enableSplunkdSSL = true
serverCert = $SPLUNK_HOME/etc/auth/mycompany/my_bundle.pem

The certificate file chain (my_bundle.pem) can be created by concatenating the certificate, private key and CA certs such as below:

cat my_server.crt my_server.key ca_certs.crt >> my_bundle.pem

For additional details, see the Splunk Enterprise documentation related to securing Splunk with SSL: https://docs.splunk.com/Documentation/Splunk/8.1.1/Security/AboutsecuringyourSplunkconfigurationwithSSL https://docs.splunk.com/Documentation/Splunk/8.1.0/Security/Securingyourdeploymentserverandclients

Event Display

Once activated, the event logs will stream automatically from Keeper's backend servers to the Splunk HEC. As seen in the screenshot below, the event logs will contain the event type, client application version, IP address, timestamp and username of the Keeper user.

Network Routing

Ensure that your Firewall allows traffic from Keeper servers. See Firewall Configuration page.

Sumo Logic

Integrating Keeper SIEM push to Sumo Logic

Overview

Keeper supports event streaming into Sumo Logic deployments. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Enter your Sumo Logic HTTP Source Address

The Sumo Logic integration requires a single sync URL.

Configure an HTTP Logs and Metrics Source​

To configure an HTTP Logs and Metrics Source:

  1. In Sumo Logic, select Manage Data > Collection > Collection.

  2. In the Collectors page, click Add Source next to a Hosted** **Collector.

  3. Select HTTP Logs & Metrics.

  4. Enter a Name to display for the Source in the Sumo web application. Description is optional.

  5. (Optional) For Source Host and Source Category, enter any string to tag the output collected from the source. (Category metadata is stored in a searchable field called _sourceCategory.)

  6. SIEM Processing. This option is present if Cloud SIEM Enterprise (CSE) is enabled. Click the checkbox to to send the logs collected by the source to CSE.

  7. Fields. Click the +Add Field link to define the fields you want to associate, each field needs a name (key) and value.

    • green check circle.png A green circle with a check mark is shown when the field exists in the Fields table schema.

    • orange exclamation point.png An orange triangle with an exclamation point is shown when the field doesn't exist in the Fields table schema. In this case, an option to automatically add the nonexistent fields to the Fields table schema is provided. If a field is sent to Sumo that does not exist in the Fields schema it is ignored, known as dropped.

  8. When the URL associated with the source is displayed, copy the URL so you can use it to upload data.

  9. When you are finished configuring the Source, click Save.

  10. Processing Rules. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in Create a Processing Rule. Processing rules are applied to log data, but not to metric data.

HTTP Source Address to Use in your Keeper SIEM Setup

Syslog

Integrating Keeper SIEM push to standard Syslog endpoints

Overview

Keeper supports event streaming into standard TCP Syslog collectors. External logging is real-time, and new events will appear almost immediately. Setup instructions are below.

Syslog Push Integration Settings

Keeper supports a standard "Syslog" push capability over TCP.

Ports TCP Ports 514 and 6514 (TLS)

Fields Exported "audit_event", "username", "client_version", "remote_address", "channel", "result_code", "email", "to_username", "client_version_new","username_new", "file_format", "record_uid", "folder_uid", "folder_type", "shared_folder_uid", "attachment_id", "team_uid", "role_id"

Example Payload

<165>1 2022-10-13T21:05:51.996Z keepersecurity.com keeper - - - {"record_uid":"XXX","audit_event":"fast_fill","remote_address":"12.34.56.78","category":"usage","client_version":"Browser Extensions.16.4.7","username":"user@company.com","enterprise_id":123456}

Important: Ensure that the endpoint is using a valid signed SSL certificate that has a domain matching the subject name in the certificate. The certificate must also include the full certificate chain from your CA. Keeper's systems will refuse to connect to a self-signed certificate.

Also, ensure that your syslog server allows traffic from Keeper servers. See Firewall Configuration page.

Firewall Configuration

Ingress Requirements for direct SIEM push

SIEM Events and Automator Device Approvals

For customers who are receiving inbound SIEM events and Automator device approval requests from the Keeper production environment, you can lock down traffic to the below IP addresses.

US / Global

  • 34.194.242.137/32

  • 18.235.39.229/32

  • 54.208.20.102/32

  • 34.203.159.189/32

EU / Dublin

  • 54.246.149.209/32

  • 34.250.37.43/32

  • 52.210.163.45/32

  • 54.246.185.95/32

AU / Sydney

  • 54.206.253.126/32

  • 52.64.85.78/32

  • 3.106.40.41/32

  • 54.206.208.132/32

US / GovCloud

  • 18.253.101.55/32

  • 18.253.102.58/32

  • 18.252.135.74/32

  • 18.253.212.59/32

CA / Canada Hosted Customers

  • 35.182.155.224/32

  • 35.182.216.11/32

  • 15.223.136.134/32

JP / Tokyo Hosted Customers

  • 35.74.131.237/32

  • 54.150.11.204/32

  • 52.68.53.105/32

After external logging is established, it might be automatically put on pause if the external system becomes unavailable and the number of the events in the queue reaches a threshold of 50. If this happens, you will have to manually resume the external logging after correcting the issue. We recommend setting up an alert for the "Paused Audit log Sync" event so you get notified if the external logging is paused.

On-site Commander Push

SIEM event push to local or on-prem endpoints using Keeper Commander

Command-Line SDK and Reporting API

In addition to using the user interface for generating custom reports, Keeper supports a command-line interface (CLI) and Python SDK to programmatically generate reports. Keeper Commander is an open source tool that provides command-line access and automation / integration capabilities.

Learn about Keeper Commander here: https://docs.keeper.io/secrets-manager/commander-cli

For example, below is a screenshot of the "audit-report" command usage which can be used to generate custom reports through the CLI:

Keeper Commander also integrates into 3rd party SIEM solutions that operate on-premise. For a comprehensive look at how Keeper Commander can be utilized in your environment, please visit the Documentation Portal for Keeper Commander SDK. If you require assistance with Keeper Commander, please contact commander@keepersecurity.com.