LogoLogo
Keeper Connection Manager
Keeper Connection Manager
  • Overview
  • Security Architecture
  • Installation
    • License Key
    • System Requirements
    • Preparing for Installation
    • Auto Docker Install
      • Service Management
      • Upgrading
      • Adding Packages
    • Docker Compose Install
      • keeper/guacamole
      • keeper/guacd
      • Database images
        • keeper/guacamole-db-mysql
        • keeper/guacamole-db-postgres
      • SSL Termination
        • keeper/guacamole-ssl-nginx
        • Using a Custom SSL Cert
      • Upgrading
    • Podman Install
    • Backup & Recovery
  • Authentication Options
    • SSO Auth (SAML)
      • Microsoft Azure
      • Okta
      • Google Workspace
      • OneLogin
      • Oracle
      • PingIdentity
    • 2FA with TOTP
    • 2FA with Duo
    • SSL/TLS Client Authentication
    • Multiple Hostnames
    • PIV/CAC/Smart cards
    • Account Approve/Deny Workflow
    • OpenID Connect Auth
    • LDAP Auth
      • Using Multiple LDAP Servers
      • Storing connection data within LDAP
      • Using LDAP with a database
    • Account Restrictions
  • Connection Protocols
    • RDP
    • SSH
    • VNC
    • Telnet
    • Remote Browser Isolation
    • Kubernetes
    • MySQL
      • Importing and Exporting
      • Keyboard Shortcuts
    • PostgreSQL
      • Importing and Exporting
      • Keyboard Shortcuts
    • Microsoft SQL Server
      • Importing and Exporting
      • Keyboard Shortcuts
    • Connecting to Host Instance
    • Persistent Reverse SSH Tunnel
      • AutoSSH as a Windows Service
      • Linux - AutoSSH
      • Windows - OpenSSH
  • How to Use KCM
    • Login Screen
    • Home Screen
    • Creating Connections
      • Batch Import and API
    • How to Use KCM
    • File Transfer Config
    • Sharing Connections
    • Session Recording and Playback
    • AWS EC2 Discovery
    • Credential Pass-Through
    • Dynamic Connections
    • Custom Branding
      • Add Your Logo
  • Vault Integration
    • Connecting KCM to your Vault
    • Dynamic Tokens
    • Static Tokens
    • Multiple Vaults Integration
    • EC2 Cloud Connector
    • Advanced
    • KeeperPAM
  • Custom Extensions
  • Guest Mode
  • Advanced Configuration
    • guacamole.properties
      • SAML 2.0 Authentication Configuration Properties
      • Duo Two-Factor Authentication Configuration Properties
      • Encrypted JSON Configuration Properties
      • LDAP Configuration Properties
      • MySQL / MariaDB Configuration Properties
      • PostgreSQL Configuration Properties
      • SQL Server Configuration Properties
      • Login Attempts Properties
  • Troubleshooting
  • Importing Connections
  • Exporting Connections
  • High Availability
  • Pre-Release Testing
  • Changelog
  • Licensing and Open Source
  • Scope of Support
  • Security Advisories
  • Accessibility Conformance
Powered by GitBook

Company

  • Keeper Home
  • About Us
  • Careers
  • Security

Support

  • Help Center
  • Contact Sales
  • System Status
  • Terms of Use

Solutions

  • Enterprise Password Management
  • Business Password Management
  • Privileged Access Management
  • Public Sector

Pricing

  • Business and Enterprise
  • Personal and Family
  • Student
  • Military and Medical

© 2025 Keeper Security, Inc.

On this page
  • Overview
  • Step 1: Open MySQL port to local machine
  • Step 2: Install Python modules
  • Step 3: Create the export script
  • Notes

Was this helpful?

Export as PDF

Exporting Connections

Exporting Connection Data from Keeper Connection Manager

Overview

This document describes one method of exporting connection data along with connection parameters to a JSON file using a supplied Python script. The resulting file can then be used import connections to another Keeper Connection Manager instance (as documented here) or for migration to the Keeper Connection Manager (Cloud) version.

The example provided in this document is based on the standard Auto Docker Install method of Keeper Connection Manager, using a MySQL database.

Step 1: Open MySQL port to local machine

In order for the python file to query the database from within the Docker container, edit the /etc/kcm-setup/docker-compose.ymlfile and add the "ports" section to the "db" container as displayed below:

db:
        image: keeper/guacamole-db-mysql :2
        restart: unless-stopped 
        environment:
            ACCEPT EULA: "Y"
            GUACAMOLE_DATABASE: "guacamole_db"
            GUACAMOLE_USERNAME: "guacamole_user"
            GUACAMOLE_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            GUACAMOLE_ADMIN_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            MYSQL_ROOT_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
        ports:
            - "3306:3306"            

To apply the Docker Compose update, follow these steps:

  1. If using the auto-docker install, navigate to the location where the kcm-setup.run script is stored and execute:

    sudo ./kcm-setup.run apply
  2. For Docker Compose installation, navigate to the folder /etc/kcm-setup and run:

    docker compose up -d

Step 2: Install Python modules

Assuming your instance already has Python3, install the necessary Python modules:

pip3 install mysql-connector-python
pip3 install pyyaml

To install python3-pip on your system, use the appropriate command for your operating system:

CentOS/RHEL:

sudo yum install python3-pip

Ubuntu:

sudo apt install python3-pip

Step 3: Create the export script

Copy and paste the below code into a file called export.py and place it on the Keeper Connection Manager instance, perhaps in the same location as your kcm-setup.run file. This Python script performs the following:

  • Locates the docker-compose.yml file in the standard /etc/kcm-setup/ folder

  • Pulls the MySQL credentials and connects to the database

  • Exports the connection information and creates a file called export.json in the same folder

Depending on your environment, you may need to edit the file.

export.py
import mysql.connector
import json
import yaml

# Path to the docker-compose.yml file
docker_compose_file = '/etc/kcm-setup/docker-compose.yml'

# Function to extract database credentials from docker-compose.yml
def get_db_config_from_compose():
    with open(docker_compose_file, 'r') as file:
        # Load the docker-compose YAML file
        compose_data = yaml.safe_load(file)
        
        # Extracting the necessary information from the 'db' service
        db_service = compose_data['services']['db']
        environment = db_service['environment']
        
        db_name = environment.get('GUACAMOLE_DATABASE', 'guacamole_db')
        db_user = environment.get('GUACAMOLE_USERNAME', 'guacamole_user')
        db_password = environment.get('GUACAMOLE_PASSWORD', 'password')  # Default in case it's not present
        
        return {
            'host': 'localhost',  # Assuming the database is local since it's inside Docker
            'user': db_user,
            'password': db_password,
            'database': db_name,
            'port': 3306  # Default MySQL port
        }

def build_connection_group_paths(cursor):
    """
    Build a dictionary mapping group IDs to their full paths by resolving parent-child relationships.
    """
    cursor.execute("SELECT connection_group_id, parent_id, connection_group_name FROM guacamole_connection_group")
    groups = cursor.fetchall()

    group_paths = {}

    def resolve_path(group_id):
        if group_id is None:
            return "ROOT"
        if group_id in group_paths:
            return group_paths[group_id]
        # Find the group details
        group = next(g for g in groups if g['connection_group_id'] == group_id)
        parent_path = resolve_path(group['parent_id'])
        full_path = f"{parent_path}/{group['connection_group_name']}"
        group_paths[group_id] = full_path
        return full_path

    # Resolve paths for all groups
    for group in groups:
        resolve_path(group['connection_group_id'])

    return group_paths

# SQL query to retrieve all connections, users, groups, and attributes
query = """
SELECT
    c.connection_id,
    c.connection_name AS name,
    c.protocol,
    cp.parameter_name,
    cp.parameter_value,
    e.name AS entity_name,
    e.type AS entity_type,
    g.connection_group_id,
    g.parent_id,
    g.connection_group_name AS group_name,
    ca.attribute_name,
    ca.attribute_value
FROM
    guacamole_connection c
LEFT JOIN
    guacamole_connection_parameter cp ON c.connection_id = cp.connection_id
LEFT JOIN
    guacamole_connection_attribute ca ON c.connection_id = ca.connection_id
LEFT JOIN
    guacamole_connection_group g ON c.parent_id = g.connection_group_id
LEFT JOIN
    guacamole_connection_permission p ON c.connection_id = p.connection_id
LEFT JOIN
    guacamole_entity e ON p.entity_id = e.entity_id;
"""

def export_to_json(db_config):
    try:
        # Connect to the database
        conn = mysql.connector.connect(**db_config)
        cursor = conn.cursor(dictionary=True)  # Dictionary cursor for better handling

        # Build connection group paths
        connection_group_paths = build_connection_group_paths(cursor) 

        # Execute the query
        cursor.execute(query)
        rows = cursor.fetchall()

        # Organize the data into the expected format
        connections = {}
        for row in rows:
            conn_id = row['connection_id']
            if conn_id not in connections:
                # Resolve connection group path
                group_path = connection_group_paths.get(row['connection_group_id'], "ROOT")
                connections[conn_id] = {
                    'name': row['name'],
                    'protocol': row['protocol'],
                    'parameters': {},
                    'users': [],
                    'groups': [],  # User groups go here
                    'group': group_path,  # Connection group path 
                    'attributes': {}
                }
            # Handle parameters
            if row['parameter_name']:
                connections[conn_id]['parameters'][row['parameter_name']] = row['parameter_value']
            # Handle users
            if row['entity_type'] == 'USER' and row['entity_name'] not in connections[conn_id]['users']:
                connections[conn_id]['users'].append(row['entity_name'])
            # Handle user groups
            if row['entity_type'] == 'USER_GROUP' and row['entity_name'] not in connections[conn_id]['groups']:
                connections[conn_id]['groups'].append(row['entity_name'])
            # Handle attributes
            if row['attribute_name']:
                connections[conn_id]['attributes'][row['attribute_name']] = row['attribute_value']

        # Convert to list format
        connection_list = [conn for conn in connections.values()]

        # Output the data to a JSON file
        with open('export.json', 'w') as json_file:
            json.dump(connection_list, json_file, indent=4)

        print("Export successful! Data written to export.json")

    except mysql.connector.Error as err:
        print(f"Error: {err}")
    finally:
        # Close the cursor and the connection
        if cursor:
            cursor.close()
        if conn:
            conn.close()

if __name__ == '__main__':
    # Get the database configuration from docker-compose.yml
    db_config = get_db_config_from_compose()
    export_to_json(db_config)

To execute the script, type:

sudo python3 export.py

This will produce a file called export.json in the local folder.

This export.json file will likely contain connection secrets, depending on how the connections were created. Protect this file by saving it to your Keeper vault.

Notes

  • The "groups" object refers to User Groups

  • The "group" object refers to the Connection Group location

  • When importing this into another KCM instance, the connections will only import successfully if the Connection Group exists in the target

PreviousImporting ConnectionsNextHigh Availability

Last updated 2 days ago

Was this helpful?