Exporting Connections

Exporting Connection Data from Keeper Connection Manager

Overview

This document describes one method of exporting connection data along with connection parameters to a JSON file using a supplied Python script. The resulting file can then be used import connections to another Keeper Connection Manager instance (as documented here) or for migration to the Keeper Connection Manager (Cloud) version.

The example provided in this document is based on the standard Auto Docker Install method of Keeper Connection Manager, using a MySQL database.

Step 1: Open MySQL port to local machine

In order for the python file to query the database from within the Docker container, edit the /etc/kcm-setup/docker-compose.json file and add the "ports" section to the "db" container as displayed below:

db:
        image: keeper/guacamole-db-mysql:2
        restart: unless-stopped
        environment:
            ...
        ports:
            - "3306:3306"  # Expose MySQL port 3306 to the local machine            

Step 2: Install Python modules

Assuming your instance already has Python3, install the necessary Python modules:

pip3 install mysql-connector-python
pip3 install pyyaml

Step 3: Create the export script

Copy and paste the below code into a file called export.py and place it on the Keeper Connection Manager instance, perhaps in the same location as your kcm-setup.run file. This Python script performs the following:

  • Locates the docker-compose.yml file in the standard /etc/kcm-setup/ folder

  • Pulls the MySQL credentials and connects to the database

  • Exports the connection information and creates a file called export.json in the same folder

Depending on your environment, you may need to edit the file.

export.py
import mysql.connector
import json
import yaml

# Path to the docker-compose.yml file
docker_compose_file = '/etc/kcm-setup/docker-compose.yml'

# Function to extract database credentials from docker-compose.yml
def get_db_config_from_compose():
    with open(docker_compose_file, 'r') as file:
        # Load the docker-compose YAML file
        compose_data = yaml.safe_load(file)
        
        # Extracting the necessary information from the 'db' service
        db_service = compose_data['services']['db']
        environment = db_service['environment']
        
        db_name = environment.get('GUACAMOLE_DATABASE', 'guacamole_db')
        db_user = environment.get('GUACAMOLE_USERNAME', 'guacamole_user')
        db_password = environment.get('GUACAMOLE_PASSWORD', 'password')  # Default in case it's not present
        
        return {
            'host': 'localhost',  # Assuming the database is local since it's inside Docker
            'user': db_user,
            'password': db_password,
            'database': db_name,
            'port': 3306  # Default MySQL port
        }

# SQL query to retrieve all connections, users, groups, and attributes
query = """
SELECT 
    c.connection_id,
    c.connection_name AS name,
    c.protocol,
    cp.parameter_name,
    cp.parameter_value,
    e.name AS entity_name,
    e.type AS entity_type,
    g.connection_group_name AS group_name,
    ca.attribute_name,
    ca.attribute_value
FROM
    guacamole_connection c
LEFT JOIN
    guacamole_connection_parameter cp ON c.connection_id = cp.connection_id
LEFT JOIN
    guacamole_connection_attribute ca ON c.connection_id = ca.connection_id
LEFT JOIN
    guacamole_connection_group g ON c.parent_id = g.connection_group_id
LEFT JOIN
    guacamole_connection_permission p ON c.connection_id = p.connection_id
LEFT JOIN
    guacamole_entity e ON p.entity_id = e.entity_id;
"""

def export_to_json(db_config):
    try:
        # Connect to the database
        conn = mysql.connector.connect(**db_config)
        cursor = conn.cursor(dictionary=True)  # Dictionary cursor for better handling

        # Execute the query
        cursor.execute(query)
        rows = cursor.fetchall()

        # Organize the data into the expected format
        connections = {}
        for row in rows:
            conn_id = row['connection_id']
            if conn_id not in connections:
                connections[conn_id] = {
                    'name': row['name'],
                    'protocol': row['protocol'],
                    'parameters': {},
                    'users': [],
                    'groups': [],
                    'attributes': {}
                }
            # Handle parameters
            if row['parameter_name']:
                connections[conn_id]['parameters'][row['parameter_name']] = row['parameter_value']
            # Handle users
            if row['entity_type'] == 'USER' and row['entity_name'] not in connections[conn_id]['users']:
                connections[conn_id]['users'].append(row['entity_name'])
            # Handle groups
            if row['entity_type'] == 'USER_GROUP' and row['entity_name'] not in connections[conn_id]['groups']:
                connections[conn_id]['groups'].append(row['entity_name'])
            # Handle connection groups (organizational)
            if row['group_name'] and row['group_name'] not in connections[conn_id]['groups']:
                connections[conn_id]['groups'].append(row['group_name'])
            # Handle attributes
            if row['attribute_name']:
                connections[conn_id]['attributes'][row['attribute_name']] = row['attribute_value']

        # Convert to list format
        connection_list = [conn for conn in connections.values()]

        # Output the data to a JSON file
        with open('export.json', 'w') as json_file:
            json.dump(connection_list, json_file, indent=4)

        print("Export successful! Data written to export.json")

    except mysql.connector.Error as err:
        print(f"Error: {err}")
    finally:
        # Close the cursor and the connection
        if cursor:
            cursor.close()
        if conn:
            conn.close()

if __name__ == '__main__':
    # Get the database configuration from docker-compose.yml
    db_config = get_db_config_from_compose()
    export_to_json(db_config)

To execute the script, type:

sudo python3 export.py

This will produce a file called export.json in the local folder.

This export.json file will likely contain connection secrets, depending on how the connections were created. Protect this file by saving it to your Keeper vault.

Last updated