# Exporting Connections

## Overview

This document describes one method of exporting connection data along with connection parameters to a JSON file using a supplied Python script. The resulting file can then be used import connections to another Keeper Connection Manager instance (as [documented here](https://docs.keeper.io/en/keeper-connection-manager/using-keeper-connection-manager/creating-connections/batch-import-and-api#importing-connections-with-csv-json-or-yaml)) or for migration to the Keeper Connection Manager (Cloud) version.

The example provided in this document is based on the standard [Auto Docker Install](https://docs.keeper.io/en/keeper-connection-manager/installation/auto-docker-install) method of Keeper Connection Manager, using a MySQL database.

### Step 1: Open MySQL port to local machine

In order for the python file to query the database from within the Docker container, edit the `/etc/kcm-setup/docker-compose.yml` file and add the "ports" section to the "db" container as displayed below:

```yaml
db:
        image: keeper/guacamole-db-mysql :2
        restart: unless-stopped 
        environment:
            ACCEPT EULA: "Y"
            GUACAMOLE_DATABASE: "guacamole_db"
            GUACAMOLE_USERNAME: "guacamole_user"
            GUACAMOLE_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            GUACAMOLE_ADMIN_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            MYSQL_ROOT_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
        ports:
            - "3306:3306"            
```

To apply the Docker Compose update, follow these steps:

1. If using the auto-docker install, navigate to the location where the `kcm-setup.run` script is stored and execute:

   ```bash
   sudo ./kcm-setup.run apply
   ```
2. For Docker Compose installation, navigate to the folder `/etc/kcm-setup` and run:

   ```bash
   docker compose up -d
   ```

### Step 2: Install Python modules

Assuming your instance already has Python3, install the necessary Python modules:

{% tabs %}
{% tab title="MYSQL" %}

```bash
pip3 install mysql-connector
pip3 install pyYAML
```

{% endtab %}

{% tab title="POSTGRES" %}

```bash
pip3 install psycopg2-binary
pip3 install pyYAML
```

{% endtab %}
{% endtabs %}

To install `python3-pip` on your system, use the appropriate command for your operating system:

**CentOS/RHEL 8 and newer:**

```bash
sudo dnf install python3-pip
```

**CentOS/RHEL 7 and earlier:**

```bash
sudo yum install python3-pip
```

**Ubuntu:**

```bash
sudo apt install python3-pip
```

### Step 3: Create the export script

Copy and paste the below code into a file called `export.py` and place it on the Keeper Connection Manager instance, perhaps in the same location as your `kcm-setup.run` file. This Python script performs the following:

* Locates the `docker-compose.yml` file in the standard `/etc/kcm-setup/` folder
* Pulls the MySQL/POSTGRES credentials and connects to the database
* Exports the connection information and creates a file called export.json in the same folder

Depending on your environment, you may need to edit the file.

{% tabs %}
{% tab title="MYSQL EXPORT" %}
{% code title="export.py" lineNumbers="true" %}

```python
import mysql.connector
import json
import yaml

# Path to the docker-compose.yml file
docker_compose_file = '/etc/kcm-setup/docker-compose.yml'

# Function to extract database credentials from docker-compose.yml
def get_db_config_from_compose():
    with open(docker_compose_file, 'r') as file:
        # Load the docker-compose YAML file
        compose_data = yaml.safe_load(file)
        
        # Extracting the necessary information from the 'db' service
        db_service = compose_data['services']['db']
        environment = db_service['environment']
        
        db_name = environment.get('GUACAMOLE_DATABASE', 'guacamole_db')
        db_user = environment.get('GUACAMOLE_USERNAME', 'guacamole_user')
        db_password = environment.get('GUACAMOLE_PASSWORD', 'password')  # Default in case it's not present
        
        return {
            'host': 'localhost',  # Assuming the database is local since it's inside Docker
            'user': db_user,
            'password': db_password,
            'database': db_name,
            'port': 3306  # Default MySQL port
        }

def build_connection_group_paths(cursor):
    """
    Build a dictionary mapping group IDs to their full paths by resolving parent-child relationships.
    """
    cursor.execute("SELECT connection_group_id, parent_id, connection_group_name FROM guacamole_connection_group")
    groups = cursor.fetchall()

    group_paths = {}

    def resolve_path(group_id):
        if group_id is None:
            return "ROOT"
        if group_id in group_paths:
            return group_paths[group_id]
        # Find the group details
        group = next(g for g in groups if g['connection_group_id'] == group_id)
        parent_path = resolve_path(group['parent_id'])
        full_path = f"{parent_path}/{group['connection_group_name']}"
        group_paths[group_id] = full_path
        return full_path

    # Resolve paths for all groups
    for group in groups:
        resolve_path(group['connection_group_id'])

    return group_paths

# SQL query to retrieve all connections, users, groups, and attributes
query = """
SELECT
    c.connection_id,
    c.connection_name AS name,
    c.protocol,
    cp.parameter_name,
    cp.parameter_value,
    e.name AS entity_name,
    e.type AS entity_type,
    g.connection_group_id,
    g.parent_id,
    g.connection_group_name AS group_name,
    ca.attribute_name,
    ca.attribute_value
FROM
    guacamole_connection c
LEFT JOIN
    guacamole_connection_parameter cp ON c.connection_id = cp.connection_id
LEFT JOIN
    guacamole_connection_attribute ca ON c.connection_id = ca.connection_id
LEFT JOIN
    guacamole_connection_group g ON c.parent_id = g.connection_group_id
LEFT JOIN
    guacamole_connection_permission p ON c.connection_id = p.connection_id
LEFT JOIN
    guacamole_entity e ON p.entity_id = e.entity_id;
"""

def export_to_json(db_config):
    try:
        # Connect to the database
        conn = mysql.connector.connect(**db_config)
        cursor = conn.cursor(dictionary=True)  # Dictionary cursor for better handling

        # Build connection group paths
        connection_group_paths = build_connection_group_paths(cursor) 

        # Execute the query
        cursor.execute(query)
        rows = cursor.fetchall()

        # Organize the data into the expected format
        connections = {}
        for row in rows:
            conn_id = row['connection_id']
            if conn_id not in connections:
                # Resolve connection group path
                group_path = connection_group_paths.get(row['connection_group_id'], "ROOT")
                connections[conn_id] = {
                    'name': row['name'],
                    'protocol': row['protocol'],
                    'parameters': {},
                    'users': [],
                    'groups': [],  # User groups go here
                    'group': group_path,  # Connection group path 
                    'attributes': {}
                }
            # Handle parameters
            if row['parameter_name']:
                connections[conn_id]['parameters'][row['parameter_name']] = row['parameter_value']
            # Handle users
            if row['entity_type'] == 'USER' and row['entity_name'] not in connections[conn_id]['users']:
                connections[conn_id]['users'].append(row['entity_name'])
            # Handle user groups
            if row['entity_type'] == 'USER_GROUP' and row['entity_name'] not in connections[conn_id]['groups']:
                connections[conn_id]['groups'].append(row['entity_name'])
            # Handle attributes
            if row['attribute_name']:
                connections[conn_id]['attributes'][row['attribute_name']] = row['attribute_value']

        # Convert to list format
        connection_list = [conn for conn in connections.values()]

        # Output the data to a JSON file
        with open('export.json', 'w') as json_file:
            json.dump(connection_list, json_file, indent=4)

        print("Export successful! Data written to export.json")

    except mysql.connector.Error as err:
        print(f"Error: {err}")
    finally:
        # Close the cursor and the connection
        if cursor:
            cursor.close()
        if conn:
            conn.close()

if __name__ == '__main__':
    # Get the database configuration from docker-compose.yml
    db_config = get_db_config_from_compose()
    export_to_json(db_config)

```

{% endcode %}
{% endtab %}

{% tab title="POSTGRES EXPORT" %}

```python
import psycopg2
import psycopg2.extras
import json
import yaml

# Path to the docker-compose.yml file
docker_compose_file = '/etc/kcm-setup/docker-compose.yml'

# Function to extract database credentials from docker-compose.yml
def get_db_config_from_compose():
    with open(docker_compose_file, 'r') as file:
        compose_data = yaml.safe_load(file)

        db_service = compose_data['services']['db']
        env = db_service['environment']

        return {
            'host': 'localhost',
            'user': env.get('GUACAMOLE_USERNAME', 'guacamole_user'),
            'password': env.get('GUACAMOLE_PASSWORD', 'password'),
            'database': env.get('GUACAMOLE_DATABASE', 'guacamole_db'),
            'port': 5432  # Default PostgreSQL port
        }

def build_connection_group_paths(cursor):
    cursor.execute("SELECT connection_group_id, parent_id, connection_group_name FROM guacamole_connection_group")
    groups = cursor.fetchall()

    group_paths = {}

    def resolve_path(group_id):
        if group_id is None:
            return "ROOT"
        if group_id in group_paths:
            return group_paths[group_id]
        group = next(g for g in groups if g['connection_group_id'] == group_id)
        parent_path = resolve_path(group['parent_id'])
        full_path = f"{parent_path}/{group['connection_group_name']}"
        group_paths[group_id] = full_path
        return full_path

    for group in groups:
        resolve_path(group['connection_group_id'])

    return group_paths

# SQL query remains the same (PostgreSQL-compatible syntax)
query = """
SELECT
    c.connection_id,
    c.connection_name AS name,
    c.protocol,
    cp.parameter_name,
    cp.parameter_value,
    e.name AS entity_name,
    e.type AS entity_type,
    g.connection_group_id,
    g.parent_id,
    g.connection_group_name AS group_name,
    ca.attribute_name,
    ca.attribute_value
FROM
    guacamole_connection c
LEFT JOIN
    guacamole_connection_parameter cp ON c.connection_id = cp.connection_id
LEFT JOIN
    guacamole_connection_attribute ca ON c.connection_id = ca.connection_id
LEFT JOIN
    guacamole_connection_group g ON c.parent_id = g.connection_group_id
LEFT JOIN
    guacamole_connection_permission p ON c.connection_id = p.connection_id
LEFT JOIN
    guacamole_entity e ON p.entity_id = e.entity_id;
"""

def export_to_json(db_config):
    try:
        conn = psycopg2.connect(
            host=db_config['host'],
            user=db_config['user'],
            password=db_config['password'],
            dbname=db_config['database'],
            port=db_config['port']
        )
        cursor = conn.cursor(cursor_factory=psycopg2.extras.RealDictCursor)

        connection_group_paths = build_connection_group_paths(cursor)

        cursor.execute(query)
        real_dict_rows = cursor.fetchall()
        # Convert RealDictRow objects to native dicts
        rows = [dict(row) for row in real_dict_rows]

        # Organize the data into the expected format
        connections = {}
        for row in rows:
            conn_id = row['connection_id']
            if conn_id not in connections:
                group_path = connection_group_paths.get(row['connection_group_id'], "ROOT")
                connections[conn_id] = {
                    'name': row['name'],
                    'protocol': row['protocol'],
                    'parameters': {},
                    'users': [],
                    'groups': [],
                    'group': group_path,
                    'attributes': {}
                }
            if row['parameter_name']:
                connections[conn_id]['parameters'][row['parameter_name']] = row['parameter_value']
            if row['entity_type'] == 'USER' and row['entity_name'] not in connections[conn_id]['users']:
                connections[conn_id]['users'].append(row['entity_name'])
            if row['entity_type'] == 'USER_GROUP' and row['entity_name'] not in connections[conn_id]['groups']:
                connections[conn_id]['groups'].append(row['entity_name'])
            if row['attribute_name']:
                connections[conn_id]['attributes'][row['attribute_name']] = row['attribute_value']

        with open('export.json', 'w') as json_file:
            json.dump(list(connections.values()), json_file, indent=4)

        print("Export successful! Data written to export.json")

    except psycopg2.Error as err:
        print(f"Database Error: {err}")
    finally:
        if cursor:
            cursor.close()
        if conn:
            conn.close()

if __name__ == '__main__':
    db_config = get_db_config_from_compose()
    export_to_json(db_config)

```

{% endtab %}
{% endtabs %}

To execute the script, type:

```
sudo python3 export.py
```

This will produce a file called `export.json` in the local folder.

{% hint style="warning" %}
This export.json file will likely contain connection secrets, depending on how the connections were created. Protect this file by saving it to your Keeper vault.
{% endhint %}

### Notes

* The "groups" object refers to User Groups
* The "group" object refers to the Connection Group location
* When importing this into another KCM instance, the connections will only import successfully if the Connection Group exists in the target
