# Import Connections from KCM

## Conversion Process

This process involves a Python script to automatically connect to a KCM database, export connections and connection groups, and convert the data to a valid JSON template for the PAM Project Extend command.

1. The Python script can be found [here](https://github.com/Keeper-Security/Commander/tree/master/examples/pam-kcm-import) - download both `kcm_export.py` and `KCM_mappings.json` files.
2. The `kcm_export.py` script will connect to the KCM database (locally or remotely), and pull connection and connection group data.
3. The data will be converted into parameters compliant for the PAM Project JSON template, using a dictionary of mappings in `KCM_mappings.json`.
4. Connection groups will be mapped as shared folders or nested folders - see [Connection Group Modelling](#connection-group-modelling) for more information.
5. After completing the conversion, shared folders can be added to a Gateway Application in Keeper, before running the PAM Project Extend command.\
   The final JSON file will contain a list of shared folders to create, as well as any connection names that have been found to include Dynamic Tokens.

### Allowing Connections to the KCM Database

You KCM database must have an open port for the Python script to successfully connect to it:

### On a KCM Auto-Docker Install

With the Auto-Docker installation, your `docker-compose.yml` file is found in `/etc/kcm-setup`. You can edit this file as follows:

```
sudo nano /etc/kcm-setup/docker-compose.yml
```

{% tabs %}
{% tab title="MySQL" %}

```yaml
db:
        image: keeper/guacamole-db-mysql:2
        restart: unless-stopped 
        environment:
            ACCEPT EULA: "Y"
            GUACAMOLE_DATABASE: "guacamole_db"
            GUACAMOLE_USERNAME: "guacamole_user"
            GUACAMOLE_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            GUACAMOLE_ADMIN_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            MYSQL_ROOT_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
        ports:
            - "3306:3306"            
```

{% endtab %}

{% tab title="Postgresql" %}

```yaml
db:
        image: keeper/guacamole-db-postgres:2
        restart: unless-stopped 
        environment:
            ACCEPT EULA: "Y"
            GUACAMOLE_DATABASE: "guacamole_db"
            GUACAMOLE_USERNAME: "guacamole_user"
            GUACAMOLE_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            GUACAMOLE_ADMIN_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXX"
            POSTGRES_PASSWORD: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
        ports:
            - "5432:5432"            
```

{% endtab %}
{% endtabs %}

Next, apply the changes:

```
sudo ./kcm-setup.run apply
```

Optionally, if you're running the Python script from a user other than root, you will need to change reading permissions to the `docker-compose.yml` file so that Python can access it:

```
sudo chmod 644 /etc/kcm-setup/docker-compose.yml
```

### On a Docker-Compose Install

With the Auto-Docker installation, you can make the same update to your `docker-compose.yml` file as shown above, then restart the Docker container:

```
docker compose up -d
```

#### For an External PostgreSQL Database

If your KCM host is connected to a remote database, you should already have ports open, however with PostgreSQL, the database can only be accessible to connections allowed in the `pg_hba.conf` file. Make sure whichever location the Python script is running from is allowed.

## Installing Modules and Running the Script

From the same directory as the `kcm_export.py` and `KCM_mappings.json` files, create and activate a Python virtual environment:

{% tabs %}
{% tab title="Linux" %}

```
sudo apt install python3-venv
python3 -m venv venv
source venv/bin/activate
```

{% endtab %}

{% tab title="Windows" %}

```
python -m venv venv
venv\Scripts\activate
```

{% endtab %}
{% endtabs %}

Next, install Python modules necessary for the script:

{% tabs %}
{% tab title="MySQL" %}

```
pip install pyYAML
pip install mysql-connector

pip install rich
```

{% endtab %}

{% tab title="PostgreSQL" %}

```
pip install pyYAML
pip install psycopg2-binary

pip install rich
```

{% endtab %}
{% endtabs %}

* The `pyYAML` module will be used to read the `docker-compose.yml` file.
* The `mysql-connector` / `psycopg2-binary` modules will be used to connect to the database.
* The `rich` module is *only used for Console styling and readability* - it is optional, however if you wish to exclude it you will need to delete the import statements for the module from the `kcm_export.py` file.

Finally, run the script:

```
python3 kcm_export.py
```

## Script CLI Process

The script will take you through an series of CLI prompts:

* On the first prompt, enter your target database type:

<figure><img src="https://762006384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MJXOXEifAmpyvNVL1to%2Fuploads%2F0Pl6ob1A5Wt8WJ9sNtYh%2Fimage.png?alt=media&#x26;token=cdfcccab-a2a5-4249-8936-95f9129466ba" alt="PAM KCM import - database prompt" width="563"><figcaption></figcaption></figure>

* On the second prompt, select `1` if your `docker-compose.yml` file is stored locally - which will then prompt for the file location.\
  If your database is remote and the `docker-compose.yml` file isn't on the local device, you can hardcode the connection details in the `kcm_export.py` file before running the script and select `2`.

<figure><img src="https://762006384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MJXOXEifAmpyvNVL1to%2Fuploads%2F6s6vntOyGjNQh8HZAfF0%2Fimage.png?alt=media&#x26;token=a0fc5a93-11d9-44f1-a21e-3802edc2939f" alt="PAM KCM Import - Database connection prompt" width="563"><figcaption></figcaption></figure>

* Next, you will be prompted to choose a Connection group structure - see [Connection Group Modelling](#connection-group-modelling) for more details.

<figure><img src="https://762006384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MJXOXEifAmpyvNVL1to%2Fuploads%2FR3stlShMhi8s6DglVGb7%2Fimage.png?alt=media&#x26;token=bd9cc9bf-de4b-44d7-9555-47dfaf211d46" alt="PAM KCM Import - Connection group structure prompt" width="563"><figcaption></figcaption></figure>

* (Optional) Finally, you will be prompted to choose whether you want to use a template JSON file along with your conversion. This file is only relevant if you want to set blanket parameters on all your resources / users.

<figure><img src="https://762006384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MJXOXEifAmpyvNVL1to%2Fuploads%2FiRibIM4ETAX8SJTBcL6R%2Fimage.png?alt=media&#x26;token=1b85e47a-ec35-4eca-89f4-6f035418346f" alt="PAM KCM Import - template prompt" width="563"><figcaption></figcaption></figure>

This template file contains:

* A pamDirectory resource and its user(s) - these are harcoded in the template.
* A pamMachine resource and its user - these will be used as template resource and user for the conversion script. Any field added in those objects that isn't directly overridden by KCM parameters will be added to the converted object.

### Importing the File with Commander

After this last prompt, a `pam_import.json` file should be created in your directory. This file can be imported using the PAM Project Extend command in Commander (see [guide](https://docs.keeper.io/en/keeperpam/privileged-access-manager/references/importing-pam-resources/adding-pam-resources-to-an-existing-model)):

```
pam project extend -c <pam_configuration_uid> -f path/to/pam_import.json
```

{% hint style="info" %}
Don't forget to add the necessary shared folders generated from your connection groups.

A list of them is printed in the CLI, and added to the `pam_import.json` file.
{% endhint %}

If any KCM connection includes Dynamic Tokens, they will be included in the file's `logged_records`. See [KCM Mappings and Logged Records](#kcm-mappings-and-logged-records) for more details.

## Connection Group Modelling

### Folder Path Generation

Connection groups will be modelled as folder paths on the JSON template.

In KCM, Connections include the user credentials, but in PAM, those are typically separated in different shared folders. The script will separate resources and users into separate folder paths - prefixed `KCM Resources -` and `KCM Users -` for the root folder of the path.

For instance, if you have the following folder path for a connection in KCM:

* Infra / Site 01 / Windows RDP

The script will assign the following folder paths for the resource and user records respectively:

* KCM Resources - Infra / Site 01 / KCM Resource - Windows RDP
* KCM Resources - Infra / Site 01 / KCM User - Windows RDP

The root folder and the title of the resource and user are prefixed, however any nested folders names are preserved.

### Folder Structure

As shown in the multiple-choice prompt above, the script offers three ways of parsing KCM connection groups into folder paths:

* **KSM Configuration Based**

The folder paths will largely follow the structure of KCM's connection groups, with the root folder being a shared folder and nested folders personal folders. However, if a connection group includes a KSM service configuration, it will be set as a root folder.

This structure infers that connection groups with KSM service configurations should be segmented, and therefore should be modelled as root shared folder to allow for granular sharing.

* **Keep Exact Nesting**

The folder paths will folle the structure of KCM's connection groups exactly.

* **Flat**

The nesting will be destroyed - every connection group from KCM will be modelled as a root shared folder.

## KCM Mappings and Logged Records

The output JSON file generated by the script may include a list of `logged_records`. Connections found to include Dynamic Tokens will appear in this list. However you can also log other records based on specific parameters.

The script runs alongside a `KCM_mappings.json` file, which is a dictionary of mappings between KCM parameters and PAM Project parameters. Unsupported parameters are mapped as `null`, while parameters that don't apply to PAM are mapped as `ignore`. By mapping a parameter as `log`, a connection found with said parameters would be added to your `logged_records` output.

This could be useful if you need to track specific parameters for rework after the import.

## SFTP Handling

In KCM, SFTP parameters include the host and credentials. In PAM, this is handled differently, where you can point to a pamMachine / pamUser record.

If a connection is found to include SFTP parameters, the script will automatically create a new pamMachine and pamUser with port 22, and store it in a folder labelled SFTP one level down from where the connection is.

## Debugging

You can enable debugging by changing the `DEBUG` constant to `True` at the head of the `kcm_export.py` file.
