mirror of
https://github.com/pgadmin-org/pgadmin4.git
synced 2025-02-25 18:55:31 -06:00
Added support to import/export server groups and servers from GUI. Fixes #4803
This commit is contained in:
parent
c1ad7d81f4
commit
9dd957a2aa
docs/en_US
images
import_export_servers_refresh_tree.pngimport_export_servers_step1.pngimport_export_servers_step2.pngimport_export_servers_step3.png
import_export_servers.rstrelease_notes_6_4.rsttree_control.rstweb
package.json
pgadmin
regression/javascript/schema_ui_files
setup.pywebpack.config.jswebpack.shim.jsyarn.lock
BIN
docs/en_US/images/import_export_servers_refresh_tree.png
Normal file
BIN
docs/en_US/images/import_export_servers_refresh_tree.png
Normal file
Binary file not shown.
After ![]() (image error) Size: 42 KiB |
BIN
docs/en_US/images/import_export_servers_step1.png
Normal file
BIN
docs/en_US/images/import_export_servers_step1.png
Normal file
Binary file not shown.
After ![]() (image error) Size: 79 KiB |
BIN
docs/en_US/images/import_export_servers_step2.png
Normal file
BIN
docs/en_US/images/import_export_servers_step2.png
Normal file
Binary file not shown.
After ![]() (image error) Size: 95 KiB |
BIN
docs/en_US/images/import_export_servers_step3.png
Normal file
BIN
docs/en_US/images/import_export_servers_step3.png
Normal file
Binary file not shown.
After ![]() (image error) Size: 80 KiB |
@ -1,4 +1,4 @@
|
|||||||
.. _export_import_servers:
|
.. _import_export_servers:
|
||||||
|
|
||||||
******************************
|
******************************
|
||||||
`Import/Export Servers`:index:
|
`Import/Export Servers`:index:
|
||||||
@ -6,14 +6,62 @@
|
|||||||
|
|
||||||
Server definitions (and their groups) can be exported to a JSON file and
|
Server definitions (and their groups) can be exported to a JSON file and
|
||||||
re-imported to the same or a different system to enable easy pre-configuration
|
re-imported to the same or a different system to enable easy pre-configuration
|
||||||
of pgAdmin. The ``setup.py`` script is used for this purpose.
|
of pgAdmin.
|
||||||
|
|
||||||
.. note:: To export or import servers, you must use the Python interpreter that
|
Using pgAdmin 4 GUI
|
||||||
is normally used to run pgAdmin to ensure that the required Python
|
###################
|
||||||
packages are available. In most packages, this can be found in the
|
|
||||||
Python Virtual Environment that can be found in the installation
|
To launch the *Import/Export Servers...* tool, navigate through *Tools* on the
|
||||||
directory. When using platform-native packages, the system installation
|
menu bar to click on the *Import/Export Servers* option.
|
||||||
of Python may be the one used by pgAdmin.
|
|
||||||
|
.. image:: images/import_export_servers_step1.png
|
||||||
|
:alt: Import/Export Servers step one page
|
||||||
|
:align: center
|
||||||
|
|
||||||
|
* Use the *Import/Export* field to select the Server Groups/Servers
|
||||||
|
to be imported or exported.
|
||||||
|
|
||||||
|
* Use the *Filename* field to select the JSON file to import servers or create the
|
||||||
|
new file in case of Export where the servers to be exported in the JSON format.
|
||||||
|
|
||||||
|
* Use the *Replace existing servers?* field to specify whether to replace the
|
||||||
|
existing servers or not. This field is applicable only in case of Import Servers.
|
||||||
|
|
||||||
|
Click the *Next* button to continue, or the *X* button to close the wizard.
|
||||||
|
|
||||||
|
.. image:: images/import_export_servers_step2.png
|
||||||
|
:alt: Import/Export Servers step two page
|
||||||
|
:align: center
|
||||||
|
|
||||||
|
* Select the Server Groups/ Servers to be imported/exported.
|
||||||
|
|
||||||
|
Click the *Next* button to continue, or the *X* button to close the wizard.
|
||||||
|
|
||||||
|
.. image:: images/import_export_servers_step3.png
|
||||||
|
:alt: Import/Export Servers step three page
|
||||||
|
:align: center
|
||||||
|
|
||||||
|
Check the summary of the imported/exported servers on the Summary page.
|
||||||
|
|
||||||
|
Click the *Finish* button to close the wizard.
|
||||||
|
|
||||||
|
.. image:: images/import_export_servers_refresh_tree.png
|
||||||
|
:alt: Import/Export Servers Tree Refresh
|
||||||
|
:align: center
|
||||||
|
|
||||||
|
In case of importing the server above confirmation box will be popped up to
|
||||||
|
confirm whether to refresh the browser tree or later.
|
||||||
|
|
||||||
|
|
||||||
|
Using 'setup.py' command line script
|
||||||
|
####################################
|
||||||
|
|
||||||
|
.. note:: To export or import servers using ``setup.py`` script, you must use
|
||||||
|
the Python interpreter that is normally used to run pgAdmin to ensure
|
||||||
|
that the required Python packages are available. In most packages, this
|
||||||
|
can be found in the Python Virtual Environment that can be found in the
|
||||||
|
installation directory. When using platform-native packages, the system
|
||||||
|
installation of Python may be the one used by pgAdmin.
|
||||||
|
|
||||||
Exporting Servers
|
Exporting Servers
|
||||||
*****************
|
*****************
|
||||||
|
@ -9,6 +9,7 @@ This release contains a number of bug fixes and new features since the release o
|
|||||||
New features
|
New features
|
||||||
************
|
************
|
||||||
|
|
||||||
|
| `Issue #4803 <https://redmine.postgresql.org/issues/4803>`_ - Added support to import/export server groups and servers from GUI.
|
||||||
|
|
||||||
Housekeeping
|
Housekeeping
|
||||||
************
|
************
|
||||||
@ -21,7 +22,7 @@ Bug fixes
|
|||||||
| `Issue #6745 <https://redmine.postgresql.org/issues/6745>`_ - Fixed an issue where Tablespace is created though an error is shown on the dialog.
|
| `Issue #6745 <https://redmine.postgresql.org/issues/6745>`_ - Fixed an issue where Tablespace is created though an error is shown on the dialog.
|
||||||
| `Issue #7003 <https://redmine.postgresql.org/issues/7003>`_ - Fixed an issue where Explain Analyze shows negative exclusive time.
|
| `Issue #7003 <https://redmine.postgresql.org/issues/7003>`_ - Fixed an issue where Explain Analyze shows negative exclusive time.
|
||||||
| `Issue #7034 <https://redmine.postgresql.org/issues/7034>`_ - Fixed an issue where Columns with default value not showing when adding a new row.
|
| `Issue #7034 <https://redmine.postgresql.org/issues/7034>`_ - Fixed an issue where Columns with default value not showing when adding a new row.
|
||||||
| `Issue #7075 <https://redmine.postgresql.org/issues/7075>`_ - Ensure that help should be visible properly for Procedures.
|
| `Issue #7075 <https://redmine.postgresql.org/issues/7075>`_ - Ensure that help should be visible properly for Procedures.
|
||||||
| `Issue #7077 <https://redmine.postgresql.org/issues/7077>`_ - Fixed an issue where the Owner is not displayed in the reverse engineering SQL for Procedures.
|
| `Issue #7077 <https://redmine.postgresql.org/issues/7077>`_ - Fixed an issue where the Owner is not displayed in the reverse engineering SQL for Procedures.
|
||||||
| `Issue #7078 <https://redmine.postgresql.org/issues/7078>`_ - Fixed an issue where an operation error message pop up showing the database object's name incorrectly.
|
| `Issue #7078 <https://redmine.postgresql.org/issues/7078>`_ - Fixed an issue where an operation error message pop up showing the database object's name incorrectly.
|
||||||
| `Issue #7081 <https://redmine.postgresql.org/issues/7081>`_ - Fixed an issue in SQL generation for PostgreSQL-14 functions.
|
| `Issue #7081 <https://redmine.postgresql.org/issues/7081>`_ - Fixed an issue in SQL generation for PostgreSQL-14 functions.
|
||||||
|
@ -79,7 +79,7 @@ The context-sensitive menus associated with *Tables* and nested *Table* nodes pr
|
|||||||
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
||||||
| Option | Action |
|
| Option | Action |
|
||||||
+=========================+==============================================================================================================================+
|
+=========================+==============================================================================================================================+
|
||||||
| *Import/Export...* | Click open the :ref:`Import/Export... <import_export_data>` dialog to import data to or export data from the selected table. |
|
| *Import/Export Data...* | Click open the :ref:`Import/Export... <import_export_data>` dialog to import data to or export data from the selected table. |
|
||||||
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
||||||
| *Reset Statistics* | Click to reset statistics for the selected table. |
|
| *Reset Statistics* | Click to reset statistics for the selected table. |
|
||||||
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
+-------------------------+------------------------------------------------------------------------------------------------------------------------------+
|
||||||
|
@ -143,6 +143,7 @@
|
|||||||
"raf": "^3.4.1",
|
"raf": "^3.4.1",
|
||||||
"react": "^17.0.1",
|
"react": "^17.0.1",
|
||||||
"react-aspen": "^1.1.0",
|
"react-aspen": "^1.1.0",
|
||||||
|
"react-checkbox-tree": "^1.7.2",
|
||||||
"react-dom": "^17.0.1",
|
"react-dom": "^17.0.1",
|
||||||
"react-draggable": "^4.4.4",
|
"react-draggable": "^4.4.4",
|
||||||
"react-select": "^4.2.1",
|
"react-select": "^4.2.1",
|
||||||
|
@ -25,3 +25,4 @@
|
|||||||
@import 'node_modules/xterm/css/xterm.css';
|
@import 'node_modules/xterm/css/xterm.css';
|
||||||
|
|
||||||
@import 'node_modules/jsoneditor/dist/jsoneditor.min.css';
|
@import 'node_modules/jsoneditor/dist/jsoneditor.min.css';
|
||||||
|
@import 'node_modules/react-checkbox-tree/lib/react-checkbox-tree.css';
|
||||||
|
73
web/pgadmin/static/js/components/CheckBoxTree.jsx
Normal file
73
web/pgadmin/static/js/components/CheckBoxTree.jsx
Normal file
@ -0,0 +1,73 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { makeStyles } from '@material-ui/core/styles';
|
||||||
|
import CheckboxTree from 'react-checkbox-tree';
|
||||||
|
import CheckBoxIcon from '@material-ui/icons/CheckBox';
|
||||||
|
import CheckBoxOutlineBlankIcon from '@material-ui/icons/CheckBoxOutlineBlank';
|
||||||
|
import IndeterminateCheckBoxIcon from '@material-ui/icons/IndeterminateCheckBox';
|
||||||
|
import ExpandMoreIcon from '@material-ui/icons/ExpandMore';
|
||||||
|
import ChevronRightIcon from '@material-ui/icons/ChevronRight';
|
||||||
|
import PropTypes from 'prop-types';
|
||||||
|
|
||||||
|
const useStyles = makeStyles((theme) =>
|
||||||
|
({
|
||||||
|
treeRoot: {
|
||||||
|
'& .rct-collapse, .rct-checkbox': {
|
||||||
|
padding: 0
|
||||||
|
},
|
||||||
|
'& .rct-node-leaf':{
|
||||||
|
padding: '0 0 0 10px'
|
||||||
|
},
|
||||||
|
'& .react-checkbox-tree': {
|
||||||
|
height: '97%',
|
||||||
|
fontSize: '0.815rem',
|
||||||
|
overflow: 'auto',
|
||||||
|
...theme.mixins.panelBorder
|
||||||
|
},
|
||||||
|
height: '100%'
|
||||||
|
},
|
||||||
|
unchecked: {
|
||||||
|
fill: theme.otherVars.borderColor
|
||||||
|
},
|
||||||
|
checked: {
|
||||||
|
fill: theme.palette.primary.main
|
||||||
|
}
|
||||||
|
})
|
||||||
|
);
|
||||||
|
export default function CheckBoxTree({treeData, ...props}) {
|
||||||
|
const [checked, setChecked] = React.useState([]);
|
||||||
|
const [expanded, setExpanded] = React.useState([]);
|
||||||
|
|
||||||
|
const classes = useStyles();
|
||||||
|
|
||||||
|
React.useEffect(() => {
|
||||||
|
if (props.getSelectedServers) {
|
||||||
|
props.getSelectedServers(checked);
|
||||||
|
}
|
||||||
|
}, [checked]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={classes.treeRoot}>
|
||||||
|
<CheckboxTree
|
||||||
|
nodes={treeData}
|
||||||
|
checked={checked}
|
||||||
|
expanded={expanded}
|
||||||
|
onCheck={checked => setChecked(checked)}
|
||||||
|
onExpand={expanded => setExpanded(expanded)}
|
||||||
|
showNodeIcon={false}
|
||||||
|
icons={{
|
||||||
|
check: <CheckBoxIcon className={classes.checked}/>,
|
||||||
|
uncheck: <CheckBoxOutlineBlankIcon className={classes.unchecked}/>,
|
||||||
|
halfCheck: <IndeterminateCheckBoxIcon className={classes.checked}/>,
|
||||||
|
expandClose: <ChevronRightIcon />,
|
||||||
|
expandOpen: <ExpandMoreIcon />,
|
||||||
|
leaf: <ChevronRightIcon />
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
CheckBoxTree.propTypes = {
|
||||||
|
treeData: PropTypes.array,
|
||||||
|
getSelectedServers: PropTypes.func
|
||||||
|
};
|
201
web/pgadmin/tools/import_export_servers/__init__.py
Normal file
201
web/pgadmin/tools/import_export_servers/__init__.py
Normal file
@ -0,0 +1,201 @@
|
|||||||
|
##########################################################################
|
||||||
|
#
|
||||||
|
# pgAdmin 4 - PostgreSQL Tools
|
||||||
|
#
|
||||||
|
# Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
# This software is released under the PostgreSQL Licence
|
||||||
|
#
|
||||||
|
##########################################################################
|
||||||
|
|
||||||
|
"""A blueprint module implementing the import and export servers
|
||||||
|
functionality"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import random
|
||||||
|
|
||||||
|
from flask import url_for, Response, render_template, request
|
||||||
|
from flask_babel import gettext as _
|
||||||
|
from flask_security import login_required, current_user
|
||||||
|
from pgadmin.utils import PgAdminModule
|
||||||
|
from pgadmin.utils.ajax import bad_request
|
||||||
|
from pgadmin.utils.constants import MIMETYPE_APP_JS
|
||||||
|
from web.pgadmin.utils.ajax import make_json_response, internal_server_error
|
||||||
|
from pgadmin.model import ServerGroup, Server
|
||||||
|
from pgadmin.utils import clear_database_servers, dump_database_servers,\
|
||||||
|
load_database_servers
|
||||||
|
|
||||||
|
MODULE_NAME = 'import_export_servers'
|
||||||
|
|
||||||
|
|
||||||
|
class ImportExportServersModule(PgAdminModule):
|
||||||
|
"""
|
||||||
|
class ImportExportServersModule(PgAdminModule)
|
||||||
|
|
||||||
|
A module class for import which is derived from PgAdminModule.
|
||||||
|
|
||||||
|
Methods:
|
||||||
|
-------
|
||||||
|
* get_own_javascripts(self)
|
||||||
|
- Method is used to load the required javascript files for import module
|
||||||
|
"""
|
||||||
|
|
||||||
|
LABEL = _('Import/Export Servers')
|
||||||
|
|
||||||
|
def get_own_javascripts(self):
|
||||||
|
scripts = list()
|
||||||
|
for name, script in [
|
||||||
|
['pgadmin.tools.import_export_servers', 'js/import_export_servers']
|
||||||
|
]:
|
||||||
|
scripts.append({
|
||||||
|
'name': name,
|
||||||
|
'path': url_for('import_export_servers.index') + script,
|
||||||
|
'when': None
|
||||||
|
})
|
||||||
|
|
||||||
|
return scripts
|
||||||
|
|
||||||
|
def get_exposed_url_endpoints(self):
|
||||||
|
"""
|
||||||
|
Returns:
|
||||||
|
list: URL endpoints for backup module
|
||||||
|
"""
|
||||||
|
return ['import_export_servers.get_servers',
|
||||||
|
'import_export_servers.load_servers',
|
||||||
|
'import_export_servers.save']
|
||||||
|
|
||||||
|
|
||||||
|
blueprint = ImportExportServersModule(MODULE_NAME, __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@blueprint.route("/")
|
||||||
|
@login_required
|
||||||
|
def index():
|
||||||
|
return bad_request(errormsg=_("This URL cannot be called directly."))
|
||||||
|
|
||||||
|
|
||||||
|
@blueprint.route("/js/import_export_servers.js")
|
||||||
|
@login_required
|
||||||
|
def script():
|
||||||
|
"""render the import/export javascript file"""
|
||||||
|
return Response(
|
||||||
|
response=render_template(
|
||||||
|
"import_export_servers/js/import_export_servers.js", _=_),
|
||||||
|
status=200,
|
||||||
|
mimetype=MIMETYPE_APP_JS
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@blueprint.route('/get_servers', methods=['GET'], endpoint='get_servers')
|
||||||
|
@login_required
|
||||||
|
def get_servers():
|
||||||
|
"""
|
||||||
|
This function is used to get the servers with server groups
|
||||||
|
"""
|
||||||
|
all_servers = []
|
||||||
|
groups = ServerGroup.query.filter_by(
|
||||||
|
user_id=current_user.id
|
||||||
|
).order_by("id")
|
||||||
|
|
||||||
|
# Loop through all the server groups
|
||||||
|
for idx, group in enumerate(groups):
|
||||||
|
children = []
|
||||||
|
# Loop through all the servers for specific server group
|
||||||
|
servers = Server.query.filter(
|
||||||
|
Server.user_id == current_user.id,
|
||||||
|
Server.servergroup_id == group.id)
|
||||||
|
for server in servers:
|
||||||
|
children.append({'value': server.id, 'label': server.name})
|
||||||
|
|
||||||
|
all_servers.append(
|
||||||
|
{'value': group.name, 'label': group.name, 'children': children})
|
||||||
|
|
||||||
|
return make_json_response(success=1, data=all_servers)
|
||||||
|
|
||||||
|
|
||||||
|
@blueprint.route('/load_servers', methods=['POST'], endpoint='load_servers')
|
||||||
|
@login_required
|
||||||
|
def load_servers():
|
||||||
|
"""
|
||||||
|
This function is used to load the servers from the json file.
|
||||||
|
"""
|
||||||
|
filename = None
|
||||||
|
groups = {}
|
||||||
|
all_servers = []
|
||||||
|
|
||||||
|
data = request.form if request.form else json.loads(request.data.decode())
|
||||||
|
if 'filename' in data:
|
||||||
|
filename = data['filename']
|
||||||
|
|
||||||
|
if filename is not None and os.path.exists(filename):
|
||||||
|
try:
|
||||||
|
with open(filename, 'r') as j:
|
||||||
|
data = json.loads(j.read())
|
||||||
|
if 'Servers' in data:
|
||||||
|
for server in data["Servers"]:
|
||||||
|
obj = data["Servers"][server]
|
||||||
|
server_id = server + '_' + str(random.randint(1, 9999))
|
||||||
|
|
||||||
|
if obj['Group'] in groups:
|
||||||
|
groups[obj['Group']]['children'].append(
|
||||||
|
{'value': server_id,
|
||||||
|
'label': obj['Name']})
|
||||||
|
else:
|
||||||
|
groups[obj['Group']] = \
|
||||||
|
{'value': obj['Group'], 'label': obj['Group'],
|
||||||
|
'children': [{
|
||||||
|
'value': server_id,
|
||||||
|
'label': obj['Name']}]}
|
||||||
|
else:
|
||||||
|
return internal_server_error(
|
||||||
|
_('The specified file is not in the correct format.'))
|
||||||
|
|
||||||
|
for item in groups:
|
||||||
|
all_servers.append(groups[item])
|
||||||
|
except Exception as e:
|
||||||
|
return internal_server_error(
|
||||||
|
_('Unable to load the specified file.'))
|
||||||
|
else:
|
||||||
|
return internal_server_error(_('The specified file does not exist.'))
|
||||||
|
|
||||||
|
return make_json_response(success=1, data=all_servers)
|
||||||
|
|
||||||
|
|
||||||
|
@blueprint.route('/save', methods=['POST'], endpoint='save')
|
||||||
|
@login_required
|
||||||
|
def save():
|
||||||
|
"""
|
||||||
|
This function is used to import or export based on the data
|
||||||
|
"""
|
||||||
|
required_args = [
|
||||||
|
'type', 'filename'
|
||||||
|
]
|
||||||
|
|
||||||
|
data = request.form if request.form else json.loads(request.data.decode())
|
||||||
|
for arg in required_args:
|
||||||
|
if arg not in data:
|
||||||
|
return make_json_response(
|
||||||
|
status=410,
|
||||||
|
success=0,
|
||||||
|
errormsg=_(
|
||||||
|
"Could not find the required parameter ({})."
|
||||||
|
).format(arg)
|
||||||
|
)
|
||||||
|
|
||||||
|
status = False
|
||||||
|
errmsg = None
|
||||||
|
summary_data = []
|
||||||
|
if data['type'] == 'export':
|
||||||
|
status, errmsg, summary_data = \
|
||||||
|
dump_database_servers(data['filename'], data['selected_sever_ids'])
|
||||||
|
elif data['type'] == 'import':
|
||||||
|
# Clear all the existing servers
|
||||||
|
if 'replace_servers' in data and data['replace_servers']:
|
||||||
|
clear_database_servers()
|
||||||
|
status, errmsg, summary_data = \
|
||||||
|
load_database_servers(data['filename'], data['selected_sever_ids'])
|
||||||
|
|
||||||
|
if not status:
|
||||||
|
return internal_server_error(errmsg)
|
||||||
|
|
||||||
|
return make_json_response(success=1, data=summary_data)
|
@ -0,0 +1,259 @@
|
|||||||
|
/////////////////////////////////////////////////////////////
|
||||||
|
//
|
||||||
|
// pgAdmin 4 - PostgreSQL Tools
|
||||||
|
//
|
||||||
|
// Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
// This software is released under the PostgreSQL Licence
|
||||||
|
//
|
||||||
|
//////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
import gettext from 'sources/gettext';
|
||||||
|
import _ from 'lodash';
|
||||||
|
import url_for from 'sources/url_for';
|
||||||
|
import React from 'react';
|
||||||
|
import { Box, Paper} from '@material-ui/core';
|
||||||
|
import { makeStyles } from '@material-ui/core/styles';
|
||||||
|
import Wizard from '../../../../static/js/helpers/wizard/Wizard';
|
||||||
|
import WizardStep from '../../../../static/js/helpers/wizard/WizardStep';
|
||||||
|
import { FormFooterMessage, MESSAGE_TYPE } from '../../../../static/js/components/FormComponents';
|
||||||
|
import SchemaView from '../../../../static/js/SchemaView';
|
||||||
|
import Loader from 'sources/components/Loader';
|
||||||
|
import ImportExportSelectionSchema from './import_export_selection.ui';
|
||||||
|
import CheckBoxTree from '../../../../static/js/components/CheckBoxTree';
|
||||||
|
import getApiInstance from '../../../../static/js/api_instance';
|
||||||
|
import Alertify from 'pgadmin.alertifyjs';
|
||||||
|
import { commonTableStyles } from '../../../../static/js/Theme';
|
||||||
|
import clsx from 'clsx';
|
||||||
|
import Notify from '../../../../static/js/helpers/Notifier';
|
||||||
|
import pgAdmin from 'sources/pgadmin';
|
||||||
|
|
||||||
|
const useStyles = makeStyles(() =>
|
||||||
|
({
|
||||||
|
root: {
|
||||||
|
height: '100%'
|
||||||
|
},
|
||||||
|
treeContainer: {
|
||||||
|
flexGrow: 1,
|
||||||
|
minHeight: 0,
|
||||||
|
},
|
||||||
|
boxText: {
|
||||||
|
paddingBottom: '5px'
|
||||||
|
},
|
||||||
|
noOverflow: {
|
||||||
|
overflow: 'hidden'
|
||||||
|
},
|
||||||
|
summaryContainer: {
|
||||||
|
flexGrow: 1,
|
||||||
|
minHeight: 0,
|
||||||
|
overflow: 'auto',
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
export default function ImportExportServers() {
|
||||||
|
const classes = useStyles();
|
||||||
|
const tableClasses = commonTableStyles();
|
||||||
|
|
||||||
|
var steps = ['Import/Export', 'Database Servers', 'Summary'];
|
||||||
|
const [loaderText, setLoaderText] = React.useState('');
|
||||||
|
const [errMsg, setErrMsg] = React.useState('');
|
||||||
|
const [selectionFormData, setSelectionFormData] = React.useState({});
|
||||||
|
const [serverData, setServerData] = React.useState([]);
|
||||||
|
const [selectedServers, setSelectedServers] = React.useState([]);
|
||||||
|
const [summaryData, setSummaryData] = React.useState([]);
|
||||||
|
const [summaryText, setSummaryText] = React.useState('');
|
||||||
|
const api = getApiInstance();
|
||||||
|
|
||||||
|
const onSave = () => {
|
||||||
|
if (selectionFormData.imp_exp == 'i') {
|
||||||
|
Notify.confirm(
|
||||||
|
gettext('Browser tree refresh required'),
|
||||||
|
gettext('A browser tree refresh is required. Do you wish to refresh the tree?'),
|
||||||
|
function() {
|
||||||
|
pgAdmin.Browser.tree.destroy({
|
||||||
|
success: function() {
|
||||||
|
pgAdmin.Browser.initializeBrowserTree(pgAdmin.Browser);
|
||||||
|
return true;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
},
|
||||||
|
function() {
|
||||||
|
return true;
|
||||||
|
},
|
||||||
|
gettext('Refresh'),
|
||||||
|
gettext('Later')
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
Alertify.importExportWizardDialog().close();
|
||||||
|
};
|
||||||
|
|
||||||
|
const disableNextCheck = (stepId) => {
|
||||||
|
if (stepId == 0) {
|
||||||
|
return _.isEmpty(selectionFormData.filename);
|
||||||
|
} else if (stepId == 1) {
|
||||||
|
return selectedServers.length < 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
|
||||||
|
const onDialogHelp= () => {
|
||||||
|
window.open(url_for('help.static', { 'filename': 'import_export_servers.html' }), 'pgadmin_help');
|
||||||
|
};
|
||||||
|
|
||||||
|
const onErrClose = React.useCallback(()=>{
|
||||||
|
setErrMsg('');
|
||||||
|
});
|
||||||
|
|
||||||
|
const wizardStepChange= (data) => {
|
||||||
|
switch (data.currentStep) {
|
||||||
|
case 2: {
|
||||||
|
let post_data = {'filename': selectionFormData.filename},
|
||||||
|
save_url = url_for('import_export_servers.save');
|
||||||
|
if (selectionFormData.imp_exp == 'e') {
|
||||||
|
setLoaderText('Exporting Server Groups/Servers ...');
|
||||||
|
setSummaryText('Exported following Server Groups/Servers:');
|
||||||
|
|
||||||
|
post_data['type'] = 'export';
|
||||||
|
post_data['selected_sever_ids'] = selectedServers;
|
||||||
|
api.post(save_url, post_data)
|
||||||
|
.then(res => {
|
||||||
|
setLoaderText('');
|
||||||
|
setSummaryData(res.data.data);
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
setLoaderText('');
|
||||||
|
setErrMsg(err.response.data.errormsg);
|
||||||
|
});
|
||||||
|
} else if (selectionFormData.imp_exp == 'i') {
|
||||||
|
setLoaderText('Importing Server Groups/Servers ...');
|
||||||
|
setSummaryText('Imported following Server Groups/Servers:');
|
||||||
|
// Remove the random number added to create unique tree item,
|
||||||
|
let selected_sever_ids = [];
|
||||||
|
selectedServers.forEach((id) => {
|
||||||
|
selected_sever_ids.push(id.split('_')[0]);
|
||||||
|
});
|
||||||
|
|
||||||
|
post_data['type'] = 'import';
|
||||||
|
post_data['selected_sever_ids'] = selected_sever_ids;
|
||||||
|
post_data['replace_servers'] = selectionFormData.replace_servers;
|
||||||
|
|
||||||
|
api.post(save_url, post_data)
|
||||||
|
.then(res => {
|
||||||
|
setLoaderText('');
|
||||||
|
setSummaryData(res.data.data);
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
setLoaderText('');
|
||||||
|
setErrMsg(err.response.data.errormsg);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const onBeforeNext = (activeStep)=>{
|
||||||
|
return new Promise((resolve, reject)=>{
|
||||||
|
if(activeStep == 0) {
|
||||||
|
setLoaderText('Loading Servers/Server Groups ...');
|
||||||
|
if (selectionFormData.imp_exp == 'e') {
|
||||||
|
var get_servers_url = url_for('import_export_servers.get_servers');
|
||||||
|
api.get(get_servers_url)
|
||||||
|
.then(res => {
|
||||||
|
setLoaderText('');
|
||||||
|
setServerData(res.data.data);
|
||||||
|
resolve();
|
||||||
|
})
|
||||||
|
.catch(() => {
|
||||||
|
setLoaderText('');
|
||||||
|
setErrMsg(gettext('Error while fetching Server Groups and Servers.'));
|
||||||
|
reject();
|
||||||
|
});
|
||||||
|
} else if (selectionFormData.imp_exp == 'i') {
|
||||||
|
var load_servers_url = url_for('import_export_servers.load_servers');
|
||||||
|
const post_data = {
|
||||||
|
filename: selectionFormData.filename
|
||||||
|
};
|
||||||
|
api.post(load_servers_url, post_data)
|
||||||
|
.then(res => {
|
||||||
|
setLoaderText('');
|
||||||
|
setServerData(res.data.data);
|
||||||
|
resolve();
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
setLoaderText('');
|
||||||
|
setErrMsg(err.response.data.errormsg);
|
||||||
|
reject();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Box className={classes.root}>
|
||||||
|
<Loader message={loaderText} />
|
||||||
|
<Wizard
|
||||||
|
title={gettext('Import/Export Servers')}
|
||||||
|
stepList={steps}
|
||||||
|
disableNextStep={disableNextCheck}
|
||||||
|
onStepChange={wizardStepChange}
|
||||||
|
onSave={onSave}
|
||||||
|
onHelp={onDialogHelp}
|
||||||
|
beforeNext={onBeforeNext}
|
||||||
|
>
|
||||||
|
<WizardStep stepId={0}>
|
||||||
|
<SchemaView
|
||||||
|
formType={'dialog'}
|
||||||
|
getInitData={() => { }}
|
||||||
|
viewHelperProps={{ mode: 'create' }}
|
||||||
|
schema={new ImportExportSelectionSchema()}
|
||||||
|
showFooter={false}
|
||||||
|
isTabView={false}
|
||||||
|
onDataChange={(isChanged, changedData) => {
|
||||||
|
setSelectionFormData(changedData);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
<FormFooterMessage type={MESSAGE_TYPE.ERROR} message={errMsg} onClose={onErrClose} />
|
||||||
|
</WizardStep>
|
||||||
|
<WizardStep stepId={1} className={classes.noOverflow}>
|
||||||
|
<Box className={classes.boxText}>{gettext('Select the Server Groups/Servers to import/export:')}</Box>
|
||||||
|
<Box className={classes.treeContainer}>
|
||||||
|
<CheckBoxTree treeData={serverData} getSelectedServers={(selectedServers) => {
|
||||||
|
setSelectedServers(selectedServers);
|
||||||
|
}}/>
|
||||||
|
</Box>
|
||||||
|
</WizardStep>
|
||||||
|
<WizardStep stepId={2} className={classes.noOverflow}>
|
||||||
|
<Box className={classes.boxText}>{gettext(summaryText)}</Box>
|
||||||
|
<Paper variant="outlined" elevation={0} className={classes.summaryContainer}>
|
||||||
|
<table className={clsx(tableClasses.table)}>
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Server Group</th>
|
||||||
|
<th>Server</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{summaryData.map((row) => (
|
||||||
|
<tr key={row.srno}>
|
||||||
|
<td>
|
||||||
|
{row.server_group}
|
||||||
|
</td>
|
||||||
|
<td>{row.server}</td>
|
||||||
|
</tr>
|
||||||
|
))}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</Paper>
|
||||||
|
</WizardStep>
|
||||||
|
</Wizard>
|
||||||
|
</Box>
|
||||||
|
);
|
||||||
|
}
|
@ -0,0 +1,98 @@
|
|||||||
|
/////////////////////////////////////////////////////////////
|
||||||
|
//
|
||||||
|
// pgAdmin 4 - PostgreSQL Tools
|
||||||
|
//
|
||||||
|
// Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
// This software is released under the PostgreSQL Licence
|
||||||
|
//
|
||||||
|
//////////////////////////////////////////////////////////////
|
||||||
|
import gettext from 'sources/gettext';
|
||||||
|
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
|
||||||
|
import { isEmptyString } from 'sources/validators';
|
||||||
|
|
||||||
|
export default class ImportExportSelectionSchema extends BaseUISchema {
|
||||||
|
constructor(initData = {}) {
|
||||||
|
super({
|
||||||
|
imp_exp: 'i',
|
||||||
|
filename: undefined,
|
||||||
|
replace_servers: false,
|
||||||
|
...initData
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
get idAttribute() {
|
||||||
|
return 'id';
|
||||||
|
}
|
||||||
|
|
||||||
|
get baseFields() {
|
||||||
|
return [{
|
||||||
|
id: 'imp_exp',
|
||||||
|
label: gettext('Import/Export'),
|
||||||
|
type: 'toggle',
|
||||||
|
options: [
|
||||||
|
{'label': gettext('Import'), 'value': 'i'},
|
||||||
|
{'label': gettext('Export'), 'value': 'e'},
|
||||||
|
]
|
||||||
|
}, {
|
||||||
|
id: 'filename',
|
||||||
|
label: gettext('Filename'),
|
||||||
|
type: (state)=>{
|
||||||
|
if (state.imp_exp == 'e') {
|
||||||
|
return {
|
||||||
|
type: 'file',
|
||||||
|
controlProps: {
|
||||||
|
dialogType: 'create_file',
|
||||||
|
supportedTypes: ['json'],
|
||||||
|
dialogTitle: 'Create file',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
type: 'file',
|
||||||
|
controlProps: {
|
||||||
|
dialogType: 'select_file',
|
||||||
|
supportedTypes: ['json'],
|
||||||
|
dialogTitle: 'Select file',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
deps: ['imp_exp'],
|
||||||
|
depChange: (state, source, topState, actionObj)=> {
|
||||||
|
if (state.imp_exp != actionObj.oldState.imp_exp) {
|
||||||
|
state.filename = undefined;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
helpMessage: gettext('Supports only JSON format.')
|
||||||
|
}, {
|
||||||
|
id: 'replace_servers',
|
||||||
|
label: gettext('Replace existing servers?'),
|
||||||
|
type: 'switch', deps: ['imp_exp'],
|
||||||
|
depChange: (state)=> {
|
||||||
|
if (state.imp_exp == 'e') {
|
||||||
|
state.replace_servers = false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
disabled: function (state) {
|
||||||
|
if (state.imp_exp == 'e') {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}];
|
||||||
|
}
|
||||||
|
|
||||||
|
validate(state, setError) {
|
||||||
|
if (isEmptyString(state.service)) {
|
||||||
|
let errmsg = null;
|
||||||
|
/* events validation*/
|
||||||
|
if (!state.filename) {
|
||||||
|
errmsg = gettext('Please provide a filename.');
|
||||||
|
setError('filename', errmsg);
|
||||||
|
return true;
|
||||||
|
} else {
|
||||||
|
errmsg = null;
|
||||||
|
setError('filename', errmsg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1,124 @@
|
|||||||
|
/////////////////////////////////////////////////////////////
|
||||||
|
//
|
||||||
|
// pgAdmin 4 - PostgreSQL Tools
|
||||||
|
//
|
||||||
|
// Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
// This software is released under the PostgreSQL Licence
|
||||||
|
//
|
||||||
|
//////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
import React from 'react';
|
||||||
|
import ReactDOM from 'react-dom';
|
||||||
|
import 'pgadmin.file_manager';
|
||||||
|
import gettext from 'sources/gettext';
|
||||||
|
import Alertify from 'pgadmin.alertifyjs';
|
||||||
|
import Theme from 'sources/Theme';
|
||||||
|
import ImportExportServers from './ImportExportServers';
|
||||||
|
import $ from 'jquery';
|
||||||
|
|
||||||
|
export default class ImportExportServersModule {
|
||||||
|
static instance;
|
||||||
|
|
||||||
|
static getInstance(...args) {
|
||||||
|
if(!ImportExportServersModule.instance) {
|
||||||
|
ImportExportServersModule.instance = new ImportExportServersModule(...args);
|
||||||
|
}
|
||||||
|
return ImportExportServersModule.instance;
|
||||||
|
}
|
||||||
|
|
||||||
|
constructor(pgAdmin, pgBrowser) {
|
||||||
|
this.pgAdmin = pgAdmin;
|
||||||
|
this.pgBrowser = pgBrowser;
|
||||||
|
}
|
||||||
|
|
||||||
|
init() {
|
||||||
|
if (this.initialized)
|
||||||
|
return;
|
||||||
|
this.initialized = true;
|
||||||
|
|
||||||
|
// Define the nodes on which the menus to be appear
|
||||||
|
var menus = [{
|
||||||
|
name: 'import_export_servers',
|
||||||
|
module: this,
|
||||||
|
applies: ['tools'],
|
||||||
|
callback: 'showImportExportServers',
|
||||||
|
enable: true,
|
||||||
|
priority: 3,
|
||||||
|
label: gettext('Import/Export Servers...'),
|
||||||
|
icon: 'fa fa-shopping-cart',
|
||||||
|
}];
|
||||||
|
|
||||||
|
this.pgBrowser.add_menus(menus);
|
||||||
|
}
|
||||||
|
|
||||||
|
// This is a callback function to show import/export servers when user click on menu item.
|
||||||
|
showImportExportServers() {
|
||||||
|
// Declare Wizard dialog
|
||||||
|
if (!Alertify.importExportWizardDialog) {
|
||||||
|
Alertify.dialog('importExportWizardDialog', function factory() {
|
||||||
|
|
||||||
|
// Generate wizard main container
|
||||||
|
var $container = $('<div class=\'wizard_dlg\' id=\'importExportServersDlg\'></div>');
|
||||||
|
return {
|
||||||
|
main: function () {
|
||||||
|
},
|
||||||
|
setup: function () {
|
||||||
|
return {
|
||||||
|
// Set options for dialog
|
||||||
|
options: {
|
||||||
|
frameless: true,
|
||||||
|
resizable: true,
|
||||||
|
autoReset: false,
|
||||||
|
maximizable: true,
|
||||||
|
closable: true,
|
||||||
|
closableByDimmer: false,
|
||||||
|
modal: true,
|
||||||
|
pinnable: false,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
build: function () {
|
||||||
|
this.elements.content.appendChild($container.get(0));
|
||||||
|
Alertify.pgDialogBuild.apply(this);
|
||||||
|
|
||||||
|
setTimeout(function () {
|
||||||
|
if (document.getElementById('importExportServersDlg')) {
|
||||||
|
ReactDOM.render(
|
||||||
|
<Theme>
|
||||||
|
<ImportExportServers />
|
||||||
|
</Theme>,
|
||||||
|
document.getElementById('importExportServersDlg'));
|
||||||
|
Alertify.importExportWizardDialog().elements.modal.style.maxHeight=0;
|
||||||
|
Alertify.importExportWizardDialog().elements.modal.style.maxWidth='none';
|
||||||
|
Alertify.importExportWizardDialog().elements.modal.style.overflow='visible';
|
||||||
|
Alertify.importExportWizardDialog().elements.dimmer.style.display='none';
|
||||||
|
}
|
||||||
|
}, 500);
|
||||||
|
|
||||||
|
},
|
||||||
|
prepare: function () {
|
||||||
|
$container.empty().append('<div class=\'import_export_servers_container\'></div>');
|
||||||
|
},
|
||||||
|
hooks: {
|
||||||
|
// Triggered when the dialog is closed
|
||||||
|
onclose: function () {
|
||||||
|
// Clear the view and remove the react component.
|
||||||
|
return setTimeout((function () {
|
||||||
|
ReactDOM.unmountComponentAtNode(document.getElementById('importExportServersDlg'));
|
||||||
|
return Alertify.importExportWizardDialog().destroy();
|
||||||
|
}), 500);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
Alertify.importExportWizardDialog('').set({
|
||||||
|
onmaximize:function(){
|
||||||
|
Alertify.importExportWizardDialog().elements.modal.style.maxHeight='initial';
|
||||||
|
},
|
||||||
|
onrestore:function(){
|
||||||
|
Alertify.importExportWizardDialog().elements.modal.style.maxHeight=0;
|
||||||
|
},
|
||||||
|
}).resizeTo(880, 550);
|
||||||
|
}
|
||||||
|
}
|
20
web/pgadmin/tools/import_export_servers/static/js/index.js
Normal file
20
web/pgadmin/tools/import_export_servers/static/js/index.js
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
/////////////////////////////////////////////////////////////
|
||||||
|
//
|
||||||
|
// pgAdmin 4 - PostgreSQL Tools
|
||||||
|
//
|
||||||
|
// Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
// This software is released under the PostgreSQL Licence
|
||||||
|
//
|
||||||
|
//////////////////////////////////////////////////////////////
|
||||||
|
import pgAdmin from 'sources/pgadmin';
|
||||||
|
import pgBrowser from 'top/browser/static/js/browser';
|
||||||
|
import ImportExportServersModule from './import_export_servers';
|
||||||
|
|
||||||
|
if(!pgAdmin.Tools) {
|
||||||
|
pgAdmin.Tools = {};
|
||||||
|
}
|
||||||
|
pgAdmin.Tools.ImportExportServersModule = ImportExportServersModule.getInstance(pgAdmin, pgBrowser);
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
ImportExportServersModule: ImportExportServersModule,
|
||||||
|
};
|
@ -9,6 +9,7 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
import json
|
||||||
import subprocess
|
import subprocess
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from operator import attrgetter
|
from operator import attrgetter
|
||||||
@ -20,8 +21,11 @@ from threading import Lock
|
|||||||
|
|
||||||
from .paths import get_storage_directory
|
from .paths import get_storage_directory
|
||||||
from .preferences import Preferences
|
from .preferences import Preferences
|
||||||
from pgadmin.model import Server
|
from pgadmin.utils.constants import UTILITIES_ARRAY, USER_NOT_FOUND
|
||||||
from pgadmin.utils.constants import UTILITIES_ARRAY
|
from pgadmin.model import db, User, Version, ServerGroup, Server, \
|
||||||
|
SCHEMA_VERSION as CURRENT_SCHEMA_VERSION
|
||||||
|
|
||||||
|
ADD_SERVERS_MSG = "Added %d Server Group(s) and %d Server(s)."
|
||||||
|
|
||||||
|
|
||||||
class PgAdminModule(Blueprint):
|
class PgAdminModule(Blueprint):
|
||||||
@ -370,6 +374,368 @@ def replace_binary_path(binary_path):
|
|||||||
return binary_path
|
return binary_path
|
||||||
|
|
||||||
|
|
||||||
|
def add_value(attr_dict, key, value):
|
||||||
|
"""Add a value to the attribute dict if non-empty.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attr_dict (dict): The dictionary to add the values to
|
||||||
|
key (str): The key for the new value
|
||||||
|
value (str): The value to add
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The updated attribute dictionary
|
||||||
|
"""
|
||||||
|
if value != "" and value is not None:
|
||||||
|
attr_dict[key] = value
|
||||||
|
|
||||||
|
return attr_dict
|
||||||
|
|
||||||
|
|
||||||
|
def dump_database_servers(output_file, selected_servers,
|
||||||
|
dump_user=current_user, from_setup=False):
|
||||||
|
"""Dump the server groups and servers.
|
||||||
|
"""
|
||||||
|
user = _does_user_exist(dump_user, from_setup)
|
||||||
|
if user is None:
|
||||||
|
return False, USER_NOT_FOUND % dump_user, []
|
||||||
|
|
||||||
|
user_id = user.id
|
||||||
|
# Dict to collect the output
|
||||||
|
object_dict = {}
|
||||||
|
# Counters
|
||||||
|
servers_dumped = 0
|
||||||
|
|
||||||
|
# Dump servers
|
||||||
|
servers = Server.query.filter_by(user_id=user_id).all()
|
||||||
|
server_dict = {}
|
||||||
|
dump_servers = []
|
||||||
|
for server in servers:
|
||||||
|
if selected_servers is None or str(server.id) in selected_servers:
|
||||||
|
# Get the group name
|
||||||
|
group_name = ServerGroup.query.filter_by(
|
||||||
|
user_id=user_id, id=server.servergroup_id).first().name
|
||||||
|
|
||||||
|
attr_dict = {}
|
||||||
|
add_value(attr_dict, "Name", server.name)
|
||||||
|
add_value(attr_dict, "Group", group_name)
|
||||||
|
add_value(attr_dict, "Host", server.host)
|
||||||
|
add_value(attr_dict, "HostAddr", server.hostaddr)
|
||||||
|
add_value(attr_dict, "Port", server.port)
|
||||||
|
add_value(attr_dict, "MaintenanceDB", server.maintenance_db)
|
||||||
|
add_value(attr_dict, "Username", server.username)
|
||||||
|
add_value(attr_dict, "Role", server.role)
|
||||||
|
add_value(attr_dict, "SSLMode", server.ssl_mode)
|
||||||
|
add_value(attr_dict, "Comment", server.comment)
|
||||||
|
add_value(attr_dict, "Shared", server.shared)
|
||||||
|
add_value(attr_dict, "DBRestriction", server.db_res)
|
||||||
|
add_value(attr_dict, "PassFile", server.passfile)
|
||||||
|
add_value(attr_dict, "SSLCert", server.sslcert)
|
||||||
|
add_value(attr_dict, "SSLKey", server.sslkey)
|
||||||
|
add_value(attr_dict, "SSLRootCert", server.sslrootcert)
|
||||||
|
add_value(attr_dict, "SSLCrl", server.sslcrl)
|
||||||
|
add_value(attr_dict, "SSLCompression", server.sslcompression)
|
||||||
|
add_value(attr_dict, "BGColor", server.bgcolor)
|
||||||
|
add_value(attr_dict, "FGColor", server.fgcolor)
|
||||||
|
add_value(attr_dict, "Service", server.service)
|
||||||
|
add_value(attr_dict, "Timeout", server.connect_timeout)
|
||||||
|
add_value(attr_dict, "UseSSHTunnel", server.use_ssh_tunnel)
|
||||||
|
add_value(attr_dict, "TunnelHost", server.tunnel_host)
|
||||||
|
add_value(attr_dict, "TunnelPort", server.tunnel_port)
|
||||||
|
add_value(attr_dict, "TunnelUsername", server.tunnel_username)
|
||||||
|
add_value(attr_dict, "TunnelAuthentication",
|
||||||
|
server.tunnel_authentication)
|
||||||
|
|
||||||
|
servers_dumped = servers_dumped + 1
|
||||||
|
dump_servers.append({'srno': servers_dumped,
|
||||||
|
'server_group': group_name,
|
||||||
|
'server': server.name})
|
||||||
|
|
||||||
|
server_dict[servers_dumped] = attr_dict
|
||||||
|
|
||||||
|
object_dict["Servers"] = server_dict
|
||||||
|
|
||||||
|
f = None
|
||||||
|
try:
|
||||||
|
f = open(output_file, "w")
|
||||||
|
except Exception as e:
|
||||||
|
return _handle_error("Error opening output file %s: [%d] %s" %
|
||||||
|
(output_file, e.errno, e.strerror), from_setup)
|
||||||
|
|
||||||
|
try:
|
||||||
|
f.write(json.dumps(object_dict, indent=4))
|
||||||
|
except Exception as e:
|
||||||
|
return _handle_error("Error writing output file %s: [%d] %s" %
|
||||||
|
(output_file, e.errno, e.strerror), from_setup)
|
||||||
|
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
msg = "Configuration for %s servers dumped to %s." % \
|
||||||
|
(servers_dumped, output_file)
|
||||||
|
print(msg)
|
||||||
|
|
||||||
|
return True, msg, dump_servers
|
||||||
|
|
||||||
|
|
||||||
|
def _validate_servers_data(data, is_admin):
|
||||||
|
"""
|
||||||
|
Used internally by load_servers to validate servers data.
|
||||||
|
:param data: servers data
|
||||||
|
:return: error message if any
|
||||||
|
"""
|
||||||
|
skip_servers = []
|
||||||
|
# Loop through the servers...
|
||||||
|
if "Servers" not in data:
|
||||||
|
return "'Servers' attribute not found in the specified file."
|
||||||
|
|
||||||
|
for server in data["Servers"]:
|
||||||
|
obj = data["Servers"][server]
|
||||||
|
|
||||||
|
# Check if server is shared.Won't import if user is non-admin
|
||||||
|
if obj.get('Shared', None) and not is_admin:
|
||||||
|
print("Won't import the server '%s' as it is shared " %
|
||||||
|
obj["Name"])
|
||||||
|
skip_servers.append(server)
|
||||||
|
continue
|
||||||
|
|
||||||
|
def check_attrib(attrib):
|
||||||
|
if attrib not in obj:
|
||||||
|
return ("'%s' attribute not found for server '%s'" %
|
||||||
|
(attrib, server))
|
||||||
|
return None
|
||||||
|
|
||||||
|
for attrib in ("Group", "Name"):
|
||||||
|
errmsg = check_attrib(attrib)
|
||||||
|
if errmsg:
|
||||||
|
return errmsg
|
||||||
|
|
||||||
|
is_service_attrib_available = obj.get("Service", None) is not None
|
||||||
|
|
||||||
|
if not is_service_attrib_available:
|
||||||
|
for attrib in ("Port", "Username"):
|
||||||
|
errmsg = check_attrib(attrib)
|
||||||
|
if errmsg:
|
||||||
|
return errmsg
|
||||||
|
|
||||||
|
for attrib in ("SSLMode", "MaintenanceDB"):
|
||||||
|
errmsg = check_attrib(attrib)
|
||||||
|
if errmsg:
|
||||||
|
return errmsg
|
||||||
|
|
||||||
|
if "Host" not in obj and "HostAddr" not in obj and not \
|
||||||
|
is_service_attrib_available:
|
||||||
|
return ("'Host', 'HostAddr' or 'Service' attribute "
|
||||||
|
"not found for server '%s'" % server)
|
||||||
|
|
||||||
|
for server in skip_servers:
|
||||||
|
del data["Servers"][server]
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def load_database_servers(input_file, selected_servers,
|
||||||
|
load_user=current_user, from_setup=False):
|
||||||
|
"""Load server groups and servers.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
with open(input_file) as f:
|
||||||
|
data = json.load(f)
|
||||||
|
except json.decoder.JSONDecodeError as e:
|
||||||
|
return _handle_error("Error parsing input file %s: %s" %
|
||||||
|
(input_file, e), from_setup)
|
||||||
|
except Exception as e:
|
||||||
|
return _handle_error("Error reading input file %s: [%d] %s" %
|
||||||
|
(input_file, e.errno, e.strerror), from_setup)
|
||||||
|
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
user = _does_user_exist(load_user, from_setup)
|
||||||
|
if user is None:
|
||||||
|
return False, USER_NOT_FOUND % load_user, []
|
||||||
|
|
||||||
|
user_id = user.id
|
||||||
|
# Counters
|
||||||
|
groups_added = 0
|
||||||
|
servers_added = 0
|
||||||
|
|
||||||
|
# Get the server groups
|
||||||
|
groups = ServerGroup.query.filter_by(user_id=user_id)
|
||||||
|
|
||||||
|
# Validate server data
|
||||||
|
error_msg = _validate_servers_data(data, user.has_role("Administrator"))
|
||||||
|
if error_msg is not None and from_setup:
|
||||||
|
print(ADD_SERVERS_MSG % (groups_added, servers_added))
|
||||||
|
return _handle_error(error_msg, from_setup)
|
||||||
|
|
||||||
|
load_servers = []
|
||||||
|
for server in data["Servers"]:
|
||||||
|
if selected_servers is None or str(server) in selected_servers:
|
||||||
|
obj = data["Servers"][server]
|
||||||
|
|
||||||
|
# Get the group. Create if necessary
|
||||||
|
group_id = next(
|
||||||
|
(g.id for g in groups if g.name == obj["Group"]), -1)
|
||||||
|
|
||||||
|
if group_id == -1:
|
||||||
|
new_group = ServerGroup()
|
||||||
|
new_group.name = obj["Group"]
|
||||||
|
new_group.user_id = user_id
|
||||||
|
db.session.add(new_group)
|
||||||
|
|
||||||
|
try:
|
||||||
|
db.session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
if from_setup:
|
||||||
|
print(ADD_SERVERS_MSG % (groups_added, servers_added))
|
||||||
|
return _handle_error(
|
||||||
|
"Error creating server group '%s': %s" %
|
||||||
|
(new_group.name, e), from_setup)
|
||||||
|
|
||||||
|
group_id = new_group.id
|
||||||
|
groups_added = groups_added + 1
|
||||||
|
groups = ServerGroup.query.filter_by(user_id=user_id)
|
||||||
|
|
||||||
|
# Create the server
|
||||||
|
new_server = Server()
|
||||||
|
new_server.name = obj["Name"]
|
||||||
|
new_server.servergroup_id = group_id
|
||||||
|
new_server.user_id = user_id
|
||||||
|
new_server.ssl_mode = obj["SSLMode"]
|
||||||
|
new_server.maintenance_db = obj["MaintenanceDB"]
|
||||||
|
|
||||||
|
new_server.host = obj.get("Host", None)
|
||||||
|
|
||||||
|
new_server.hostaddr = obj.get("HostAddr", None)
|
||||||
|
|
||||||
|
new_server.port = obj.get("Port", None)
|
||||||
|
|
||||||
|
new_server.username = obj.get("Username", None)
|
||||||
|
|
||||||
|
new_server.role = obj.get("Role", None)
|
||||||
|
|
||||||
|
new_server.ssl_mode = obj["SSLMode"]
|
||||||
|
|
||||||
|
new_server.comment = obj.get("Comment", None)
|
||||||
|
|
||||||
|
new_server.db_res = obj.get("DBRestriction", None)
|
||||||
|
|
||||||
|
new_server.passfile = obj.get("PassFile", None)
|
||||||
|
|
||||||
|
new_server.sslcert = obj.get("SSLCert", None)
|
||||||
|
|
||||||
|
new_server.sslkey = obj.get("SSLKey", None)
|
||||||
|
|
||||||
|
new_server.sslrootcert = obj.get("SSLRootCert", None)
|
||||||
|
|
||||||
|
new_server.sslcrl = obj.get("SSLCrl", None)
|
||||||
|
|
||||||
|
new_server.sslcompression = obj.get("SSLCompression", None)
|
||||||
|
|
||||||
|
new_server.bgcolor = obj.get("BGColor", None)
|
||||||
|
|
||||||
|
new_server.fgcolor = obj.get("FGColor", None)
|
||||||
|
|
||||||
|
new_server.service = obj.get("Service", None)
|
||||||
|
|
||||||
|
new_server.connect_timeout = obj.get("Timeout", None)
|
||||||
|
|
||||||
|
new_server.use_ssh_tunnel = obj.get("UseSSHTunnel", None)
|
||||||
|
|
||||||
|
new_server.tunnel_host = obj.get("TunnelHost", None)
|
||||||
|
|
||||||
|
new_server.tunnel_port = obj.get("TunnelPort", None)
|
||||||
|
|
||||||
|
new_server.tunnel_username = obj.get("TunnelUsername", None)
|
||||||
|
|
||||||
|
new_server.tunnel_authentication = \
|
||||||
|
obj.get("TunnelAuthentication", None)
|
||||||
|
|
||||||
|
new_server.shared = \
|
||||||
|
obj.get("Shared", None)
|
||||||
|
|
||||||
|
db.session.add(new_server)
|
||||||
|
|
||||||
|
try:
|
||||||
|
db.session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
if from_setup:
|
||||||
|
print(ADD_SERVERS_MSG % (groups_added, servers_added))
|
||||||
|
return _handle_error("Error creating server '%s': %s" %
|
||||||
|
(new_server.name, e), from_setup)
|
||||||
|
|
||||||
|
servers_added = servers_added + 1
|
||||||
|
load_servers.append({'srno': servers_added,
|
||||||
|
'server_group': obj["Group"],
|
||||||
|
'server': obj["Name"]})
|
||||||
|
|
||||||
|
msg = ADD_SERVERS_MSG % (groups_added, servers_added)
|
||||||
|
print(msg)
|
||||||
|
|
||||||
|
return True, msg, load_servers
|
||||||
|
|
||||||
|
|
||||||
|
def clear_database_servers(load_user=current_user, from_setup=False):
|
||||||
|
"""Clear groups and servers configurations.
|
||||||
|
"""
|
||||||
|
user = _does_user_exist(load_user, from_setup)
|
||||||
|
if user is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
user_id = user.id
|
||||||
|
|
||||||
|
# Remove all servers
|
||||||
|
servers = Server.query.filter_by(user_id=user_id)
|
||||||
|
for server in servers:
|
||||||
|
db.session.delete(server)
|
||||||
|
|
||||||
|
# Remove all groups
|
||||||
|
groups = ServerGroup.query.filter_by(user_id=user_id)
|
||||||
|
for group in groups:
|
||||||
|
db.session.delete(group)
|
||||||
|
servers = Server.query.filter_by(user_id=user_id)
|
||||||
|
|
||||||
|
for server in servers:
|
||||||
|
db.session.delete(server)
|
||||||
|
|
||||||
|
try:
|
||||||
|
db.session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
error_msg = "Error clearing server configuration with error (%s)" % \
|
||||||
|
str(e)
|
||||||
|
if from_setup:
|
||||||
|
print(error_msg)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
return False, error_msg
|
||||||
|
|
||||||
|
|
||||||
|
def _does_user_exist(user, from_setup):
|
||||||
|
"""
|
||||||
|
This function will check user is exist or not. If exist then return
|
||||||
|
"""
|
||||||
|
if isinstance(user, User):
|
||||||
|
user = user.email
|
||||||
|
|
||||||
|
user = User.query.filter_by(email=user).first()
|
||||||
|
|
||||||
|
if user is None:
|
||||||
|
print(USER_NOT_FOUND % user)
|
||||||
|
if from_setup:
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_error(error_msg, from_setup):
|
||||||
|
"""
|
||||||
|
This function is used to print the error msg and exit from app if
|
||||||
|
called from setup.py
|
||||||
|
"""
|
||||||
|
if from_setup:
|
||||||
|
print(error_msg)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
return False, error_msg, []
|
||||||
|
|
||||||
|
|
||||||
# Shortcut configuration for Accesskey
|
# Shortcut configuration for Accesskey
|
||||||
ACCESSKEY_FIELDS = [
|
ACCESSKEY_FIELDS = [
|
||||||
{
|
{
|
||||||
|
@ -110,3 +110,4 @@ BINARY_PATHS = {
|
|||||||
UTILITIES_ARRAY = ['pg_dump', 'pg_dumpall', 'pg_restore', 'psql']
|
UTILITIES_ARRAY = ['pg_dump', 'pg_dumpall', 'pg_restore', 'psql']
|
||||||
|
|
||||||
ENTER_EMAIL_ADDRESS = "Email address: "
|
ENTER_EMAIL_ADDRESS = "Email address: "
|
||||||
|
USER_NOT_FOUND = gettext("The specified user ID (%s) could not be found.")
|
||||||
|
@ -0,0 +1,68 @@
|
|||||||
|
/////////////////////////////////////////////////////////////
|
||||||
|
//
|
||||||
|
// pgAdmin 4 - PostgreSQL Tools
|
||||||
|
//
|
||||||
|
// Copyright (C) 2013 - 2021, The pgAdmin Development Team
|
||||||
|
// This software is released under the PostgreSQL Licence
|
||||||
|
//
|
||||||
|
//////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
import jasmineEnzyme from 'jasmine-enzyme';
|
||||||
|
import React from 'react';
|
||||||
|
import '../helper/enzyme.helper';
|
||||||
|
import { createMount } from '@material-ui/core/test-utils';
|
||||||
|
import pgAdmin from 'sources/pgadmin';
|
||||||
|
import { messages } from '../fake_messages';
|
||||||
|
import SchemaView from '../../../pgadmin/static/js/SchemaView';
|
||||||
|
import ImportExportSelectionSchema from '../../../pgadmin/tools/import_export_servers/static/js/import_export_selection.ui';
|
||||||
|
|
||||||
|
describe('ImportExportServers', () => {
|
||||||
|
let mount;
|
||||||
|
let schemaObj = new ImportExportSelectionSchema();
|
||||||
|
|
||||||
|
/* Use createMount so that material ui components gets the required context */
|
||||||
|
/* https://material-ui.com/guides/testing/#api */
|
||||||
|
beforeAll(() => {
|
||||||
|
mount = createMount();
|
||||||
|
});
|
||||||
|
|
||||||
|
afterAll(() => {
|
||||||
|
mount.cleanUp();
|
||||||
|
});
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
jasmineEnzyme();
|
||||||
|
/* messages used by validators */
|
||||||
|
pgAdmin.Browser = pgAdmin.Browser || {};
|
||||||
|
pgAdmin.Browser.messages = pgAdmin.Browser.messages || messages;
|
||||||
|
pgAdmin.Browser.utils = pgAdmin.Browser.utils || {};
|
||||||
|
});
|
||||||
|
|
||||||
|
it('import', () => {
|
||||||
|
mount(<SchemaView
|
||||||
|
formType='dialog'
|
||||||
|
schema={schemaObj}
|
||||||
|
viewHelperProps={{
|
||||||
|
mode: 'create',
|
||||||
|
}}
|
||||||
|
onDataChange={() => { }}
|
||||||
|
showFooter={false}
|
||||||
|
isTabView={false}
|
||||||
|
/>);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('export', () => {
|
||||||
|
let schemaObj = new ImportExportSelectionSchema(
|
||||||
|
{imp_exp: 'e', filename: 'test.json'});
|
||||||
|
mount(<SchemaView
|
||||||
|
formType='dialog'
|
||||||
|
schema={schemaObj}
|
||||||
|
viewHelperProps={{
|
||||||
|
mode: 'create',
|
||||||
|
}}
|
||||||
|
onDataChange={() => { }}
|
||||||
|
showFooter={false}
|
||||||
|
isTabView={false}
|
||||||
|
/>);
|
||||||
|
});
|
||||||
|
});
|
322
web/setup.py
322
web/setup.py
@ -16,8 +16,6 @@ import os
|
|||||||
import sys
|
import sys
|
||||||
import builtins
|
import builtins
|
||||||
|
|
||||||
USER_NOT_FOUND = "The specified user ID (%s) could not be found."
|
|
||||||
|
|
||||||
# Grab the SERVER_MODE if it's been set by the runtime
|
# Grab the SERVER_MODE if it's been set by the runtime
|
||||||
if 'SERVER_MODE' in globals():
|
if 'SERVER_MODE' in globals():
|
||||||
builtins.SERVER_MODE = globals()['SERVER_MODE']
|
builtins.SERVER_MODE = globals()['SERVER_MODE']
|
||||||
@ -30,26 +28,10 @@ root = os.path.dirname(os.path.realpath(__file__))
|
|||||||
if sys.path[0] != root:
|
if sys.path[0] != root:
|
||||||
sys.path.insert(0, root)
|
sys.path.insert(0, root)
|
||||||
|
|
||||||
from pgadmin.model import db, User, Version, ServerGroup, Server, \
|
from pgadmin.model import db, Version, SCHEMA_VERSION as CURRENT_SCHEMA_VERSION
|
||||||
SCHEMA_VERSION as CURRENT_SCHEMA_VERSION
|
|
||||||
from pgadmin import create_app
|
from pgadmin import create_app
|
||||||
|
from pgadmin.utils import clear_database_servers, dump_database_servers,\
|
||||||
|
load_database_servers
|
||||||
def add_value(attr_dict, key, value):
|
|
||||||
"""Add a value to the attribute dict if non-empty.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
attr_dict (dict): The dictionary to add the values to
|
|
||||||
key (str): The key for the new value
|
|
||||||
value (str): The value to add
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
The updated attribute dictionary
|
|
||||||
"""
|
|
||||||
if value != "" and value is not None:
|
|
||||||
attr_dict[key] = value
|
|
||||||
|
|
||||||
return attr_dict
|
|
||||||
|
|
||||||
|
|
||||||
def dump_servers(args):
|
def dump_servers(args):
|
||||||
@ -77,139 +59,7 @@ def dump_servers(args):
|
|||||||
|
|
||||||
app = create_app(config.APP_NAME + '-cli')
|
app = create_app(config.APP_NAME + '-cli')
|
||||||
with app.app_context():
|
with app.app_context():
|
||||||
user = User.query.filter_by(email=dump_user).first()
|
dump_database_servers(args.dump_servers, args.servers, dump_user, True)
|
||||||
|
|
||||||
if user is None:
|
|
||||||
print(USER_NOT_FOUND % dump_user)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
user_id = user.id
|
|
||||||
|
|
||||||
# Dict to collect the output
|
|
||||||
object_dict = {}
|
|
||||||
|
|
||||||
# Counters
|
|
||||||
servers_dumped = 0
|
|
||||||
|
|
||||||
# Dump servers
|
|
||||||
servers = Server.query.filter_by(user_id=user_id).all()
|
|
||||||
server_dict = {}
|
|
||||||
for server in servers:
|
|
||||||
if args.servers is None or str(server.id) in args.servers:
|
|
||||||
# Get the group name
|
|
||||||
group_name = ServerGroup.query.filter_by(
|
|
||||||
user_id=user_id, id=server.servergroup_id).first().name
|
|
||||||
|
|
||||||
attr_dict = {}
|
|
||||||
add_value(attr_dict, "Name", server.name)
|
|
||||||
add_value(attr_dict, "Group", group_name)
|
|
||||||
add_value(attr_dict, "Host", server.host)
|
|
||||||
add_value(attr_dict, "HostAddr", server.hostaddr)
|
|
||||||
add_value(attr_dict, "Port", server.port)
|
|
||||||
add_value(attr_dict, "MaintenanceDB", server.maintenance_db)
|
|
||||||
add_value(attr_dict, "Username", server.username)
|
|
||||||
add_value(attr_dict, "Role", server.role)
|
|
||||||
add_value(attr_dict, "SSLMode", server.ssl_mode)
|
|
||||||
add_value(attr_dict, "Comment", server.comment)
|
|
||||||
add_value(attr_dict, "Shared", server.shared)
|
|
||||||
add_value(attr_dict, "DBRestriction", server.db_res)
|
|
||||||
add_value(attr_dict, "PassFile", server.passfile)
|
|
||||||
add_value(attr_dict, "SSLCert", server.sslcert)
|
|
||||||
add_value(attr_dict, "SSLKey", server.sslkey)
|
|
||||||
add_value(attr_dict, "SSLRootCert", server.sslrootcert)
|
|
||||||
add_value(attr_dict, "SSLCrl", server.sslcrl)
|
|
||||||
add_value(attr_dict, "SSLCompression", server.sslcompression)
|
|
||||||
add_value(attr_dict, "BGColor", server.bgcolor)
|
|
||||||
add_value(attr_dict, "FGColor", server.fgcolor)
|
|
||||||
add_value(attr_dict, "Service", server.service)
|
|
||||||
add_value(attr_dict, "Timeout", server.connect_timeout)
|
|
||||||
add_value(attr_dict, "UseSSHTunnel", server.use_ssh_tunnel)
|
|
||||||
add_value(attr_dict, "TunnelHost", server.tunnel_host)
|
|
||||||
add_value(attr_dict, "TunnelPort", server.tunnel_port)
|
|
||||||
add_value(attr_dict, "TunnelUsername", server.tunnel_username)
|
|
||||||
add_value(attr_dict, "TunnelAuthentication",
|
|
||||||
server.tunnel_authentication)
|
|
||||||
|
|
||||||
servers_dumped = servers_dumped + 1
|
|
||||||
|
|
||||||
server_dict[servers_dumped] = attr_dict
|
|
||||||
|
|
||||||
object_dict["Servers"] = server_dict
|
|
||||||
|
|
||||||
try:
|
|
||||||
f = open(args.dump_servers, "w")
|
|
||||||
except Exception as e:
|
|
||||||
print("Error opening output file %s: [%d] %s" %
|
|
||||||
(args.dump_servers, e.errno, e.strerror))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
try:
|
|
||||||
f.write(json.dumps(object_dict, indent=4))
|
|
||||||
except Exception as e:
|
|
||||||
print("Error writing output file %s: [%d] %s" %
|
|
||||||
(args.dump_servers, e.errno, e.strerror))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
print("Configuration for %s servers dumped to %s." %
|
|
||||||
(servers_dumped, args.dump_servers))
|
|
||||||
|
|
||||||
|
|
||||||
def _validate_servers_data(data, is_admin):
|
|
||||||
"""
|
|
||||||
Used internally by load_servers to validate servers data.
|
|
||||||
:param data: servers data
|
|
||||||
:return: error message if any
|
|
||||||
"""
|
|
||||||
skip_servers = []
|
|
||||||
# Loop through the servers...
|
|
||||||
if "Servers" not in data:
|
|
||||||
return ("'Servers' attribute not found in file '%s'" %
|
|
||||||
args.load_servers)
|
|
||||||
|
|
||||||
for server in data["Servers"]:
|
|
||||||
obj = data["Servers"][server]
|
|
||||||
|
|
||||||
# Check if server is shared.Won't import if user is non-admin
|
|
||||||
if obj.get('Shared', None) and not is_admin:
|
|
||||||
print("Won't import the server '%s' as it is shared " %
|
|
||||||
obj["Name"])
|
|
||||||
skip_servers.append(server)
|
|
||||||
continue
|
|
||||||
|
|
||||||
def check_attrib(attrib):
|
|
||||||
if attrib not in obj:
|
|
||||||
return ("'%s' attribute not found for server '%s'" %
|
|
||||||
(attrib, server))
|
|
||||||
return None
|
|
||||||
|
|
||||||
for attrib in ("Group", "Name"):
|
|
||||||
errmsg = check_attrib(attrib)
|
|
||||||
if errmsg:
|
|
||||||
return errmsg
|
|
||||||
|
|
||||||
is_service_attrib_available = obj.get("Service", None) is not None
|
|
||||||
|
|
||||||
if not is_service_attrib_available:
|
|
||||||
for attrib in ("Port", "Username"):
|
|
||||||
errmsg = check_attrib(attrib)
|
|
||||||
if errmsg:
|
|
||||||
return errmsg
|
|
||||||
|
|
||||||
for attrib in ("SSLMode", "MaintenanceDB"):
|
|
||||||
errmsg = check_attrib(attrib)
|
|
||||||
if errmsg:
|
|
||||||
return errmsg
|
|
||||||
|
|
||||||
if "Host" not in obj and "HostAddr" not in obj and not \
|
|
||||||
is_service_attrib_available:
|
|
||||||
return ("'Host', 'HostAddr' or 'Service' attribute "
|
|
||||||
"not found for server '%s'" % server)
|
|
||||||
|
|
||||||
for server in skip_servers:
|
|
||||||
del data["Servers"][server]
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def load_servers(args):
|
def load_servers(args):
|
||||||
@ -232,143 +82,9 @@ def load_servers(args):
|
|||||||
print('SQLite pgAdmin config:', config.SQLITE_PATH)
|
print('SQLite pgAdmin config:', config.SQLITE_PATH)
|
||||||
print('----------')
|
print('----------')
|
||||||
|
|
||||||
try:
|
|
||||||
with open(args.load_servers) as f:
|
|
||||||
data = json.load(f)
|
|
||||||
except json.decoder.JSONDecodeError as e:
|
|
||||||
print("Error parsing input file %s: %s" %
|
|
||||||
(args.load_servers, e))
|
|
||||||
sys.exit(1)
|
|
||||||
except Exception as e:
|
|
||||||
print("Error reading input file %s: [%d] %s" %
|
|
||||||
(args.load_servers, e.errno, e.strerror))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
app = create_app(config.APP_NAME + '-cli')
|
app = create_app(config.APP_NAME + '-cli')
|
||||||
with app.app_context():
|
with app.app_context():
|
||||||
user = User.query.filter_by(email=load_user).first()
|
load_database_servers(args.load_servers, None, load_user, True)
|
||||||
|
|
||||||
if user is None:
|
|
||||||
print(USER_NOT_FOUND % load_user)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
user_id = user.id
|
|
||||||
|
|
||||||
# Counters
|
|
||||||
groups_added = 0
|
|
||||||
servers_added = 0
|
|
||||||
|
|
||||||
# Get the server groups
|
|
||||||
groups = ServerGroup.query.filter_by(user_id=user_id)
|
|
||||||
|
|
||||||
def print_summary():
|
|
||||||
print("Added %d Server Group(s) and %d Server(s)." %
|
|
||||||
(groups_added, servers_added))
|
|
||||||
|
|
||||||
err_msg = _validate_servers_data(data, user.has_role("Administrator"))
|
|
||||||
if err_msg is not None:
|
|
||||||
print(err_msg)
|
|
||||||
print_summary()
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
for server in data["Servers"]:
|
|
||||||
obj = data["Servers"][server]
|
|
||||||
|
|
||||||
# Get the group. Create if necessary
|
|
||||||
group_id = next(
|
|
||||||
(g.id for g in groups if g.name == obj["Group"]), -1)
|
|
||||||
|
|
||||||
if group_id == -1:
|
|
||||||
new_group = ServerGroup()
|
|
||||||
new_group.name = obj["Group"]
|
|
||||||
new_group.user_id = user_id
|
|
||||||
db.session.add(new_group)
|
|
||||||
|
|
||||||
try:
|
|
||||||
db.session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
print("Error creating server group '%s': %s" %
|
|
||||||
(new_group.name, e))
|
|
||||||
print_summary()
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
group_id = new_group.id
|
|
||||||
groups_added = groups_added + 1
|
|
||||||
groups = ServerGroup.query.filter_by(user_id=user_id)
|
|
||||||
|
|
||||||
# Create the server
|
|
||||||
new_server = Server()
|
|
||||||
new_server.name = obj["Name"]
|
|
||||||
new_server.servergroup_id = group_id
|
|
||||||
new_server.user_id = user_id
|
|
||||||
new_server.ssl_mode = obj["SSLMode"]
|
|
||||||
new_server.maintenance_db = obj["MaintenanceDB"]
|
|
||||||
|
|
||||||
new_server.host = obj.get("Host", None)
|
|
||||||
|
|
||||||
new_server.hostaddr = obj.get("HostAddr", None)
|
|
||||||
|
|
||||||
new_server.port = obj.get("Port", None)
|
|
||||||
|
|
||||||
new_server.username = obj.get("Username", None)
|
|
||||||
|
|
||||||
new_server.role = obj.get("Role", None)
|
|
||||||
|
|
||||||
new_server.ssl_mode = obj["SSLMode"]
|
|
||||||
|
|
||||||
new_server.comment = obj.get("Comment", None)
|
|
||||||
|
|
||||||
new_server.db_res = obj.get("DBRestriction", None)
|
|
||||||
|
|
||||||
new_server.passfile = obj.get("PassFile", None)
|
|
||||||
|
|
||||||
new_server.sslcert = obj.get("SSLCert", None)
|
|
||||||
|
|
||||||
new_server.sslkey = obj.get("SSLKey", None)
|
|
||||||
|
|
||||||
new_server.sslrootcert = obj.get("SSLRootCert", None)
|
|
||||||
|
|
||||||
new_server.sslcrl = obj.get("SSLCrl", None)
|
|
||||||
|
|
||||||
new_server.sslcompression = obj.get("SSLCompression", None)
|
|
||||||
|
|
||||||
new_server.bgcolor = obj.get("BGColor", None)
|
|
||||||
|
|
||||||
new_server.fgcolor = obj.get("FGColor", None)
|
|
||||||
|
|
||||||
new_server.service = obj.get("Service", None)
|
|
||||||
|
|
||||||
new_server.connect_timeout = obj.get("Timeout", None)
|
|
||||||
|
|
||||||
new_server.use_ssh_tunnel = obj.get("UseSSHTunnel", None)
|
|
||||||
|
|
||||||
new_server.tunnel_host = obj.get("TunnelHost", None)
|
|
||||||
|
|
||||||
new_server.tunnel_port = obj.get("TunnelPort", None)
|
|
||||||
|
|
||||||
new_server.tunnel_username = obj.get("TunnelUsername", None)
|
|
||||||
|
|
||||||
new_server.tunnel_authentication = \
|
|
||||||
obj.get("TunnelAuthentication", None)
|
|
||||||
|
|
||||||
new_server.shared = \
|
|
||||||
obj.get("Shared", None)
|
|
||||||
|
|
||||||
db.session.add(new_server)
|
|
||||||
|
|
||||||
try:
|
|
||||||
db.session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
print("Error creating server '%s': %s" %
|
|
||||||
(new_server.name, e))
|
|
||||||
print_summary()
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
servers_added = servers_added + 1
|
|
||||||
|
|
||||||
print_summary()
|
|
||||||
|
|
||||||
|
|
||||||
def setup_db():
|
def setup_db():
|
||||||
@ -421,33 +137,7 @@ def clear_servers():
|
|||||||
|
|
||||||
app = create_app(config.APP_NAME + '-cli')
|
app = create_app(config.APP_NAME + '-cli')
|
||||||
with app.app_context():
|
with app.app_context():
|
||||||
user = User.query.filter_by(email=load_user).first()
|
clear_database_servers(load_user, True)
|
||||||
|
|
||||||
if user is None:
|
|
||||||
print(USER_NOT_FOUND % load_user)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
user_id = user.id
|
|
||||||
|
|
||||||
# Remove all servers
|
|
||||||
servers = Server.query.filter_by(user_id=user_id)
|
|
||||||
for server in servers:
|
|
||||||
db.session.delete(server)
|
|
||||||
|
|
||||||
# Remove all groups
|
|
||||||
groups = ServerGroup.query.filter_by(user_id=user_id)
|
|
||||||
for group in groups:
|
|
||||||
db.session.delete(group)
|
|
||||||
servers = Server.query.filter_by(user_id=user_id)
|
|
||||||
|
|
||||||
for server in servers:
|
|
||||||
db.session.delete(server)
|
|
||||||
|
|
||||||
try:
|
|
||||||
db.session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
print("Error clearing server configuration with error (%s)" %
|
|
||||||
str(e))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
@ -543,6 +543,7 @@ module.exports = [{
|
|||||||
'pure|pgadmin.tools.grant_wizard',
|
'pure|pgadmin.tools.grant_wizard',
|
||||||
'pure|pgadmin.tools.maintenance',
|
'pure|pgadmin.tools.maintenance',
|
||||||
'pure|pgadmin.tools.import_export',
|
'pure|pgadmin.tools.import_export',
|
||||||
|
'pure|pgadmin.tools.import_export_servers',
|
||||||
'pure|pgadmin.tools.debugger.controller',
|
'pure|pgadmin.tools.debugger.controller',
|
||||||
'pure|pgadmin.tools.debugger.direct',
|
'pure|pgadmin.tools.debugger.direct',
|
||||||
'pure|pgadmin.node.pga_job',
|
'pure|pgadmin.node.pga_job',
|
||||||
|
@ -295,6 +295,7 @@ var webpackShimConfig = {
|
|||||||
'pgadmin.tools.debugger.utils': path.join(__dirname, './pgadmin/tools/debugger/static/js/debugger_utils'),
|
'pgadmin.tools.debugger.utils': path.join(__dirname, './pgadmin/tools/debugger/static/js/debugger_utils'),
|
||||||
'pgadmin.tools.grant_wizard': path.join(__dirname, './pgadmin/tools/grant_wizard/static/js/grant_wizard'),
|
'pgadmin.tools.grant_wizard': path.join(__dirname, './pgadmin/tools/grant_wizard/static/js/grant_wizard'),
|
||||||
'pgadmin.tools.import_export': path.join(__dirname, './pgadmin/tools/import_export/static/js/import_export'),
|
'pgadmin.tools.import_export': path.join(__dirname, './pgadmin/tools/import_export/static/js/import_export'),
|
||||||
|
'pgadmin.tools.import_export_servers': path.join(__dirname, './pgadmin/tools/import_export_servers/static/js/'),
|
||||||
'pgadmin.tools.maintenance': path.join(__dirname, './pgadmin/tools/maintenance/static/js/maintenance'),
|
'pgadmin.tools.maintenance': path.join(__dirname, './pgadmin/tools/maintenance/static/js/maintenance'),
|
||||||
'pgadmin.tools.restore': path.join(__dirname, './pgadmin/tools/restore/static/js/restore'),
|
'pgadmin.tools.restore': path.join(__dirname, './pgadmin/tools/restore/static/js/restore'),
|
||||||
'pgadmin.tools.schema_diff': path.join(__dirname, './pgadmin/tools/schema_diff/static/js/schema_diff'),
|
'pgadmin.tools.schema_diff': path.join(__dirname, './pgadmin/tools/schema_diff/static/js/schema_diff'),
|
||||||
|
@ -2865,7 +2865,7 @@ circular-json-es6@^2.0.1:
|
|||||||
resolved "https://registry.yarnpkg.com/circular-json-es6/-/circular-json-es6-2.0.2.tgz#e4f4a093e49fb4b6aba1157365746112a78bd344"
|
resolved "https://registry.yarnpkg.com/circular-json-es6/-/circular-json-es6-2.0.2.tgz#e4f4a093e49fb4b6aba1157365746112a78bd344"
|
||||||
integrity sha512-ODYONMMNb3p658Zv+Pp+/XPa5s6q7afhz3Tzyvo+VRh9WIrJ64J76ZC4GQxnlye/NesTn09jvOiuE8+xxfpwhQ==
|
integrity sha512-ODYONMMNb3p658Zv+Pp+/XPa5s6q7afhz3Tzyvo+VRh9WIrJ64J76ZC4GQxnlye/NesTn09jvOiuE8+xxfpwhQ==
|
||||||
|
|
||||||
classnames@*, classnames@^2.2.6:
|
classnames@*, classnames@^2.2.5, classnames@^2.2.6:
|
||||||
version "2.3.1"
|
version "2.3.1"
|
||||||
resolved "https://registry.yarnpkg.com/classnames/-/classnames-2.3.1.tgz#dfcfa3891e306ec1dad105d0e88f4417b8535e8e"
|
resolved "https://registry.yarnpkg.com/classnames/-/classnames-2.3.1.tgz#dfcfa3891e306ec1dad105d0e88f4417b8535e8e"
|
||||||
integrity sha512-OlQdbZ7gLfGarSqxesMesDa5uz7KFbID8Kpq/SxIoNGDqY8lSYs0D+hhtBXhcdB3rcbXArFr7vlHheLk1voeNA==
|
integrity sha512-OlQdbZ7gLfGarSqxesMesDa5uz7KFbID8Kpq/SxIoNGDqY8lSYs0D+hhtBXhcdB3rcbXArFr7vlHheLk1voeNA==
|
||||||
@ -6646,6 +6646,11 @@ nanocolors@^0.1.12:
|
|||||||
resolved "https://registry.yarnpkg.com/nanocolors/-/nanocolors-0.1.12.tgz#8577482c58cbd7b5bb1681db4cf48f11a87fd5f6"
|
resolved "https://registry.yarnpkg.com/nanocolors/-/nanocolors-0.1.12.tgz#8577482c58cbd7b5bb1681db4cf48f11a87fd5f6"
|
||||||
integrity sha512-2nMHqg1x5PU+unxX7PGY7AuYxl2qDx7PSrTRjizr8sxdd3l/3hBuWWaki62qmtYm2U5i4Z5E7GbjlyDFhs9/EQ==
|
integrity sha512-2nMHqg1x5PU+unxX7PGY7AuYxl2qDx7PSrTRjizr8sxdd3l/3hBuWWaki62qmtYm2U5i4Z5E7GbjlyDFhs9/EQ==
|
||||||
|
|
||||||
|
nanoid@^3.0.0:
|
||||||
|
version "3.1.30"
|
||||||
|
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.30.tgz#63f93cc548d2a113dc5dfbc63bfa09e2b9b64362"
|
||||||
|
integrity sha512-zJpuPDwOv8D2zq2WRoMe1HsfZthVewpel9CAvTfc/2mBD1uUT/agc5f7GHGWXlYkFvi1mVxe4IjvP2HNrop7nQ==
|
||||||
|
|
||||||
nanoid@^3.1.23:
|
nanoid@^3.1.23:
|
||||||
version "3.1.23"
|
version "3.1.23"
|
||||||
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.23.tgz#f744086ce7c2bc47ee0a8472574d5c78e4183a81"
|
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.23.tgz#f744086ce7c2bc47ee0a8472574d5c78e4183a81"
|
||||||
@ -7763,6 +7768,16 @@ react-aspen@^1.1.0, react-aspen@^1.1.1:
|
|||||||
path-fx "^2.1.1"
|
path-fx "^2.1.1"
|
||||||
react-window "^1.3.1"
|
react-window "^1.3.1"
|
||||||
|
|
||||||
|
react-checkbox-tree@^1.7.2:
|
||||||
|
version "1.7.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/react-checkbox-tree/-/react-checkbox-tree-1.7.2.tgz#71cb5d22add293a92eb718de8425c8430b2db263"
|
||||||
|
integrity sha512-T0Y3Us2ds5QppOgIM/cSbtdrEBcCGkiz03o2p4elTireAIw0i5k5xPoaTxbjWTFmzgXajUrJzQMlBujEQhOUsQ==
|
||||||
|
dependencies:
|
||||||
|
classnames "^2.2.5"
|
||||||
|
lodash "^4.17.10"
|
||||||
|
nanoid "^3.0.0"
|
||||||
|
prop-types "^15.5.8"
|
||||||
|
|
||||||
react-dom@^16.6.3:
|
react-dom@^16.6.3:
|
||||||
version "16.14.0"
|
version "16.14.0"
|
||||||
resolved "https://registry.yarnpkg.com/react-dom/-/react-dom-16.14.0.tgz#7ad838ec29a777fb3c75c3a190f661cf92ab8b89"
|
resolved "https://registry.yarnpkg.com/react-dom/-/react-dom-16.14.0.tgz#7ad838ec29a777fb3c75c3a190f661cf92ab8b89"
|
||||||
|
Loading…
Reference in New Issue
Block a user