1) Refactored code of columns node. Fixes #4938.

2) Fix an issue where length and precision are not removed from table/column dialog. Fixes #4964.
3) Fix an issue where the Interval data type is not displayed in the properties dialog of table/column. Fixes #4965.
4) Fix an issue where the wrong type is displayed when changing the datatype from timestamp with time zone to timestamp without time zone. Fixes #4761.
5) Change the label from 'Length and Precision' to 'Length/Precision and Scale' for columns.
6) The maximum length for datatype like interval, timestamp with time zone, time with time zone, etc.. is 6. Set the max length to 6 instead of 10.
This commit is contained in:
Akshay Joshi 2019-11-28 18:48:53 +05:30
parent 4ed2d74d9c
commit 4fa1bdba5a
32 changed files with 571 additions and 602 deletions

View File

@ -34,7 +34,7 @@ are disabled if inapplicable.)
column. For more information on the data types that are supported by column. For more information on the data types that are supported by
PostgreSQL, refer to Chapter 8 of the Postgres core documentation. This field PostgreSQL, refer to Chapter 8 of the Postgres core documentation. This field
is required. is required.
* Use the *Length* and *Precision* fields to specify the maximum number of * Use the *Length/Precision* and *Scale* fields to specify the maximum number of
significant digits in a numeric value, or the maximum number of characters in significant digits in a numeric value, or the maximum number of characters in
a text value. a text value.
* Use the drop-down listbox next to *Collation* to apply a collation setting to * Use the drop-down listbox next to *Collation* to apply a collation setting to

Binary file not shown.

Before

Width:  |  Height:  |  Size: 43 KiB

After

Width:  |  Height:  |  Size: 50 KiB

BIN
docs/en_US/images/table_columns.png Executable file → Normal file

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

After

Width:  |  Height:  |  Size: 63 KiB

View File

@ -15,6 +15,7 @@ Housekeeping
| `Issue #4696 <https://redmine.postgresql.org/issues/4696>`_ - Add Reverse Engineered and Modified SQL tests for Materialized Views. | `Issue #4696 <https://redmine.postgresql.org/issues/4696>`_ - Add Reverse Engineered and Modified SQL tests for Materialized Views.
| `Issue #4807 <https://redmine.postgresql.org/issues/4807>`_ - Refactored code of table and it's child nodes. | `Issue #4807 <https://redmine.postgresql.org/issues/4807>`_ - Refactored code of table and it's child nodes.
| `Issue #4938 <https://redmine.postgresql.org/issues/4938>`_ - Refactored code of columns node.
Bug fixes Bug fixes
********* *********
@ -22,6 +23,7 @@ Bug fixes
| `Issue #3538 <https://redmine.postgresql.org/issues/3538>`_ - Fix issue where the Reset button does not get enabled till all the mandatory fields are provided in the dialog. | `Issue #3538 <https://redmine.postgresql.org/issues/3538>`_ - Fix issue where the Reset button does not get enabled till all the mandatory fields are provided in the dialog.
| `Issue #4659 <https://redmine.postgresql.org/issues/4659>`_ - Updated documentation for default privileges to clarify more on the grantor. | `Issue #4659 <https://redmine.postgresql.org/issues/4659>`_ - Updated documentation for default privileges to clarify more on the grantor.
| `Issue #4724 <https://redmine.postgresql.org/issues/4724>`_ - Fix network disconnect issue while establishing the connection via SSH Tunnel and it impossible to expand the Servers node. | `Issue #4724 <https://redmine.postgresql.org/issues/4724>`_ - Fix network disconnect issue while establishing the connection via SSH Tunnel and it impossible to expand the Servers node.
| `Issue #4761 <https://redmine.postgresql.org/issues/4761>`_ - Fix an issue where the wrong type is displayed when changing the datatype from timestamp with time zone to timestamp without time zone.
| `Issue #4792 <https://redmine.postgresql.org/issues/4792>`_ - Ensure that the superuser should be able to create database, as the superuser overrides all the access restrictions. | `Issue #4792 <https://redmine.postgresql.org/issues/4792>`_ - Ensure that the superuser should be able to create database, as the superuser overrides all the access restrictions.
| `Issue #4818 <https://redmine.postgresql.org/issues/4818>`_ - Fix server connection drops out issue in query tool. | `Issue #4818 <https://redmine.postgresql.org/issues/4818>`_ - Fix server connection drops out issue in query tool.
| `Issue #4836 <https://redmine.postgresql.org/issues/4836>`_ - Updated the json file name from 'servers.json' to 'pgadmin4/servers.json' in the container deployment section of the documentation. | `Issue #4836 <https://redmine.postgresql.org/issues/4836>`_ - Updated the json file name from 'servers.json' to 'pgadmin4/servers.json' in the container deployment section of the documentation.
@ -32,3 +34,5 @@ Bug fixes
| `Issue #4933 <https://redmine.postgresql.org/issues/4933>`_ - Ensure that the Servers collection node should expand independently of server connections. | `Issue #4933 <https://redmine.postgresql.org/issues/4933>`_ - Ensure that the Servers collection node should expand independently of server connections.
| `Issue #4934 <https://redmine.postgresql.org/issues/4934>`_ - Fix the help button link on the User Management dialog. | `Issue #4934 <https://redmine.postgresql.org/issues/4934>`_ - Fix the help button link on the User Management dialog.
| `Issue #4935 <https://redmine.postgresql.org/issues/4935>`_ - Fix accessibility issues. | `Issue #4935 <https://redmine.postgresql.org/issues/4935>`_ - Fix accessibility issues.
| `Issue #4964 <https://redmine.postgresql.org/issues/4964>`_ - Fix an issue where length and precision are not removed from table/column dialog.
| `Issue #4965 <https://redmine.postgresql.org/issues/4965>`_ - Fix an issue where the Interval data type is not displayed in the properties dialog of table/column.

View File

@ -50,9 +50,9 @@ the *Columns* table:
the column. This can include array specifiers. For more information on the the column. This can include array specifiers. For more information on the
data types supported by PostgreSQL, refer to Chapter 8 of the core data types supported by PostgreSQL, refer to Chapter 8 of the core
documentation. documentation.
* If enabled, use the *Length* and *Precision* fields to specify the maximum * If enabled, use the *Length/Precision* and *Scale* fields to specify the
number of significant digits in a numeric value, or the maximum number of maximum number of significant digits in a numeric value, or the maximum
characters in a text value. number of characters in a text value.
* Move the *Not NULL?* switch to the *Yes* position to require a value in the * Move the *Not NULL?* switch to the *Yes* position to require a value in the
column field. column field.
* Move the *Primary key?* switch to the *Yes* position to specify the column is * Move the *Primary key?* switch to the *Yes* position to specify the column is

View File

@ -24,6 +24,8 @@ from .utils import BaseTableView
from pgadmin.utils.preferences import Preferences from pgadmin.utils.preferences import Preferences
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\ from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
constraints.foreign_key import utils as fkey_utils constraints.foreign_key import utils as fkey_utils
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
columns import utils as column_utils
class TableModule(SchemaChildModule): class TableModule(SchemaChildModule):
@ -210,10 +212,6 @@ class TableView(BaseTableView, DataTypeReader, VacuumSettings):
* get_toast_table_vacuum(gid, sid, did, scid=None, tid=None) * get_toast_table_vacuum(gid, sid, did, scid=None, tid=None)
- Fetch the default values for toast table auto-vacuum - Fetch the default values for toast table auto-vacuum
* _parse_format_columns(self, data, mode=None):
- This function will parse and return formatted list of columns
added by user
* get_index_constraint_sql(self, did, tid, data): * get_index_constraint_sql(self, did, tid, data):
- This function will generate modified sql for index constraints - This function will generate modified sql for index constraints
(Primary Key & Unique) (Primary Key & Unique)
@ -886,75 +884,6 @@ class TableView(BaseTableView, DataTypeReader, VacuumSettings):
except Exception as e: except Exception as e:
return internal_server_error(errormsg=str(e)) return internal_server_error(errormsg=str(e))
def _parse_format_columns(self, data, mode=None):
"""
data:
Data coming from client side
Returns:
This function will parse and return formatted list of columns
added by user
"""
columns = data['columns']
# 'EDIT' mode
if mode is not None:
for action in ['added', 'changed']:
if action in columns:
final_columns = []
for c in columns[action]:
if 'inheritedfrom' not in c:
final_columns.append(c)
for c in final_columns:
if 'attacl' in c:
if 'added' in c['attacl']:
c['attacl']['added'] = parse_priv_to_db(
c['attacl']['added'], self.column_acl
)
elif 'changed' in c['attacl']:
c['attacl']['changed'] = parse_priv_to_db(
c['attacl']['changed'], self.column_acl
)
elif 'deleted' in c['attacl']:
c['attacl']['deleted'] = parse_priv_to_db(
c['attacl']['deleted'], self.column_acl
)
if 'cltype' in c:
# check type for '[]' in it
c['cltype'], c['hasSqrBracket'] = \
self._cltype_formatter(c['cltype'])
c = TableView.convert_length_precision_to_string(c)
data['columns'][action] = final_columns
else:
# We need to exclude all the columns which are inherited from other
# tables 'CREATE' mode
final_columns = []
for c in columns:
if 'inheritedfrom' not in c:
final_columns.append(c)
# Now we have all lis of columns which we need
# to include in our create definition, Let's format them
for c in final_columns:
if 'attacl' in c:
c['attacl'] = parse_priv_to_db(
c['attacl'], self.column_acl
)
if 'cltype' in c:
# check type for '[]' in it
c['cltype'], c['hasSqrBracket'] = \
self._cltype_formatter(c['cltype'])
c = TableView.convert_length_precision_to_string(c)
data['columns'] = final_columns
return data
@BaseTableView.check_precondition @BaseTableView.check_precondition
def create(self, gid, sid, did, scid): def create(self, gid, sid, did, scid):
""" """
@ -1000,7 +929,7 @@ class TableView(BaseTableView, DataTypeReader, VacuumSettings):
data['relacl'] = parse_priv_to_db(data['relacl'], self.acl) data['relacl'] = parse_priv_to_db(data['relacl'], self.acl)
# Parse & format columns # Parse & format columns
data = self._parse_format_columns(data) data = column_utils.parse_format_columns(data)
data = TableView.check_and_convert_name_to_string(data) data = TableView.check_and_convert_name_to_string(data)
# 'coll_inherits' is Array but it comes as string from browser # 'coll_inherits' is Array but it comes as string from browser

View File

@ -23,6 +23,8 @@ from pgadmin.browser.server_groups.servers.utils import parse_priv_from_db, \
from pgadmin.browser.utils import PGChildNodeView from pgadmin.browser.utils import PGChildNodeView
from pgadmin.utils.ajax import make_json_response, internal_server_error, \ from pgadmin.utils.ajax import make_json_response, internal_server_error, \
make_response as ajax_response, gone make_response as ajax_response, gone
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
columns import utils as column_utils
from pgadmin.utils.driver import get_driver from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER from config import PG_DEFAULT_DRIVER
from pgadmin.utils import IS_PY2 from pgadmin.utils import IS_PY2
@ -215,16 +217,9 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
self.acl = ['a', 'r', 'w', 'x'] self.acl = ['a', 'r', 'w', 'x']
# We need parent's name eg table name and schema name # We need parent's name eg table name and schema name
SQL = render_template("/".join([self.template_path, schema, table = column_utils.get_parent(self.conn, kwargs['tid'])
'get_parent.sql']), self.schema = schema
tid=kwargs['tid']) self.table = table
status, rset = self.conn.execute_2darray(SQL)
if not status:
return internal_server_error(errormsg=rset)
for row in rset['rows']:
self.schema = row['schema']
self.table = row['table']
return f(*args, **kwargs) return f(*args, **kwargs)
@ -319,138 +314,6 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
status=200 status=200
) )
def _formatter(self, scid, tid, clid, data):
"""
Args:
scid: schema oid
tid: table oid
clid: position of column in table
data: dict of query result
Returns:
It will return formatted output of collections
"""
# To check if column is primary key
if 'attnum' in data and 'indkey' in data:
# Current column
attnum = str(data['attnum'])
# Single/List of primary key column(s)
indkey = str(data['indkey'])
# We will check if column is in primary column(s)
if attnum in indkey.split(" "):
data['is_pk'] = True
else:
data['is_pk'] = False
# Find length & precision of column data type
fulltype = self.get_full_type(
data['typnspname'], data['typname'],
data['isdup'], data['attndims'], data['atttypmod']
)
length = False
precision = False
if 'elemoid' in data:
length, precision, typeval = \
self.get_length_precision(data['elemoid'])
# Set length and precision to None
data['attlen'] = None
data['attprecision'] = None
import re
# If we have length & precision both
if length and precision:
matchObj = re.search(r'(\d+),(\d+)', fulltype)
if matchObj:
data['attlen'] = matchObj.group(1)
data['attprecision'] = matchObj.group(2)
elif length:
# If we have length only
matchObj = re.search(r'(\d+)', fulltype)
if matchObj:
data['attlen'] = matchObj.group(1)
data['attprecision'] = None
# We need to fetch inherited tables for each table
SQL = render_template("/".join([self.template_path,
'get_inherited_tables.sql']),
tid=tid)
status, inh_res = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=inh_res)
for row in inh_res['rows']:
if row['attrname'] == data['name']:
data['is_inherited'] = True
data['tbls_inherited'] = row['inhrelname']
# We need to format variables according to client js collection
if 'attoptions' in data and data['attoptions'] is not None:
spcoptions = []
for spcoption in data['attoptions']:
k, v = spcoption.split('=')
spcoptions.append({'name': k, 'value': v})
data['attoptions'] = spcoptions
# Need to format security labels according to client js collection
if 'seclabels' in data and data['seclabels'] is not None:
seclabels = []
for seclbls in data['seclabels']:
k, v = seclbls.split('=')
seclabels.append({'provider': k, 'label': v})
data['seclabels'] = seclabels
# We need to parse & convert ACL coming from database to json format
SQL = render_template("/".join([self.template_path, 'acl.sql']),
tid=tid, clid=clid)
status, acl = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=acl)
# We will set get privileges from acl sql so we don't need
# it from properties sql
data['attacl'] = []
for row in acl['rows']:
priv = parse_priv_from_db(row)
data.setdefault(row['deftype'], []).append(priv)
# we are receiving request when in edit mode
# we will send filtered types related to current type
type_id = data['atttypid']
SQL = render_template("/".join([self.template_path,
'is_referenced.sql']),
tid=tid, clid=clid)
status, is_reference = self.conn.execute_scalar(SQL)
edit_types_list = list()
# We will need present type in edit mode
edit_types_list.append(data['cltype'])
if int(is_reference) == 0:
SQL = render_template(
"/".join([self.template_path, 'edit_mode_types.sql']),
type_id=type_id
)
status, rset = self.conn.execute_2darray(SQL)
for row in rset['rows']:
edit_types_list.append(row['typname'])
data['edit_types'] = edit_types_list
data['cltype'] = DataTypeReader.parse_type_name(data['cltype'])
return data
@check_precondition @check_precondition
def properties(self, gid, sid, did, scid, tid, clid): def properties(self, gid, sid, did, scid, tid, clid):
""" """
@ -485,61 +348,13 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
# Making copy of output for future use # Making copy of output for future use
data = dict(res['rows'][0]) data = dict(res['rows'][0])
data = self._formatter(scid, tid, clid, data) data = column_utils.column_formatter(self.conn, tid, clid, data)
return ajax_response( return ajax_response(
response=data, response=data,
status=200 status=200
) )
def _cltype_formatter(self, type):
"""
Args:
data: Type string
Returns:
We need to remove [] from type and append it
after length/precision so we will set flag for
sql template
"""
if '[]' in type:
type = type.replace('[]', '')
self.hasSqrBracket = True
else:
self.hasSqrBracket = False
return type
@staticmethod
def convert_length_precision_to_string(data):
"""
This function is used to convert length & precision to string
to handle case like when user gives 0 as length
Args:
data: Data from client
Returns:
Converted data
"""
# We need to handle the below case because jquery has changed
# undefined/null values to empty strings
# https://github.com/jquery/jquery/commit/36d2d9ae937f626d98319ed850905e8d1cbfd078
if 'attlen' in data and data['attlen'] == '':
data['attlen'] = None
elif 'attlen' in data and data['attlen'] is not None:
data['attlen'] = str(data['attlen'])
if 'attprecision' in data and data['attprecision'] == '':
data['attprecision'] = None
elif 'attprecision' in data and data['attprecision'] is not None:
data['attprecision'] = str(data['attprecision'])
return data
@check_precondition @check_precondition
def create(self, gid, sid, did, scid, tid): def create(self, gid, sid, did, scid, tid):
""" """
@ -590,9 +405,9 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
data['table'] = self.table data['table'] = self.table
# check type for '[]' in it # check type for '[]' in it
data['cltype'] = self._cltype_formatter(data['cltype']) data['cltype'], data['hasSqrBracket'] = \
data['hasSqrBracket'] = self.hasSqrBracket column_utils.type_formatter(data['cltype'])
data = self.convert_length_precision_to_string(data) data = column_utils.convert_length_precision_to_string(data)
SQL = render_template("/".join([self.template_path, SQL = render_template("/".join([self.template_path,
'create.sql']), 'create.sql']),
@ -711,8 +526,8 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
# check type for '[]' in it # check type for '[]' in it
if 'cltype' in data: if 'cltype' in data:
data['cltype'] = self._cltype_formatter(data['cltype']) data['cltype'], data['hasSqrBracket'] = \
data['hasSqrBracket'] = self.hasSqrBracket column_utils.type_formatter(data['cltype'])
SQL, name = self.get_sql(scid, tid, clid, data) SQL, name = self.get_sql(scid, tid, clid, data)
if not isinstance(SQL, (str, unicode)): if not isinstance(SQL, (str, unicode)):
@ -754,8 +569,8 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
# check type for '[]' in it # check type for '[]' in it
if 'cltype' in data: if 'cltype' in data:
data['cltype'] = self._cltype_formatter(data['cltype']) data['cltype'], data['hasSqrBracket'] = \
data['hasSqrBracket'] = self.hasSqrBracket column_utils.type_formatter(data['cltype'])
try: try:
SQL, name = self.get_sql(scid, tid, clid, data) SQL, name = self.get_sql(scid, tid, clid, data)
@ -776,7 +591,7 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
""" """
This function will genrate sql from model data This function will genrate sql from model data
""" """
data = self.convert_length_precision_to_string(data) data = column_utils.convert_length_precision_to_string(data)
if clid is not None: if clid is not None:
SQL = render_template( SQL = render_template(
@ -794,12 +609,13 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
) )
old_data = dict(res['rows'][0]) old_data = dict(res['rows'][0])
# We will add table & schema as well # We will add table & schema as well
old_data = self._formatter(scid, tid, clid, old_data) old_data = column_utils.column_formatter(
self.conn, tid, clid, old_data)
# check type for '[]' in it # check type for '[]' in it
if 'cltype' in old_data: if 'cltype' in old_data:
old_data['cltype'] = self._cltype_formatter(old_data['cltype']) old_data['cltype'], old_data['hasSqrBracket'] = \
old_data['hasSqrBracket'] = self.hasSqrBracket column_utils.type_formatter(old_data['cltype'])
if 'cltype' in data and data['cltype'] != old_data['cltype']: if 'cltype' in data and data['cltype'] != old_data['cltype']:
length, precision, typeval = \ length, precision, typeval = \
@ -897,11 +713,11 @@ class ColumnsView(PGChildNodeView, DataTypeReader):
data['table'] = self.table data['table'] = self.table
# check type for '[]' in it # check type for '[]' in it
if 'cltype' in data: if 'cltype' in data:
data['cltype'] = self._cltype_formatter(data['cltype']) data['cltype'], data['hasSqrBracket'] = \
data['hasSqrBracket'] = self.hasSqrBracket column_utils.type_formatter(data['cltype'])
# We will add table & schema as well # We will add table & schema as well
data = self._formatter(scid, tid, clid, data) data = column_utils.column_formatter(self.conn, tid, clid, data)
SQL, name = self.get_sql(scid, tid, None, data, is_sql=True) SQL, name = self.get_sql(scid, tid, None, data, is_sql=True)
if not isinstance(SQL, (str, unicode)): if not isinstance(SQL, (str, unicode)):

View File

@ -176,8 +176,10 @@ define('pgadmin.node.column', [
attacl: undefined, attacl: undefined,
description: undefined, description: undefined,
parent_tbl: undefined, parent_tbl: undefined,
min_val: undefined, min_val_attlen: undefined,
max_val: undefined, min_val_attprecision: undefined,
max_val_attlen: undefined,
max_val_attprecision: undefined,
edit_types: undefined, edit_types: undefined,
is_primary_key: false, is_primary_key: false,
inheritedfrom: undefined, inheritedfrom: undefined,
@ -348,7 +350,7 @@ define('pgadmin.node.column', [
return _.isUndefined(m.top.node_info['table'] || m.top.node_info['view'] || m.top.node_info['mview']); return _.isUndefined(m.top.node_info['table'] || m.top.node_info['view'] || m.top.node_info['mview']);
}, },
},{ },{
id: 'attlen', label: gettext('Length'), cell: IntegerDepCell, id: 'attlen', label: gettext('Length/Precision'), cell: IntegerDepCell,
deps: ['cltype'], type: 'int', group: gettext('Definition'), cellHeaderClasses:'width_percent_20', deps: ['cltype'], type: 'int', group: gettext('Definition'), cellHeaderClasses:'width_percent_20',
disabled: function(m) { disabled: function(m) {
var of_type = m.get('cltype'), var of_type = m.get('cltype'),
@ -357,8 +359,8 @@ define('pgadmin.node.column', [
if ( of_type == o.value ) { if ( of_type == o.value ) {
if(o.length) if(o.length)
{ {
m.set('min_val', o.min_val, {silent: true}); m.set('min_val_attlen', o.min_val, {silent: true});
m.set('max_val', o.max_val, {silent: true}); m.set('max_val_attlen', o.max_val, {silent: true});
flag = false; flag = false;
} }
} }
@ -388,8 +390,8 @@ define('pgadmin.node.column', [
_.each(m.datatypes, function(o) { _.each(m.datatypes, function(o) {
if ( of_type == o.value ) { if ( of_type == o.value ) {
if(o.length) { if(o.length) {
m.set('min_val', o.min_val, {silent: true}); m.set('min_val_attlen', o.min_val, {silent: true});
m.set('max_val', o.max_val, {silent: true}); m.set('max_val_attlen', o.max_val, {silent: true});
flag = true; flag = true;
} }
} }
@ -404,7 +406,7 @@ define('pgadmin.node.column', [
return flag; return flag;
}, },
},{ },{
id: 'attprecision', label: gettext('Precision'), cell: IntegerDepCell, id: 'attprecision', label: gettext('Scale'), cell: IntegerDepCell,
deps: ['cltype'], type: 'int', group: gettext('Definition'), cellHeaderClasses:'width_percent_20', deps: ['cltype'], type: 'int', group: gettext('Definition'), cellHeaderClasses:'width_percent_20',
disabled: function(m) { disabled: function(m) {
var of_type = m.get('cltype'), var of_type = m.get('cltype'),
@ -412,8 +414,8 @@ define('pgadmin.node.column', [
_.each(m.datatypes, function(o) { _.each(m.datatypes, function(o) {
if ( of_type == o.value ) { if ( of_type == o.value ) {
if(o.precision) { if(o.precision) {
m.set('min_val', 0, {silent: true}); m.set('min_val_attprecision', 0, {silent: true});
m.set('max_val', o.max_val, {silent: true}); m.set('max_val_attprecision', o.max_val, {silent: true});
flag = false; flag = false;
} }
} }
@ -442,8 +444,8 @@ define('pgadmin.node.column', [
_.each(m.datatypes, function(o) { _.each(m.datatypes, function(o) {
if ( of_type == o.value ) { if ( of_type == o.value ) {
if(o.precision) { if(o.precision) {
m.set('min_val', 0, {silent: true}); m.set('min_val_attprecision', 0, {silent: true});
m.set('max_val', o.max_val, {silent: true}); m.set('max_val_attprecision', o.max_val, {silent: true});
flag = true; flag = true;
} }
} }
@ -722,10 +724,10 @@ define('pgadmin.node.column', [
&& !_.isNull(this.get('attlen')) && !_.isNull(this.get('attlen'))
&& this.get('attlen') !== '') { && this.get('attlen') !== '') {
// Validation for Length field // Validation for Length field
if (this.get('attlen') < this.get('min_val')) if (this.get('attlen') < this.get('min_val_attlen'))
msg = gettext('Length should not be less than: ') + this.get('min_val'); msg = gettext('Length/Precision should not be less than: ') + this.get('min_val_attlen');
if (this.get('attlen') > this.get('max_val')) if (this.get('attlen') > this.get('max_val_attlen'))
msg = gettext('Length should not be greater than: ') + this.get('max_val'); msg = gettext('Length/Precision should not be greater than: ') + this.get('max_val_attlen');
// If we have any error set then throw it to user // If we have any error set then throw it to user
if(msg) { if(msg) {
this.errorModel.set('attlen', msg); this.errorModel.set('attlen', msg);
@ -738,10 +740,10 @@ define('pgadmin.node.column', [
&& !_.isNull(this.get('attprecision')) && !_.isNull(this.get('attprecision'))
&& this.get('attprecision') !== '') { && this.get('attprecision') !== '') {
// Validation for precision field // Validation for precision field
if (this.get('attprecision') < this.get('min_val')) if (this.get('attprecision') < this.get('min_val_attprecision'))
msg = gettext('Precision should not be less than: ') + this.get('min_val'); msg = gettext('Scale should not be less than: ') + this.get('min_val_attprecision');
if (this.get('attprecision') > this.get('max_val')) if (this.get('attprecision') > this.get('max_val_attprecision'))
msg = gettext('Precision should not be greater than: ') + this.get('max_val'); msg = gettext('Scale should not be greater than: ') + this.get('max_val_attprecision');
// If we have any error set then throw it to user // If we have any error set then throw it to user
if(msg) { if(msg) {
this.errorModel.set('attprecision', msg); this.errorModel.set('attprecision', msg);

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(50) COLLATE pg_catalog."C"; ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(1) COLLATE pg_catalog."C";
COMMENT ON COLUMN testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#" numeric(15,0) NOT NULL; ADD COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#" numeric(15,6) NOT NULL;
COMMENT ON COLUMN testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_4_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_4_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -0,0 +1,15 @@
-- Column: testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_4_$%{}[]()&*^!@""'`\/#"
-- ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#" numeric NOT NULL;
COMMENT ON COLUMN testschema."table_2_$%{}[]()&*^!@""'`\/#"."new_col_4_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter';
ALTER TABLE testschema."table_2_$%{}[]()&*^!@""'`\/#"
ALTER COLUMN "new_col_4_$%{}[]()&*^!@""'`\/#"
SET (n_distinct=1);
GRANT ALL("new_col_4_$%{}[]()&*^!@""'`\/#") ON testschema."table_2_$%{}[]()&*^!@""'`\/#" TO PUBLIC;

View File

@ -195,12 +195,23 @@
"name": "new_col_4_$%{}[]()&*^!@\"'`\\/#", "name": "new_col_4_$%{}[]()&*^!@\"'`\\/#",
"attnum": 4, "attnum": 4,
"attlen":"15", "attlen":"15",
"attprecision":"0", "attprecision":"6",
"description": "Comment for alter", "description": "Comment for alter",
"attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]} "attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]}
}, },
"expected_sql_file": "alter_column_numeric.sql" "expected_sql_file": "alter_column_numeric.sql"
}, },
{
"type": "alter",
"name": "Alter Column (Remove Length)",
"endpoint": "NODE-column.obj_id",
"sql_endpoint": "NODE-column.sql_id",
"data": {
"attnum": 3,
"attlen":""
},
"expected_sql_file": "alter_column_remove_length.sql"
},
{ {
"type": "delete", "type": "delete",
"name": "Drop Column (Numeric type with Length Precision & Variables)", "name": "Drop Column (Numeric type with Length Precision & Variables)",

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(50) COLLATE pg_catalog."C"; ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(1) COLLATE pg_catalog."C";
COMMENT ON COLUMN testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#" numeric(15,0) NOT NULL; ADD COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#" numeric(15,6) NOT NULL;
COMMENT ON COLUMN testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_5_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_5_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -0,0 +1,15 @@
-- Column: testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_5_$%{}[]()&*^!@""'`\/#"
-- ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#" numeric NOT NULL;
COMMENT ON COLUMN testschema."table_3_$%{}[]()&*^!@""'`\/#"."new_col_5_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter';
ALTER TABLE testschema."table_3_$%{}[]()&*^!@""'`\/#"
ALTER COLUMN "new_col_5_$%{}[]()&*^!@""'`\/#"
SET (n_distinct=1);
GRANT ALL("new_col_5_$%{}[]()&*^!@""'`\/#") ON testschema."table_3_$%{}[]()&*^!@""'`\/#" TO PUBLIC;

View File

@ -238,12 +238,23 @@
"name": "new_col_5_$%{}[]()&*^!@\"'`\\/#", "name": "new_col_5_$%{}[]()&*^!@\"'`\\/#",
"attnum": 5, "attnum": 5,
"attlen":"15", "attlen":"15",
"attprecision":"0", "attprecision":"6",
"description": "Comment for alter", "description": "Comment for alter",
"attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]} "attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]}
}, },
"expected_sql_file": "alter_column_numeric.sql" "expected_sql_file": "alter_column_numeric.sql"
}, },
{
"type": "alter",
"name": "Alter Column (Remove Length)",
"endpoint": "NODE-column.obj_id",
"sql_endpoint": "NODE-column.sql_id",
"data": {
"attnum": 3,
"attlen":""
},
"expected_sql_file": "alter_column_remove_length.sql"
},
{ {
"type": "delete", "type": "delete",
"name": "Drop Column (Numeric type with Length Precision & Variables)", "name": "Drop Column (Numeric type with Length Precision & Variables)",

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(50) COLLATE pg_catalog."C"; ADD COLUMN "new_col_2_$%{}[]()&*^!@""'`\/#" character(1) COLLATE pg_catalog."C";
COMMENT ON COLUMN testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_2_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -3,7 +3,7 @@
-- ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#"; -- ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#" numeric(15,0) NOT NULL; ADD COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#" numeric(15,6) NOT NULL;
COMMENT ON COLUMN testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_3_$%{}[]()&*^!@""'`\/#" COMMENT ON COLUMN testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_3_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter'; IS 'Comment for alter';

View File

@ -0,0 +1,15 @@
-- Column: testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_3_$%{}[]()&*^!@""'`\/#"
-- ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#" DROP COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#";
ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#"
ADD COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#" numeric NOT NULL;
COMMENT ON COLUMN testschema."table_1_$%{}[]()&*^!@""'`\/#"."new_col_3_$%{}[]()&*^!@""'`\/#"
IS 'Comment for alter';
ALTER TABLE testschema."table_1_$%{}[]()&*^!@""'`\/#"
ALTER COLUMN "new_col_3_$%{}[]()&*^!@""'`\/#"
SET (n_distinct=1);
GRANT ALL("new_col_3_$%{}[]()&*^!@""'`\/#") ON testschema."table_1_$%{}[]()&*^!@""'`\/#" TO PUBLIC;

View File

@ -135,12 +135,23 @@
"name": "new_col_3_$%{}[]()&*^!@\"'`\\/#", "name": "new_col_3_$%{}[]()&*^!@\"'`\\/#",
"attnum": 3, "attnum": 3,
"attlen":"15", "attlen":"15",
"attprecision":"0", "attprecision":"6",
"description": "Comment for alter", "description": "Comment for alter",
"attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]} "attacl":{"added":[{"grantee":"PUBLIC","grantor":"postgres","privileges":[{"privilege_type":"a","privilege":true,"with_grant":false},{"privilege_type":"r","privilege":true,"with_grant":false},{"privilege_type":"w","privilege":true,"with_grant":false},{"privilege_type":"x","privilege":true,"with_grant":false}]}]}
}, },
"expected_sql_file": "alter_column_numeric.sql" "expected_sql_file": "alter_column_numeric.sql"
}, },
{
"type": "alter",
"name": "Alter Column (Remove Length)",
"endpoint": "NODE-column.obj_id",
"sql_endpoint": "NODE-column.sql_id",
"data": {
"attnum": 3,
"attlen":""
},
"expected_sql_file": "alter_column_remove_length.sql"
},
{ {
"type": "delete", "type": "delete",
"name": "Drop Column (Numeric type with Length Precision & Variables)", "name": "Drop Column (Numeric type with Length Precision & Variables)",

View File

@ -53,7 +53,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
old_len=5, old_len=5,
new_precision=4, new_precision=4,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision})[];' '{column} TYPE numeric({len}, {precision})[];'
)), )),
('msql column change numeric precision', ('msql column change numeric precision',
dict( dict(
@ -62,7 +62,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
old_len=6, old_len=6,
new_precision=5, new_precision=5,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision});' '{column} TYPE numeric({len}, {precision});'
)), )),
('msql column change numeric array length', ('msql column change numeric array length',
dict( dict(
@ -71,7 +71,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
new_len=8, new_len=8,
old_precision=3, old_precision=3,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision})[];' '{column} TYPE numeric({len}, {precision})[];'
)), )),
('msql column change numeric length', ('msql column change numeric length',
dict( dict(
@ -80,7 +80,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
new_len=8, new_len=8,
old_precision=4, old_precision=4,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision});' '{column} TYPE numeric({len}, {precision});'
)), )),
('msql column change numeric array len and precision', ('msql column change numeric array len and precision',
dict( dict(
@ -89,7 +89,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
new_len=15, new_len=15,
new_precision=8, new_precision=8,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision})[];' '{column} TYPE numeric({len}, {precision})[];'
)), )),
('msql column change numeric len and precision', ('msql column change numeric len and precision',
dict( dict(
@ -98,7 +98,7 @@ class ColumnMsqlTestCase(BaseTestGenerator):
new_len=14, new_len=14,
new_precision=9, new_precision=9,
expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN ' expected_res='ALTER TABLE {schema}.{table}\n ALTER COLUMN '
'{column} TYPE numeric ({len}, {precision});' '{column} TYPE numeric({len}, {precision});'
)) ))
] ]

View File

@ -0,0 +1,360 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2019, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
""" Implements Utility class for Compound Triggers. """
from flask import render_template
from flask_babelex import gettext as _
from pgadmin.utils.ajax import internal_server_error
from pgadmin.utils.exception import ObjectGone
from pgadmin.browser.server_groups.servers.databases.schemas.utils \
import DataTypeReader
from pgadmin.browser.server_groups.servers.utils import parse_priv_from_db, \
parse_priv_to_db
from functools import wraps
def get_template_path(f):
"""
This function will behave as a decorator which will prepare
the template path based on database server version.
"""
@wraps(f)
def wrap(*args, **kwargs):
# Here args[0] will hold the connection object
conn_obj = args[0]
if 'template_path' not in kwargs:
kwargs['template_path'] = 'columns/sql/#{0}#'.format(
conn_obj.manager.version)
return f(*args, **kwargs)
return wrap
@get_template_path
def get_parent(conn, tid, template_path=None):
"""
This function will return the parent of the given table.
:param conn: Connection Object
:param tid: Table oid
:param template_path: Optional template path
:return:
"""
SQL = render_template("/".join([template_path,
'get_parent.sql']), tid=tid)
status, rset = conn.execute_2darray(SQL)
if not status:
raise Exception(rset)
schema = ''
table = ''
if 'rows' in rset and len(rset['rows']) > 0:
schema = rset['rows'][0]['schema']
table = rset['rows'][0]['table']
return schema, table
@get_template_path
def column_formatter(conn, tid, clid, data, template_path=None):
"""
This function will return formatted output of query result
as per client model format for column node
:param conn: Connection Object
:param tid: Table ID
:param clid: Column ID
:param data: Data
:param template_path: Optional template path
:return:
"""
# To check if column is primary key
if 'attnum' in data and 'indkey' in data:
# Current column
attnum = str(data['attnum'])
# Single/List of primary key column(s)
indkey = str(data['indkey'])
# We will check if column is in primary column(s)
if attnum in indkey.split(" "):
data['is_pk'] = True
data['is_primary_key'] = True
else:
data['is_pk'] = False
data['is_primary_key'] = False
# Fetch length and precision
data = fetch_length_precision(data)
# We need to fetch inherited tables for each table
SQL = render_template("/".join([template_path,
'get_inherited_tables.sql']), tid=tid)
status, inh_res = conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=inh_res)
for row in inh_res['rows']:
if row['attrname'] == data['name']:
data['is_inherited'] = True
data['tbls_inherited'] = row['inhrelname']
# We need to format variables according to client js collection
if 'attoptions' in data and data['attoptions'] is not None:
spcoptions = []
for spcoption in data['attoptions']:
k, v = spcoption.split('=')
spcoptions.append({'name': k, 'value': v})
data['attoptions'] = spcoptions
# Need to format security labels according to client js collection
if 'seclabels' in data and data['seclabels'] is not None:
seclabels = []
for seclbls in data['seclabels']:
k, v = seclbls.split('=')
seclabels.append({'provider': k, 'label': v})
data['seclabels'] = seclabels
# We need to parse & convert ACL coming from database to json format
SQL = render_template("/".join([template_path, 'acl.sql']),
tid=tid, clid=clid)
status, acl = conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=acl)
# We will set get privileges from acl sql so we don't need
# it from properties sql
data['attacl'] = []
for row in acl['rows']:
priv = parse_priv_from_db(row)
data.setdefault(row['deftype'], []).append(priv)
# we are receiving request when in edit mode
# we will send filtered types related to current type
type_id = data['atttypid']
SQL = render_template("/".join([template_path, 'is_referenced.sql']),
tid=tid, clid=clid)
status, is_reference = conn.execute_scalar(SQL)
edit_types_list = list()
# We will need present type in edit mode
edit_types_list.append(data['cltype'])
if int(is_reference) == 0:
SQL = render_template("/".join([template_path,
'edit_mode_types.sql']),
type_id=type_id)
status, rset = conn.execute_2darray(SQL)
for row in rset['rows']:
edit_types_list.append(row['typname'])
data['edit_types'] = edit_types_list
data['cltype'] = DataTypeReader.parse_type_name(data['cltype'])
return data
@get_template_path
def get_formatted_columns(conn, tid, data, other_columns,
table_or_type, template_path=None):
"""
This function will iterate and return formatted data for all
the columns.
:param conn: Connection Object
:param tid: Table ID
:param data: Data
:param other_columns:
:param table_or_type:
:param template_path: Optional template path
:return:
"""
SQL = render_template("/".join([template_path, 'properties.sql']),
tid=tid, show_sys_objects=False)
status, res = conn.execute_dict(SQL)
if not status:
raise Exception(res)
all_columns = res['rows']
# Add inherited from details from other columns - type, table
for col in all_columns:
for other_col in other_columns:
if col['name'] == other_col['name']:
col['inheritedfrom' + table_or_type] = \
other_col['inheritedfrom']
data['columns'] = all_columns
if 'columns' in data and len(data['columns']) > 0:
for column in data['columns']:
column_formatter(conn, tid, column['attnum'], column)
return data
def parse_format_columns(data, mode=None):
"""
This function will parse and return formatted list of columns
added by user.
:param data:
:param mode:
:return:
"""
column_acl = ['a', 'r', 'w', 'x']
columns = data['columns']
# 'EDIT' mode
if mode is not None:
for action in ['added', 'changed']:
if action in columns:
final_columns = []
for c in columns[action]:
if 'inheritedfrom' not in c:
final_columns.append(c)
for c in final_columns:
if 'attacl' in c:
if 'added' in c['attacl']:
c['attacl']['added'] = parse_priv_to_db(
c['attacl']['added'], column_acl
)
elif 'changed' in c['attacl']:
c['attacl']['changed'] = parse_priv_to_db(
c['attacl']['changed'], column_acl
)
elif 'deleted' in c['attacl']:
c['attacl']['deleted'] = parse_priv_to_db(
c['attacl']['deleted'], column_acl
)
if 'cltype' in c:
# check type for '[]' in it
c['cltype'], c['hasSqrBracket'] = \
type_formatter(c['cltype'])
c = convert_length_precision_to_string(c)
data['columns'][action] = final_columns
else:
# We need to exclude all the columns which are inherited from other
# tables 'CREATE' mode
final_columns = []
for c in columns:
if 'inheritedfrom' not in c:
final_columns.append(c)
# Now we have all lis of columns which we need
# to include in our create definition, Let's format them
for c in final_columns:
if 'attacl' in c:
c['attacl'] = parse_priv_to_db(
c['attacl'], column_acl
)
if 'cltype' in c:
# check type for '[]' in it
c['cltype'], c['hasSqrBracket'] = type_formatter(c['cltype'])
c = convert_length_precision_to_string(c)
data['columns'] = final_columns
return data
def convert_length_precision_to_string(data):
"""
This function is used to convert length & precision to string
to handle case like when user gives 0 as length.
:param data:
:return:
"""
# We need to handle the below case because jquery has changed
# undefined/null values to empty strings
# https://github.com/jquery/jquery/commit/36d2d9ae937f626d98319ed850905e8d1cbfd078
if 'attlen' in data and data['attlen'] == '':
data['attlen'] = None
elif 'attlen' in data and data['attlen'] is not None:
data['attlen'] = str(data['attlen'])
if 'attprecision' in data and data['attprecision'] == '':
data['attprecision'] = None
elif 'attprecision' in data and data['attprecision'] is not None:
data['attprecision'] = str(data['attprecision'])
return data
def type_formatter(data_type):
"""
We need to remove [] from type and append it
after length/precision so we will set flag for
sql template.
:param data_type:
:param template_path: Optional template path
:return:
"""
if '[]' in data_type:
return data_type[:-2], True
else:
return data_type, False
def fetch_length_precision(data):
"""
This function is used to fetch the length and precision.
:param data:
:return:
"""
# Find length & precision of column data type
fulltype = DataTypeReader.get_full_type(
data['typnspname'], data['typname'],
data['isdup'], data['attndims'], data['atttypmod'])
length = False
precision = False
if 'elemoid' in data:
length, precision, typeval = \
DataTypeReader.get_length_precision(data['elemoid'])
# Set length and precision to None
data['attlen'] = None
data['attprecision'] = None
import re
# If we have length & precision both
if length and precision:
matchObj = re.search(r'(\d+),(\d+)', fulltype)
if matchObj:
data['attlen'] = matchObj.group(1)
data['attprecision'] = matchObj.group(2)
elif length:
# If we have length only
matchObj = re.search(r'(\d+)', fulltype)
if matchObj:
data['attlen'] = matchObj.group(1)
data['attprecision'] = None
return data

View File

@ -7,7 +7,7 @@
# #
########################################################################## ##########################################################################
""" Implements Utility class for Foreign Keys. """ """ Implements Utility class for Compound Triggers. """
from flask import render_template from flask import render_template
from flask_babelex import gettext as _ from flask_babelex import gettext as _

View File

@ -7,7 +7,7 @@
# #
########################################################################## ##########################################################################
""" Implements Utility class for Exclusion Constraint. """ """ Implements Utility class for Index Constraint. """
from flask import render_template from flask import render_template
from flask_babelex import gettext as _ from flask_babelex import gettext as _

View File

@ -9,7 +9,7 @@ ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
{% endif %} {% endif %}
{### Alter column type and collation ###} {### Alter column type and collation ###}
{% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen is not none and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname)%} {% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname)%}
ALTER TABLE {{conn|qtIdent(data.schema, data.table)}} ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %} ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %}
COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %}; COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %};

View File

@ -6,10 +6,9 @@
{% if data.name and data.name != o_data.name %} {% if data.name and data.name != o_data.name %}
ALTER TABLE {{conn|qtIdent(data.schema, data.table)}} ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
RENAME {{conn|qtIdent(o_data.name)}} TO {{conn|qtIdent(data.name)}}; RENAME {{conn|qtIdent(o_data.name)}} TO {{conn|qtIdent(data.name)}};
{% endif %} {% endif %}
{### Alter column type and collation ###} {### Alter column type and collation ###}
{% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen is not none and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname)%} {% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname)%}
ALTER TABLE {{conn|qtIdent(data.schema, data.table)}} ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %} ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %}
COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %}; COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %};

View File

@ -9,7 +9,7 @@ ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
{% endif %} {% endif %}
{### Alter column type and collation ###} {### Alter column type and collation ###}
{% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen is not none and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname) %} {% if (data.cltype and data.cltype != o_data.cltype) or (data.attlen is defined and data.attlen != o_data.attlen) or (data.attprecision is defined and data.attprecision != o_data.attprecision) or (data.collspcname and data.collspcname != o_data.collspcname) %}
ALTER TABLE {{conn|qtIdent(data.schema, data.table)}} ALTER TABLE {{conn|qtIdent(data.schema, data.table)}}
ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %} ALTER COLUMN {% if data.name %}{{conn|qtTypeIdent(data.name)}}{% else %}{{conn|qtTypeIdent(o_data.name)}}{% endif %} TYPE {{ GET_TYPE.UPDATE_TYPE_SQL(conn, data, o_data) }}{% if data.collspcname and data.collspcname != o_data.collspcname %}
COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %}; COLLATE {{data.collspcname}}{% elif o_data.collspcname %} COLLATE {{o_data.collspcname}}{% endif %};

View File

@ -7,7 +7,7 @@
# #
########################################################################## ##########################################################################
""" Implements Utility class for Foreign Keys. """ """ Implements Utility class for Triggers. """
from flask import render_template from flask import render_template
from flask_babelex import gettext as _ from flask_babelex import gettext as _

View File

@ -28,6 +28,8 @@ from pgadmin.utils import IS_PY2
from pgadmin.utils.compile_template_name import compile_template_path from pgadmin.utils.compile_template_name import compile_template_path
from pgadmin.utils.driver import get_driver from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER from config import PG_DEFAULT_DRIVER
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
columns import utils as column_utils
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\ from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
constraints.foreign_key import utils as fkey_utils constraints.foreign_key import utils as fkey_utils
from pgadmin.browser.server_groups.servers.databases.schemas.tables.\ from pgadmin.browser.server_groups.servers.databases.schemas.tables.\
@ -57,15 +59,6 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
- It will return formatted output of query result - It will return formatted output of query result
as per client model format as per client model format
* _columns_formatter(tid, data):
- It will return formatted output of query result
as per client model format for column node
* _cltype_formatter(type): (staticmethod)
- We need to remove [] from type and append it
after length/precision so we will send flag for
sql template.
* get_table_dependents(self, tid): * get_table_dependents(self, tid):
- This function get the dependents and return ajax response - This function get the dependents and return ajax response
for the table node. for the table node.
@ -149,129 +142,6 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
return wrap return wrap
def _columns_formatter(self, tid, data):
"""
Args:
tid: Table OID
data: dict of query result
Returns:
It will return formatted output of query result
as per client model format for column node
"""
for column in data['columns']:
# We need to format variables according to client js collection
if 'attoptions' in column and column['attoptions'] is not None:
spcoptions = []
for spcoption in column['attoptions']:
k, v = spcoption.split('=')
spcoptions.append({'name': k, 'value': v})
column['attoptions'] = spcoptions
# Need to format security labels according to client js collection
if 'seclabels' in column and column['seclabels'] is not None:
seclabels = []
for seclbls in column['seclabels']:
k, v = seclbls.split('=')
seclabels.append({'provider': k, 'label': v})
column['seclabels'] = seclabels
if 'attnum' in column and column['attnum'] is not None \
and column['attnum'] > 0:
# We need to parse & convert ACL coming from database to
# json format
SQL = render_template("/".join(
[self.column_template_path, 'acl.sql']),
tid=tid, clid=column['attnum']
)
status, acl = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=acl)
# We will set get privileges from acl sql so we don't need
# it from properties sql
column['attacl'] = []
for row in acl['rows']:
priv = parse_priv_from_db(row)
column.setdefault(row['deftype'], []).append(priv)
# we are receiving request when in edit mode
# we will send filtered types related to current type
type_id = column['atttypid']
fulltype = self.get_full_type(
column['typnspname'], column['typname'],
column['isdup'], column['attndims'], column['atttypmod']
)
length = False
precision = False
if 'elemoid' in column:
length, precision, typeval = \
self.get_length_precision(column['elemoid'])
# Set length and precision to None
column['attlen'] = None
column['attprecision'] = None
# If we have length & precision both
if length and precision:
matchObj = re.search(r'(\d+),(\d+)', fulltype)
if matchObj:
column['attlen'] = matchObj.group(1)
column['attprecision'] = matchObj.group(2)
elif length:
# If we have length only
matchObj = re.search(r'(\d+)', fulltype)
if matchObj:
column['attlen'] = matchObj.group(1)
column['attprecision'] = None
SQL = render_template("/".join([self.column_template_path,
'is_referenced.sql']),
tid=tid, clid=column['attnum'])
status, is_reference = self.conn.execute_scalar(SQL)
edit_types_list = list()
# We will need present type in edit mode
edit_types_list.append(column['cltype'])
if int(is_reference) == 0:
SQL = render_template("/".join([self.column_template_path,
'edit_mode_types.sql']),
type_id=type_id)
status, rset = self.conn.execute_2darray(SQL)
for row in rset['rows']:
edit_types_list.append(row['typname'])
column['edit_types'] = edit_types_list
column['cltype'] = DataTypeReader.parse_type_name(
column['cltype']
)
if 'indkey' in column:
# Current column
attnum = str(column['attnum'])
# Single/List of primary key column(s)
indkey = str(column['indkey'])
# We will check if column is in primary column(s)
if attnum in indkey.split(" "):
column['is_primary_key'] = True
else:
column['is_primary_key'] = False
return data
def _formatter(self, did, scid, tid, data): def _formatter(self, did, scid, tid, data):
""" """
Args: Args:
@ -363,28 +233,9 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
# We will fetch all the columns for the table using # We will fetch all the columns for the table using
# columns properties.sql, so we need to set template path # columns properties.sql, so we need to set template path
SQL = render_template("/".join([self.column_template_path, data = column_utils.get_formatted_columns(self.conn, tid,
'properties.sql']), data, other_columns,
tid=tid, table_or_type)
show_sys_objects=False
)
status, res = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=res)
all_columns = res['rows']
# Add inheritedfrom details from other columns - type, table
for col in all_columns:
for other_col in other_columns:
if col['name'] == other_col['name']:
col['inheritedfrom' + table_or_type] = \
other_col['inheritedfrom']
data['columns'] = all_columns
if 'columns' in data and len(data['columns']) > 0:
data = self._columns_formatter(tid, data)
# Here we will add constraint in our output # Here we will add constraint in our output
index_constraints = { index_constraints = {
@ -420,23 +271,6 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
return data return data
@staticmethod
def _cltype_formatter(data_type):
"""
Args:
data_type: Type string
Returns:
We need to remove [] from type and append it
after length/precision so we will send flag for
sql template
"""
if '[]' in data_type:
return data_type[:-2], True
else:
return data_type, False
def get_table_dependents(self, tid): def get_table_dependents(self, tid):
""" """
This function get the dependents and return ajax response This function get the dependents and return ajax response
@ -608,7 +442,7 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
# check type for '[]' in it # check type for '[]' in it
if 'cltype' in c: if 'cltype' in c:
c['cltype'], c['hasSqrBracket'] = \ c['cltype'], c['hasSqrBracket'] = \
self._cltype_formatter(c['cltype']) column_utils.type_formatter(c['cltype'])
sql_header = u"-- Table: {0}\n\n-- ".format( sql_header = u"-- Table: {0}\n\n-- ".format(
self.qtIdent(self.conn, data['schema'], data['name'])) self.qtIdent(self.conn, data['schema'], data['name']))
@ -971,7 +805,7 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
# Parse/Format columns & create sql # Parse/Format columns & create sql
if 'columns' in data: if 'columns' in data:
# Parse the data coming from client # Parse the data coming from client
data = self._parse_format_columns(data, mode='edit') data = column_utils.parse_format_columns(data, mode='edit')
columns = data['columns'] columns = data['columns']
column_sql = '\n' column_sql = '\n'
@ -1010,81 +844,17 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
old_col_data['cltype'], \ old_col_data['cltype'], \
old_col_data['hasSqrBracket'] = \ old_col_data['hasSqrBracket'] = \
self._cltype_formatter(old_col_data['cltype']) column_utils.type_formatter(old_col_data['cltype'])
old_col_data = \ old_col_data = \
BaseTableView.convert_length_precision_to_string( column_utils.convert_length_precision_to_string(
old_col_data old_col_data)
) old_col_data = column_utils.fetch_length_precision(
old_col_data)
fulltype = self.get_full_type(
old_col_data['typnspname'],
old_col_data['typname'],
old_col_data['isdup'],
old_col_data['attndims'],
old_col_data['atttypmod']
)
def get_type_attr(key, data):
"""Utility function"""
if key in data:
return data[key]
return None
# If the column data type has not changed then fetch
# old length and precision
if 'elemoid' in old_col_data and 'cltype' not in c:
length, precision, typeval = \
self.get_length_precision(
old_col_data['elemoid'])
# If we have length & precision both
if length and precision:
matchObj = re.search(r'(\d+),(\d+)', fulltype)
if matchObj:
c['attlen'] = get_type_attr(
'attlen', c
) or matchObj.group(1)
c['attprecision'] = get_type_attr(
'attprecision', c
) or matchObj.group(2)
elif length:
# If we have length only
matchObj = re.search(r'(\d+)', fulltype)
if matchObj:
c['attlen'] = get_type_attr(
'attlen', c
) or matchObj.group(1)
c['attprecision'] = None
else:
c['attlen'] = None
c['attprecision'] = None
if 'cltype' in c:
typename = c['cltype']
if 'hasSqrBracket' in c and c['hasSqrBracket']:
typename += '[]'
length, precision, typeval = \
self.get_length_precision(typename)
# if new datatype does not have length or precision
# then we cannot apply length or precision of old
# datatype to new one.
if not length:
old_col_data['attlen'] = -1
if not precision:
old_col_data['attprecision'] = None
old_col_data['cltype'] = \ old_col_data['cltype'] = \
DataTypeReader.parse_type_name( DataTypeReader.parse_type_name(
old_col_data['cltype']) old_col_data['cltype'])
if int(old_col_data['attlen']) == -1:
old_col_data['attlen'] = None
if 'attprecision' not in old_col_data:
old_col_data['attprecision'] = None
# Sql for alter column # Sql for alter column
if 'inheritedfrom' not in c: if 'inheritedfrom' not in c:
column_sql += render_template("/".join( column_sql += render_template("/".join(
@ -1098,7 +868,7 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
c['schema'] = data['schema'] c['schema'] = data['schema']
c['table'] = data['name'] c['table'] = data['name']
c = BaseTableView.convert_length_precision_to_string(c) c = column_utils.convert_length_precision_to_string(c)
if 'inheritedfrom' not in c: if 'inheritedfrom' not in c:
column_sql += render_template("/".join( column_sql += render_template("/".join(
@ -1209,7 +979,7 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
data['relacl'] = parse_priv_to_db(data['relacl'], self.acl) data['relacl'] = parse_priv_to_db(data['relacl'], self.acl)
# Parse & format columns # Parse & format columns
data = self._parse_format_columns(data) data = column_utils.parse_format_columns(data)
data = BaseTableView.check_and_convert_name_to_string(data) data = BaseTableView.check_and_convert_name_to_string(data)
if 'foreign_key' in data: if 'foreign_key' in data:
@ -1659,24 +1429,6 @@ class BaseTableView(PGChildNodeView, BasePartitionTable):
return schema_name, table_name return schema_name, table_name
@staticmethod
def convert_length_precision_to_string(data):
"""
This function is used to convert length & precision to string
to handle case like when user gives 0 as length
Args:
data: Data from client
Returns:
Converted data
"""
if 'attlen' in data and data['attlen'] is not None:
data['attlen'] = str(data['attlen'])
if 'attprecision' in data and data['attprecision'] is not None:
data['attprecision'] = str(data['attprecision'])
return data
def update_vacuum_settings(self, vacuum_key, old_data, data): def update_vacuum_settings(self, vacuum_key, old_data, data):
""" """
This function iterate the vacuum and vacuum toast table and create This function iterate the vacuum and vacuum toast table and create

View File

@ -111,7 +111,7 @@ define('pgadmin.node.type', [
// Note: There are ambiguities in the PG catalogs and docs between // Note: There are ambiguities in the PG catalogs and docs between
// precision and scale. In the UI, we try to follow the docs as // precision and scale. In the UI, we try to follow the docs as
// closely as possible, therefore we use Length/Precision and Scale // closely as possible, therefore we use Length/Precision and Scale
id: 'tlength', label: gettext('Length/precision'), deps: ['type'], type: 'text', id: 'tlength', label: gettext('Length/Precision'), deps: ['type'], type: 'text',
disabled: false, cell: IntegerDepCell, disabled: false, cell: IntegerDepCell,
editable: function(m) { editable: function(m) {
// We will store type from selected from combobox // We will store type from selected from combobox
@ -214,13 +214,13 @@ define('pgadmin.node.type', [
this.errorModel.set('type', errmsg); this.errorModel.set('type', errmsg);
return errmsg; return errmsg;
} }
// Validation for Length/precision field (see comments above if confused about the naming!) // Validation for Length/Precision field (see comments above if confused about the naming!)
else if (this.get('is_tlength') else if (this.get('is_tlength')
&& !_.isUndefined(this.get('tlength'))) { && !_.isUndefined(this.get('tlength'))) {
if (this.get('tlength') < this.get('min_val')) if (this.get('tlength') < this.get('min_val'))
errmsg = gettext('Length/precision should not be less than %s', this.get('min_val')); errmsg = gettext('Length/Precision should not be less than %s', this.get('min_val'));
if (this.get('tlength') > this.get('max_val') ) if (this.get('tlength') > this.get('max_val') )
errmsg = gettext('Length/precision should not be greater than %s', this.get('max_val')); errmsg = gettext('Length/Precision should not be greater than %s', this.get('max_val'));
// If we have any error set then throw it to user // If we have any error set then throw it to user
if(errmsg) { if(errmsg) {
this.errorModel.set('tlength', errmsg); this.errorModel.set('tlength', errmsg);

View File

@ -22,6 +22,21 @@ time({{ type_length }}) with time zone{% endif %}{% if is_type_array %}
{##### BELOW MACRO IS USED FOR COLUMN TYPE UPDATE #####} {##### BELOW MACRO IS USED FOR COLUMN TYPE UPDATE #####}
{######################################################} {######################################################}
{% macro UPDATE_TYPE_SQL(conn, data, o_data) %} {% macro UPDATE_TYPE_SQL(conn, data, o_data) %}
{% if data.attprecision is defined and data.attprecision is none %}
{% set old_precision = none %}
{% elif data.attprecision is defined and data.attprecision is not none %}
{% set data_precision = data.attprecision %}
{% set old_precision = o_data.attprecision %}
{% else %}
{% set old_precision = o_data.attprecision %}
{% endif %}
{% if data.attlen is defined and data.attlen is none %}
{% set old_length = none %}
{% set old_precision = none %}
{% set data_precision = none %}
{% else %}
{% set old_length = o_data.attlen %}
{% endif %}
{% if data.cltype and data.cltype.startswith('time') and data.attlen %} {% if data.cltype and data.cltype.startswith('time') and data.attlen %}
{#############################################################} {#############################################################}
{###### Need to check against separate time types - START ######} {###### Need to check against separate time types - START ######}
@ -34,12 +49,12 @@ time({{ data.attlen }}) with time zone {% endif %}{% if data.hasSqrBracket %}[]{
{#############################################################} {#############################################################}
{# if only type changes, we need to give previous length to current type#} {# if only type changes, we need to give previous length to current type#}
{#############################################################} {#############################################################}
{% elif data.cltype and data.cltype.startswith('time') and o_data.attlen != 'None' %} {% elif data.cltype and data.cltype.startswith('time') %}
{% if data.cltype == "timestamp without time zone" %} {% if data.cltype == "timestamp without time zone" %}
timestamp({{ o_data.attlen }}) without time zone {% elif data.cltype == "timestamp with time zone" %} timestamp{% if o_data.attlen is not none %}({{ o_data.attlen }}){% endif %} without time zone {% elif data.cltype == "timestamp with time zone" %}
timestamp({{ o_data.attlen }}) with time zone {% elif data.cltype == "time without time zone" %} timestamp{% if o_data.attlen is not none %}({{ o_data.attlen }}){% endif %} with time zone {% elif data.cltype == "time without time zone" %}
time({{ o_data.attlen }}) without time zone {% elif data.cltype == "time with time zone" %} time{% if o_data.attlen is not none %}({{ o_data.attlen }}){% endif %} without time zone {% elif data.cltype == "time with time zone" %}
time({{ o_data.attlen }}) with time zone {% endif %}{% if data.hasSqrBracket %}[]{% endif %} time{% if o_data.attlen is not none %}({{ o_data.attlen }}){% endif %} with time zone {% endif %}{% if data.hasSqrBracket %}[]{% endif %}
{#############################################################} {#############################################################}
{# if only length changes, we need to give previous length to current type#} {# if only length changes, we need to give previous length to current type#}
{#############################################################} {#############################################################}
@ -54,10 +69,10 @@ time({{ data.attlen }}) with time zone {% endif %}{% if o_data.hasSqrBracket %}[
{#############################################################} {#############################################################}
{########## We will create SQL for other types here ##########} {########## We will create SQL for other types here ##########}
{#############################################################} {#############################################################}
{% if data.cltype %}{{ data.cltype }} {% elif o_data.typnspname != 'pg_catalog' %}{{conn|qtTypeIdent(o_data.typnspname, o_data.cltype)}}{% else %}{{conn|qtTypeIdent(o_data.cltype)}} {% endif %}{% if (data.attlen and data.attlen != 'None') or (data.attprecision and data.attprecision != 'None') or (o_data.attlen and o_data.attlen != 'None' and o_data.attlen|int >0) or (o_data.attprecision and o_data.attprecision != 'None') %} {% if data.cltype %}{{ data.cltype }}{% elif o_data.typnspname != 'pg_catalog' %}{{conn|qtTypeIdent(o_data.typnspname, o_data.cltype)}}{% else %}{{conn|qtTypeIdent(o_data.cltype)}}{% endif %}{% if (data.attlen and data.attlen is not none) or (data_precision and data_precision is not none) or (old_length and old_length is not none and old_length|int >0) or (old_precision and old_precision is not none) %}
{% if data.attlen and data.attlen != 'None' %} {% if data.attlen and data.attlen is not none %}
({{ data.attlen }}{% elif o_data.attlen and o_data.attlen != 'None' %}({{ o_data.attlen }}{% endif %}{% if data.attprecision and data.attprecision != 'None' %} ({{ data.attlen }}{% elif old_length and old_length is not none %}({{ old_length }}{% endif %}{% if data_precision and data_precision is not none %}
, {{ data.attprecision }}){% elif o_data.attprecision and o_data.attprecision != 'None' %}, {{ o_data.attprecision }}){% else %}){% endif %} , {{ data_precision }}){% elif old_precision and old_precision is not none %}, {{ old_precision }}){% else %}){% endif %}
{% endif %}{% if o_data.hasSqrBracket %}[]{% endif %} {% endif %}{% if o_data.hasSqrBracket %}[]{% endif %}
{% endif %} {% endif %}
{% endmacro %} {% endmacro %}

View File

@ -144,7 +144,12 @@ class DataTypeReader:
# Max of integer value # Max of integer value
max_val = 2147483647 max_val = 2147483647
else: else:
max_val = 10 # Max value is 6 for data type like
# interval, timestamptz, etc..
if typeval == 'D':
max_val = 6
else:
max_val = 10
res.append({ res.append({
'label': row['typname'], 'value': row['typname'], 'label': row['typname'], 'value': row['typname'],
@ -206,7 +211,8 @@ class DataTypeReader:
return length, precision, typeval return length, precision, typeval
def get_full_type(self, nsp, typname, isDup, numdims, typmod): @staticmethod
def get_full_type(nsp, typname, isDup, numdims, typmod):
""" """
Returns full type name with Length and Precision. Returns full type name with Length and Precision.
@ -273,6 +279,10 @@ class DataTypeReader:
elif name == 'interval': elif name == 'interval':
_prec = 0 _prec = 0
_len = typmod & 0xffff _len = typmod & 0xffff
# Max length for interval data type is 6
# If length is greater then 6 then set length to None
if _len > 6:
_len = ''
length += str(_len) length += str(_len)
elif name == 'date': elif name == 'date':
# Clear length # Clear length
@ -330,6 +340,10 @@ class DataTypeReader:
from re import sub as sub_str from re import sub as sub_str
pattern = r'(\(\d+\))' pattern = r'(\(\d+\))'
type_name = sub_str(pattern, '', type_name) type_name = sub_str(pattern, '', type_name)
# We need special handling for interval types like
# interval hours to minute.
elif type_name.startswith("interval"):
type_name = 'interval'
if is_array: if is_array:
type_name += "[]" type_name += "[]"