Added support for EDB Job Scheduler. #7098

This commit is contained in:
Akshay Joshi 2024-03-18 11:53:59 +05:30 committed by GitHub
parent f351b10ed0
commit 097b630738
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
125 changed files with 7867 additions and 339 deletions

108
docs/en_US/dbms_job.rst Normal file
View File

@ -0,0 +1,108 @@
.. _dbms_job:
*****************
`DBMS Job`:index:
*****************
Use the *DBMS Job* dialog to create a DBMS Job.
.. image:: images/dbms_job_general.png
:alt: DBMS Job dialog general tab
:align: center
Use the fields in the *General* tab to create job:
* Use the *Name* field to add a descriptive name for the job. The name will
be displayed in the *pgAdmin* object explorer.
* Use the *Enabled?* switch to indicate that job should be enabled or disabled.
* Use the *Job Type* field to select the type of the job. Type could be SELF-CONTAINED or PRE-DEFINED.
If the Job Type is Self-Contained you need to specify the action and repeat interval in the Action and Repeat tabs respectively.
If the Job Type is Pre-Defined you need to specify the existing Program and Schedule names in the Pre-Defined tab.
* Store notes about the job in the *Comment* field.
Click the *Action* tab to continue.
.. image:: images/dbms_job_action.png
:alt: DBMS Job dialog action tab
:align: center
Use the *Action* tab to select the action for the job. This tab is only enabled when the job type is 'SELF-CONTAINED'.
* Use the *Type* field to select the type of the job. Type could be PLSQL BLOCK or STORED PROCEDURE.
* Use the *Procedure* field to select an existing procedure that executes when the job is invoked.
* *Number of Arguments* field is read-only and indicates the quantity of arguments necessary for the chosen procedure.
Click the *Code* tab to continue.
.. image:: images/dbms_job_code.png
:alt: DBMS Job dialog code tab
:align: center
* Use the *Code* field to write the code that executes when the job is invoked.
This tab is only enabled when the job type is 'SELF-CONTAINED' and type of the action is set to 'PLSQL BLOCK'.
Click the *Arguments* tab to continue.
.. image:: images/dbms_job_arguments.png
:alt: DBMS Job dialog arguments tab
:align: center
* *Arguments* tab outlines the arguments required by the selected procedure in the 'Action' tab. This tab is only enabled when the job type is 'SELF-CONTAINED'.
Click the *Repeat* tab to continue.
.. image:: images/dbms_job_repeat.png
:alt: DBMS Job dialog repeat tab
:align: center
Use the *Repeat* tab to select the repeat interval for the job. This tab is only enabled when the job type is 'SELF-CONTAINED'.
* Use the calendar selector in the *Start* field to specify the starting date
and time for the job.
* Use the calendar selector in the *End* field to specify the ending date and
time for the job.
* Use the *Frequency* field to select the frequency. Frequency is one of the following:
YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY.
* Use the *Date* field to select the date on which job will execute.Date is YYYYMMDD.
* Use the *Months* field to select the months in which the job will execute.
* Use the *Week Days* field to select the days on which the job will execute.
* Use the *Month Days* field to select the numeric days on which the job will
execute.
* Use the *Hours* field to select the hour at which the job will execute.
* Use the *Minutes* field to select the minute at which the job will execute.
Click the *Pre-Defined* tab to continue.
.. image:: images/dbms_job_predefined.png
:alt: DBMS Job dialog predefined tab
:align: center
Use the *Pre-Defined* tab to select the existing program and schedule to create the job.
This tab is only enabled when the job type is 'PRE-DEFINED'.
* Use the *Program Name* field to select the existing program.
* Use the *Schedule Name* field to select an existing schedule.
Click the *SQL* tab to continue.
Your entries in the *DBMS Job* dialog generate a SQL command (see an example below).
Use the *SQL* tab for review; revisit or switch tabs to make any changes to the
SQL command.
**Example**
The following is an example of the sql command generated by user selections in
the *DBMS Job* dialog:
.. image:: images/dbms_job_sql.png
:alt: DBMS Job dialog sql tab
:align: center
* Click the *Info* button (i) to access online help.
* Click the *Help* button (?) to access dialog help.
* Click the *Save* button to save work.
* Click the *Close* button to exit without saving work.
* Click the *Reset* button to restore configuration parameters.

View File

@ -0,0 +1,45 @@
.. _dbms_job_scheduler:
********************************
`Using EDB Job Scheduler`:index:
********************************
In the past versions of EPAS, DBMS_SCHEDULER or DBMS_JOBS required the configuration
of pgAgent, an essential service for their functionality. Maintaining pgAgent in a
production environment is cumbersome. The need for correct configuration, regular updates,
and ensuring the services health added complexity.
EPAS 16 revolutionizes job scheduling by eliminating the need for the pgAgent component.
The new version introduces EDB Job Scheduler which is an extension that runs the job scheduler
as a background process for the DBMS_SCHEDULER and DBMS_JOB packages.
The EDB Job Scheduler has a scheduler process that starts when the database cluster starts.
To start the scheduler process, load the EDB Job Scheduler extension using the **shared_preload_libraries**
parameter. After you load the extension, create the extension using the CREATE EXTENSION command.
The database in which you're creating the extension must be listed in the **edb_job_scheduler.database_list**
parameter.
Instructions for configuring the EDB Job Scheduler can be found in the
`Configuring EDB Job Scheduler <https://www.enterprisedb.com/docs/pg_extensions/edb_job_scheduler/configuring/>`_.
.. image:: images/dbms_job_scheduler.png
:alt: DBMS Job Scheduler Object Browser
:align: center
Check the status of all the jobs
********************************
To check the running status of all the jobs select the 'DBMS Job Scheduler' collection node from the object
explorer and select the Properties tab.
.. image:: images/dbms_job_details.png
:alt: DBMS Job Details
:align: center
.. toctree::
:maxdepth: 1
dbms_job
dbms_program
dbms_schedule

View File

@ -0,0 +1,71 @@
.. _dbms_program:
*********************
`DBMS Program`:index:
*********************
Use the *DBMS Program* dialog to create a DBMS Program.
.. image:: images/dbms_program_general.png
:alt: DBMS Program dialog general tab
:align: center
Use the fields in the *General* tab to create program:
* Use the *Name* field to add a descriptive name for the program. The name will
be displayed in the *pgAdmin* object explorer.
* Use the *Enabled?* switch to indicate that program should be enabled or disabled.
* Store notes about the program in the *Comment* field.
Click the *Action* tab to continue.
.. image:: images/dbms_program_action.png
:alt: DBMS Program dialog action tab
:align: center
Use the *Action* tab to select the action for the program:
* Use the *Type* field to select the type of the program. Type could be PLSQL BLOCK or STORED PROCEDURE.
* Use the *Procedure* field to select an existing procedure that executes when the program is invoked.
* *Number of Arguments* field is read-only and indicates the quantity of arguments necessary for the chosen procedure.
Click the *Code* tab to continue.
.. image:: images/dbms_program_code.png
:alt: DBMS Program dialog code tab
:align: center
* Use the *Code* field to write the code that executes when the program is invoked.
This tab is only enabled when the type of the program is set to 'PLSQL BLOCK'.
Click the *Arguments* tab to continue.
.. image:: images/dbms_program_arguments.png
:alt: DBMS Program dialog arguments tab
:align: center
* *Arguments* tab is a read-only section that outlines the arguments required by the selected procedure in the 'Action' tab.
Click the *SQL* tab to continue.
Your entries in the *DBMS Program* dialog generate a SQL command (see an example below).
Use the *SQL* tab for review; revisit or switch tabs to make any changes to the
SQL command.
**Example**
The following is an example of the sql command generated by user selections in
the *DBMS Program* dialog:
.. image:: images/dbms_program_sql.png
:alt: DBMS Program dialog sql tab
:align: center
* Click the *Info* button (i) to access online help.
* Click the *Help* button (?) to access dialog help.
* Click the *Save* button to save work.
* Click the *Close* button to exit without saving work.
* Click the *Reset* button to restore configuration parameters.

View File

@ -0,0 +1,61 @@
.. _dbms_schedule:
**********************
`DBMS Schedule`:index:
**********************
Use the *DBMS Schedule* dialog to create a DBMS Schedule.
.. image:: images/dbms_schedule_general.png
:alt: DBMS Schedule dialog general tab
:align: center
Use the fields in the *General* tab to create schedule:
* Use the *Name* field to add a descriptive name for the schedule. The name will
be displayed in the *pgAdmin* object explorer.
* Store notes about the schedule in the *Comment* field.
Click the *Repeat* tab to continue.
.. image:: images/dbms_schedule_repeat.png
:alt: DBMS Schedule dialog repeat tab
:align: center
Use the *Repeat* tab to select the repeat interval for the schedule:
* Use the calendar selector in the *Start* field to specify the starting date
and time for the schedule.
* Use the calendar selector in the *End* field to specify the ending date and
time for the schedule.
* Use the *Frequency* field to select the frequency. Frequency is one of the following:
YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY.
* Use the *Date* field to select the date on which schedule will execute.Date is YYYYMMDD.
* Use the *Months* field to select the months in which the schedule will execute.
* Use the *Week Days* field to select the days on which the schedule will execute.
* Use the *Month Days* field to select the numeric days on which the schedule will
execute.
* Use the *Hours* field to select the hour at which the schedule will execute.
* Use the *Minutes* field to select the minute at which the schedule will execute.
Click the *SQL* tab to continue.
Your entries in the *DBMS Schedule* dialog generate a SQL command (see an example below).
Use the *SQL* tab for review; revisit or switch tabs to make any changes to the
SQL command.
**Example**
The following is an example of the sql command generated by user selections in
the *DBMS Schedule* dialog:
.. image:: images/dbms_schedule_sql.png
:alt: DBMS Schedule dialog sql tab
:align: center
* Click the *Info* button (i) to access online help.
* Click the *Help* button (?) to access dialog help.
* Click the *Save* button to save work.
* Click the *Close* button to exit without saving work.
* Click the *Reset* button to restore configuration parameters.

Binary file not shown.

After

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 57 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 111 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 98 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 71 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 45 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 49 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 115 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 74 KiB

View File

@ -18,6 +18,7 @@ node, and select *Create Cast...*
:maxdepth: 1
cast_dialog
dbms_job_scheduler
collation_dialog
domain_dialog
domain_constraint_dialog

View File

@ -128,6 +128,9 @@ class DatabaseModule(CollectionNodeModule):
from .subscriptions import blueprint as module
self.submodules.append(module)
from .dbms_job_scheduler import blueprint as module
self.submodules.append(module)
super().register(app, options)

View File

@ -0,0 +1,280 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
""" Implements DBMS Job Scheduler objects Node."""
from functools import wraps
from flask import render_template
from flask_babel import gettext
from pgadmin.browser.collection import CollectionNodeModule
from pgadmin.browser.server_groups.servers import databases
from pgadmin.browser.utils import PGChildNodeView
from pgadmin.utils.ajax import (make_json_response, internal_server_error,
make_response as ajax_response)
from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER
from pgadmin.utils.constants import DBMS_JOB_SCHEDULER_ID
class DBMSJobSchedulerModule(CollectionNodeModule):
"""
class DBMSJobSchedulerModule(CollectionNodeModule)
A module class for DBMS Job Scheduler objects node derived
from CollectionNodeModule.
Methods:
-------
* __init__(*args, **kwargs)
- Method is used to initialize the DBMS Job Scheduler objects, and it's
base module.
* get_nodes(gid, sid, did, scid, coid)
- Method is used to generate the browser collection node.
* script_load()
- Load the module script for DBMS Job Scheduler objects, when any of the
server node is initialized.
* backend_supported(manager, **kwargs)
* registert(self, app, options)
- Override the default register function to automagically register
sub-modules at once.
"""
_NODE_TYPE = 'dbms_job_scheduler'
_COLLECTION_LABEL = gettext("DBMS Job Scheduler")
def __init__(self, *args, **kwargs):
"""
Method is used to initialize the DBMSJobSchedulerModule, and it's base
module.
Args:
*args:
**kwargs:
"""
super().__init__(*args, **kwargs)
self.min_ver = None
self.max_ver = None
@property
def node_icon(self):
"""
icon to be displayed for the browser nodes
"""
return 'icon-coll-dbms_job_scheduler'
def get_nodes(self, gid, sid, did):
"""
Generate the collection node
"""
if self.show_node:
yield self.generate_browser_node(DBMS_JOB_SCHEDULER_ID, did,
self._COLLECTION_LABEL, None)
@property
def script_load(self):
"""
Load the module script for server, when any of the database node is
initialized.
"""
return databases.DatabaseModule.node_type
def backend_supported(self, manager, **kwargs):
"""
Function is used to check the pre-requisite for this node.
Args:
manager:
**kwargs:
Returns:
"""
if hasattr(self, 'show_node') and not self.show_node:
return False
# Get the connection for the respective database
conn = manager.connection(did=kwargs['did'])
# Checking whether both 'edb_job_scheduler' and 'dbms_scheduler'
# extensions are created or not.
status, res = conn.execute_scalar("""
SELECT COUNT(*) FROM pg_extension WHERE extname IN (
'edb_job_scheduler', 'dbms_scheduler') """)
if status and int(res) == 2:
# Get the list of databases specified for the edb_job_scheduler
status, res = conn.execute_scalar("""
SHOW edb_job_scheduler.database_list""")
# If database is available in the specified list than return True.
if status and res and conn.db in res:
return True
return False
@property
def module_use_template_javascript(self):
"""
Returns whether Jinja2 template is used for generating the javascript
module.
"""
return False
def register(self, app, options):
"""
Override the default register function to automagically register
sub-modules at once.
"""
from .dbms_jobs import blueprint as module
self.submodules.append(module)
from .dbms_programs import blueprint as module
self.submodules.append(module)
from .dbms_schedules import blueprint as module
self.submodules.append(module)
super().register(app, options)
blueprint = DBMSJobSchedulerModule(__name__)
class DBMSJobSchedulerView(PGChildNodeView):
"""
class DBMSJobSchedulerView(PGChildNodeView)
A view class for cast node derived from PGChildNodeView. This class is
responsible for all the stuff related to view like
create/update/delete cast, showing properties of cast node,
showing sql in sql pane.
Methods:
-------
* __init__(**kwargs)
- Method is used to initialize the DBMSJobSchedulerView, and it's
base view.
* check_precondition()
- This function will behave as a decorator which will checks
database connection before running view, it will also attach
manager,conn & template_path properties to self
* nodes()
- This function will use to create all the child node within that
collection. Here it will create all the scheduler nodes.
* properties(gid, sid, did, jsid)
- This function will show the properties of the selected job node
"""
node_type = blueprint.node_type
BASE_TEMPLATE_PATH = 'dbms_job_scheduler/ppas/#{0}#'
parent_ids = [
{'type': 'int', 'id': 'gid'},
{'type': 'int', 'id': 'sid'},
{'type': 'int', 'id': 'did'}
]
ids = [
{'type': 'int', 'id': 'jsid'}
]
operations = dict({
'obj': [
{'get': 'properties'},
{'get': 'list'}
],
'children': [{
'get': 'children'
}],
'nodes': [{'get': 'nodes'}, {'get': 'nodes'}]
})
def _init_(self, **kwargs):
self.conn = None
self.template_path = None
self.manager = None
super().__init__(**kwargs)
def check_precondition(f):
"""
This function will behave as a decorator which will check the
database connection before running view. It will also attach
manager, conn & template_path properties to self
"""
@wraps(f)
def wrap(*args, **kwargs):
self = args[0]
self.driver = get_driver(PG_DEFAULT_DRIVER)
self.manager = self.driver.connection_manager(kwargs['sid'])
self.conn = self.manager.connection(did=kwargs['did'])
# Set the template path for the SQL scripts
self.template_path = self.BASE_TEMPLATE_PATH.format(
self.manager.version)
# Here args[0] will hold self & kwargs will hold gid,sid,did
return f(*args, **kwargs)
return wrap
@check_precondition
def nodes(self, gid, sid, did):
"""
This function will use to create all the child nodes within the
collection.
"""
return make_json_response(
data=[],
status=200
)
@check_precondition
def list(self, gid, sid, did):
"""
This function will show the run details of all the jobs.
Args:
gid: Server Group ID
sid: Server ID
did: Database ID
"""
try:
sql = render_template(
"/".join([self.template_path, 'get_job_run_details.sql']))
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
return ajax_response(
response=res['rows'],
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def properties(self, gid, sid, did, jsid):
"""
Args:
gid: Server Group ID
sid: Server ID
did: Database ID
jsid: Job Scheduler ID
"""
return make_json_response(
data=[],
status=200
)
DBMSJobSchedulerView.register_node_view(blueprint)

View File

@ -0,0 +1,785 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
""" Implements DBMS Job objects Node."""
import json
from functools import wraps
from flask import render_template, request, jsonify
from flask_babel import gettext
from pgadmin.browser.collection import CollectionNodeModule
from pgadmin.browser.server_groups.servers import databases
from pgadmin.browser.utils import PGChildNodeView
from pgadmin.utils.ajax import make_json_response, gone, \
make_response as ajax_response, internal_server_error, success_return
from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER
from pgadmin.utils.constants import DBMS_JOB_SCHEDULER_ID
from pgadmin.browser.server_groups.servers.databases.schemas.functions.utils \
import format_arguments_from_db
from pgadmin.browser.server_groups.servers.databases.dbms_job_scheduler.utils \
import (resolve_calendar_string, create_calendar_string,
get_formatted_program_args)
class DBMSJobModule(CollectionNodeModule):
"""
class DBMSJobModule(CollectionNodeModule)
A module class for DBMS Job objects node derived
from CollectionNodeModule.
Methods:
-------
* get_nodes(gid, sid, did)
- Method is used to generate the browser collection node.
* script_load()
- Load the module script for DBMS Job objects, when any of
the server node is initialized.
"""
_NODE_TYPE = 'dbms_job'
_COLLECTION_LABEL = gettext("DBMS Jobs")
@property
def collection_icon(self):
"""
icon to be displayed for the browser collection node
"""
return 'icon-coll-pga_job'
@property
def node_icon(self):
"""
icon to be displayed for the browser nodes
"""
return 'icon-pga_job'
def get_nodes(self, gid, sid, did, jsid):
"""
Generate the collection node
"""
if self.show_node:
yield self.generate_browser_collection_node(did)
@property
def node_inode(self):
"""
Override this property to make the node a leaf node.
Returns: False as this is the leaf node
"""
return False
@property
def script_load(self):
"""
Load the module script for server, when any of the database node is
initialized.
"""
return databases.DatabaseModule.node_type
@property
def module_use_template_javascript(self):
"""
Returns whether Jinja2 template is used for generating the javascript
module.
"""
return False
blueprint = DBMSJobModule(__name__)
class DBMSJobView(PGChildNodeView):
"""
class DBMSJobView(PGChildNodeView)
A view class for DBMSJob node derived from PGChildNodeView.
This class is responsible for all the stuff related to view like
updating job node, showing properties, showing sql in sql pane.
Methods:
-------
* __init__(**kwargs)
- Method is used to initialize the DBMSJobView, and it's base view.
* check_precondition()
- This function will behave as a decorator which will checks
database connection before running view, it will also attaches
manager,conn & template_path properties to self
* list()
- This function is used to list all the job nodes within that
collection.
* nodes()
- This function will use to create all the child node within that
collection. Here it will create all the job node.
* properties(gid, sid, did, jsid, jsjobid)
- This function will show the properties of the selected job node
* create(gid, sid, did, jsid, jsjobid)
- This function will create the new job object
* msql(gid, sid, did, jsid, jsjobid)
- This function is used to return modified SQL for the
selected job node
* sql(gid, sid, did, jsid, jsjobid)
- Dummy response for sql panel
* delete(gid, sid, did, jsid, jsjobid)
- Drops job
"""
node_type = blueprint.node_type
BASE_TEMPLATE_PATH = 'dbms_jobs/ppas/#{0}#'
PROGRAM_TEMPLATE_PATH = 'dbms_programs/ppas/#{0}#'
SCHEDULE_TEMPLATE_PATH = 'dbms_schedules/ppas/#{0}#'
parent_ids = [
{'type': 'int', 'id': 'gid'},
{'type': 'int', 'id': 'sid'},
{'type': 'int', 'id': 'did'},
{'type': 'int', 'id': 'jsid'}
]
ids = [
{'type': 'int', 'id': 'jsjobid'}
]
operations = dict({
'obj': [
{'get': 'properties', 'delete': 'delete', 'put': 'update'},
{'get': 'list', 'post': 'create', 'delete': 'delete'}
],
'nodes': [{'get': 'nodes'}, {'get': 'nodes'}],
'msql': [{'get': 'msql'}, {'get': 'msql'}],
'sql': [{'get': 'sql'}],
'get_procedures': [{}, {'get': 'get_procedures'}],
'enable_disable': [{'put': 'enable_disable'}],
'get_programs': [{}, {'get': 'get_programs'}],
'get_schedules': [{}, {'get': 'get_schedules'}],
'run_job': [{'put': 'run_job'}],
})
def _init_(self, **kwargs):
self.conn = None
self.template_path = None
self.pr_template_path = None
self.sch_template_path = None
self.manager = None
super().__init__(**kwargs)
def check_precondition(f):
"""
This function will behave as a decorator which will check the
database connection before running view. It will also attach
manager, conn & template_path properties to self
"""
@wraps(f)
def wrap(*args, **kwargs):
# Here args[0] will hold self & kwargs will hold gid,sid,did
self = args[0]
self.driver = get_driver(PG_DEFAULT_DRIVER)
self.manager = self.driver.connection_manager(kwargs['sid'])
self.conn = self.manager.connection(did=kwargs['did'])
# Set the template path for the SQL scripts
self.template_path = self.BASE_TEMPLATE_PATH.format(
self.manager.version)
self.pr_template_path = self.PROGRAM_TEMPLATE_PATH.format(
self.manager.version)
self.sch_template_path = self.SCHEDULE_TEMPLATE_PATH.format(
self.manager.version)
return f(*args, **kwargs)
return wrap
@check_precondition
def list(self, gid, sid, did, jsid):
"""
This function is used to list all the job nodes within
that collection.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]))
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
return ajax_response(
response=res['rows'],
status=200
)
@check_precondition
def nodes(self, gid, sid, did, jsid, jsjobid=None):
"""
This function is used to create all the child nodes within
the collection. Here it will create all the job nodes.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
res = []
try:
sql = render_template(
"/".join([self.template_path, self._NODES_SQL]))
status, result = self.conn.execute_2darray(sql)
if not status:
return internal_server_error(errormsg=result)
if jsjobid is not None:
if len(result['rows']) == 0:
return gone(
errormsg=gettext("Could not find the specified job.")
)
row = result['rows'][0]
return make_json_response(
data=self.blueprint.generate_browser_node(
row['jsjobid'],
DBMS_JOB_SCHEDULER_ID,
row['jsjobname'],
is_enabled=row['jsjobenabled'],
icon="icon-pga_job" if row['jsjobenabled'] else
"icon-pga_job-disabled",
description=row['jsjobdesc']
)
)
for row in result['rows']:
res.append(
self.blueprint.generate_browser_node(
row['jsjobid'],
DBMS_JOB_SCHEDULER_ID,
row['jsjobname'],
is_enabled=row['jsjobenabled'],
icon="icon-pga_job" if row['jsjobenabled'] else
"icon-pga_job-disabled",
description=row['jsjobdesc']
)
)
return make_json_response(
data=res,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def properties(self, gid, sid, did, jsid, jsjobid):
"""
This function will show the properties of the selected job node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsjobid: Job ID
"""
try:
status, data = self._fetch_properties(jsjobid)
if not status:
return data
return ajax_response(
response=data,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
def _fetch_properties(self, jsjobid):
"""
This function is used to fetch the properties.
Args:
jsjobid:
"""
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]),
jsjobid=jsjobid
)
status, res = self.conn.execute_dict(sql)
if not status:
return False, internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return False, gone(
errormsg=gettext("Could not find the specified job.")
)
data = res['rows'][0]
# If 'jsjobscname' and 'jsjobprname' in data then set the jsjobtype
if ('jsjobscname' in data and data['jsjobscname'] is not None and
'jsjobprname' in data and data['jsjobprname'] is not None):
data['jsjobtype'] = 'p'
else:
data['jsjobtype'] = 's'
# Resolve the repeat interval string
if 'jsscrepeatint' in data:
(freq, by_date, by_month, by_month_day, by_weekday, by_hour,
by_minute) = resolve_calendar_string(
data['jsscrepeatint'])
data['jsscfreq'] = freq
data['jsscdate'] = by_date
data['jsscmonths'] = by_month
data['jsscmonthdays'] = by_month_day
data['jsscweekdays'] = by_weekday
data['jsschours'] = by_hour
data['jsscminutes'] = by_minute
# Get Program's arguments
sql = render_template(
"/".join([self.pr_template_path, self._PROPERTIES_SQL]),
jsprid=data['program_id']
)
status, res_program = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res_program)
# Update the data dictionary.
if len(res_program['rows']) > 0:
# Get the formatted program args
get_formatted_program_args(self.pr_template_path, self.conn,
res_program['rows'][0])
# Get the job argument value
self.get_job_args_value(self.template_path, self.conn,
data['jsjobname'],
res_program['rows'][0])
data['jsprarguments'] = res_program['rows'][0]['jsprarguments'] \
if 'jsprarguments' in res_program['rows'][0] else []
return True, data
@check_precondition
def create(self, gid, sid, did, jsid):
"""
This function will create the job node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
data = json.loads(request.data)
try:
# Get the SQL
sql, _, _ = self.get_sql(None, data)
status, res = self.conn.execute_void('BEGIN')
if not status:
return internal_server_error(errormsg=res)
status, res = self.conn.execute_scalar(sql)
if not status:
if self.conn.connected():
self.conn.execute_void('END')
return internal_server_error(errormsg=res)
self.conn.execute_void('END')
# Get the newly created job id
sql = render_template(
"/".join([self.template_path, 'get_job_id.sql']),
job_name=data['jsjobname'], conn=self.conn
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
errormsg=gettext("Job creation failed.")
)
row = res['rows'][0]
return jsonify(
node=self.blueprint.generate_browser_node(
row['jsjobid'],
DBMS_JOB_SCHEDULER_ID,
row['jsjobname'],
is_enabled=row['jsjobenabled'],
icon="icon-pga_job" if row['jsjobenabled'] else
"icon-pga_job-disabled"
)
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def update(self, gid, sid, did, jsid, jsjobid=None):
"""
This function will update job object
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
data = request.form if request.form else json.loads(
request.data
)
try:
sql, jobname, jobenabled = self.get_sql(jsjobid, data)
status, res = self.conn.execute_scalar(sql)
if not status:
return internal_server_error(errormsg=res)
return jsonify(
node=self.blueprint.generate_browser_node(
jsjobid,
DBMS_JOB_SCHEDULER_ID,
jobname,
icon="icon-pga_job" if jobenabled else
"icon-pga_job-disabled"
)
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def delete(self, gid, sid, did, jsid, jsjobid=None):
"""Delete the Job."""
if jsjobid is None:
data = request.form if request.form else json.loads(
request.data
)
else:
data = {'ids': [jsjobid]}
try:
for jsjobid in data['ids']:
status, data = self._fetch_properties(jsjobid)
if not status:
return data
jsjobname = data['jsjobname']
status, res = self.conn.execute_void(
render_template(
"/".join([self.template_path, self._DELETE_SQL]),
job_name=jsjobname, conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return make_json_response(success=1)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def msql(self, gid, sid, did, jsid, jsjobid=None):
"""
This function is used to return modified SQL for the
selected job node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsjobid: Job ID (optional)
"""
data = {}
for k, v in request.args.items():
try:
# comments should be taken as is because if user enters a
# json comment it is parsed by loads which should not happen
if k in ('jsjobdesc',):
data[k] = v
else:
data[k] = json.loads(v)
except ValueError:
data[k] = v
sql, _, _ = self.get_sql(jsjobid, data)
return make_json_response(
data=sql,
status=200
)
def get_sql(self, jsjobid, data):
"""
This function is used to get the SQL.
"""
sql = ''
name = ''
enabled = True
if jsjobid is None:
name = data['jsjobname']
enabled = data['jsjobenabled']
# Create calendar string for repeat interval
repeat_interval = create_calendar_string(
data['jsscfreq'], data['jsscdate'], data['jsscmonths'],
data['jsscmonthdays'], data['jsscweekdays'], data['jsschours'],
data['jsscminutes'])
sql = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
job_name=data['jsjobname'],
internal_job_type=data['jsjobtype'],
job_type=data['jsprtype'],
job_action=data['jsprproc']
if data['jsprtype'] == 'STORED_PROCEDURE' else
data['jsprcode'],
enabled=data['jsjobenabled'],
comments=data['jsjobdesc'],
number_of_arguments=data['jsprnoofargs'],
start_date=data['jsscstart'],
repeat_interval=repeat_interval,
end_date=data['jsscend'],
program_name=data['jsjobprname'],
schedule_name=data['jsjobscname'],
arguments=data['jsprarguments'] if 'jsprarguments' in data
else [],
conn=self.conn
)
elif jsjobid is not None and 'jsprarguments' in data:
status, res = self._fetch_properties(jsjobid)
if not status:
return res
name = res['jsjobname']
enabled = res['jsjobenabled']
sql = render_template(
"/".join([self.template_path, self._UPDATE_SQL]),
job_name=res['jsjobname'],
changed_value=data['jsprarguments']['changed'],
conn=self.conn
)
return sql, name, enabled
@check_precondition
def sql(self, gid, sid, did, jsid, jsjobid):
"""
This function will generate sql for the sql panel
"""
try:
status, data = self._fetch_properties(jsjobid)
if not status:
return ''
SQL = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
display_comments=True,
job_name=data['jsjobname'],
internal_job_type=data['jsjobtype'],
job_type=data['jsprtype'],
job_action=data['jsprproc']
if data['jsprtype'] == 'STORED_PROCEDURE' else
data['jsprcode'],
enabled=data['jsjobenabled'],
comments=data['jsjobdesc'],
number_of_arguments=data['jsprnoofargs'],
start_date=data['jsscstart'],
repeat_interval=data['jsscrepeatint'],
end_date=data['jsscend'],
program_name=data['jsjobprname'],
schedule_name=data['jsjobscname'],
arguments=data['jsprarguments'] if 'jsprarguments' in data
else [],
conn=self.conn
)
return ajax_response(response=SQL)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def get_procedures(self, gid, sid, did, jsid=None):
"""
This function will return procedure list
:param gid: group id
:param sid: server id
:param did: database id
:return:
"""
res = []
sql = render_template("/".join([self.pr_template_path,
'get_procedures.sql']),
datlastsysoid=self._DATABASE_LAST_SYSTEM_OID)
status, rset = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=rset)
for row in rset['rows']:
# Get formatted Arguments
frmtd_params, _ = format_arguments_from_db(
self.template_path, self.conn, row)
res.append({'label': row['proc_name'],
'value': row['proc_name'],
'no_of_args': row['number_of_arguments'],
'arguments': frmtd_params['arguments']
})
return make_json_response(
data=res,
status=200
)
@check_precondition
def enable_disable(self, gid, sid, did, jsid, jsjobid=None):
"""
This function is used to enable/disable job.
"""
data = request.form if request.form else json.loads(
request.data
)
status, res = self.conn.execute_void(
render_template(
"/".join([self.pr_template_path, 'enable_disable.sql']),
name=data['job_name'],
is_enable=data['is_enable_job'], conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return make_json_response(
success=1,
info=gettext("Job enabled") if data['is_enable_job'] else
gettext('Job disabled'),
data={
'sid': sid,
'did': did,
'jsid': jsid,
'jsjobid': jsjobid
}
)
@check_precondition
def get_programs(self, gid, sid, did, jsid=None):
"""
This function will return procedure list
:param gid: group id
:param sid: server id
:param did: database id
:return:
"""
res = []
sql = render_template("/".join([self.pr_template_path,
self._NODES_SQL]),
datlastsysoid=self._DATABASE_LAST_SYSTEM_OID)
status, rset = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=rset)
for row in rset['rows']:
res.append({'label': row['jsprname'],
'value': row['jsprname']})
return make_json_response(
data=res,
status=200
)
@check_precondition
def get_schedules(self, gid, sid, did, jsid=None):
"""
This function will return procedure list
:param gid: group id
:param sid: server id
:param did: database id
:return:
"""
res = []
sql = render_template("/".join([self.sch_template_path,
self._NODES_SQL]))
status, rset = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=rset)
for row in rset['rows']:
res.append({'label': row['jsscname'],
'value': row['jsscname']
})
return make_json_response(
data=res,
status=200
)
@check_precondition
def run_job(self, gid, sid, did, jsid, jsjobid=None):
"""
This function is used to run the job now.
"""
data = request.form if request.form else json.loads(
request.data
)
try:
status, res = self.conn.execute_void(
render_template(
"/".join([self.template_path, 'run_job.sql']),
job_name=data['job_name'], conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return success_return(
message=gettext("Started the Job execution.")
)
except Exception as e:
return internal_server_error(errormsg=str(e))
def get_job_args_value(self, template_path, conn, jobname, data):
"""
This function is used to get the job arguments value.
Args:
template_path:
conn:
jobname:
data:
Returns:
"""
if 'jsprarguments' in data and len(data['jsprarguments']) > 0:
for args in data['jsprarguments']:
sql = render_template(
"/".join([template_path, 'get_job_args_value.sql']),
job_name=jobname,
arg_name=args['argname'], conn=self.conn)
status, res = conn.execute_scalar(sql)
if not status:
return internal_server_error(errormsg=res)
args['argval'] = res
DBMSJobView.register_node_view(blueprint)

View File

@ -0,0 +1,234 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import DBMSJobSchema from './dbms_job.ui';
import { getNodeAjaxOptions } from '../../../../../../../static/js/node_ajax';
import getApiInstance from '../../../../../../../../static/js/api_instance';
define('pgadmin.node.dbms_job', [
'sources/gettext', 'sources/url_for', 'sources/pgadmin',
'pgadmin.browser', 'pgadmin.browser.collection',
], function(gettext, url_for, pgAdmin, pgBrowser) {
if (!pgBrowser.Nodes['coll-dbms_job']) {
pgBrowser.Nodes['coll-dbms_job'] =
pgBrowser.Collection.extend({
node: 'dbms_job',
label: gettext('DBMS Jobs'),
type: 'coll-dbms_job',
columns: ['jsjobname', 'jsjobenabled', 'jsjobruncount', 'jsjobfailurecount', 'jsjobdesc'],
hasSQL: false,
hasDepends: false,
hasStatistics: false,
hasScriptTypes: [],
canDrop: true,
canDropCascade: false,
});
}
if (!pgBrowser.Nodes['dbms_job']) {
pgAdmin.Browser.Nodes['dbms_job'] = pgAdmin.Browser.Node.extend({
parent_type: 'dbms_job_scheduler',
type: 'dbms_job',
label: gettext('DBMS Job'),
node_image: 'icon-pga_job',
epasHelp: true,
epasURL: 'https://www.enterprisedb.com/docs/epas/$VERSION$/epas_compat_bip_guide/03_built-in_packages/15_dbms_scheduler/02_create_job/',
dialogHelp: url_for('help.static', {'filename': 'dbms_job.html'}),
canDrop: true,
hasSQL: true,
hasDepends: false,
hasStatistics: false,
Init: function() {
/* Avoid multiple registration of menus */
if (this.initialized)
return;
this.initialized = true;
pgBrowser.add_menus([{
name: 'create_dbms_job_on_coll', node: 'coll-dbms_job', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Job...'),
data: {action: 'create'},
},{
name: 'create_dbms_job', node: 'dbms_job', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Job...'),
data: {action: 'create'},
},{
name: 'create_dbms_job', node: 'dbms_job_scheduler', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Job...'),
data: {action: 'create'},
}, {
name: 'enable_job', node: 'dbms_job', module: this,
applies: ['object', 'context'], callback: 'enable_job',
priority: 4, label: gettext('Enable Job'),
enable : 'is_enabled',data: {
data_disabled: gettext('Job is already enabled.'),
},
}, {
name: 'disable_job', node: 'dbms_job', module: this,
applies: ['object', 'context'], callback: 'disable_job',
priority: 4, label: gettext('Disable Job'),
enable : 'is_disabled',data: {
data_disabled: gettext('Job is already disabled.'),
},
}, {
name: 'run_job', node: 'dbms_job', module: this,
applies: ['object', 'context'], callback: 'run_job',
priority: 4, label: gettext('Run Job'),
enable : 'is_disabled', data: {
data_disabled: gettext('Job is already disabled.'),
}
}
]);
},
is_enabled: function(node) {
return !node?.is_enabled;
},
is_disabled: function(node) {
return node?.is_enabled;
},
callbacks: {
enable_job: function(args, notify) {
let input = args || {},
obj = this,
t = pgBrowser.tree,
i = 'item' in input ? input.item : t.selected(),
d = i ? t.itemData(i) : undefined;
if (d) {
notify = notify || _.isUndefined(notify) || _.isNull(notify);
let enable = function() {
let data = d;
getApiInstance().put(
obj.generate_url(i, 'enable_disable', d, true),
{'job_name': data.label, 'is_enable_job': true}
).then(({data: res})=> {
if (res.success == 1) {
pgAdmin.Browser.notifier.success(res.info);
t.removeIcon(i);
data.icon = 'icon-pga_jobstep';
data.is_enabled = true;
t.addIcon(i, {icon: data.icon});
t.updateAndReselectNode(i, data);
}
}).catch(function(error) {
pgAdmin.Browser.notifier.pgRespErrorNotify(error);
t.refresh(i);
});
};
if (notify) {
pgAdmin.Browser.notifier.confirm(
gettext('Enable Job'),
gettext('Are you sure you want to enable the job %s?', d.label),
function() { enable(); },
function() { return true;},
);
} else {
enable();
}
}
return false;
},
disable_job: function(args, notify) {
let input = args || {},
obj = this,
t = pgBrowser.tree,
i = 'item' in input ? input.item : t.selected(),
d = i ? t.itemData(i) : undefined;
if (d) {
notify = notify || _.isUndefined(notify) || _.isNull(notify);
let disable = function() {
let data = d;
getApiInstance().put(
obj.generate_url(i, 'enable_disable', d, true),
{'job_name': data.label, 'is_enable_job': false}
).then(({data: res})=> {
if (res.success == 1) {
pgAdmin.Browser.notifier.success(res.info);
t.removeIcon(i);
data.icon = 'icon-pga_jobstep-disabled';
data.is_enabled = false;
t.addIcon(i, {icon: data.icon});
t.updateAndReselectNode(i, data);
}
}).catch(function(error) {
pgAdmin.Browser.notifier.pgRespErrorNotify(error);
t.refresh(i);
});
};
if (notify) {
pgAdmin.Browser.notifier.confirm(
gettext('Disable Job'),
gettext('Are you sure you want to disable the job %s?', d.label),
function() { disable(); },
function() { return true;},
);
} else {
disable();
}
}
return false;
},
run_job: function(args, notify) {
let input = args || {},
obj = this,
t = pgBrowser.tree,
i = 'item' in input ? input.item : t.selected(),
d = i ? t.itemData(i) : undefined;
if (d) {
notify = notify || _.isUndefined(notify) || _.isNull(notify);
let run = function() {
let data = d;
getApiInstance().put(
obj.generate_url(i, 'run_job', d, true),
{'job_name': data.label}
).then(({data: res})=> {
if (res.success == 1) {
pgAdmin.Browser.notifier.success(res.info);
}
}).catch(function(error) {
pgAdmin.Browser.notifier.pgRespErrorNotify(error);
});
};
if (notify) {
pgAdmin.Browser.notifier.confirm(
gettext('Run Job'),
gettext('Are you sure you want to run the job %s now?', d.label),
function() { run(); },
function() { return true;},
);
} else {
run();
}
}
return false;
}
},
getSchema: function(treeNodeInfo, itemNodeData) {
return new DBMSJobSchema(
{
procedures: ()=>getNodeAjaxOptions('get_procedures', this, treeNodeInfo, itemNodeData),
programs: ()=>getNodeAjaxOptions('get_programs', this, treeNodeInfo, itemNodeData),
schedules: ()=>getNodeAjaxOptions('get_schedules', this, treeNodeInfo, itemNodeData)
}
);
},
});
}
return pgBrowser.Nodes['dbms_job'];
});

View File

@ -0,0 +1,198 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import gettext from 'sources/gettext';
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
import { isEmptyString } from 'sources/validators';
import moment from 'moment';
import { getActionSchema, getRepeatSchema } from '../../../static/js/dbms_job_scheduler_common.ui';
export default class DBMSJobSchema extends BaseUISchema {
constructor(fieldOptions={}) {
super({
jsjobid: null,
jsjobname: '',
jsjobenabled: true,
jsjobdesc: '',
jsjobtype: 's',
jsjobruncount: 0,
jsjobfailurecount: 0,
// Program Args
jsjobprname: '',
jsprtype: 'PLSQL_BLOCK',
jsprenabled: true,
jsprnoofargs: 0,
jsprproc: null,
jsprcode: null,
jsprarguments: [],
// Schedule args
jsjobscname: '',
jsscstart: null,
jsscend: null,
jsscrepeatint: '',
jsscfreq: null,
jsscdate: null,
jsscweekdays: null,
jsscmonthdays: null,
jsscmonths: null,
jsschours: null,
jsscminutes: null,
});
this.fieldOptions = {
procedures: [],
programs: [],
schedules: [],
...fieldOptions,
};
}
get idAttribute() {
return 'jsjobid';
}
get baseFields() {
let obj = this;
return [
{
id: 'jsjobid', label: gettext('ID'), type: 'int', mode: ['properties'],
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsjobname', label: gettext('Name'), cell: 'text',
editable: false, type: 'text', noEmpty: true,
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsjobenabled', label: gettext('Enabled?'), type: 'switch', cell: 'switch',
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsjobtype', label: gettext('Job Type'),
type: ()=>{
let options = [
{'label': gettext('SELF-CONTAINED'), 'value': 's'},
{'label': gettext('PRE-DEFINED'), 'value': 'p'},
];
return {
type: 'toggle',
options: options,
};
},
readonly: function(state) {return !obj.isNew(state); },
helpMessage: gettext('If the Job Type is Self-Contained you need to specify the action and repeat interval in the Action and Repeat tabs respectively. If the Job Type is Pre-Defined you need to specify the existing Program and Schedule names in the Pre-Defined tab.'),
helpMessageMode: ['create'],
}, {
id: 'jsjobruncount', label: gettext('Run Count'), type: 'int',
readonly: true, mode: ['edit', 'properties']
}, {
id: 'jsjobfailurecount', label: gettext('Failure Count'), type: 'int',
readonly: true, mode: ['edit', 'properties']
}, {
id: 'jsjobdesc', label: gettext('Comment'), type: 'multiline',
readonly: function(state) {return !obj.isNew(state); },
},
// Add the Action Schema
...getActionSchema(obj, 'job'),
// Add the Repeat Schema.
...getRepeatSchema(obj, 'job'),
{
id: 'jsjobprname', label: gettext('Program Name'), type: 'select',
controlProps: { allowClear: false}, group: gettext('Pre-Defined'),
options: this.fieldOptions.programs,
readonly: function(state) {
return !obj.isNew(state) || state.jsjobtype == 's';
},
deps: ['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 's') {
return { jsjobprname: null };
}
}
}, {
id: 'jsjobscname', label: gettext('Schedule Name'), type: 'select',
controlProps: { allowClear: false}, group: gettext('Pre-Defined'),
options: this.fieldOptions.schedules,
readonly: function(state) {
return !obj.isNew(state) || state.jsjobtype == 's';
},
deps: ['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 's') {
return { jsjobscname: null };
}
}
},
];
}
validate(state, setError) {
if (state.jsjobtype == 's' ) {
if (isEmptyString(state.jsprtype)) {
setError('jsprtype', gettext('Job Type cannot be empty.'));
return true;
} else {
setError('jsprtype', null);
}
if (state.jsprtype == 'PLSQL_BLOCK' && isEmptyString(state.jsprcode)) {
setError('jsprcode', gettext('Code cannot be empty.'));
return true;
} else {
setError('jsprcode', null);
}
if (state.jsprtype == 'STORED_PROCEDURE' && isEmptyString(state.jsprproc)) {
setError('jsprproc', gettext('Procedure cannot be empty.'));
return true;
} else {
setError('jsprproc', null);
}
if (isEmptyString(state.jsscstart) && isEmptyString(state.jsscfreq) &&
isEmptyString(state.jsscmonths) && isEmptyString(state.jsscweekdays) &&
isEmptyString(state.jsscmonthdays) && isEmptyString(state.jsschours) &&
isEmptyString(state.jsscminutes) && isEmptyString(state.jsscdate)) {
setError('jsscstart', gettext('Either Start time or Repeat interval must be specified.'));
return true;
} else {
setError('jsscstart', null);
}
if (!isEmptyString(state.jsscend)) {
let start_time = state.jsscstart,
end_time = state.jsscend,
start_time_js = start_time.split(' '),
end_time_js = end_time.split(' ');
start_time_js = moment(start_time_js[0] + ' ' + start_time_js[1]);
end_time_js = moment(end_time_js[0] + ' ' + end_time_js[1]);
if(end_time_js.isBefore(start_time_js)) {
setError('jsscend', gettext('Start time must be less than end time'));
return true;
} else {
setError('jsscend', null);
}
} else {
state.jsscend = null;
}
} else if (state.jsjobtype == 'p') {
if (isEmptyString(state.jsjobprname)) {
setError('jsjobprname', gettext('Pre-Defined program name cannot be empty.'));
return true;
} else {
setError('jsjobprname', null);
}
if (isEmptyString(state.jsjobscname)) {
setError('jsjobscname', gettext('Pre-Defined schedule name cannot be empty.'));
return true;
} else {
setError('jsjobscname', null);
}
}
}
}

View File

@ -0,0 +1,61 @@
{% if display_comments %}
-- DBMS Job: '{{ job_name }}'
-- EXEC dbms_scheduler.DROP_JOB('{{ job_name }}');
{% endif %}
{% if internal_job_type is defined and internal_job_type == 's' %}
EXEC dbms_scheduler.CREATE_JOB(
job_name => {{ job_name|qtLiteral(conn) }},
job_type => {{ job_type|qtLiteral(conn) }},
job_action => {{ job_action|qtLiteral(conn) }},
repeat_interval => {{ repeat_interval|qtLiteral(conn) }}{% if start_date or end_date or number_of_arguments or enabled or comments %},{% endif %}
{% if start_date %}
start_date => {{ start_date|qtLiteral(conn) }}{% if end_date or number_of_arguments or enabled or comments %},{% endif %}
{% endif %}
{% if end_date %}
end_date => {{ end_date|qtLiteral(conn) }}{% if number_of_arguments or enabled or comments %},{% endif %}
{% endif %}
{% if number_of_arguments %}
number_of_arguments => {{ number_of_arguments }}{% if enabled or comments %},{% endif %}
{% endif %}
{% if enabled %}
enabled => {{ enabled }}{% if comments %},{% endif %}
{% endif %}
{% if comments %}
comments => {{ comments|qtLiteral(conn) }}
{% endif %}
);
{% elif internal_job_type is defined and internal_job_type == 'p' %}
EXEC dbms_scheduler.CREATE_JOB(
job_name => {{ job_name|qtLiteral(conn) }},
program_name => {{ program_name|qtLiteral(conn) }},
schedule_name => {{ schedule_name|qtLiteral(conn) }}{% if enabled or comments %},{% endif %}
{% if enabled %}
enabled => {{ enabled }}{% if comments %},{% endif %}
{% endif %}
{% if comments %}
comments => {{ comments|qtLiteral(conn) }}
{% endif %}
);
{% endif %}
{% for args_list_item in arguments %}
EXEC dbms_scheduler.SET_JOB_ARGUMENT_VALUE(
job_name => {{ job_name|qtLiteral(conn) }},
argument_name => {{ args_list_item.argname|qtLiteral(conn) }},
{% if args_list_item.argval is defined and args_list_item.argval != '' %}
argument_value => {{ args_list_item.argval|qtLiteral(conn) }}
{% elif args_list_item.argdefval is defined %}
argument_value => {{ args_list_item.argdefval|qtLiteral(conn) }}
{% endif %}
);
{% endfor %}

View File

@ -0,0 +1,3 @@
EXEC dbms_scheduler.DROP_JOB(
{{ job_name|qtLiteral(conn) }}
);

View File

@ -0,0 +1,3 @@
SELECT value
FROM dba_scheduler_job_args
WHERE job_name = {{ job_name|qtLiteral(conn) }} AND argument_name = {{ arg_name|qtLiteral(conn) }}

View File

@ -0,0 +1,5 @@
SELECT
dsj_job_id AS jsjobid, dsj_job_name AS jsjobname,
dsj_enabled AS jsjobenabled
FROM sys.scheduler_0400_job
WHERE dsj_job_name={{ job_name|qtLiteral(conn) }}

View File

@ -0,0 +1,4 @@
SELECT
dsj_job_id as jsjobid, dsj_job_name as jsjobname,
dsj_enabled as jsjobenabled, dsj_comments as jsjobdesc
FROM sys.scheduler_0400_job;

View File

@ -0,0 +1,14 @@
SELECT
job.dsj_job_id as jsjobid, job_name as jsjobname, program_name as jsjobprname, job_type as jsprtype,
CASE WHEN job_type = 'PLSQL_BLOCK' THEN job_action ELSE '' END AS jsprcode,
CASE WHEN job_type = 'STORED_PROCEDURE' THEN job_action ELSE '' END AS jsprproc,
job_action, number_of_arguments as jsprnoofargs, schedule_name as jsjobscname,
start_date as jsscstart, end_date as jsscend, repeat_interval as jsscrepeatint,
enabled as jsjobenabled, comments as jsjobdesc,
run_count as jsjobruncount, failure_count as jsjobfailurecount,
job.dsj_program_id as program_id, job.dsj_schedule_id as schedule_id
FROM sys.dba_scheduler_jobs jobv
LEFT JOIN sys.scheduler_0400_job job ON jobv.job_name = job.dsj_job_name
{% if jsjobid %}
WHERE job.dsj_job_id={{jsjobid}}::oid
{% endif %}

View File

@ -0,0 +1,3 @@
EXEC dbms_scheduler.RUN_JOB(
{{ job_name|qtLiteral(conn) }}
);

View File

@ -0,0 +1,12 @@
{% for chval in changed_value %}
EXEC dbms_scheduler.SET_JOB_ARGUMENT_VALUE(
job_name => {{ job_name|qtLiteral(conn) }},
argument_name => {{ chval.argname|qtLiteral(conn) }},
{% if chval.argval is defined and chval.argval != '' %}
argument_value => {{ chval.argval|qtLiteral(conn) }}
{% elif chval.argdefval is defined %}
argument_value => {{ chval.argdefval|qtLiteral(conn) }}
{% endif %}
);
{% endfor %}

View File

@ -0,0 +1,429 @@
{
"dbms_create_job": [
{
"name": "Create job when type is self contained and PLSQL_BLOCK",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobid": null,
"jsjobname": "job_self_with_psql",
"jsjobenabled": true,
"jsjobdesc": "This is a self contained psql job.",
"jsjobtype": "s",
"jsprtype": "PLSQL_BLOCK",
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2054-02-28 00:00:00 +05:30",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"],
"jsjobprname": "",
"jsjobscname": ""
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create job when type is self contained and STORED_PROCEDURE without arguments",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobid": null,
"jsjobname": "job_self_with_proc_noargs",
"jsjobenabled": true,
"jsjobdesc": "This is a self contained stored procedure job with no args.",
"jsjobtype": "s",
"jsprtype": "STORED_PROCEDURE",
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": "public.test_proc_without_args",
"jsprcode": null,
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2054-02-28 00:00:00 +05:30",
"jsscfreq": "YEARLY",
"jsscdate": "20250113",
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": [],
"jsjobprname": "",
"jsjobscname": ""
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create job when type is pre-defined and program is PLSQL",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobid": null,
"jsjobname": "job_pre_with_psql",
"jsjobenabled": true,
"jsjobdesc": "This is a pre-defined job with PLSQL program.",
"jsjobtype": "p",
"jsjobprname": "prg_with_psql",
"jsjobscname": "yearly_sch",
"jsprtype": null,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": null,
"jsprcode": null,
"jsscstart": null,
"jsscend": null,
"jsscfreq": null,
"jsscdate": null,
"jsscweekdays": null,
"jsscmonthdays": null,
"jsscmonths": null,
"jsschours": null,
"jsscminutes": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create job when type is pre-defined and program is Stored Procedure without args",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobid": null,
"jsjobname": "job_pre_with_proc_noargs",
"jsjobenabled": true,
"jsjobdesc": "This is a pre-defined job with Stored Procedure without args",
"jsjobtype": "p",
"jsjobprname": "prg_with_proc_noargs",
"jsjobscname": "yearly_sch",
"jsprtype": null,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": null,
"jsprcode": null,
"jsscstart": null,
"jsscend": null,
"jsscfreq": null,
"jsscdate": null,
"jsscweekdays": null,
"jsscmonthdays": null,
"jsscmonths": null,
"jsschours": null,
"jsscminutes": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create job when type is pre-defined and program is Stored Procedure with args",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobid": null,
"jsjobname": "job_pre_with_proc_args",
"jsjobenabled": true,
"jsjobdesc": "This is a pre-defined job with program is Stored Procedure with args.",
"jsjobtype": "p",
"jsjobprname": "prg_with_proc_args",
"jsjobscname": "yearly_sch",
"jsprtype": null,
"jsprnoofargs": [],
"jsprarguments": [],
"jsprproc": null,
"jsprcode": null,
"jsscstart": null,
"jsscend": null,
"jsscfreq": null,
"jsscdate": null,
"jsscweekdays": null,
"jsscmonthdays": null,
"jsscmonths": null,
"jsschours": null,
"jsscminutes": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create job: while server is down",
"url": "/browser/dbms_job/obj/",
"is_positive_test": false,
"test_data": {
"jsjobid": null,
"jsjobname": "job_with_psql",
"jsjobenabled": true,
"jsjobdesc": "This is a self contained psql job.",
"jsjobtype": "s",
"jsprtype": "PLSQL_BLOCK",
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2054-02-28 00:00:00 +05:30",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"],
"jsjobprname": "",
"jsjobscname": ""
},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_scalar",
"return_value": "[(False,'Mocked Internal Server Error')]"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
}
}
],
"dbms_update_job": [
{
"name": "Set job argument value",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"test_data": {
"jsjobname": "job_with_update_args",
"jsprarguments": {
"changed": [{"argid":0,"argtype":"bigint","argmode":"IN","argname":"salary","argdefval":"10000","argval":"5000"}]
}
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
}
],
"dbms_delete_job": [
{
"name": "Delete job: With existing DBMS job.",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Delete multiple jobs: With existing DBMS jobs.",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
}
],
"dbms_get_job": [
{
"name": "Get job: With existing DBMS job.",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Get jobs: With multiple existing DBMS jobs.",
"url": "/browser/dbms_job/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
},
{
"name": "Get job: while server down.",
"url": "/browser/dbms_job/obj/",
"is_positive_test": false,
"inventory_data": {},
"test_data": {},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_dict",
"return_value": "(False,'Mocked Internal Server Error')"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
},
"is_list": false
}
],
"dbms_msql_job": [
{
"name": "Get job msql: For existing PLSQL job.",
"url": "/browser/dbms_job/msql/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {
"jsjobid": null,
"jsjobname": "job_self_with_psql",
"jsjobenabled": true,
"jsjobdesc": "This is a self contained psql job.",
"jsjobtype": "s",
"jsprtype": "PLSQL_BLOCK",
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2054-02-28 00:00:00 +05:30",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"],
"jsjobprname": "",
"jsjobscname": ""
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Get job msql: For existing STORED_PROCEDURE job.",
"url": "/browser/dbms_job/msql/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {
"jsjobid": null,
"jsjobname": "job_self_with_proc_noargs",
"jsjobenabled": true,
"jsjobdesc": "This is a self contained stored procedure job with no args.",
"jsjobtype": "s",
"jsprtype": "STORED_PROCEDURE",
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprproc": "public.test_proc_without_args",
"jsprcode": null,
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2054-02-28 00:00:00 +05:30",
"jsscfreq": "YEARLY",
"jsscdate": "20250113",
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": [],
"jsjobprname": "",
"jsjobscname": ""
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
}
],
"dbms_enable_job": [
{
"name": "Enable existing job",
"url": "/browser/dbms_job/enable_disable/",
"is_positive_test": true,
"test_data": {
"is_enable_job": true
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
}
],
"dbms_disable_job": [
{
"name": "Disable existing job",
"url": "/browser/dbms_job/enable_disable/",
"is_positive_test": true,
"test_data": {
"is_enable_job": false
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
}
]
}

View File

@ -0,0 +1,94 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSAddJobTestCase(BaseTestGenerator):
"""This class will test the add job in the DBMS Job API"""
scenarios = utils.generate_scenarios("dbms_create_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
# Create job schedule
job_scheduler_utils.create_dbms_schedule(self, 'yearly_sch')
job_scheduler_utils.create_dbms_program(self, 'prg_with_psql')
job_scheduler_utils.create_dbms_program(
self,'prg_with_proc_noargs', with_proc=True,
proc_name='public.test_proc_without_args()')
job_scheduler_utils.create_dbms_program(
self,'prg_with_proc_args', with_proc=True,
proc_name='public.test_proc_with_args(IN salary bigint DEFAULT '
'10000, IN name character varying)')
def runTest(self):
""" This function will add DBMS Job under test database. """
if self.is_positive_test:
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
# Verify in backend
response_data = json.loads(response.data)
self.jobs_id = response_data['node']['_id']
jobs_name = response_data['node']['label']
is_present = job_scheduler_utils.verify_dbms_job(
self, jobs_name)
self.assertTrue(
is_present,"DBMS job was not created successfully.")
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=eval(self.mock_data["return_value"])):
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,98 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSDeleteJobTestCase(BaseTestGenerator):
"""This class will test the delete job in the DBMS job API"""
scenarios = utils.generate_scenarios("dbms_delete_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.job_name = "test_job_delete%s" % str(uuid.uuid4())[1:8]
self.job_id = job_scheduler_utils.create_dbms_job(
self, self.job_name)
# multiple jobs
if self.is_list:
self.job_name2 = "test_job_delete%s" % str(uuid.uuid4())[1:8]
self.job_id_2 = job_scheduler_utils.create_dbms_job(
self, self.job_name2)
def runTest(self):
"""
This function will test delete DBMS job under test database.
"""
if self.is_list:
self.data['ids'] = [self.job_id, self.job_id_2]
response = job_scheduler_utils.api_delete(self, '')
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_job(
self, self.job_name)
self.assertFalse(
is_present, "DBMS job was not deleted successfully")
is_present = job_scheduler_utils.verify_dbms_job(
self, self.job_name2)
self.assertFalse(
is_present, "DBMS job was not deleted successfully")
else:
response = job_scheduler_utils.api_delete(self)
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_job(
self, self.job_name)
self.assertFalse(
is_present, "DBMS job was not deleted successfully")
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,71 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSDisableJobTestCase(BaseTestGenerator):
"""This class will test the add job in the DBMS job API"""
scenarios = utils.generate_scenarios("dbms_disable_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.job_name = "test_job_disable%s" % str(uuid.uuid4())[1:8]
self.data['job_name'] = self.job_name
self.job_id = job_scheduler_utils.create_dbms_job(
self, self.job_name)
def runTest(self):
""" This function will test DBMS job under test database."""
response = job_scheduler_utils.api_put(self, self.job_id)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_job(self, self.job_name)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,71 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSEnableJobTestCase(BaseTestGenerator):
"""This class will test the enable job in the DBMS job API"""
scenarios = utils.generate_scenarios("dbms_enable_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.job_name = "test_job_enable%s" % str(uuid.uuid4())[1:8]
self.data['job_name'] = self.job_name
self.job_id = job_scheduler_utils.create_dbms_job(
self, self.job_name, False)
def runTest(self):
""" This function will test DBMS job under test database."""
response = job_scheduler_utils.api_put(self, self.job_id)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_job(self, self.job_name)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,92 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetJobTestCase(BaseTestGenerator):
"""This class will test the add job in the DBMS job API"""
scenarios = utils.generate_scenarios("dbms_get_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.job_name = "test_job_get%s" % str(uuid.uuid4())[1:8]
self.job_id = job_scheduler_utils.create_dbms_job(
self, self.job_name)
# multiple jobs
if self.is_list:
self.job_name2 = "test_job_get%s" % str(uuid.uuid4())[1:8]
self.job_id_2 = job_scheduler_utils.create_dbms_job(
self, self.job_name2,)
def runTest(self):
""" This function will test DBMS job under test database."""
if self.is_positive_test:
if self.is_list:
response = job_scheduler_utils.api_get(self, '')
else:
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=[eval(self.mock_data["return_value"])]):
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_job(self, self.job_name)
if self.is_list:
job_scheduler_utils.delete_dbms_job(self, self.job_name2)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,65 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetMSQLJobTestCase(BaseTestGenerator):
"""This class will test the add job in the DBMS job API"""
scenarios = utils.generate_scenarios("dbms_msql_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
def runTest(self):
""" This function will add DBMS job under test database. """
url_encode_data = self.data
response = job_scheduler_utils.api_get_msql(self, url_encode_data)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,73 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_jobs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSUpdateJobTestCase(BaseTestGenerator):
"""This class will test the add job in the DBMS Job API"""
scenarios = utils.generate_scenarios("dbms_update_job",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
# Create job schedule
job_scheduler_utils.create_dbms_schedule(self, 'yearly_sch')
job_scheduler_utils.create_dbms_program(
self,'prg_with_proc_args', with_proc=True,
proc_name='public.test_proc_with_args()',
define_args=True)
self.job_id = job_scheduler_utils.create_dbms_job(
self, self.data['jsjobname'], True,
'prg_with_proc_args','yearly_sch')
def runTest(self):
""" This function will update DBMS Job under test database. """
response = job_scheduler_utils.api_put(self, self.job_id)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,574 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
""" Implements DBMS Program objects Node."""
import json
from functools import wraps
from flask import render_template, request, jsonify
from flask_babel import gettext
from pgadmin.browser.collection import CollectionNodeModule
from pgadmin.browser.server_groups.servers import databases
from pgadmin.browser.utils import PGChildNodeView
from pgadmin.utils.ajax import make_json_response, gone, \
make_response as ajax_response, internal_server_error
from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER
from pgadmin.utils.constants import DBMS_JOB_SCHEDULER_ID
from pgadmin.browser.server_groups.servers.databases.schemas.functions.utils \
import format_arguments_from_db
from pgadmin.browser.server_groups.servers.databases.dbms_job_scheduler.utils \
import get_formatted_program_args
class DBMSProgramModule(CollectionNodeModule):
"""
class DBMSProgramModule(CollectionNodeModule)
A module class for DBMS Program objects node derived
from CollectionNodeModule.
Methods:
-------
* get_nodes(gid, sid, did)
- Method is used to generate the browser collection node.
* script_load()
- Load the module script for DBMS Program objects, when any of
the server node is initialized.
"""
_NODE_TYPE = 'dbms_program'
_COLLECTION_LABEL = gettext("DBMS Programs")
@property
def collection_icon(self):
"""
icon to be displayed for the browser collection node
"""
return 'icon-coll-pga_jobstep'
@property
def node_icon(self):
"""
icon to be displayed for the browser nodes
"""
return 'icon-pga_jobstep'
def get_nodes(self, gid, sid, did, jsid):
"""
Generate the collection node
"""
if self.show_node:
yield self.generate_browser_collection_node(did)
@property
def node_inode(self):
"""
Override this property to make the node a leaf node.
Returns: False as this is the leaf node
"""
return False
@property
def script_load(self):
"""
Load the module script for server, when any of the database node is
initialized.
"""
return databases.DatabaseModule.node_type
@property
def module_use_template_javascript(self):
"""
Returns whether Jinja2 template is used for generating the javascript
module.
"""
return False
blueprint = DBMSProgramModule(__name__)
class DBMSProgramView(PGChildNodeView):
"""
class DBMSProgramView(PGChildNodeView)
A view class for DBMSProgram node derived from PGChildNodeView.
This class is responsible for all the stuff related to view like
updating program node, showing properties, showing sql in sql pane.
Methods:
-------
* __init__(**kwargs)
- Method is used to initialize the DBMSProgramView, and it's base view.
* check_precondition()
- This function will behave as a decorator which will checks
database connection before running view, it will also attaches
manager,conn & template_path properties to self
* list()
- This function is used to list all the program nodes within that
collection.
* nodes()
- This function will use to create all the child node within that
collection. Here it will create all the program node.
* properties(gid, sid, did, jsid, jsprid)
- This function will show the properties of the selected program node
* create(gid, sid, did, jsid, jsprid)
- This function will create the new program object
* msql(gid, sid, did, jsid, jsprid)
- This function is used to return modified SQL for the
selected program node
* sql(gid, sid, did, jsid, jsprid)
- Dummy response for sql panel
* delete(gid, sid, did, jsid, jsprid)
- Drops job program
"""
node_type = blueprint.node_type
BASE_TEMPLATE_PATH = 'dbms_programs/ppas/#{0}#'
parent_ids = [
{'type': 'int', 'id': 'gid'},
{'type': 'int', 'id': 'sid'},
{'type': 'int', 'id': 'did'},
{'type': 'int', 'id': 'jsid'}
]
ids = [
{'type': 'int', 'id': 'jsprid'}
]
operations = dict({
'obj': [
{'get': 'properties', 'delete': 'delete'},
{'get': 'list', 'post': 'create', 'delete': 'delete'}
],
'nodes': [{'get': 'nodes'}, {'get': 'nodes'}],
'msql': [{'get': 'msql'}, {'get': 'msql'}],
'sql': [{'get': 'sql'}],
'get_procedures': [{}, {'get': 'get_procedures'}],
'enable_disable': [{'put': 'enable_disable'}],
})
def _init_(self, **kwargs):
self.conn = None
self.template_path = None
self.manager = None
super().__init__(**kwargs)
def check_precondition(f):
"""
This function will behave as a decorator which will check the
database connection before running view. It will also attach
manager, conn & template_path properties to self
"""
@wraps(f)
def wrap(*args, **kwargs):
# Here args[0] will hold self & kwargs will hold gid,sid,did
self = args[0]
self.driver = get_driver(PG_DEFAULT_DRIVER)
self.manager = self.driver.connection_manager(kwargs['sid'])
self.conn = self.manager.connection(did=kwargs['did'])
# Set the template path for the SQL scripts
self.template_path = self.BASE_TEMPLATE_PATH.format(
self.manager.version)
return f(*args, **kwargs)
return wrap
@check_precondition
def list(self, gid, sid, did, jsid):
"""
This function is used to list all the program nodes within
that collection.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]))
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
return ajax_response(
response=res['rows'],
status=200
)
@check_precondition
def nodes(self, gid, sid, did, jsid, jsprid=None):
"""
This function is used to create all the child nodes within
the collection. Here it will create all the program nodes.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
res = []
try:
sql = render_template(
"/".join([self.template_path, self._NODES_SQL]))
status, result = self.conn.execute_2darray(sql)
if not status:
return internal_server_error(errormsg=result)
if jsprid is not None:
if len(result['rows']) == 0:
return gone(
errormsg=gettext("Could not find the specified "
"program."))
row = result['rows'][0]
return make_json_response(
data=self.blueprint.generate_browser_node(
row['jsprid'],
DBMS_JOB_SCHEDULER_ID,
row['jsprname'],
is_enabled=row['jsprenabled'],
icon="icon-pga_jobstep" if row['jsprenabled'] else
"icon-pga_jobstep-disabled",
description=row['jsprdesc']
)
)
for row in result['rows']:
res.append(
self.blueprint.generate_browser_node(
row['jsprid'],
DBMS_JOB_SCHEDULER_ID,
row['jsprname'],
is_enabled=row['jsprenabled'],
icon="icon-pga_jobstep" if row['jsprenabled'] else
"icon-pga_jobstep-disabled",
description=row['jsprdesc']
)
)
return make_json_response(
data=res,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def properties(self, gid, sid, did, jsid, jsprid):
"""
This function will show the properties of the selected program node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsprid: Job program ID
"""
try:
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]),
jsprid=jsprid
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
errormsg=gettext("Could not find the specified program.")
)
data = res['rows'][0]
# Get the formatted program args
get_formatted_program_args(self.template_path, self.conn, data)
return ajax_response(
response=data,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def create(self, gid, sid, did, jsid):
"""
This function will update the data for the selected program node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
data = json.loads(request.data)
try:
sql = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
program_name=data['jsprname'],
program_type=data['jsprtype'],
program_action=data['jsprproc']
if data['jsprtype'] == 'STORED_PROCEDURE' else
data['jsprcode'],
number_of_arguments=data['jsprnoofargs'],
enabled=data['jsprenabled'],
comments=data['jsprdesc'],
arguments=data['jsprarguments'],
conn=self.conn
)
status, res = self.conn.execute_void('BEGIN')
if not status:
return internal_server_error(errormsg=res)
status, res = self.conn.execute_scalar(sql)
if not status:
if self.conn.connected():
self.conn.execute_void('END')
return internal_server_error(errormsg=res)
self.conn.execute_void('END')
# Get the newly created program id
sql = render_template(
"/".join([self.template_path, 'get_program_id.sql']),
jsprname=data['jsprname'], conn=self.conn
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
errormsg=gettext("Job program creation failed.")
)
row = res['rows'][0]
return jsonify(
node=self.blueprint.generate_browser_node(
row['jsprid'],
DBMS_JOB_SCHEDULER_ID,
row['jsprname'],
is_enabled=row['jsprenabled'],
icon="icon-pga_jobstep" if row['jsprenabled'] else
"icon-pga_jobstep-disabled"
)
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def delete(self, gid, sid, did, jsid, jsprid=None):
"""Delete the Job program."""
if jsprid is None:
data = request.form if request.form else json.loads(
request.data
)
else:
data = {'ids': [jsprid]}
try:
for jsprid in data['ids']:
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]),
jsprid=jsprid
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
jsprname = res['rows'][0]['jsprname']
status, res = self.conn.execute_void(
render_template(
"/".join([self.template_path, self._DELETE_SQL]),
program_name=jsprname, conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return make_json_response(success=1)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def msql(self, gid, sid, did, jsid, jsprid=None):
"""
This function is used to return modified SQL for the
selected program node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsprid: Job program ID (optional)
"""
data = {}
for k, v in request.args.items():
try:
# comments should be taken as is because if user enters a
# json comment it is parsed by loads which should not happen
if k in ('jsprdesc',):
data[k] = v
else:
data[k] = json.loads(v)
except ValueError:
data[k] = v
try:
sql = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
program_name=data['jsprname'],
program_type=data['jsprtype'],
program_action=data['jsprproc']
if data['jsprtype'] == 'STORED_PROCEDURE' else
data['jsprcode'],
number_of_arguments=data['jsprnoofargs'],
enabled=data['jsprenabled'],
comments=data['jsprdesc'],
arguments=data['jsprarguments'],
conn=self.conn
)
return make_json_response(
data=sql,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def sql(self, gid, sid, did, jsid, jsprid):
"""
This function will generate sql for the sql panel
"""
try:
SQL = render_template("/".join(
[self.template_path, self._PROPERTIES_SQL]
), jsprid=jsprid)
status, res = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
gettext("Could not find the DBMS Schedule.")
)
data = res['rows'][0]
# Get the formatted program args
get_formatted_program_args(self.template_path, self.conn, data)
SQL = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
display_comments=True,
program_name=data['jsprname'],
program_type=data['jsprtype'],
program_action=data['jsprproc']
if data['jsprtype'] == 'STORED_PROCEDURE' else
data['jsprcode'],
number_of_arguments=data['jsprnoofargs'],
enabled=data['jsprenabled'],
comments=data['jsprdesc'],
arguments=data['jsprarguments'] if 'jsprarguments' in data
else [],
conn=self.conn
)
return ajax_response(response=SQL)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def get_procedures(self, gid, sid, did, jsid=None):
"""
This function will return procedure list
:param gid: group id
:param sid: server id
:param did: database id
:return:
"""
res = []
sql = render_template("/".join([self.template_path,
'get_procedures.sql']),
datlastsysoid=self._DATABASE_LAST_SYSTEM_OID)
status, rset = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=rset)
for row in rset['rows']:
# Get formatted Arguments
frmtd_params, _ = format_arguments_from_db(
self.template_path, self.conn, row)
res.append({'label': row['proc_name'],
'value': row['proc_name'],
'no_of_args': row['number_of_arguments'],
'arguments': frmtd_params['arguments']
})
return make_json_response(
data=res,
status=200
)
@check_precondition
def enable_disable(self, gid, sid, did, jsid, jsprid=None):
"""
This function is used to enable/disable program.
"""
data = request.form if request.form else json.loads(
request.data
)
status, res = self.conn.execute_void(
render_template(
"/".join([self.template_path, 'enable_disable.sql']),
name=data['program_name'],
is_enable=data['is_enable_program'], conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return make_json_response(
success=1,
info=gettext("Program enabled") if data['is_enable_program'] else
gettext('Program disabled'),
data={
'sid': sid,
'did': did,
'jsid': jsid,
'jsprid': jsprid
}
)
DBMSProgramView.register_node_view(blueprint)

View File

@ -0,0 +1,189 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import DBMSProgramSchema from './dbms_program.ui';
import { getNodeAjaxOptions } from '../../../../../../../static/js/node_ajax';
import getApiInstance from '../../../../../../../../static/js/api_instance';
define('pgadmin.node.dbms_program', [
'sources/gettext', 'sources/url_for', 'sources/pgadmin',
'pgadmin.browser', 'pgadmin.browser.collection',
], function(gettext, url_for, pgAdmin, pgBrowser) {
if (!pgBrowser.Nodes['coll-dbms_program']) {
pgBrowser.Nodes['coll-dbms_program'] =
pgBrowser.Collection.extend({
node: 'dbms_program',
label: gettext('DBMS Programs'),
type: 'coll-dbms_program',
columns: ['jsprname', 'jsprtype', 'jsprenabled', 'jsprdesc'],
hasSQL: false,
hasDepends: false,
hasStatistics: false,
hasScriptTypes: [],
canDrop: true,
canDropCascade: false,
});
}
if (!pgBrowser.Nodes['dbms_program']) {
pgAdmin.Browser.Nodes['dbms_program'] = pgAdmin.Browser.Node.extend({
parent_type: 'dbms_job_scheduler',
type: 'dbms_program',
label: gettext('DBMS Program'),
node_image: 'icon-pga_jobstep',
epasHelp: true,
epasURL: 'https://www.enterprisedb.com/docs/epas/$VERSION$/epas_compat_bip_guide/03_built-in_packages/15_dbms_scheduler/03_create_program/',
dialogHelp: url_for('help.static', {'filename': 'dbms_program.html'}),
canDrop: true,
hasSQL: true,
hasDepends: false,
hasStatistics: false,
Init: function() {
/* Avoid multiple registration of menus */
if (this.initialized)
return;
this.initialized = true;
pgBrowser.add_menus([{
name: 'create_dbms_program_on_coll', node: 'coll-dbms_program', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Program...'),
data: {action: 'create'},
}, {
name: 'create_dbms_program', node: 'dbms_program', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Program...'),
data: {action: 'create'},
}, {
name: 'create_dbms_program', node: 'dbms_job_scheduler', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Program...'),
data: {action: 'create'},
}, {
name: 'enable_program', node: 'dbms_program', module: this,
applies: ['object', 'context'], callback: 'enable_program',
priority: 4, label: gettext('Enable Program'),
enable : 'is_enabled',data: {
data_disabled: gettext('Program is already enabled.'),
},
}, {
name: 'disable_program', node: 'dbms_program', module: this,
applies: ['object', 'context'], callback: 'disable_program',
priority: 4, label: gettext('Disable Program'),
enable : 'is_disabled',data: {
data_disabled: gettext('Program is already disabled.'),
},
}
]);
},
is_enabled: function(node) {
return !node?.is_enabled;
},
is_disabled: function(node) {
return node?.is_enabled;
},
callbacks: {
enable_program: function(args, notify) {
let input = args || {},
obj = this,
t = pgBrowser.tree,
i = 'item' in input ? input.item : t.selected(),
d = i ? t.itemData(i) : undefined;
if (d) {
notify = notify || _.isUndefined(notify) || _.isNull(notify);
let enable = function() {
let data = d;
getApiInstance().put(
obj.generate_url(i, 'enable_disable', d, true),
{'program_name': data.label, 'is_enable_program': true}
).then(({data: res})=> {
if (res.success == 1) {
pgAdmin.Browser.notifier.success(res.info);
t.removeIcon(i);
data.icon = 'icon-pga_jobstep';
data.is_enabled = true;
t.addIcon(i, {icon: data.icon});
t.updateAndReselectNode(i, data);
}
}).catch(function(error) {
pgAdmin.Browser.notifier.pgRespErrorNotify(error);
t.refresh(i);
});
};
if (notify) {
pgAdmin.Browser.notifier.confirm(
gettext('Enable Program'),
gettext('Are you sure you want to enable the program %s?', d.label),
function() { enable(); },
function() { return true;},
);
} else {
enable();
}
}
return false;
},
disable_program: function(args, notify) {
let input = args || {},
obj = this,
t = pgBrowser.tree,
i = 'item' in input ? input.item : t.selected(),
d = i ? t.itemData(i) : undefined;
if (d) {
notify = notify || _.isUndefined(notify) || _.isNull(notify);
let disable = function() {
let data = d;
getApiInstance().put(
obj.generate_url(i, 'enable_disable', d, true),
{'program_name': data.label, 'is_enable_program': false}
).then(({data: res})=> {
if (res.success == 1) {
pgAdmin.Browser.notifier.success(res.info);
t.removeIcon(i);
data.icon = 'icon-pga_jobstep-disabled';
data.is_enabled = false;
t.addIcon(i, {icon: data.icon});
t.updateAndReselectNode(i, data);
}
}).catch(function(error) {
pgAdmin.Browser.notifier.pgRespErrorNotify(error);
t.refresh(i);
});
};
if (notify) {
pgAdmin.Browser.notifier.confirm(
gettext('Disable Program'),
gettext('Are you sure you want to disable the program %s?', d.label),
function() { disable(); },
function() { return true;},
);
} else {
disable();
}
}
return false;
},
},
getSchema: function(treeNodeInfo, itemNodeData) {
return new DBMSProgramSchema(
{
procedures: ()=>getNodeAjaxOptions('get_procedures', this, treeNodeInfo, itemNodeData),
}
);
},
});
}
return pgBrowser.Nodes['dbms_program'];
});

View File

@ -0,0 +1,76 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import gettext from 'sources/gettext';
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
import { isEmptyString } from 'sources/validators';
import { getActionSchema } from '../../../static/js/dbms_job_scheduler_common.ui';
export default class DBMSProgramSchema extends BaseUISchema {
constructor(fieldOptions={}) {
super({
jsprid: null,
jsprname: '',
jsprtype: 'PLSQL_BLOCK',
jsprenabled: true,
jsprnoofargs: 0,
jsprarguments: [],
jsprdesc: '',
jsprproc: null,
jsprcode: null,
});
this.fieldOptions = {
procedures: [],
...fieldOptions,
};
}
get idAttribute() {
return 'jsprid';
}
get baseFields() {
let obj = this;
return [
{
id: 'jsprid', label: gettext('ID'), type: 'int', mode: ['properties'],
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsprname', label: gettext('Name'), cell: 'text',
type: 'text', noEmpty: true,
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsprenabled', label: gettext('Enabled?'), type: 'switch', cell: 'switch',
readonly: function(state) {return !obj.isNew(state); },
},
// Add the Action Schema
...getActionSchema(obj, 'program'),
{
id: 'jsprdesc', label: gettext('Comment'), type: 'multiline',
readonly: function(state) {return !obj.isNew(state); },
}
];
}
validate(state, setError) {
/* code validation*/
if (state.jsprtype == 'PLSQL_BLOCK' && isEmptyString(state.jsprcode)) {
setError('jsprcode', gettext('Code cannot be empty.'));
return true;
} else {
setError('jsprcode', null);
}
if (state.jsprtype == 'STORED_PROCEDURE' && isEmptyString(state.jsprproc)) {
setError('jsprproc', gettext('Procedure cannot be empty.'));
return true;
} else {
setError('jsprproc', null);
}
}
}

View File

@ -0,0 +1,34 @@
{% if display_comments %}
-- DBMS Program: '{{ program_name }}'
-- EXEC dbms_scheduler.DROP_PROGRAM('{{ program_name }}');
{% endif %}
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => {{ program_name|qtLiteral(conn) }},
program_type => {{ program_type|qtLiteral(conn) }},
program_action => {{ program_action|qtLiteral(conn) }}{% if number_of_arguments or enabled or comments %},{% endif %}
{% if number_of_arguments %}
number_of_arguments => {{ number_of_arguments }}{% if enabled or comments %},{% endif %}
{% endif %}
{% if enabled %}
enabled => {{ enabled }}{% if comments %},{% endif %}
{% endif %}
{% if comments %}
comments => {{ comments|qtLiteral(conn) }}
{% endif %}
);
{% for args_list_item in arguments %}
EXEC dbms_scheduler.DEFINE_PROGRAM_ARGUMENT(
program_name => {{ program_name|qtLiteral(conn) }},
argument_position => {{ args_list_item['argid'] }},
argument_name => {{ args_list_item['argname']|qtLiteral(conn) }},
argument_type => {{ args_list_item['argtype']|qtLiteral(conn) }},
default_value => {{ args_list_item['argdefval']|qtLiteral(conn) }}
);
{% endfor %}

View File

@ -0,0 +1,3 @@
EXEC dbms_scheduler.DROP_PROGRAM(
{{ program_name|qtLiteral(conn) }}
);

View File

@ -0,0 +1,5 @@
{% if is_enable %}
EXEC dbms_scheduler.ENABLE({{ name|qtLiteral(conn) }});
{% else %}
EXEC dbms_scheduler.DISABLE({{ name|qtLiteral(conn) }});
{% endif %}

View File

@ -0,0 +1,20 @@
SELECT
pg_catalog.concat(pg_catalog.quote_ident(nsp.nspname),'.',pg_catalog.quote_ident(pr.proname)) AS proc_name,
pr.pronargs AS number_of_arguments, proargnames,
pg_catalog.oidvectortypes(proargtypes) AS proargtypenames,
pg_catalog.pg_get_expr(proargdefaults, 'pg_catalog.pg_class'::regclass) AS proargdefaultvals
FROM
pg_catalog.pg_proc pr
JOIN
pg_catalog.pg_type typ ON typ.oid=prorettype
JOIN
pg_catalog.pg_namespace nsp ON nsp.oid=pr.pronamespace
WHERE
pr.prokind IN ('f', 'p')
AND typname NOT IN ('trigger', 'event_trigger')
AND (pronamespace = 2200::oid OR pronamespace > {{datlastsysoid}}::OID)
{% if without_args %}
AND pr.pronargs = 0
{% endif %}
ORDER BY
proname;

View File

@ -0,0 +1,5 @@
SELECT
dsp_program_id AS jsprid, dsp_program_name AS jsprname,
dsp_enabled AS jsprenabled
FROM sys.scheduler_0200_program
WHERE dsp_program_name={{ jsprname|qtLiteral(conn) }}

View File

@ -0,0 +1,6 @@
SELECT
dsp_program_id AS jsprid, dsp_program_name AS jsprname,
dsp_program_type AS jsprtype, dsp_enabled AS jsprenabled,
dsp_comments AS jsprdesc
FROM sys.scheduler_0200_program prt
JOIN sys.dba_scheduler_programs prv ON prt.dsp_program_name = prv.program_name

View File

@ -0,0 +1,18 @@
SELECT
dsp_program_id AS jsprid, dsp_program_name AS jsprname,
dsp_program_type AS jsprtype,
CASE WHEN dsp_program_type = 'PLSQL_BLOCK' THEN dsp_program_action ELSE '' END AS jsprcode,
CASE WHEN dsp_program_type = 'STORED_PROCEDURE' THEN dsp_program_action ELSE '' END AS jsprproc,
dsp_number_of_arguments AS jsprnoofargs, dsp_enabled AS jsprenabled,
dsp_comments AS jsprdesc, array_agg(argument_name) AS proargnames,
array_agg(argument_type) AS proargtypenames,
array_agg(default_value) AS proargdefaultvals
FROM sys.scheduler_0200_program prt
{% if not jsprid %}
JOIN sys.dba_scheduler_programs prv ON prt.dsp_program_name = prv.program_name
{% endif %}
LEFT JOIN dba_scheduler_program_args prargs ON dsp_program_name = prargs.program_name
{% if jsprid %}
WHERE dsp_program_id={{jsprid}}::oid
{% endif %}
GROUP BY dsp_program_id, dsp_program_name

View File

@ -0,0 +1,269 @@
{
"dbms_create_program": [
{
"name": "Create program when type is PLSQL_BLOCK",
"url": "/browser/dbms_program/obj/",
"is_positive_test": true,
"test_data": {
"jsprid": null,
"jsprname": "prg_with_psql",
"jsprtype": "PLSQL_BLOCK",
"jsprenabled": true,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a PLSQL program.",
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;"
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create program when type is STORED_PROCEDURE with args",
"url": "/browser/dbms_program/obj/",
"is_positive_test": true,
"test_data": {
"jsprid": null,
"jsprname": "prg_with_proc_args",
"jsprtype": "STORED_PROCEDURE",
"jsprenabled": true,
"jsprnoofargs": 2,
"jsprarguments": [{"argid":0,"argtype":"bigint","argmode":"IN","argname":"salary","argdefval":"10000"},{"argid":1,"argtype":"character varying","argmode":"IN","argname":"name","argdefval":" -"}],
"jsprdesc": "This is a STORED_PROCEDURE program.",
"jsprproc": "public.test_proc_with_args",
"jsprcode": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create program when type is STORED_PROCEDURE without args",
"url": "/browser/dbms_program/obj/",
"proc_name": "public.test_proc_without_args()",
"is_positive_test": true,
"test_data": {
"jsprid": null,
"jsprname": "prg_with_proc_without_args",
"jsprtype": "STORED_PROCEDURE",
"jsprenabled": false,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a STORED_PROCEDURE program.",
"jsprproc": "public.test_proc_without_args",
"jsprcode": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create program: while server is down",
"url": "/browser/dbms_program/obj/",
"is_positive_test": false,
"test_data": {
"jsprid": null,
"jsprname": "prg_with_psql",
"jsprtype": "PLSQL_BLOCK",
"jsprenabled": true,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a PLSQL program.",
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;"
},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_scalar",
"return_value": "[(False,'Mocked Internal Server Error')]"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
}
}
],
"dbms_delete_program": [
{
"name": "Delete program: With existing DBMS program.",
"url": "/browser/dbms_program/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Delete multiple programs: With existing DBMS programs.",
"url": "/browser/dbms_program/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
}
],
"dbms_get_program": [
{
"name": "Get program: With existing DBMS program.",
"url": "/browser/dbms_program/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Get programs: With multiple existing DBMS programs.",
"url": "/browser/dbms_program/obj/",
"proc_name": "public.test_proc_with_args(IN salary bigint DEFAULT 10000, IN name character varying)",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
},
{
"name": "Get program: while server down.",
"url": "/browser/dbms_program/obj/",
"is_positive_test": false,
"inventory_data": {},
"test_data": {},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_dict",
"return_value": "(False,'Mocked Internal Server Error')"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
},
"is_list": false
}
],
"dbms_msql_program": [
{
"name": "Get program msql: For existing PLSQL program.",
"url": "/browser/dbms_program/msql/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {
"jsprid": null,
"jsprname": "prg_with_psql",
"jsprtype": "PLSQL_BLOCK",
"jsprenabled": true,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a PLSQL program.",
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;"
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Get program msql: For existing STORED_PROCEDURE program.",
"url": "/browser/dbms_program/msql/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {
"jsprid": null,
"jsprname": "prg_with_proc_args",
"jsprtype": "STORED_PROCEDURE",
"jsprenabled": true,
"jsprnoofargs": 2,
"jsprarguments": "[{\"argid\":0,\"argtype\":\"bigint\",\"argmode\":\"IN\",\"argname\":\"salary\",\"argdefval\":\"10000\"},{\"argid\":1,\"argtype\":\"character varying\",\"argmode\":\"IN\",\"argname\":\"name\",\"argdefval\":\" -\"}]",
"jsprdesc": "This is a STORED_PROCEDURE program.",
"jsprproc": "public.test_proc_with_args",
"jsprcode": null
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
}
],
"dbms_enable_program": [
{
"name": "Enable existing program",
"url": "/browser/dbms_program/enable_disable/",
"is_positive_test": true,
"test_data": {
"is_enable_program": true
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
}
],
"dbms_disable_program": [
{
"name": "Disable existing program",
"url": "/browser/dbms_program/enable_disable/",
"is_positive_test": true,
"test_data": {
"is_enable_program": false
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
}
]
}

View File

@ -0,0 +1,8 @@
-- DBMS Program: 'dbms_prg_disabled'
-- EXEC dbms_scheduler.DROP_PROGRAM('dbms_prg_disabled');
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_disabled',
program_type => 'PLSQL_BLOCK',
program_action => 'BEGIN PERFORM 1; END;');

View File

@ -0,0 +1,4 @@
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_disabled',
program_type => 'PLSQL_BLOCK',
program_action => 'BEGIN PERFORM 1; END;');

View File

@ -0,0 +1,28 @@
-- DBMS Program: 'dbms_prg_proc_with_args'
-- EXEC dbms_scheduler.DROP_PROGRAM('dbms_prg_proc_with_args');
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_proc_with_args',
program_type => 'STORED_PROCEDURE',
program_action => 'public.test_proc_with_args',
number_of_arguments => 2,
enabled => True,
comments => 'This is a STORED_PROCEDURE program.'
);
EXEC dbms_scheduler.DEFINE_PROGRAM_ARGUMENT(
program_name => 'dbms_prg_proc_with_args',
argument_position => 0,
argument_name => 'salary',
argument_type => 'bigint',
default_value => '10000'
);
EXEC dbms_scheduler.DEFINE_PROGRAM_ARGUMENT(
program_name => 'dbms_prg_proc_with_args',
argument_position => 1,
argument_name => 'name',
argument_type => 'character varying',
default_value => ' -'
);

View File

@ -0,0 +1,24 @@
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_proc_with_args',
program_type => 'STORED_PROCEDURE',
program_action => 'public.test_proc_with_args',
number_of_arguments => 2,
enabled => True,
comments => 'This is a STORED_PROCEDURE program.'
);
EXEC dbms_scheduler.DEFINE_PROGRAM_ARGUMENT(
program_name => 'dbms_prg_proc_with_args',
argument_position => 0,
argument_name => 'salary',
argument_type => 'bigint',
default_value => '10000'
);
EXEC dbms_scheduler.DEFINE_PROGRAM_ARGUMENT(
program_name => 'dbms_prg_proc_with_args',
argument_position => 1,
argument_name => 'name',
argument_type => 'character varying',
default_value => ' -'
);

View File

@ -0,0 +1,10 @@
-- DBMS Program: 'dbms_prg_proc_without_args'
-- EXEC dbms_scheduler.DROP_PROGRAM('dbms_prg_proc_without_args');
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_proc_without_args',
program_type => 'STORED_PROCEDURE',
program_action => 'public.test_proc_without_args',
comments => 'This is a STORED_PROCEDURE program.'
);

View File

@ -0,0 +1,6 @@
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_proc_without_args',
program_type => 'STORED_PROCEDURE',
program_action => 'public.test_proc_without_args',
comments => 'This is a STORED_PROCEDURE program.'
);

View File

@ -0,0 +1,11 @@
-- DBMS Program: 'dbms_prg_with_psql'
-- EXEC dbms_scheduler.DROP_PROGRAM('dbms_prg_with_psql');
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_with_psql',
program_type => 'PLSQL_BLOCK',
program_action => 'BEGIN PERFORM 1; END;',
enabled => True,
comments => 'This is a PLSQL program.'
);

View File

@ -0,0 +1,7 @@
EXEC dbms_scheduler.CREATE_PROGRAM(
program_name => 'dbms_prg_with_psql',
program_type => 'PLSQL_BLOCK',
program_action => 'BEGIN PERFORM 1; END;',
enabled => True,
comments => 'This is a PLSQL program.'
);

View File

@ -0,0 +1,145 @@
{
"scenarios": [
{
"type": "create",
"name": "Create extension 'edb_job_scheduler' for DBMS Program",
"endpoint": "NODE-extension.obj",
"sql_endpoint": "NODE-extension.sql_id",
"data": {
"name": "edb_job_scheduler"
},
"store_object_id": true
},
{
"type": "create",
"name": "Create extension 'dbms_scheduler' for DBMS Program",
"endpoint": "NODE-extension.obj",
"sql_endpoint": "NODE-extension.sql_id",
"data": {
"name": "dbms_scheduler"
},
"store_object_id": true
},
{
"type": "create",
"name": "Create Program with PLSQL_BLOCK",
"endpoint": "NODE-dbms_program.obj",
"sql_endpoint": "NODE-dbms_program.sql_id",
"msql_endpoint": "NODE-dbms_program.msql",
"data": {
"jsprid": null,
"jsprname": "dbms_prg_with_psql",
"jsprtype": "PLSQL_BLOCK",
"jsprenabled": true,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a PLSQL program.",
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;"
},
"expected_sql_file": "create_program_psql.sql",
"expected_msql_file": "create_program_psql_msql.sql"
},
{
"type": "delete",
"name": "Drop Program",
"endpoint": "NODE-dbms_program.obj_id",
"data": {
"name": "dbms_prg_with_psql"
}
},
{
"type": "create",
"name": "Create Program with Stored Procedure without args",
"endpoint": "NODE-dbms_program.obj",
"sql_endpoint": "NODE-dbms_program.sql_id",
"msql_endpoint": "NODE-dbms_program.msql",
"data": {
"jsprid": null,
"jsprname": "dbms_prg_proc_without_args",
"jsprtype": "STORED_PROCEDURE",
"jsprenabled": false,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "This is a STORED_PROCEDURE program.",
"jsprproc": "public.test_proc_without_args",
"jsprcode": null
},
"expected_sql_file": "create_program_proc_without_args.sql",
"expected_msql_file": "create_program_proc_without_args_msql.sql"
},
{
"type": "delete",
"name": "Drop Program",
"endpoint": "NODE-dbms_program.obj_id",
"data": {
"name": "dbms_prg_proc_without_args"
}
},
{
"type": "create",
"name": "Create Program with Stored Procedure with args",
"endpoint": "NODE-dbms_program.obj",
"sql_endpoint": "NODE-dbms_program.sql_id",
"msql_endpoint": "NODE-dbms_program.msql",
"data": {
"jsprid": null,
"jsprname": "dbms_prg_proc_with_args",
"jsprtype": "STORED_PROCEDURE",
"jsprenabled": true,
"jsprnoofargs": 2,
"jsprarguments": [{"argid":0,"argtype":"bigint","argmode":"IN","argname":"salary","argdefval":"10000"},{"argid":1,"argtype":"character varying","argmode":"IN","argname":"name","argdefval":" -"}],
"jsprdesc": "This is a STORED_PROCEDURE program.",
"jsprproc": "public.test_proc_with_args",
"jsprcode": null
},
"expected_sql_file": "create_program_proc_with_args.sql",
"expected_msql_file": "create_program_proc_with_args_msql.sql"
},
{
"type": "delete",
"name": "Drop Program",
"endpoint": "NODE-dbms_program.obj_id",
"data": {
"name": "dbms_prg_proc_with_args"
}
},
{
"type": "create",
"name": "Create disabled program",
"endpoint": "NODE-dbms_program.obj",
"sql_endpoint": "NODE-dbms_program.sql_id",
"msql_endpoint": "NODE-dbms_program.msql",
"data": {
"jsprid": null,
"jsprname": "dbms_prg_disabled",
"jsprtype": "PLSQL_BLOCK",
"jsprenabled": false,
"jsprnoofargs": 0,
"jsprarguments": [],
"jsprdesc": "",
"jsprproc": null,
"jsprcode": "BEGIN PERFORM 1; END;"
},
"expected_sql_file": "create_program_disabled.sql",
"expected_msql_file": "create_program_disabled_msql.sql"
},
{
"type": "delete",
"name": "Drop Program",
"endpoint": "NODE-dbms_program.obj_id",
"data": {
"name": "dbms_prg_disabled"
}
},
{
"type": "delete",
"name": "Drop Extension",
"endpoint": "NODE-extension.delete",
"data": {
"ids": ["<edb_job_scheduler>"]
},
"preprocess_data": true
}
]
}

View File

@ -0,0 +1,83 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSAddProgramTestCase(BaseTestGenerator):
"""This class will test the add program in the DBMS Program API"""
scenarios = utils.generate_scenarios("dbms_create_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
def runTest(self):
""" This function will add DBMS Program under test database. """
if self.is_positive_test:
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
# Verify in backend
response_data = json.loads(response.data)
self.programs_id = response_data['node']['_id']
programs_name = response_data['node']['label']
is_present = job_scheduler_utils.verify_dbms_program(
self, programs_name)
self.assertTrue(
is_present,"DBMS program was not created successfully.")
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=eval(self.mock_data["return_value"])):
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,98 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSDeleteProgramTestCase(BaseTestGenerator):
"""This class will test the add program in the DBMS program API"""
scenarios = utils.generate_scenarios("dbms_delete_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.prg_name = "test_program_delete%s" % str(uuid.uuid4())[1:8]
self.program_id = job_scheduler_utils.create_dbms_program(
self, self.prg_name)
# multiple programs
if self.is_list:
self.prg_name2 = "test_program_delete%s" % str(uuid.uuid4())[1:8]
self.program_id_2 = job_scheduler_utils.create_dbms_program(
self, self.prg_name2)
def runTest(self):
"""
This function will test delete DBMS program under test database.
"""
if self.is_list:
self.data['ids'] = [self.program_id, self.program_id_2]
response = job_scheduler_utils.api_delete(self, '')
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_program(
self, self.prg_name)
self.assertFalse(
is_present, "DBMS program was not deleted successfully")
is_present = job_scheduler_utils.verify_dbms_program(
self, self.prg_name2)
self.assertFalse(
is_present, "DBMS program was not deleted successfully")
else:
response = job_scheduler_utils.api_delete(self)
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_program(
self, self.prg_name)
self.assertFalse(
is_present, "DBMS program was not deleted successfully")
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,71 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSDisableProgramTestCase(BaseTestGenerator):
"""This class will test the add program in the DBMS program API"""
scenarios = utils.generate_scenarios("dbms_disable_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.prg_name = "test_program_disable%s" % str(uuid.uuid4())[1:8]
self.data['program_name'] = self.prg_name
self.program_id = job_scheduler_utils.create_dbms_program(
self, self.prg_name)
def runTest(self):
""" This function will test DBMS program under test database."""
response = job_scheduler_utils.api_put(self, self.program_id)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_program(self, self.prg_name)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,71 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSEnableProgramTestCase(BaseTestGenerator):
"""This class will test the enable program in the DBMS program API"""
scenarios = utils.generate_scenarios("dbms_enable_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.prg_name = "test_program_enable%s" % str(uuid.uuid4())[1:8]
self.data['program_name'] = self.prg_name
self.program_id = job_scheduler_utils.create_dbms_program(
self, self.prg_name, False)
def runTest(self):
""" This function will test DBMS program under test database."""
response = job_scheduler_utils.api_put(self, self.program_id)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_program(self, self.prg_name)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,65 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetMSQLProgramTestCase(BaseTestGenerator):
"""This class will test the add program in the DBMS program API"""
scenarios = utils.generate_scenarios("dbms_msql_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
def runTest(self):
""" This function will add DBMS program under test database. """
url_encode_data = self.data
response = job_scheduler_utils.api_get_msql(self, url_encode_data)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,92 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_programs_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetProgramTestCase(BaseTestGenerator):
"""This class will test the add program in the DBMS program API"""
scenarios = utils.generate_scenarios("dbms_get_program",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.prg_name = "test_program_get%s" % str(uuid.uuid4())[1:8]
self.program_id = job_scheduler_utils.create_dbms_program(
self, self.prg_name)
# multiple programs
if self.is_list:
self.prg_name2 = "test_program_get%s" % str(uuid.uuid4())[1:8]
self.program_id_2 = job_scheduler_utils.create_dbms_program(
self, self.prg_name2, True, True, self.proc_name)
def runTest(self):
""" This function will test DBMS program under test database."""
if self.is_positive_test:
if self.is_list:
response = job_scheduler_utils.api_get(self, '')
else:
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=[eval(self.mock_data["return_value"])]):
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_program(self, self.prg_name)
if self.is_list:
job_scheduler_utils.delete_dbms_program(self, self.prg_name2)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,508 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
""" Implements DBMS Schedule objects Node."""
import json
from functools import wraps
from flask import render_template, request, jsonify
from flask_babel import gettext
from pgadmin.browser.collection import CollectionNodeModule
from pgadmin.browser.server_groups.servers import databases
from pgadmin.browser.utils import PGChildNodeView
from pgadmin.utils.ajax import make_json_response, gone, \
make_response as ajax_response, internal_server_error
from pgadmin.utils.driver import get_driver
from config import PG_DEFAULT_DRIVER
from pgadmin.utils.constants import DBMS_JOB_SCHEDULER_ID
from pgadmin.browser.server_groups.servers.databases.dbms_job_scheduler.utils \
import resolve_calendar_string, create_calendar_string
class DBMSScheduleModule(CollectionNodeModule):
"""
class DBMSScheduleModule(CollectionNodeModule)
A module class for DBMS Schedule objects node derived
from CollectionNodeModule.
Methods:
-------
* get_nodes(gid, sid, did)
- Method is used to generate the browser collection node.
* script_load()
- Load the module script for DBMS Schedule objects, when any of
the server node is initialized.
"""
_NODE_TYPE = 'dbms_schedule'
_COLLECTION_LABEL = gettext("DBMS Schedules")
@property
def collection_icon(self):
"""
icon to be displayed for the browser collection node
"""
return 'icon-coll-pga_schedule'
@property
def node_icon(self):
"""
icon to be displayed for the browser nodes
"""
return 'icon-pga_schedule'
def get_nodes(self, gid, sid, did, jsid):
"""
Generate the collection node
"""
if self.show_node:
yield self.generate_browser_collection_node(did)
@property
def node_inode(self):
"""
Override this property to make the node a leaf node.
Returns: False as this is the leaf node
"""
return False
@property
def script_load(self):
"""
Load the module script for server, when any of the database node is
initialized.
"""
return databases.DatabaseModule.node_type
@property
def module_use_template_javascript(self):
"""
Returns whether Jinja2 template is used for generating the javascript
module.
"""
return False
blueprint = DBMSScheduleModule(__name__)
class DBMSScheduleView(PGChildNodeView):
"""
class DBMSScheduleView(PGChildNodeView)
A view class for DBMSSchedule node derived from PGChildNodeView.
This class is responsible for all the stuff related to view like
updating schedule node, showing properties, showing sql in sql pane.
Methods:
-------
* __init__(**kwargs)
- Method is used to initialize the DBMSScheduleView, and it's base view.
* check_precondition()
- This function will behave as a decorator which will checks
database connection before running view, it will also attaches
manager,conn & template_path properties to self
* list()
- This function is used to list all the schedule nodes within that
collection.
* nodes()
- This function will use to create all the child node within that
collection. Here it will create all the schedule node.
* properties(gid, sid, did, jsid, jsscid)
- This function will show the properties of the selected schedule node
* create(gid, sid, did, jsid, jsscid)
- This function will create the new schedule object
* msql(gid, sid, did, jsid, jsscid)
- This function is used to return modified SQL for the
selected schedule node
* sql(gid, sid, did, jsid, jsscid)
- Dummy response for sql panel
* delete(gid, sid, did, jsid, jsscid)
- Drops job schedule
"""
node_type = blueprint.node_type
BASE_TEMPLATE_PATH = 'dbms_schedules/ppas/#{0}#'
parent_ids = [
{'type': 'int', 'id': 'gid'},
{'type': 'int', 'id': 'sid'},
{'type': 'int', 'id': 'did'},
{'type': 'int', 'id': 'jsid'}
]
ids = [
{'type': 'int', 'id': 'jsscid'}
]
operations = dict({
'obj': [
{'get': 'properties', 'delete': 'delete'},
{'get': 'list', 'post': 'create', 'delete': 'delete'}
],
'nodes': [{'get': 'nodes'}, {'get': 'nodes'}],
'msql': [{'get': 'msql'}, {'get': 'msql'}],
'sql': [{'get': 'sql'}]
})
def _init_(self, **kwargs):
self.conn = None
self.template_path = None
self.manager = None
super().__init__(**kwargs)
def check_precondition(f):
"""
This function will behave as a decorator which will check the
database connection before running view. It will also attach
manager, conn & template_path properties to self
"""
@wraps(f)
def wrap(*args, **kwargs):
# Here args[0] will hold self & kwargs will hold gid,sid,did
self = args[0]
self.driver = get_driver(PG_DEFAULT_DRIVER)
self.manager = self.driver.connection_manager(kwargs['sid'])
self.conn = self.manager.connection(did=kwargs['did'])
# Set the template path for the SQL scripts
self.template_path = self.BASE_TEMPLATE_PATH.format(
self.manager.version)
return f(*args, **kwargs)
return wrap
@check_precondition
def list(self, gid, sid, did, jsid):
"""
This function is used to list all the schedule nodes within
that collection.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]))
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
return ajax_response(
response=res['rows'],
status=200
)
@check_precondition
def nodes(self, gid, sid, did, jsid, jsscid=None):
"""
This function is used to create all the child nodes within
the collection. Here it will create all the schedule nodes.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
res = []
try:
sql = render_template(
"/".join([self.template_path, self._NODES_SQL]))
status, result = self.conn.execute_2darray(sql)
if not status:
return internal_server_error(errormsg=result)
if jsscid is not None:
if len(result['rows']) == 0:
return gone(
errormsg=gettext("Could not find the specified "
"schedule.")
)
row = result['rows'][0]
return make_json_response(
data=self.blueprint.generate_browser_node(
row['jsscid'],
DBMS_JOB_SCHEDULER_ID,
row['jsscname'],
icon="icon-pga_schedule",
description=row['jsscdesc']
)
)
for row in result['rows']:
res.append(
self.blueprint.generate_browser_node(
row['jsscid'],
DBMS_JOB_SCHEDULER_ID,
row['jsscname'],
icon="icon-pga_schedule",
description=row['jsscdesc']
)
)
return make_json_response(
data=res,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def properties(self, gid, sid, did, jsid, jsscid):
"""
This function will show the properties of the selected schedule node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsscid: JobSchedule ID
"""
try:
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]),
jsscid=jsscid
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
errormsg=gettext("Could not find the specified schedule.")
)
# Resolve the repeat interval string
if 'jsscrepeatint' in res['rows'][0]:
(freq, by_date, by_month, by_month_day, by_weekday, by_hour,
by_minute) = resolve_calendar_string(
res['rows'][0]['jsscrepeatint'])
res['rows'][0]['jsscfreq'] = freq
res['rows'][0]['jsscdate'] = by_date
res['rows'][0]['jsscmonths'] = by_month
res['rows'][0]['jsscmonthdays'] = by_month_day
res['rows'][0]['jsscweekdays'] = by_weekday
res['rows'][0]['jsschours'] = by_hour
res['rows'][0]['jsscminutes'] = by_minute
return ajax_response(
response=res['rows'][0],
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def create(self, gid, sid, did, jsid):
"""
This function will create the schedule node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
"""
data = json.loads(request.data)
try:
# Create calendar string for repeat interval
repeat_interval = create_calendar_string(
data['jsscfreq'], data['jsscdate'], data['jsscmonths'],
data['jsscmonthdays'], data['jsscweekdays'], data['jsschours'],
data['jsscminutes'])
sql = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
schedule_name=data['jsscname'],
start_date=data['jsscstart'],
repeat_interval=repeat_interval,
end_date=data['jsscend'],
comments=data['jsscdesc'],
conn=self.conn
)
status, res = self.conn.execute_void('BEGIN')
if not status:
return internal_server_error(errormsg=res)
status, res = self.conn.execute_scalar(sql)
if not status:
if self.conn.connected():
self.conn.execute_void('END')
return internal_server_error(errormsg=res)
self.conn.execute_void('END')
# Get the newly created Schedule id
sql = render_template(
"/".join([self.template_path, 'get_schedule_id.sql']),
jsscname=data['jsscname'], conn=self.conn
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
errormsg=gettext("Job schedule creation failed.")
)
row = res['rows'][0]
return jsonify(
node=self.blueprint.generate_browser_node(
row['jsscid'],
DBMS_JOB_SCHEDULER_ID,
row['jsscname'],
icon="icon-pga_schedule"
)
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def delete(self, gid, sid, did, jsid, jsscid=None):
"""Delete the Job Schedule."""
if jsscid is None:
data = request.form if request.form else json.loads(
request.data
)
else:
data = {'ids': [jsscid]}
try:
for jsscid in data['ids']:
sql = render_template(
"/".join([self.template_path, self._PROPERTIES_SQL]),
jsscid=jsscid
)
status, res = self.conn.execute_dict(sql)
if not status:
return internal_server_error(errormsg=res)
jsscname = res['rows'][0]['jsscname']
status, res = self.conn.execute_void(
render_template(
"/".join([self.template_path, self._DELETE_SQL]),
schedule_name=jsscname, force=False, conn=self.conn
)
)
if not status:
return internal_server_error(errormsg=res)
return make_json_response(success=1)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def msql(self, gid, sid, did, jsid, jsscid=None):
"""
This function is used to return modified SQL for the
selected Schedule node.
Args:
gid: Server Group ID
sid: Server ID
jsid: Job Scheduler ID
jsscid: Job Schedule ID (optional)
"""
data = {}
for k, v in request.args.items():
try:
# comments should be taken as is because if user enters a
# json comment it is parsed by loads which should not happen
if k in ('jsscdesc',):
data[k] = v
else:
data[k] = json.loads(v)
except ValueError:
data[k] = v
try:
# Create calendar string for repeat interval
repeat_interval = create_calendar_string(
data['jsscfreq'], data['jsscdate'], data['jsscmonths'],
data['jsscmonthdays'], data['jsscweekdays'], data['jsschours'],
data['jsscminutes'])
sql = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
schedule_name=data['jsscname'],
start_date=data['jsscstart'],
repeat_interval=repeat_interval,
end_date=data['jsscend'],
comments=data['jsscdesc'],
conn=self.conn
)
return make_json_response(
data=sql,
status=200
)
except Exception as e:
return internal_server_error(errormsg=str(e))
@check_precondition
def sql(self, gid, sid, did, jsid, jsscid):
"""
This function will generate sql for the sql panel
"""
try:
SQL = render_template("/".join(
[self.template_path, self._PROPERTIES_SQL]
), jsscid=jsscid)
status, res = self.conn.execute_dict(SQL)
if not status:
return internal_server_error(errormsg=res)
if len(res['rows']) == 0:
return gone(
gettext("Could not find the DBMS Schedule.")
)
data = res['rows'][0]
SQL = render_template(
"/".join([self.template_path, self._CREATE_SQL]),
display_comments=True,
schedule_name=data['jsscname'],
start_date=data['jsscstart'],
repeat_interval=data['jsscrepeatint'],
end_date=data['jsscend'],
comments=data['jsscdesc'],
conn=self.conn
)
return ajax_response(response=SQL)
except Exception as e:
return internal_server_error(errormsg=str(e))
DBMSScheduleView.register_node_view(blueprint)

View File

@ -0,0 +1,77 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import DBMSScheduleSchema from './dbms_schedule.ui';
define('pgadmin.node.dbms_schedule', [
'sources/gettext', 'sources/url_for', 'sources/pgadmin',
'pgadmin.browser', 'pgadmin.browser.collection',
], function(gettext, url_for, pgAdmin, pgBrowser) {
if (!pgBrowser.Nodes['coll-dbms_schedule']) {
pgBrowser.Nodes['coll-dbms_schedule'] =
pgBrowser.Collection.extend({
node: 'dbms_schedule',
label: gettext('DBMS Schedules'),
type: 'coll-dbms_schedule',
columns: ['jsscname', 'jsscrepeatint', 'jsscdesc'],
hasSQL: false,
hasDepends: false,
hasStatistics: false,
hasScriptTypes: [],
canDrop: true,
canDropCascade: false,
});
}
if (!pgBrowser.Nodes['dbms_schedule']) {
pgAdmin.Browser.Nodes['dbms_schedule'] = pgAdmin.Browser.Node.extend({
parent_type: 'dbms_job_scheduler',
type: 'dbms_schedule',
label: gettext('DBMS Schedule'),
node_image: 'icon-pga_schedule',
epasHelp: true,
epasURL: 'https://www.enterprisedb.com/docs/epas/$VERSION$/epas_compat_bip_guide/03_built-in_packages/15_dbms_scheduler/04_create_schedule/',
dialogHelp: url_for('help.static', {'filename': 'dbms_schedule.html'}),
canDrop: true,
hasSQL: true,
hasDepends: false,
hasStatistics: false,
Init: function() {
/* Avoid multiple registration of menus */
if (this.initialized)
return;
this.initialized = true;
pgBrowser.add_menus([{
name: 'create_dbms_schedule_on_coll', node: 'coll-dbms_schedule', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Schedule...'),
data: {action: 'create'},
},{
name: 'create_dbms_schedule', node: 'dbms_schedule', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Schedule...'),
data: {action: 'create'},
},{
name: 'create_dbms_schedule', node: 'dbms_job_scheduler', module: this,
applies: ['object', 'context'], callback: 'show_obj_properties',
category: 'create', priority: 4, label: gettext('DBMS Schedule...'),
data: {action: 'create'},
},
]);
},
getSchema: ()=>new DBMSScheduleSchema(),
});
}
return pgBrowser.Nodes['dbms_schedule'];
});

View File

@ -0,0 +1,89 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import gettext from 'sources/gettext';
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
import { isEmptyString } from 'sources/validators';
import moment from 'moment';
import { getRepeatSchema } from '../../../static/js/dbms_job_scheduler_common.ui';
export default class DBMSScheduleSchema extends BaseUISchema {
constructor() {
super({
jsscid: null,
jsscname: '',
jsscstart: null,
jsscend: null,
jsscdesc: '',
jsscrepeatint: '',
jsscfreq: null,
jsscdate: null,
jsscweekdays: null,
jsscmonthdays: null,
jsscmonths: null,
jsschours: null,
jsscminutes: null,
});
}
get idAttribute() {
return 'jsscid';
}
get baseFields() {
let obj = this;
return [
{
id: 'jsscid', label: gettext('ID'), type: 'int', mode: ['properties'],
readonly: function(state) {return !obj.isNew(state); },
}, {
id: 'jsscname', label: gettext('Name'), cell: 'text',
editable: false, type: 'text', noEmpty: true,
readonly: function(state) {return !obj.isNew(state); },
},
// Add the Repeat Schema.
...getRepeatSchema(obj, 'schedule'),
{
id: 'jsscdesc', label: gettext('Comment'), type: 'multiline',
readonly: function(state) {return !obj.isNew(state); },
},
];
}
validate(state, setError) {
if (isEmptyString(state.jsscstart) && isEmptyString(state.jsscfreq) &&
isEmptyString(state.jsscmonths) && isEmptyString(state.jsscweekdays) &&
isEmptyString(state.jsscmonthdays) && isEmptyString(state.jsschours) &&
isEmptyString(state.jsscminutes) && isEmptyString(state.jsscdate)) {
setError('jsscstart', gettext('Either Start time or Repeat interval must be specified.'));
return true;
} else {
setError('jsscstart', null);
}
if (!isEmptyString(state.jsscend)) {
let start_time = state.jsscstart,
end_time = state.jsscend,
start_time_js = start_time.split(' '),
end_time_js = end_time.split(' ');
start_time_js = moment(start_time_js[0] + ' ' + start_time_js[1]);
end_time_js = moment(end_time_js[0] + ' ' + end_time_js[1]);
if(end_time_js.isBefore(start_time_js)) {
setError('jsscend', gettext('Start time must be less than end time'));
return true;
} else {
setError('jsscend', null);
}
} else {
state.jsscend = null;
}
}
}

View File

@ -0,0 +1,22 @@
{% if display_comments %}
-- DBMS Schedule: '{{ schedule_name }}'
-- EXEC dbms_scheduler.DROP_SCHEDULE('{{ schedule_name }}');
{% endif %}
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => {{ schedule_name|qtLiteral(conn) }},
repeat_interval => {{ repeat_interval|qtLiteral(conn) }}{% if start_date or end_date or comments %},{% endif %}
{% if start_date %}
start_date => {{ start_date|qtLiteral(conn) }}{% if end_date or comments %},{% endif %}
{% endif %}
{% if end_date %}
end_date => {{ end_date|qtLiteral(conn) }}{% if comments %},{% endif %}
{% endif %}
{% if comments %}
comments => {{ comments|qtLiteral(conn) }}
{% endif %}
);

View File

@ -0,0 +1,6 @@
EXEC dbms_scheduler.DROP_SCHEDULE(
{{ schedule_name|qtLiteral(conn) }}{% if force %},{% endif %}
{% if force %}
{{ force }}
{% endif %}
);

View File

@ -0,0 +1,4 @@
SELECT
dss_schedule_id as jsscid, dss_schedule_name as jsscname
FROM sys.scheduler_0300_schedule
WHERE dss_schedule_name={{ jsscname|qtLiteral(conn) }}

View File

@ -0,0 +1,5 @@
SELECT
dss_schedule_id as jsscid, dss_schedule_name as jsscname,
dss_comments as jsscdesc
FROM sys.scheduler_0300_schedule sct
JOIN sys.dba_scheduler_schedules scv ON sct.dss_schedule_name = scv.schedule_name;

View File

@ -0,0 +1,12 @@
SELECT
dss_schedule_id as jsscid, dss_schedule_name as jsscname,
dss_start_date as jsscstart, dss_end_date as jsscend,
dss_repeat_interval as jsscrepeatint, dss_comments as jsscdesc
FROM sys.scheduler_0300_schedule sct
{% if not jsscid %}
JOIN sys.dba_scheduler_schedules scv ON sct.dss_schedule_name = scv.schedule_name
{% endif %}
{% if jsscid %}
WHERE dss_schedule_id={{jsscid}}::oid
{% endif %}
ORDER BY dss_schedule_name

View File

@ -0,0 +1,335 @@
{
"dbms_create_schedule": [
{
"name": "Create schedule: YEARLY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_yearly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is yearly test schedule",
"jsscrepeatint": "",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: YEARLY BY DATE",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_yearly_by_date",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is yearly by date test schedule",
"jsscrepeatint": "",
"jsscfreq": "YEARLY",
"jsscdate": "20250113",
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: MONTHLY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_monthly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is monthly test schedule",
"jsscrepeatint": "",
"jsscfreq": "MONTHLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: WEEKLY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_weekly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is weekly test schedule",
"jsscrepeatint": "",
"jsscfreq": "WEEKLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: DAILY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_daily",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is daily test schedule",
"jsscrepeatint": "",
"jsscfreq": "DAILY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: HOURLY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_hourly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is hourly test schedule",
"jsscrepeatint": "",
"jsscfreq": "HOURLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: MINUTELY",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_minutely",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is minutely test schedule",
"jsscrepeatint": "",
"jsscfreq": "MINUTELY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
}
},
{
"name": "Create schedule: while server is down",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": false,
"test_data": {
"jsscid": null,
"jsscname": "dbms_test_sch_yearly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is yearly test schedule",
"jsscrepeatint": "",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_scalar",
"return_value": "[(False,'Mocked Internal Server Error')]"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
}
}
],
"dbms_delete_schedule": [
{
"name": "Delete schedule: With existing DBMS schedule.",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Delete multiple schedules: With existing DBMS schedule.",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
}
],
"dbms_get_schedule": [
{
"name": "Get schedule: With existing DBMS schedule.",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
},
{
"name": "Get schedules: With multiple existing DBMS schedules.",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": true
},
{
"name": "Get schedule: while server down.",
"url": "/browser/dbms_schedule/obj/",
"is_positive_test": false,
"inventory_data": {},
"test_data": {},
"mocking_required": true,
"mock_data": {
"function_name": "pgadmin.utils.driver.psycopg3.connection.Connection.execute_dict",
"return_value": "(False,'Mocked Internal Server Error')"
},
"expected_data": {
"status_code": 500,
"error_msg": "Mocked Internal Server Error",
"test_result_data": {}
},
"is_list": false
}
],
"dbms_msql_schedule": [
{
"name": "Get schedule msql: For existing dbms schedule.",
"url": "/browser/dbms_schedule/msql/",
"is_positive_test": true,
"inventory_data": {},
"test_data": {
"jsscname": "dbms_test_sch_msql",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is daily test schedule",
"jsscrepeatint": "",
"jsscfreq": "DAILY",
"jsscdate": null,
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"mocking_required": false,
"mock_data": {},
"expected_data": {
"status_code": 200,
"error_msg": null,
"test_result_data": {}
},
"is_list": false
}
]
}

View File

@ -0,0 +1,11 @@
-- DBMS Schedule: 'dbms_test_sch_yearly'
-- EXEC dbms_scheduler.DROP_SCHEDULE('dbms_test_sch_yearly');
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_yearly',
repeat_interval => 'FREQ=YEARLY;BYMONTH=JAN,MAY,DEC;BYMONTHDAY=2,8,31,27;BYDAY=SUN,MON,TUE,WED,THU,FRI,SAT;BYHOUR=05,18,22;BYMINUTE=45,37,58',
start_date => '2024-02-27 00:00:00+05:30',
end_date => '2024-02-28 00:00:00+05:30',
comments => 'This is yearly test schedule'
);

View File

@ -0,0 +1,7 @@
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_yearly',
repeat_interval => 'FREQ=YEARLY;BYMONTH=JAN,MAY,DEC;BYMONTHDAY=2,8,31,27;BYDAY=SUN,MON,TUE,WED,THU,FRI,SAT;BYHOUR=05,18,22;BYMINUTE=45,37,58',
start_date => '2024-02-27 00:00:00 +05:30',
end_date => '2024-02-28 00:00:00 +05:30',
comments => 'This is yearly test schedule'
);

View File

@ -0,0 +1,11 @@
-- DBMS Schedule: 'dbms_test_sch_yearly_by_date'
-- EXEC dbms_scheduler.DROP_SCHEDULE('dbms_test_sch_yearly_by_date');
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_yearly_by_date',
repeat_interval => 'FREQ=YEARLY;BYDATE=20250113;',
start_date => '2024-02-27 00:00:00+05:30',
end_date => '2024-02-28 00:00:00+05:30',
comments => 'This is yearly by date test schedule'
);

View File

@ -0,0 +1,7 @@
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_yearly_by_date',
repeat_interval => 'FREQ=YEARLY;BYDATE=20250113;',
start_date => '2024-02-27 00:00:00 +05:30',
end_date => '2024-02-28 00:00:00 +05:30',
comments => 'This is yearly by date test schedule'
);

View File

@ -0,0 +1,7 @@
-- DBMS Schedule: 'dbms_test_sch_daily_freq'
-- EXEC dbms_scheduler.DROP_SCHEDULE('dbms_test_sch_daily_freq');
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_daily_freq',
repeat_interval => 'FREQ=DAILY;');

View File

@ -0,0 +1,9 @@
-- DBMS Schedule: 'dbms_test_sch_weekly_comm'
-- EXEC dbms_scheduler.DROP_SCHEDULE('dbms_test_sch_weekly_comm');
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_weekly_comm',
repeat_interval => 'FREQ=WEEKLY;',
comments => 'This is weekly test schedule'
);

View File

@ -0,0 +1,5 @@
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_weekly_comm',
repeat_interval => 'FREQ=WEEKLY;',
comments => 'This is weekly test schedule'
);

View File

@ -0,0 +1,3 @@
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_daily_freq',
repeat_interval => 'FREQ=DAILY;');

View File

@ -0,0 +1,8 @@
-- DBMS Schedule: 'dbms_test_sch_monthly_start_date'
-- EXEC dbms_scheduler.DROP_SCHEDULE('dbms_test_sch_monthly_start_date');
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_monthly_start_date',
repeat_interval => 'FREQ=MONTHLY;',
start_date => '2024-02-27 00:00:00+05:30');

View File

@ -0,0 +1,4 @@
EXEC dbms_scheduler.CREATE_SCHEDULE(
schedule_name => 'dbms_test_sch_monthly_start_date',
repeat_interval => 'FREQ=MONTHLY;',
start_date => '2024-02-27 00:00:00 +05:30');

View File

@ -0,0 +1,188 @@
{
"scenarios": [
{
"type": "create",
"name": "Create extension 'edb_job_scheduler' for DBMS Schedule",
"endpoint": "NODE-extension.obj",
"sql_endpoint": "NODE-extension.sql_id",
"data": {
"name": "edb_job_scheduler"
},
"store_object_id": true
},
{
"type": "create",
"name": "Create extension 'dbms_scheduler' for DBMS Schedule",
"endpoint": "NODE-extension.obj",
"sql_endpoint": "NODE-extension.sql_id",
"data": {
"name": "dbms_scheduler"
},
"store_object_id": true
},
{
"type": "create",
"name": "Create Schedule with all options",
"endpoint": "NODE-dbms_schedule.obj",
"sql_endpoint": "NODE-dbms_schedule.sql_id",
"msql_endpoint": "NODE-dbms_schedule.msql",
"data": {
"jsscname": "dbms_test_sch_yearly",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is yearly test schedule",
"jsscrepeatint": "",
"jsscfreq": "YEARLY",
"jsscdate": null,
"jsscweekdays": ["7", "1", "2", "3", "4", "5", "6"],
"jsscmonthdays": ["2", "8", "31", "27"],
"jsscmonths": ["1", "5", "12"],
"jsschours": ["05", "18", "22"],
"jsscminutes": ["45", "37", "58"]
},
"expected_sql_file": "create_schedule_all.sql",
"expected_msql_file": "create_schedule_all_msql.sql"
},
{
"type": "delete",
"name": "Drop Schedule",
"endpoint": "NODE-dbms_schedule.obj_id",
"data": {
"name": "dbms_test_sch_yearly"
}
},
{
"type": "create",
"name": "Create Schedule Yearly by date",
"endpoint": "NODE-dbms_schedule.obj",
"sql_endpoint": "NODE-dbms_schedule.sql_id",
"msql_endpoint": "NODE-dbms_schedule.msql",
"data": {
"jsscname": "dbms_test_sch_yearly_by_date",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "2024-02-28 00:00:00 +05:30",
"jsscdesc": "This is yearly by date test schedule",
"jsscrepeatint": "",
"jsscfreq": "YEARLY",
"jsscdate": "20250113",
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"expected_sql_file": "create_schedule_bydate.sql",
"expected_msql_file": "create_schedule_bydate_msql.sql"
},
{
"type": "delete",
"name": "Drop Schedule",
"endpoint": "NODE-dbms_schedule.obj_id",
"data": {
"name": "dbms_test_sch_yearly_by_date"
}
},
{
"type": "create",
"name": "Create Schedule only start date",
"endpoint": "NODE-dbms_schedule.obj",
"sql_endpoint": "NODE-dbms_schedule.sql_id",
"msql_endpoint": "NODE-dbms_schedule.msql",
"data": {
"jsscname": "dbms_test_sch_monthly_start_date",
"jsscstart": "2024-02-27 00:00:00 +05:30",
"jsscend": "",
"jsscdesc": "",
"jsscrepeatint": "",
"jsscfreq": "MONTHLY",
"jsscdate": null,
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"expected_sql_file": "create_schedule_start_date.sql",
"expected_msql_file": "create_schedule_start_date_msql.sql"
},
{
"type": "delete",
"name": "Drop Schedule",
"endpoint": "NODE-dbms_schedule.obj_id",
"data": {
"name": "dbms_test_sch_monthly_start_date"
}
},
{
"type": "create",
"name": "Create Schedule only frequency",
"endpoint": "NODE-dbms_schedule.obj",
"sql_endpoint": "NODE-dbms_schedule.sql_id",
"msql_endpoint": "NODE-dbms_schedule.msql",
"data": {
"jsscname": "dbms_test_sch_daily_freq",
"jsscstart": "",
"jsscend": "",
"jsscdesc": "",
"jsscrepeatint": "",
"jsscfreq": "DAILY",
"jsscdate": null,
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"expected_sql_file": "create_schedule_freq.sql",
"expected_msql_file": "create_schedule_freq_msql.sql"
},
{
"type": "delete",
"name": "Drop Schedule",
"endpoint": "NODE-dbms_schedule.obj_id",
"data": {
"name": "dbms_test_sch_daily_freq"
}
},
{
"type": "create",
"name": "Create Schedule frequency with comment",
"endpoint": "NODE-dbms_schedule.obj",
"sql_endpoint": "NODE-dbms_schedule.sql_id",
"msql_endpoint": "NODE-dbms_schedule.msql",
"data": {
"jsscname": "dbms_test_sch_weekly_comm",
"jsscstart": "",
"jsscend": "",
"jsscdesc": "This is weekly test schedule",
"jsscrepeatint": "",
"jsscfreq": "WEEKLY",
"jsscdate": null,
"jsscweekdays": [],
"jsscmonthdays": [],
"jsscmonths": [],
"jsschours": [],
"jsscminutes": []
},
"expected_sql_file": "create_schedule_freq_comm.sql",
"expected_msql_file": "create_schedule_freq_comm_msql.sql"
},
{
"type": "delete",
"name": "Drop Schedule",
"endpoint": "NODE-dbms_schedule.obj_id",
"data": {
"name": "dbms_test_sch_weekly_comm"
}
},
{
"type": "delete",
"name": "Drop Extension",
"endpoint": "NODE-extension.delete",
"data": {
"ids": ["<edb_job_scheduler>"]
},
"preprocess_data": true
}
]
}

View File

@ -0,0 +1,83 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_schedules_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSAddScheduleTestCase(BaseTestGenerator):
"""This class will test the add schedule in the DBMS Schedule API"""
scenarios = utils.generate_scenarios("dbms_create_schedule",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
def runTest(self):
""" This function will add DBMS Schedule under test database. """
if self.is_positive_test:
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
# Verify in backend
response_data = json.loads(response.data)
self.schedule_id = response_data['node']['_id']
schedule_name = response_data['node']['label']
is_present = job_scheduler_utils.verify_dbms_schedule(
self, schedule_name)
self.assertTrue(
is_present,"DBMS schedule was not created successfully.")
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=eval(self.mock_data["return_value"])):
response = job_scheduler_utils.api_create(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,98 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_schedules_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSDeleteScheduleTestCase(BaseTestGenerator):
"""This class will test the add schedule in the DBMS Schedule API"""
scenarios = utils.generate_scenarios("dbms_delete_schedule",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.sch_name = "test_schedule_delete%s" % str(uuid.uuid4())[1:8]
self.schedule_id = job_scheduler_utils.create_dbms_schedule(
self, self.sch_name)
# multiple schedules
if self.is_list:
self.sch_name2 = "test_schedule_delete%s" % str(uuid.uuid4())[1:8]
self.schedule_id_2 = job_scheduler_utils.create_dbms_schedule(
self, self.sch_name2)
def runTest(self):
"""
This function will test delete DBMS Schedule under test database.
"""
if self.is_list:
self.data['ids'] = [self.schedule_id, self.schedule_id_2]
response = job_scheduler_utils.api_delete(self, '')
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_schedule(
self, self.sch_name)
self.assertFalse(
is_present, "DBMS schedule was not deleted successfully")
is_present = job_scheduler_utils.verify_dbms_schedule(
self, self.sch_name2)
self.assertFalse(
is_present, "DBMS schedule was not deleted successfully")
else:
response = job_scheduler_utils.api_delete(self)
# Assert response
utils.assert_status_code(self, response)
is_present = job_scheduler_utils.verify_dbms_schedule(
self, self.sch_name)
self.assertFalse(
is_present, "DBMS schedule was not deleted successfully")
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,65 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import os
import json
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_schedules_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetMSQLScheduleTestCase(BaseTestGenerator):
"""This class will test the add schedule in the DBMS Schedule API"""
scenarios = utils.generate_scenarios("dbms_msql_schedule",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
def runTest(self):
""" This function will add DBMS Schedule under test database. """
url_encode_data = self.data
response = job_scheduler_utils.api_get_msql(self, url_encode_data)
# Assert response
utils.assert_status_code(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,92 @@
##########################################################################
#
# pgAdmin 4 - PostgreSQL Tools
#
# Copyright (C) 2013 - 2024, The pgAdmin Development Team
# This software is released under the PostgreSQL Licence
#
##########################################################################
import uuid
import os
import json
from unittest.mock import patch
from pgadmin.utils.route import BaseTestGenerator
from regression.python_test_utils import test_utils as utils
from ...tests import utils as job_scheduler_utils
from pgadmin.browser.server_groups.servers.databases.tests import \
utils as database_utils
# Load test data from json file.
CURRENT_PATH = os.path.dirname(os.path.realpath(__file__))
with open(CURRENT_PATH + "/dbms_schedules_test_data.json") as data_file:
test_cases = json.load(data_file)
class DBMSGetScheduleTestCase(BaseTestGenerator):
"""This class will test the add schedule in the DBMS Schedule API"""
scenarios = utils.generate_scenarios("dbms_get_schedule",
test_cases)
def setUp(self):
super().setUp()
# Load test data
self.data = self.test_data
if not job_scheduler_utils.is_supported_version(self):
self.skipTest(job_scheduler_utils.SKIP_MSG)
# Create db
self.db_name, self.db_id = job_scheduler_utils.create_test_database(
self)
db_con = database_utils.connect_database(self,
utils.SERVER_GROUP,
self.server_id,
self.db_id)
if db_con["info"] != "Database connected.":
raise Exception("Could not connect to database.")
# Create extension required for job scheduler
job_scheduler_utils.create_job_scheduler_extensions(self)
if not job_scheduler_utils.is_dbms_job_scheduler_present(self):
self.skipTest(job_scheduler_utils.SKIP_MSG_EXTENSION)
self.sch_name = "test_schedule_get%s" % str(uuid.uuid4())[1:8]
self.schedule_id = job_scheduler_utils.create_dbms_schedule(
self, self.sch_name)
# multiple schedules
if self.is_list:
self.sch_name2 = "test_schedule_get%s" % str(uuid.uuid4())[1:8]
self.schedule_id_2 = job_scheduler_utils.create_dbms_schedule(
self, self.sch_name2)
def runTest(self):
""" This function will test DBMS Schedule under test database."""
if self.is_positive_test:
if self.is_list:
response = job_scheduler_utils.api_get(self, '')
else:
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
else:
if self.mocking_required:
with patch(self.mock_data["function_name"],
side_effect=[eval(self.mock_data["return_value"])]):
response = job_scheduler_utils.api_get(self)
# Assert response
utils.assert_status_code(self, response)
utils.assert_error_message(self, response)
def tearDown(self):
"""This function will do the cleanup task."""
job_scheduler_utils.delete_dbms_schedule(self, self.sch_name)
if self.is_list:
job_scheduler_utils.delete_dbms_schedule(self, self.sch_name2)
job_scheduler_utils.clean_up(self)

View File

@ -0,0 +1,16 @@
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M13.9703 3.5697H3.51531C2.9934 3.5697 2.57031 3.99279 2.57031 4.5147V17.9997C2.57031 18.5216 2.9934 18.9447 3.51531 18.9447H13.9703C14.4922 18.9447 14.9153 18.5216 14.9153 17.9997V4.5147C14.9153 3.99279 14.4922 3.5697 13.9703 3.5697Z" fill="#DEF4FD"/>
<path d="M13.97 4.125C14.0695 4.125 14.1648 4.16451 14.2352 4.23484C14.3055 4.30516 14.345 4.40054 14.345 4.5V18C14.345 18.0995 14.3055 18.1948 14.2352 18.2652C14.1648 18.3355 14.0695 18.375 13.97 18.375H3.47C3.37583 18.3674 3.28799 18.3246 3.22402 18.2551C3.16006 18.1856 3.1247 18.0945 3.125 18V4.5C3.125 4.40054 3.16451 4.30516 3.23484 4.23484C3.30516 4.16451 3.40054 4.125 3.5 4.125H14H13.97ZM14 3H3.5C3.10218 3 2.72064 3.15804 2.43934 3.43934C2.15804 3.72064 2 4.10218 2 4.5V18C2 18.3978 2.15804 18.7794 2.43934 19.0607C2.72064 19.342 3.10218 19.5 3.5 19.5H14C14.3978 19.5 14.7794 19.342 15.0607 19.0607C15.342 18.7794 15.5 18.3978 15.5 18V4.5C15.5 4.10218 15.342 3.72064 15.0607 3.43934C14.7794 3.15804 14.3978 3 14 3V3Z" fill="#34495E"/>
<path d="M6.30469 3.9447V2.9997C6.30855 2.75426 6.40778 2.51995 6.58135 2.34637C6.75493 2.17279 6.98924 2.07357 7.23469 2.0697H10.2347C10.4827 2.06967 10.7209 2.16717 10.8977 2.34116C11.0745 2.51515 11.1758 2.75168 11.1797 2.9997V3.9447H6.30469Z" fill="#DEF4FD"/>
<path d="M10.2344 2.625C10.3338 2.625 10.4292 2.66451 10.4995 2.73484C10.5699 2.80516 10.6094 2.90054 10.6094 3V3.375H6.85938V3C6.85938 2.90054 6.89888 2.80516 6.96921 2.73484C7.03954 2.66451 7.13492 2.625 7.23438 2.625H10.2344ZM10.2344 1.5H7.23438C6.83655 1.5 6.45502 1.65804 6.17372 1.93934C5.89241 2.22064 5.73438 2.60218 5.73438 3V4.5H11.7344V3C11.7344 2.60218 11.5763 2.22064 11.295 1.93934C11.0137 1.65804 10.6322 1.5 10.2344 1.5V1.5Z" fill="#34495E"/>
<path d="M7.92578 8.41502V7.39502H12.7258V8.43002L7.92578 8.41502Z" fill="#2195E7"/>
<path d="M4.91016 8.60982V7.31982H6.20016V8.60982H4.91016Z" fill="#2195E7"/>
<path d="M7.92578 11.9998V10.9198H12.7258V11.9998H7.92578Z" fill="#2195E7"/>
<path d="M4.91016 12.1351V10.8601H6.20016V12.1501L4.91016 12.1351Z" fill="#2195E7"/>
<path d="M7.92578 15.4653V14.4303H12.7258V15.4653H7.92578Z" fill="#2195E7"/>
<path d="M4.91016 15.5848V14.2948H6.20016V15.5848H4.91016Z" fill="#2195E7"/>
<path d="M15.6705 21.9956C19.2854 21.9956 22.2159 19.0651 22.2159 15.4501C22.2159 11.8352 19.2854 8.90466 15.6705 8.90466C12.0555 8.90466 9.125 11.8352 9.125 15.4501C9.125 19.0651 12.0555 21.9956 15.6705 21.9956Z" fill="#E3F3F9"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M15.6719 9.35916C12.308 9.35916 9.58097 12.0862 9.58097 15.4501C9.58097 18.814 12.308 21.541 15.6719 21.541C19.0358 21.541 21.7628 18.814 21.7628 15.4501C21.7628 12.0862 19.0358 9.35916 15.6719 9.35916ZM8.67188 15.4501C8.67188 11.5841 11.8059 8.45007 15.6719 8.45007C19.5379 8.45007 22.6719 11.5841 22.6719 15.4501C22.6719 19.3161 19.5379 22.4501 15.6719 22.4501C11.8059 22.4501 8.67188 19.3161 8.67188 15.4501Z" fill="#BD4949"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M15.6725 10.4805C16.0072 10.4805 16.2785 10.7518 16.2785 11.0865V15.4502C16.2785 15.7849 16.0072 16.0562 15.6725 16.0562C15.3377 16.0562 15.0664 15.7849 15.0664 15.4502V11.0865C15.0664 10.7518 15.3377 10.4805 15.6725 10.4805Z" fill="#1E50AD"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M15.0664 15.4502C15.0664 15.1155 15.3377 14.8441 15.6725 14.8441H18.9452C19.2799 14.8441 19.5513 15.1155 19.5513 15.4502C19.5513 15.7849 19.2799 16.0562 18.9452 16.0562H15.6725C15.3377 16.0562 15.0664 15.7849 15.0664 15.4502Z" fill="#1E50AD"/>
</svg>

After

Width:  |  Height:  |  Size: 3.5 KiB

View File

@ -0,0 +1,47 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import DBMSJobSchedulerSchema from './dbms_jobscheduler.ui';
define('pgadmin.node.dbms_job_scheduler', [
'sources/gettext', 'sources/pgadmin',
'pgadmin.browser', 'pgadmin.browser.collection', 'pgadmin.node.dbms_job',
'pgadmin.node.dbms_program', 'pgadmin.node.dbms_schedule',
], function(gettext, pgAdmin, pgBrowser) {
if (!pgBrowser.Nodes['dbms_job_scheduler']) {
pgAdmin.Browser.Nodes['dbms_job_scheduler'] = pgAdmin.Browser.Node.extend({
parent_type: 'database',
type: 'dbms_job_scheduler',
label: gettext('DBMS Job Scheduler'),
columns: ['jobname', 'jobstatus', 'joberror', 'jobstarttime', 'jobendtime', 'jobnextrun'],
canDrop: false,
canDropCascade: false,
canSelect: false,
hasSQL: false,
hasDepends: false,
hasStatistics: false,
hasScriptTypes: [],
Init: function() {
/* Avoid multiple registration of menus */
if (this.initialized)
return;
this.initialized = true;
},
getSchema: ()=> {
return new DBMSJobSchedulerSchema();
}
});
}
return pgBrowser.Nodes['dbms_job_scheduler'];
});

View File

@ -0,0 +1,255 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import gettext from 'sources/gettext';
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
import { WEEKDAYS, MONTHDAYS, MONTHS, HOURS, MINUTES } from '../../../../../../static/js/constants';
function isReadOnly(obj, state, parentName) {
// Always true in case of edit dialog
if (!obj.isNew(state))
return true;
// Check the parent name and based on the condition return appropriate value.
if (parentName == 'job') {
return obj.isNew(state) && state.jsjobtype == 'p';
}
return false;
}
export function getRepeatSchema(obj, parentName) {
return [
{
id: 'jsscrepeatint', label: gettext('Repeat Interval'), cell: 'text',
readonly: true, type: 'multiline', mode: ['edit', 'properties'],
group: gettext('Repeat'),
}, {
id: 'jsscstart', label: gettext('Start'), type: 'datetimepicker',
cell: 'datetimepicker', group: gettext('Repeat'),
controlProps: { ampm: false,
placeholder: gettext('YYYY-MM-DD HH:mm:ss Z'), autoOk: true,
disablePast: true,
},
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscstart: null };
}
}
}, {
id: 'jsscend', label: gettext('End'), type: 'datetimepicker',
cell: 'datetimepicker', group: gettext('Repeat'),
controlProps: { ampm: false,
placeholder: gettext('YYYY-MM-DD HH:mm:ss Z'), autoOk: true,
disablePast: true,
},
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscend: null };
}
}
}, {
id: 'jsscfreq', label: gettext('Frequency'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true,
placeholder: gettext('Select the frequency...'),
},
options: [
{'label': 'YEARLY', 'value': 'YEARLY'},
{'label': 'MONTHLY', 'value': 'MONTHLY'},
{'label': 'WEEKLY', 'value': 'WEEKLY'},
{'label': 'DAILY', 'value': 'DAILY'},
{'label': 'HOURLY', 'value': 'HOURLY'},
{'label': 'MINUTELY', 'value': 'MINUTELY'},
],
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscfreq: null };
}
}
}, {
id: 'jsscdate', label: gettext('Date'), type: 'datetimepicker',
cell: 'datetimepicker', group: gettext('Repeat'),
controlProps: { ampm: false,
placeholder: gettext('YYYYMMDD'), autoOk: true,
disablePast: true, pickerType: 'Date', format: 'yyyyMMdd',
},
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscdate: null };
}
}
}, {
id: 'jsscmonths', label: gettext('Months'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true, multiple: true, allowSelectAll: true,
placeholder: gettext('Select the months...'),
},
options: MONTHS,
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscmonths: null };
}
}
}, {
id: 'jsscweekdays', label: gettext('Week Days'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true, multiple: true, allowSelectAll: true,
placeholder: gettext('Select the weekdays...'),
},
options: WEEKDAYS,
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscweekdays: null };
}
}
}, {
id: 'jsscmonthdays', label: gettext('Month Days'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true, multiple: true, allowSelectAll: true,
placeholder: gettext('Select the month days...'),
},
options: MONTHDAYS,
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscmonthdays: null };
}
}
}, {
id: 'jsschours', label: gettext('Hours'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true, multiple: true, allowSelectAll: true,
placeholder: gettext('Select the hours...'),
},
options: HOURS,
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsschours: null };
}
}
}, {
id: 'jsscminutes', label: gettext('Minutes'), type: 'select',
group: gettext('Repeat'),
controlProps: { allowClear: true, multiple: true, allowSelectAll: true,
placeholder: gettext('Select the minutes...'),
},
options: MINUTES,
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsscminutes: null };
}
}
}
];
}
export function getActionSchema(obj, parentName) {
return [
{
id: 'jsprtype', label: gettext('Type'), type: 'select',
controlProps: { allowClear: false}, group: gettext('Action'),
options: [
{'label': 'PLSQL BLOCK', 'value': 'PLSQL_BLOCK'},
{'label': 'STORED PROCEDURE', 'value': 'STORED_PROCEDURE'},
],
readonly: function(state) { return isReadOnly(obj, state, parentName); },
deps:['jsjobtype'],
depChange: (state) => {
if (state.jsjobtype == 'p') {
return { jsprtype: null };
}
}
}, {
id: 'jsprcode', label: gettext('Code'), type: 'sql',
group: gettext('Code'), isFullTab: true,
readonly: function(state) {
return isReadOnly(obj, state, parentName) || state.jsprtype == 'STORED_PROCEDURE';
},
deps:['jsprtype', 'jsjobtype'],
depChange: (state) => {
if (obj.isNew(state) && (state.jsprtype == 'STORED_PROCEDURE' || state.jsjobtype == 'p')) {
return { jsprcode: '' };
}
}
}, {
id: 'jsprproc', label: gettext('Procedure'), type: 'select',
controlProps: { allowClear: false}, group: gettext('Action'),
options: obj.fieldOptions.procedures,
optionsLoaded: (options) => { obj.jsprocData = options; },
readonly: function(state) {
return isReadOnly(obj, state, parentName) || state.jsprtype == 'PLSQL_BLOCK';
},
deps:['jsprtype', 'jsjobtype'],
depChange: (state) => {
if (obj.isNew(state) && (state.jsprtype == 'PLSQL_BLOCK' || state.jsjobtype == 'p')) {
return { jsprproc: null, jsprnoofargs : 0, jsprarguments: [] };
}
for(const option of obj.jsprocData) {
if (option.label == state.jsprproc) {
return { jsprnoofargs : option.no_of_args,
jsprarguments: option.arguments};
}
}
}
}, {
id: 'jsprnoofargs', label: gettext('Number of Arguments'),
type: 'int', group: gettext('Action'), deps:['jsprtype'],
readonly: true,
}, {
id: 'jsprarguments', label: gettext('Arguments'), cell: 'string',
group: gettext('Arguments'), type: 'collection',
canAdd: false, canDelete: false, canDeleteRow: false, canEdit: false,
mode: ['create', 'edit'],
columns: parentName == 'job' ? ['argname', 'argtype', 'argdefval', 'argval'] : ['argname', 'argtype', 'argdefval'],
schema : new ProgramArgumentSchema(parentName),
}
];
}
export class ProgramArgumentSchema extends BaseUISchema {
constructor(parentName) {
super();
this.parentName = parentName;
}
get baseFields() {
return[{
id: 'argname', label: gettext('Name'), type: 'text',
cell: '', readonly: true,
}, {
id: 'argtype', label: gettext('Data type'), type: 'text',
cell: '', readonly: true,
}, {
id: 'argdefval', label: gettext('Default'), type: 'text',
cell: '', readonly: true,
}, {
id: 'argval', label: gettext('Value'), type: 'text',
cell: 'text',
}];
}
}

View File

@ -0,0 +1,45 @@
/////////////////////////////////////////////////////////////
//
// pgAdmin 4 - PostgreSQL Tools
//
// Copyright (C) 2013 - 2024, The pgAdmin Development Team
// This software is released under the PostgreSQL Licence
//
//////////////////////////////////////////////////////////////
import gettext from 'sources/gettext';
import BaseUISchema from 'sources/SchemaView/base_schema.ui';
export default class DBMSJobSchedulerSchema extends BaseUISchema {
constructor() {
super({
jobid: null,
jobname: '',
jobstatus: '',
joberror: ''
});
}
get idAttribute() {
return 'jobid';
}
get baseFields() {
return [
{
id: 'jobid', label: gettext('ID'), cell: 'int', mode: ['properties']
}, {
id: 'jobname', label: gettext('Name'), cell: 'text'
}, {
id: 'jobstatus', label: gettext('Status'), cell: 'text'
}, {
id: 'joberror', label: gettext('Error'), cell: 'text'
}, {
id: 'jobstarttime', label: gettext('Start Time'), cell: 'datetimepicker'
}, {
id: 'jobendtime', label: gettext('End Time'), cell: 'datetimepicker'
}, {
id: 'jobnextrun', label: gettext('Next Run'), cell: 'datetimepicker'
}];
}
}

Some files were not shown because too many files have changed in this diff Show More