Compare commits

...

143 Commits

Author SHA1 Message Date
Julien Fontanet
6f9dad8999 feat(xo-server): 5.21.0 2018-06-28 14:18:47 +02:00
Julien Fontanet
089cffcca1 feat(xo-server-load-balancer): 0.3.2 2018-06-28 14:14:48 +02:00
Julien Fontanet
88df5a8337 feat(xen-api): 0.16.11 2018-06-28 14:13:35 +02:00
Julien Fontanet
794c1cf89b feat(vhd-lib): 0.2.0 2018-06-28 14:10:08 +02:00
Julien Fontanet
9a5eea6e78 chore(CHANGELOG): 5.21.0 2018-06-28 13:58:44 +02:00
badrAZ
40568cd61f feat(xo-web/backup-ng/logs): copy full log and report a failed job (#3110)
Fixes #3100
2018-06-28 10:57:35 +02:00
Pierre Donias
358e1441cc fix(xo-server/pool.mergeInto): fail when product brands are different (#3118)
Fixes #3061
2018-06-28 10:10:55 +02:00
badrAZ
be930e127e fix(xo-web/vm/snapshots): creation from snapshot (#3117) 2018-06-28 10:03:19 +02:00
badrAZ
3656e83df5 feat(backup-ng/logs): add the job's name to the modal's title (#3115)
See #2711
2018-06-27 17:43:06 +02:00
badrAZ
abbb0450f8 feat(xo-web/backup-ng/logs): remove Mode column (#3116)
See #2711
2018-06-27 17:41:03 +02:00
Pierre Donias
8e4beeb00f feat(xo-web/backup-ng/health): legacy snapshots table (#3111)
Fixes #3082
2018-06-27 17:25:52 +02:00
Julien Fontanet
05d10ef985 feat(xo-web/settings/remotes): type defaults to NFS (#3114)
Fixes #3103
2018-06-27 17:04:55 +02:00
Pierre Donias
989d27154d fix(xo-web/backup/health): missing schedule subscription (#3104)
Also, move the table from `Backup/Health` to `Backup NG/Health`.
2018-06-26 16:12:29 +02:00
Pierre Donias
ec9957bd86 feat(xo-server,xo-web/tasks): disable cancel/destroy when not relevant (#3109)
* feat(xo-server,xo-web/tasks): disable cancel/destroy when not relevant

Fixes #3076

* cancellable/destroyable → allowedOperations

* Update index.js

* Update index.js
2018-06-26 15:40:53 +02:00
Julien Fontanet
dc8a7c46e0 feat(xo-server): dont create default remote (#3107)
Related to #3105
2018-06-26 14:47:24 +02:00
Pierre Donias
9ee2d8e0c2 feat(xo-web): new backup/health view (#3102)
Fixes #3090

Move "VM snapshots related to non-existent backups" table
from dashboard/health to backup/health

### Check list

> Check items when done or if not relevant

- [x] PR reference the relevant issue (e.g. `Fixes #007`)
- [x] if UI changes, a screenshot has been added to the PR
- [x] CHANGELOG updated
- [x] documentation updated

### Process

1. create a PR as soon as possible
1. mark it as `WiP:` (Work in Progress) if not ready to be merged
1. when you want a review, add a reviewer
1. if necessary, update your PR, and readd a reviewer

### List of packages to release

> No need to mention xo-server and xo-web.

### Screenshots

![capture_2018-06-25_12 18 01](https://user-images.githubusercontent.com/10992860/41858496-56af27d0-789a-11e8-9f86-e3ce1ac28e54.png)
2018-06-25 17:59:36 +02:00
badrAZ
6c62d6840a fix(backup-ng/new): "new-schedules" dont exist anymore (#3098) 2018-06-25 14:02:10 +02:00
badrAZ
2a2135ac71 fix(xo-server-load-balancer): make the metric "memoryFree" optional (#3073) 2018-06-22 19:00:27 +02:00
badrAZ
efaad2efb2 feat(backup-ng/new): ability to enable/disable a schedule (#3094)
Fixes #3062
2018-06-22 17:25:31 +02:00
badrAZ
3b244c24d7 feat(backup-ng/new): group saved & new schedules (#3093)
See #2711
2018-06-22 16:23:12 +02:00
badrAZ
915052d5f6 feat(backup NG): ability to cancel a running job (#3053)
Fixes #3047
2018-06-22 09:10:10 +02:00
Pierre Donias
05c6c7830d feat(xo-web/backups): add blog link to deprecation message (#3092)
Fixes #3089
2018-06-21 16:48:28 +02:00
badrAZ
0217c51559 fix(backup-ng/new): only list writable SRs for CR/DR (#3085)
Fixes #3050
2018-06-21 15:41:16 +02:00
badrAZ
0c514198bb feat: ability to save cloud configs (#3054)
Fixes #2984
2018-06-21 15:02:52 +02:00
Pierre Donias
0e68834b4c chore(xo-web): multiple minor fixes (#3091) 2018-06-21 12:10:38 +02:00
Julien Fontanet
ee99ef6264 feat(vhd-lib): createContentStream (#3086)
Export the raw content of the VHD as a stream.

This features is exposed in CLI: `vhd-cli raw input.vhd output.raw`

Related to #3083

Perf comparison between qemu-img and our vhd-cli to convert a 10GiB VHD file to raw:

```
> time qemu-img convert -f vpc -O raw origin.vhd expected.raw
1.40user 15.19system 1:01.88elapsed 26%CPU (0avgtext+0avgdata 24264maxresident)k
20979008inputs+20971520outputs (12major+4648minor)pagefaults 0swaps
> time vhd-cli raw origin.vhd actual.raw
21.97user 16.03system 1:09.11elapsed 54%CPU (0avgtext+0avgdata 65208maxresident)k
20956008inputs+20972448outputs (1major+754101minor)pagefaults 0swaps

> md5sum *.raw
b55ec6924be750edd2423e4a7aa262c3  actual.raw
b55ec6924be750edd2423e4a7aa262c3  expected.raw
```
2018-06-20 17:53:20 +02:00
badrAZ
bebb9bf0df fix(XapiStats/iowait): don't scale XAPI values (#3079)
Fixes #2969
2018-06-19 16:16:07 +02:00
badrAZ
4830ac9623 fix(xo-server/xapi-stats): specify the wanted step when requesting the RRD (#3078)
Fixes #3026
Fixes #3075
2018-06-18 18:32:16 +02:00
badrAZ
58b1d0fba8 fix(xo-server-load-balancer): issue whith a mutation of a var (#3077) 2018-06-18 16:01:26 +02:00
Julien Fontanet
cc4e69e631 feat(xo-server): 5.20.3 2018-06-16 17:05:54 +02:00
Julien Fontanet
e14fda6e8a feat(xen-api): 0.1.3 2018-06-16 17:05:27 +02:00
Julien Fontanet
ec48b77af3 feat(xen-api): add task record to task errors 2018-06-16 15:24:26 +02:00
Julien Fontanet
c7d6a19864 fix(xo-server/full backup NG): do not fork if 1 target 2018-06-16 15:24:26 +02:00
Julien Fontanet
7d714c8ce4 feat(xo-server): 5.20.2 2018-06-16 13:51:51 +02:00
Julien Fontanet
f70989c3a2 feat(delta NG): fix UUID of existing VHDs 2018-06-16 13:51:03 +02:00
Julien Fontanet
70490988b0 chore(xo-server/backupNg): add parseUuid util function 2018-06-16 13:51:03 +02:00
Julien Fontanet
d0795fdded feat(vhd-lib): make footer.{uuid,parentUuid)} a buffer 2018-06-16 13:51:03 +02:00
Julien Fontanet
1c736e9910 feat(xo-web): 5.20.2 2018-06-15 15:35:26 +02:00
Julien Fontanet
62979d5c22 feat(xo-server): 5.20.1 2018-06-15 15:35:00 +02:00
Julien Fontanet
ec8a4da73c feat(xo-vmdk-to-vhd): 0.1.3 2018-06-15 15:34:33 +02:00
Julien Fontanet
dea1bfee01 feat(xo-server-backup-reports): 0.12.2 2018-06-15 15:32:02 +02:00
Julien Fontanet
c18b82504a feat(xen-api): 0.16.10 2018-06-15 15:30:59 +02:00
Julien Fontanet
ed5460273f feat(vhd-lib): 0.1.2 2018-06-15 15:27:21 +02:00
Julien Fontanet
b91f8b21b9 feat(fs): 0.1.0 2018-06-15 15:26:48 +02:00
Julien Fontanet
5cea18e577 feat(delta NG): check VDIs before export (#3069) 2018-06-15 15:22:12 +02:00
Julien Fontanet
148eaa6a72 fix(xo-server/backup NG): retention checks (#3072)
They were broken by the introduction of `copyRetention`.
2018-06-15 12:13:55 +02:00
Rajaa.BARHTAOUI
80794211af feat(xo-server,xo-web/snapshots): fast-clone to create VM from snapshot (#3030)
Fixes #2937
2018-06-14 15:01:19 +02:00
Rajaa.BARHTAOUI
75dcbae417 chore(xo-web/backup): deprecate legacy backup creation (#3035)
Fixes #2956
2018-06-14 14:18:17 +02:00
Rajaa.BARHTAOUI
b19682b3c5 fix(xo-web/New VM): networks predicate in Self Service mode (#3027)
Fixes #3011
2018-06-14 13:25:08 +02:00
badrAZ
dd3b97cae2 feat(backup/overview): add tooltip to migrate action button (#3067)
Fixes #3042
2018-06-14 11:06:28 +02:00
Julien Fontanet
79891235f3 chore(CHANGELOG): add missing entry related to prev commit 2018-06-13 15:34:11 +02:00
Julien Fontanet
1e2f72ab6b feat(backup NG): new setting copyRetention (#2976)
Fixes #2895
2018-06-13 15:24:46 +02:00
Julien Fontanet
66d02e3808 feat(xo-server/backup NG): more logs (#3066) 2018-06-13 13:11:03 +02:00
Julien Fontanet
275e1f8f4c chore(xo-server): Xapi#_assertHealthyVdiChains is sync 2018-06-13 12:01:41 +02:00
Julien Fontanet
84dbbb0fbb chore(xo-server/backups-ng): add FIXME 2018-06-13 11:33:05 +02:00
Julien Fontanet
a36ef5209c fix(backup NG): only display the concerned tasks (#3063)
Allows us to add other tasks if necessary without breaking xo-web listing and the backup reports.
2018-06-12 13:18:46 +02:00
badrAZ
3497889302 fix(backup-ng): only display the concerned tasks 2018-06-12 11:53:43 +02:00
badrAZ
0a2f6b4ce7 feat(xo-web/backupNg/new): improve backupNg feedback (#2873)
See #2711
2018-06-12 11:52:19 +02:00
Julien Fontanet
f8be44d746 chore(xo-server): remove Xapi#barrier legacy type param 2018-06-11 18:00:16 +02:00
Julien Fontanet
379253c5ae chore(xen-api): lodash.forEach is unnecessary here 2018-06-11 17:57:02 +02:00
Julien Fontanet
aed1ba474c chore(xen-api): remove unnecessary test
`eventWatchers` is defined if event watching is enabled which is always the case here.
2018-06-11 17:55:47 +02:00
badrAZ
bc72e67442 feat(backup NG): new option to shutdown VMs before snapshot (#3060)
Fixes #3058
2018-06-11 17:25:18 +02:00
Julien Fontanet
26c965faa9 feat(xen-api/examples): handle cancelation 2018-06-11 15:48:43 +02:00
Pierre Donias
b3a3965ed2 feat(xo-web/SR): copy VDIs' UUIDs from Disks tab (#3059)
Fixes #3051
2018-06-11 14:57:37 +02:00
Julien Fontanet
7f88b46f4c chore(xen-api/examples): update deps 2018-06-08 15:36:29 +02:00
badrAZ
dd60d82d3d fix(xo-web/Backup-ng/logs): ability to retry a single failed/interrupted VM backup (#3052)
Fixes #2912
2018-06-08 10:52:38 +02:00
Julien Fontanet
4eeb995340 chore(CHANGELOG): update for next release 2018-06-08 10:32:00 +02:00
Julien Fontanet
1d29348e30 feat(xo-server-backup-reports): 0.12.1 2018-06-07 18:46:53 +02:00
Pierre Donias
a24db3f896 fix(xo-web/SortedTable): show grouped actions when all items selected (#3049)
Fixes #3048
2018-06-07 17:26:58 +02:00
Julien Fontanet
cffac27d0a feat(xo-server/jobs): implement cancelation (#3046)
Related to #3047

This is the first step toward Backup NG cancelation, the server side stuff should be ok (but need testing), so the next step is to expose it in the UI.
2018-06-07 17:20:06 +02:00
Julien Fontanet
b207cbdd77 feat(fs/read): read part of a file in an existing Buffer (#3036)
Easier to use and probably more efficient than `createReadStream` for this specific usage.
2018-06-07 17:19:33 +02:00
badrAZ
10baecefb9 fix(xo-server-backup-reports): not display size and speed if not have a transfer/merge (#3038) 2018-06-07 16:34:49 +02:00
badrAZ
42620323a9 fix(xo-web/Backup-ng): add label to Edit action (#3045)
Fixes #3043
2018-06-07 16:21:31 +02:00
Julien Fontanet
4d91006994 feat(xo-server/backupNg.getAllLogs): sort tasks (#3041) 2018-06-07 14:04:46 +02:00
badrAZ
a81f0b9a93 feat(xo-web/Backup NG/logs): display whether the export is delta/full (#3023)
See #2711
2018-06-07 12:30:44 +02:00
Julien Fontanet
2cee413ae1 chore(PR template): should reference issue 2018-06-07 12:14:58 +02:00
Nicolas Raynaud
53099eacc8 chore(xo-vmdk-to-vhd): split a file and rename some consts (#2966) 2018-06-06 16:49:18 +02:00
badrAZ
b628c5c07e fix(xo-server-backup-reports): handle the case when a transfer/merge fail (#3020) 2018-06-06 15:21:46 +02:00
Julien Fontanet
12889b6a09 fix(xo-server/Xapi#importDeltaVm): correctly copy task prop (#3034) 2018-06-06 14:26:19 +02:00
badrAZ
0c23ca5b66 feat(xo-web/Backup NG/logs): details are now dynamic (#3031) 2018-06-06 14:25:31 +02:00
Pierre Donias
d732ee3ade fix(xo-web/file restore NG): restrict to Premium (#3032) 2018-06-06 13:37:28 +02:00
badrAZ
65cb0bc4cf fix(xo-server-backup-reports): correctly send status to Nagios (#3019)
Fixes #2991
2018-06-05 17:55:06 +02:00
badrAZ
1ba68a94e3 fix(xo-server): vm.xenTools.* must be numbers (#3022) 2018-06-05 14:30:30 +02:00
Julien Fontanet
084430451a feat(xo-web): 5.20.1 2018-05-31 21:00:34 +02:00
Julien Fontanet
458a4d4efe fix(xo-web/backup NG logs): dont display size/speed if size is 0 2018-05-31 20:59:47 +02:00
Julien Fontanet
62eeab2a74 feat(xo-web): 5.20.0 2018-05-31 18:33:13 +02:00
Julien Fontanet
790b43910d feat(xo-server): 5.20.0 2018-05-31 18:33:13 +02:00
Julien Fontanet
ba65461c4d feat(xo-server-usage-report): 0.5.0 2018-05-31 18:29:45 +02:00
Julien Fontanet
5bd468791f feat(xo-server-backup-reports): 0.12.0 2018-05-31 18:28:21 +02:00
Julien Fontanet
37f71bb36c feat(xo-acl-resolver): 0.2.4 2018-05-31 18:26:53 +02:00
Julien Fontanet
2ed4b7ad3f feat(vhd-lib): 0.1.1 2018-05-31 17:59:02 +02:00
Julien Fontanet
7eb970f22a feat(fs): 0.0.1 2018-05-31 17:57:13 +02:00
badrAZ
13db4a8411 feat(Backup NG): improve logs (#3013) 2018-05-31 17:54:35 +02:00
badrAZ
49a7a89bbf feat(xo-web/new-vm): ability to use template vars in the CloudConfig (#3006)
Fixes #2140
2018-05-31 17:44:52 +02:00
Rajaa.BARHTAOUI
0af8a60c1c fix(xo-web/SR/disks): show VM templates attached to VDIs (#3012)
Fixes #2974
2018-05-31 17:32:34 +02:00
Rajaa.BARHTAOUI
e1650b376c fix(xo-web/Self new VM): do not auto-select resource set's 1st SR (#3007)
New behaviour: either auto-select the template's pool's default SR if
it's in the resource set or do not auto-select any SR at all

Fixes #3001
2018-05-31 16:57:52 +02:00
Rajaa.BARHTAOUI
873b40cc70 feat(xo-server,xo-web): allow setting remote syslog host (#2958)
Fixes #2900
2018-05-31 16:22:39 +02:00
Rajaa.BARHTAOUI
d45265b180 feat(xo-web): create single-server private network (#3004)
Fixes #2944
2018-05-31 11:25:08 +02:00
badrAZ
ff50b2848e feat(xo-server,xo-web): allow pool admins to create VMs (#2995)
Fixes #2350
2018-05-31 10:42:03 +02:00
Julien Fontanet
d67fae22ab chore: initial PR template 2018-05-30 18:10:11 +02:00
badrAZ
d809002558 feat(Backup NG): ability to retry a single failed VM backup (#3009)
Fixes 2912
2018-05-30 17:04:50 +02:00
Pierre Donias
5c30559d15 feat(xo-server,xo-web): handle XCP-ng patches (#3005) 2018-05-30 16:03:43 +02:00
Rajaa.BARHTAOUI
cbb5b011e1 fix(xo-web/SR/disks): show control domain VMs attached to VDIs (#3002)
Fixes #2999
2018-05-30 15:55:44 +02:00
Julien Fontanet
f5bff408a8 feat(xo-server/exportDeltaVm): dont check chain if no snapshot
Should fix a race condition with XenServer DB during Delta Backup NG.
2018-05-30 12:22:10 +02:00
Nicolas Raynaud
d7cfe4d3dc fix(vhd/merge): fix performance enhancement (#2980) 2018-05-30 10:26:15 +02:00
badrAZ
7be8f38c6b fix(xo-web/jobs/edit): support all types of inputs (#2997) 2018-05-30 10:20:22 +02:00
badrAZ
08a7e605ce feat(xo-web/backup logs): show # of calls for each state (#2860) 2018-05-30 09:58:48 +02:00
Julien Fontanet
4b57db5893 feat(xo-server/delta NG): validate parent VHD 2018-05-29 16:36:33 +02:00
Julien Fontanet
8b1ae3f3c9 fix(xo-server/delta NG): dont select tmp file as parent 2018-05-29 16:36:10 +02:00
Julien Fontanet
77d35a5928 chore(fs/Handler#list): prepend dir after filtering 2018-05-29 16:34:51 +02:00
Julien Fontanet
323d409e6c chore(package): require yarn >1.7
Previous releases prevented xo-web from being built to an incorrect resolution of dependencies.
2018-05-29 16:29:14 +02:00
Pierre Donias
9f2f2b7b69 fix(xo-web/render-xo-item): show SR container name (#3003)
Fixes #3000
2018-05-29 12:01:50 +02:00
Julien Fontanet
b44fa7beca chore(xo-server): bump xo:perf threshold to 500ms 2018-05-29 10:53:14 +02:00
Julien Fontanet
6d4e310b8e chore(vhd-cli/build): migrate to Babel 7 2018-05-28 18:18:44 +02:00
Julien Fontanet
6726530229 fix(tests): run all test with Babel 7
Temporarily disable xo-web tests which uses Babel 6.
2018-05-28 18:18:44 +02:00
Julien Fontanet
8351352541 chore(xo-remote-parser/build): migrate to Babel 7 2018-05-28 18:18:44 +02:00
Julien Fontanet
3f9e8d79ea chore(xo-collection/build): migrate to Babel 7 2018-05-28 18:18:44 +02:00
Julien Fontanet
685f2328bd chore(package): update dependencies 2018-05-28 14:54:55 +02:00
Pierre Donias
746567a8a7 fix(xo-acl-resolver,xo-web): SR & PIF ACLs inheritance (#2994) 2018-05-25 15:26:45 +02:00
Julien Fontanet
c116c41c42 chore(package): update dependencies 2018-05-25 11:42:55 +02:00
Pierre Donias
3768a7de37 fix(xo-web/select): do not auto-select disabled option (#2992) 2018-05-25 11:11:21 +02:00
Julien Fontanet
11ef0ee54f feat(rolling snapshots legacy): check VDI chains (#2986) 2018-05-24 16:39:06 +02:00
Julien Fontanet
33ae531e3a fix(package): update hashy to 0.7.1
Correctly handle hash with identifier `2b`.
2018-05-24 14:52:38 +02:00
Julien Fontanet
8cc9924751 feat(xo-server/importDeltaVm): add UUID of missing base VM 2018-05-24 00:04:52 +02:00
Julien Fontanet
c329ab863b feat(Backup NG): configurable concurrency (#2918) 2018-05-23 16:29:31 +02:00
Julien Fontanet
41820ea316 fix(xo-server/backup legacy): fix reports (#2979) 2018-05-23 15:56:40 +02:00
badrAZ
bf00f80716 fix(xo-web): update @julien-f/freactal to 0.1.1 (#2972)
This fix prevent the cursor from jumping to the end of the input when editing the backup's name.
2018-05-23 14:47:28 +02:00
Julien Fontanet
9baf0c74e4 chore(vhd-lib): move some deps to devDeps 2018-05-23 11:49:59 +02:00
Julien Fontanet
b59ccdf26f chore(package): update dependencies 2018-05-23 11:23:25 +02:00
Julien Fontanet
9cae978923 feat(xo-server): 5.19.9 2018-05-22 19:43:19 +02:00
Julien Fontanet
311d914b96 feat(xo-web/import): remove restriction for Free 2018-05-22 16:27:17 +02:00
Julien Fontanet
592cb4ef9e feat(xo-web): 5.19.8 2018-05-22 15:53:45 +02:00
Julien Fontanet
ec2db7f2d0 feat(xo-vmdk-to-vhd): 0.1.2 2018-05-22 15:53:03 +02:00
Julien Fontanet
71eab7ba9b feat(xo-web): 5.19.7 2018-05-22 15:36:54 +02:00
badrAZ
5e07171d60 fix(xo-server/backup-ng): fix incorrect condition (#2971) 2018-05-22 11:45:02 +02:00
Rajaa.BARHTAOUI
3f73e3d964 fix(xo-server,xo-web): message when no Xen tools (#2916)
Fixes #2911
2018-05-22 09:57:46 +02:00
badrAZ
0ebe78b4a2 feat(xo-server-usage-report): improve the report (#2970)
Fixes #2968
2018-05-21 16:25:40 +02:00
Julien Fontanet
61c3379298 feat(xo-server): 5.19.8 2018-05-21 11:00:19 +02:00
Julien Fontanet
44866f3316 feat(xo-web): 5.19.6 2018-05-21 10:42:37 +02:00
Julien Fontanet
4bb8ce8779 feat(vhd-lib): 0.1.0 2018-05-21 10:03:37 +02:00
Julien Fontanet
58eb6a8b5f feat(xo-server): 5.19.7 2018-05-18 18:44:34 +02:00
Julien Fontanet
52f6a79e01 fix(xo-server/backupNg/logs): include merge/transfer size (#2965) 2018-05-18 18:44:07 +02:00
Julien Fontanet
129f79d44b feat(xo-web): 5.19.5 2018-05-18 18:42:31 +02:00
132 changed files with 5663 additions and 3680 deletions

View File

@@ -9,6 +9,7 @@
[options]
esproposal.decorators=ignore
esproposal.optional_chaining=enable
include_warnings=true
module.use_strict=true

1
.gitignore vendored
View File

@@ -10,6 +10,7 @@
/packages/vhd-cli/src/commands/index.js
/packages/xen-api/examples/node_modules/
/packages/xen-api/plot.dat
/packages/xo-server/.xo-server.*

View File

@@ -1,40 +1,56 @@
'use strict'
const PLUGINS_RE = /^(?:@babel\/plugin-.+|babel-plugin-lodash)$/
const PLUGINS_RE = /^(?:@babel\/|babel-)plugin-.+$/
const PRESETS_RE = /^@babel\/preset-.+$/
const NODE_ENV = process.env.NODE_ENV || 'development'
const __PROD__ = NODE_ENV === 'production'
const __TEST__ = NODE_ENV === 'test'
const configs = {
'@babel/plugin-proposal-decorators': {
legacy: true,
},
'@babel/preset-env' (pkg) {
return {
debug: !__TEST__,
loose: true,
shippedProposals: true,
targets: __PROD__
? (() => {
let node = (pkg.engines || {}).node
if (node !== undefined) {
const trimChars = '^=>~'
while (trimChars.includes(node[0])) {
node = node.slice(1)
}
return { node: node }
}
})()
: { browsers: '', node: 'current' },
useBuiltIns: '@babel/polyfill' in (pkg.dependencies || {}) && 'usage',
}
},
}
const getConfig = (key, ...args) => {
const config = configs[key]
return config === undefined
? {}
: typeof config === 'function'
? config(...args)
: config
}
module.exports = function (pkg, plugins, presets) {
plugins === undefined && (plugins = {})
presets === undefined && (presets = {})
presets['@babel/preset-env'] = {
debug: !__TEST__,
loose: true,
shippedProposals: true,
targets: __PROD__
? (() => {
let node = (pkg.engines || {}).node
if (node !== undefined) {
const trimChars = '^=>~'
while (trimChars.includes(node[0])) {
node = node.slice(1)
}
return { node: node }
}
})()
: { browsers: '', node: 'current' },
useBuiltIns: '@babel/polyfill' in (pkg.dependencies || {}) && 'usage',
}
Object.keys(pkg.devDependencies || {}).forEach(name => {
if (!(name in presets) && PLUGINS_RE.test(name)) {
plugins[name] = {}
plugins[name] = getConfig(name, pkg)
} else if (!(name in presets) && PRESETS_RE.test(name)) {
presets[name] = {}
presets[name] = getConfig(name, pkg)
}
})

View File

@@ -41,10 +41,10 @@
"moment-timezone": "^0.5.14"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},

View File

@@ -1,6 +1,6 @@
{
"name": "@xen-orchestra/fs",
"version": "0.0.0",
"version": "0.1.0",
"license": "AGPL-3.0",
"description": "The File System for Xen Orchestra backups.",
"keywords": [],
@@ -20,10 +20,10 @@
"node": ">=6"
},
"dependencies": {
"@babel/runtime": "^7.0.0-beta.44",
"@babel/runtime": "^7.0.0-beta.49",
"@marsaud/smb2-promise": "^0.2.1",
"execa": "^0.10.0",
"fs-extra": "^5.0.0",
"fs-extra": "^6.0.1",
"get-stream": "^3.0.0",
"lodash": "^4.17.4",
"promise-toolbox": "^0.9.5",
@@ -32,12 +32,12 @@
"xo-remote-parser": "^0.3"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/plugin-proposal-function-bind": "7.0.0-beta.44",
"@babel/plugin-transform-runtime": "^7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/plugin-proposal-function-bind": "7.0.0-beta.49",
"@babel/plugin-transform-runtime": "^7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"index-modules": "^0.3.0",

View File

@@ -92,6 +92,22 @@ export default class RemoteHandlerAbstract {
await promise
}
async read (
file: File,
buffer: Buffer,
position?: number
): Promise<{| bytesRead: number, buffer: Buffer |}> {
return this._read(file, buffer, position)
}
_read (
file: File,
buffer: Buffer,
position?: number
): Promise<{| bytesRead: number, buffer: Buffer |}> {
throw new Error('Not implemented')
}
async readFile (file: string, options?: Object): Promise<Buffer> {
return this._readFile(file, options)
}
@@ -126,7 +142,10 @@ export default class RemoteHandlerAbstract {
prependDir = false,
}: { filter?: (name: string) => boolean, prependDir?: boolean } = {}
): Promise<string[]> {
const entries = await this._list(dir)
let entries = await this._list(dir)
if (filter !== undefined) {
entries = entries.filter(filter)
}
if (prependDir) {
entries.forEach((entry, i) => {
@@ -134,7 +153,7 @@ export default class RemoteHandlerAbstract {
})
}
return filter === undefined ? entries : entries.filter(filter)
return entries
}
async _list (dir: string): Promise<string[]> {

View File

@@ -50,6 +50,24 @@ export default class LocalHandler extends RemoteHandlerAbstract {
await fs.writeFile(path, data, options)
}
async _read (file, buffer, position) {
const needsClose = typeof file === 'string'
file = needsClose ? await fs.open(this._getFilePath(file), 'r') : file.fd
try {
return await fs.read(
file,
buffer,
0,
buffer.length,
position === undefined ? null : position
)
} finally {
if (needsClose) {
await fs.close(file)
}
}
}
async _readFile (file, options) {
return fs.readFile(this._getFilePath(file), options)
}

View File

@@ -1,6 +1,47 @@
# ChangeLog
## **5.20.0** (planned 2018-05-31)
## *next*
### Enhancements
### Bugs
## **5.21.0** (2018-06-28)
### Enhancements
- Hide legacy backup creation view [#2956](https://github.com/vatesfr/xen-orchestra/issues/2956)
- [Delta Backup NG logs] Display wether the export is a full or a delta [#2711](https://github.com/vatesfr/xen-orchestra/issues/2711)
- Copy VDIs' UUID from SR/disks view [#3051](https://github.com/vatesfr/xen-orchestra/issues/3051)
- [Backup NG] New option to shutdown VMs before snapshotting them [#3058](https://github.com/vatesfr/xen-orchestra/issues/3058#event-1673756438)
- [Backup NG form] Improve feedback [#2711](https://github.com/vatesfr/xen-orchestra/issues/2711)
- [Backup NG] Different retentions for backup and replication [#2895](https://github.com/vatesfr/xen-orchestra/issues/2895)
- Possibility to use a fast clone when creating a VM from a snapshot [#2937](https://github.com/vatesfr/xen-orchestra/issues/2937)
- Ability to customize cloud config templates [#2984](https://github.com/vatesfr/xen-orchestra/issues/2984)
- Add Backup deprecation message and link to Backup NG migration blog post [#3089](https://github.com/vatesfr/xen-orchestra/issues/3089)
- [Backup NG] Ability to cancel a running backup job [#3047](https://github.com/vatesfr/xen-orchestra/issues/3047)
- [Backup NG form] Ability to enable/disable a schedule [#3062](https://github.com/vatesfr/xen-orchestra/issues/3062)
- New backup/health view with non-existent backup snapshots table [#3090](https://github.com/vatesfr/xen-orchestra/issues/3090)
- Disable cancel/destroy tasks when not allowed [#3076](https://github.com/vatesfr/xen-orchestra/issues/3076)
- Default remote type is NFS [#3103](https://github.com/vatesfr/xen-orchestra/issues/3103) (PR [#3114](https://github.com/vatesfr/xen-orchestra/pull/3114))
- Add legacy backups snapshots to backup/health [#3082](https://github.com/vatesfr/xen-orchestra/issues/3082) (PR [#3111](https://github.com/vatesfr/xen-orchestra/pull/3111))
- [Backup NG logs] Add the job's name to the modal's title [#2711](https://github.com/vatesfr/xen-orchestra/issues/2711) (PR [#3115](https://github.com/vatesfr/xen-orchestra/pull/3115))
- Adding a XCP-ng host to a XS pool now fails fast [#3061](https://github.com/vatesfr/xen-orchestra/issues/3061) (PR [#3118](https://github.com/vatesfr/xen-orchestra/pull/3118))
- [Backup NG logs] Ability to report a failed job and copy its log to the clipboard [#3100](https://github.com/vatesfr/xen-orchestra/issues/3100) (PR [#3110](https://github.com/vatesfr/xen-orchestra/pull/3110))
### Bugs
- update the xentools search item to return the version number of installed xentools [#3015](https://github.com/vatesfr/xen-orchestra/issues/3015)
- Fix Nagios backup reports [#2991](https://github.com/vatesfr/xen-orchestra/issues/2991)
- Fix the retry of a single failed/interrupted VM backup [#2912](https://github.com/vatesfr/xen-orchestra/issues/2912#issuecomment-395480321)
- New VM with Self: filter out networks that are not in the template's pool [#3011](https://github.com/vatesfr/xen-orchestra/issues/3011)
- [Backup NG] Auto-detect when a full export is necessary.
- Fix Load Balancer [#3075](https://github.com/vatesfr/xen-orchestra/issues/3075#event-1685469551) [#3026](https://github.com/vatesfr/xen-orchestra/issues/3026)
- [SR stats] Don't scale XAPI iowait values [#2969](https://github.com/vatesfr/xen-orchestra/issues/2969)
- [Backup NG] Don't list unusable SRs for CR/DR [#3050](https://github.com/vatesfr/xen-orchestra/issues/3050)
- Fix creating VM from snapshot (PR [3117](https://github.com/vatesfr/xen-orchestra/pull/3117))
## **5.20.0** (2018-05-31)
### Enhancements
@@ -9,8 +50,6 @@
- [Patches] ignore XS upgrade in missing patches counter [#2866](https://github.com/vatesfr/xen-orchestra/issues/2866)
- [Health] List VM snapshots related to non-existing backup jobs/schedules [#2828](https://github.com/vatesfr/xen-orchestra/issues/2828)
### Bugs
## **5.19.0** (2018-05-01)
### Enhancements

19
PULL_REQUEST_TEMPLATE.md Normal file
View File

@@ -0,0 +1,19 @@
### Check list
> Check items when done or if not relevant
- [ ] PR reference the relevant issue (e.g. `Fixes #007`)
- [ ] if UI changes, a screenshot has been added to the PR
- [ ] CHANGELOG updated
- [ ] documentation updated
### Process
1. create a PR as soon as possible
1. mark it as `WiP:` (Work in Progress) if not ready to be merged
1. when you want a review, add a reviewer
1. if necessary, update your PR, and readd a reviewer
### List of packages to release
> No need to mention xo-server and xo-web.

5
babel.config.js Normal file
View File

@@ -0,0 +1,5 @@
module.exports = {
// Necessary for jest to be able to find the `.babelrc.js` closest to the file
// instead of only the one in this directory.
babelrcRoots: true,
}

View File

@@ -0,0 +1,6 @@
declare module 'limit-concurrency-decorator' {
declare function limitConcurrencyDecorator(
concurrency: number
): <T: Function>(T) => T
declare export default typeof limitConcurrencyDecorator
}

View File

@@ -1,4 +1,8 @@
declare module 'lodash' {
declare export function countBy<K, V>(
object: { [K]: V },
iteratee: K | ((V, K) => string)
): { [string]: number }
declare export function forEach<K, V>(
object: { [K]: V },
iteratee: (V, K) => void
@@ -20,5 +24,10 @@ declare module 'lodash' {
iteratee: (V1, K) => V2
): { [K]: V2 }
declare export function noop(...args: mixed[]): void
declare export function some<T>(
collection: T[],
iteratee: (T, number) => boolean
): boolean
declare export function sum(values: number[]): number
declare export function values<K, V>(object: { [K]: V }): V[]
}

View File

@@ -1,4 +1,7 @@
declare module 'promise-toolbox' {
declare export class CancelToken {
static source(): { cancel: (message: any) => void, token: CancelToken };
}
declare export function cancelable(Function): Function
declare export function defer<T>(): {|
promise: Promise<T>,

View File

@@ -1,8 +1,10 @@
{
"devDependencies": {
"@babel/register": "^7.0.0-beta.44",
"babel-7-jest": "^21.3.2",
"@babel/core": "^7.0.0-beta.49",
"@babel/register": "^7.0.0-beta.49",
"babel-core": "^7.0.0-0",
"babel-eslint": "^8.1.2",
"babel-jest": "^23.0.1",
"benchmark": "^2.1.4",
"eslint": "^4.14.0",
"eslint-config-standard": "^11.0.0-beta.0",
@@ -13,23 +15,22 @@
"eslint-plugin-react": "^7.6.1",
"eslint-plugin-standard": "^3.0.1",
"exec-promise": "^0.7.0",
"flow-bin": "^0.69.0",
"flow-bin": "^0.73.0",
"globby": "^8.0.0",
"husky": "^0.14.3",
"jest": "^22.0.4",
"jest": "^23.0.1",
"lodash": "^4.17.4",
"prettier": "^1.10.2",
"promise-toolbox": "^0.9.5",
"sorted-object": "^2.0.1"
},
"engines": {
"yarn": "^1.2.1"
"yarn": "^1.7.0"
},
"jest": {
"collectCoverage": true,
"projects": [
"<rootDir>",
"<rootDir>/packages/xo-web"
"<rootDir>"
],
"testEnvironment": "node",
"testPathIgnorePatterns": [
@@ -38,14 +39,6 @@
],
"testRegex": "\\.spec\\.js$",
"transform": {
"/@xen-orchestra/cron/.+\\.jsx?$": "babel-7-jest",
"/@xen-orchestra/fs/.+\\.jsx?$": "babel-7-jest",
"/packages/complex-matcher/.+\\.jsx?$": "babel-7-jest",
"/packages/value-matcher/.+\\.jsx?$": "babel-7-jest",
"/packages/vhd-lib/.+\\.jsx?$": "babel-7-jest",
"/packages/xo-cli/.+\\.jsx?$": "babel-7-jest",
"/packages/xo-server/.+\\.jsx?$": "babel-7-jest",
"/packages/xo-vmdk-to-vhd/.+\\.jsx?$": "babel-7-jest",
"\\.jsx?$": "babel-jest"
}
},

View File

@@ -30,9 +30,9 @@
"lodash": "^4.17.4"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.1",
"rimraf": "^2.6.2"

View File

@@ -28,10 +28,10 @@
},
"dependencies": {},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},

View File

@@ -0,0 +1,3 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -23,21 +23,20 @@
"dist/"
],
"engines": {
"node": ">=4"
"node": ">=6"
},
"dependencies": {
"@xen-orchestra/fs": "^0.0.0",
"babel-runtime": "^6.22.0",
"@xen-orchestra/fs": "^0.1.0",
"exec-promise": "^0.7.0",
"struct-fu": "^1.2.0",
"vhd-lib": "^0.0.0"
"vhd-lib": "^0.2.0"
},
"devDependencies": {
"babel-cli": "^6.24.1",
"@babel/cli": "^7.0.0-beta.49",
"@babel/core": "^7.0.0-beta.49",
"@babel/plugin-transform-runtime": "^7.0.0-beta.49",
"@babel/preset-env": "^7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-env": "^1.5.2",
"babel-preset-stage-3": "^6.24.1",
"cross-env": "^5.1.3",
"execa": "^0.10.0",
"index-modules": "^0.3.0",
@@ -51,22 +50,5 @@
"prebuild": "rimraf dist/ && index-modules --cjs-lazy src/commands",
"predev": "yarn run prebuild",
"prepare": "yarn run build"
},
"babel": {
"plugins": [
"lodash",
"transform-runtime"
],
"presets": [
[
"env",
{
"targets": {
"node": 4
}
}
],
"stage-3"
]
}
}

View File

@@ -0,0 +1,23 @@
const { createWriteStream } = require('fs')
const { PassThrough } = require('stream')
const createOutputStream = path => {
if (path !== undefined && path !== '-') {
return createWriteStream(path)
}
// introduce a through stream because stdout is not a normal stream!
const stream = new PassThrough()
stream.pipe(process.stdout)
return stream
}
export const writeStream = (input, path) => {
const output = createOutputStream(path)
return new Promise((resolve, reject) =>
input
.on('error', reject)
.pipe(output.on('error', reject).on('finish', resolve))
)
}

View File

@@ -0,0 +1,16 @@
import { createContentStream } from 'vhd-lib'
import { getHandler } from '@xen-orchestra/fs'
import { resolve } from 'path'
import { writeStream } from '../_utils'
export default async args => {
if (args.length < 2 || args.some(_ => _ === '-h' || _ === '--help')) {
return `Usage: ${this.command} <input VHD> [<output raw>]`
}
await writeStream(
createContentStream(getHandler({ url: 'file:///' }), resolve(args[0])),
args[1]
)
}

View File

@@ -1,6 +1,6 @@
{
"name": "vhd-lib",
"version": "0.0.0",
"version": "0.2.0",
"license": "AGPL-3.0",
"description": "Primitives for VHD file handling",
"keywords": [],
@@ -20,30 +20,30 @@
"node": ">=6"
},
"dependencies": {
"@babel/runtime": "^7.0.0-beta.44",
"@xen-orchestra/fs": "^0.0.0",
"@babel/runtime": "^7.0.0-beta.49",
"async-iterator-to-stream": "^1.0.2",
"execa": "^0.10.0",
"from2": "^2.3.0",
"fs-extra": "^5.0.0",
"get-stream": "^3.0.0",
"fs-extra": "^6.0.1",
"limit-concurrency-decorator": "^0.4.0",
"promise-toolbox": "^0.9.5",
"struct-fu": "^1.2.0",
"uuid": "^3.0.1",
"tmp": "^0.0.33"
"uuid": "^3.0.1"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/plugin-transform-runtime": "^7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/plugin-transform-runtime": "^7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"@xen-orchestra/fs": "^0.1.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"execa": "^0.10.0",
"fs-promise": "^2.0.0",
"get-stream": "^3.0.0",
"index-modules": "^0.3.0",
"rimraf": "^2.6.2"
"rimraf": "^2.6.2",
"tmp": "^0.0.33"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",

View File

@@ -33,7 +33,7 @@ export function createFooter (
currentSize: size,
diskGeometry: geometry,
diskType,
uuid: generateUuid(null, []),
uuid: generateUuid(null, Buffer.allocUnsafe(16)),
})
checksumStruct(footer, fuFooter)
return footer

View File

@@ -40,7 +40,7 @@ export const fuFooter = fu.struct([
]),
fu.uint32('diskType'), // 60 Disk type, must be equal to HARD_DISK_TYPE_DYNAMIC/HARD_DISK_TYPE_DIFFERENCING.
fu.uint32('checksum'), // 64
fu.uint8('uuid', 16), // 68
fu.byte('uuid', 16), // 68
fu.char('saved'), // 84
fu.char('hidden'), // 85 TODO: should probably be merged in reserved
fu.char('reserved', 426), // 86
@@ -55,7 +55,7 @@ export const fuHeader = fu.struct([
fu.uint32('maxTableEntries'), // Max entries in the Block Allocation Table.
fu.uint32('blockSize'), // Block size in bytes. Default (2097152 => 2MB)
fu.uint32('checksum'),
fu.uint8('parentUuid', 16),
fu.byte('parentUuid', 16),
fu.uint32('parentTimestamp'),
fu.uint32('reserved1'),
fu.char16be('parentUnicodeName', 512),

View File

@@ -0,0 +1,31 @@
import asyncIteratorToStream from 'async-iterator-to-stream'
import Vhd from './vhd'
export default asyncIteratorToStream(async function * (handler, path) {
const fd = await handler.openFile(path, 'r')
try {
const vhd = new Vhd(handler, fd)
await vhd.readHeaderAndFooter()
await vhd.readBlockAllocationTable()
const {
footer: { currentSize },
header: { blockSize },
} = vhd
const nFullBlocks = Math.floor(currentSize / blockSize)
const nLeftoverBytes = currentSize % blockSize
const emptyBlock = Buffer.alloc(blockSize)
for (let i = 0; i < nFullBlocks; ++i) {
yield vhd.containsBlock(i) ? (await vhd._readBlock(i)).data : emptyBlock
}
if (nLeftoverBytes !== 0) {
yield (vhd.containsBlock(nFullBlocks)
? (await vhd._readBlock(nFullBlocks)).data
: emptyBlock
).slice(0, nLeftoverBytes)
}
} finally {
await handler.closeFile(fd)
}
})

View File

@@ -28,7 +28,7 @@ function createBAT (
) {
let currentVhdPositionSector = firstBlockPosition / SECTOR_SIZE
blockAddressList.forEach(blockPosition => {
assert.strictEqual(blockPosition % 512, 0)
assert.strictEqual(blockPosition % SECTOR_SIZE, 0)
const vhdTableIndex = Math.floor(blockPosition / VHD_BLOCK_SIZE_BYTES)
if (bat.readUInt32BE(vhdTableIndex * 4) === BLOCK_UNUSED) {
bat.writeUInt32BE(currentVhdPositionSector, vhdTableIndex * 4)
@@ -57,7 +57,8 @@ export default asyncIteratorToStream(async function * (
}
const maxTableEntries = Math.ceil(diskSize / VHD_BLOCK_SIZE_BYTES) + 1
const tablePhysicalSizeBytes = Math.ceil(maxTableEntries * 4 / 512) * 512
const tablePhysicalSizeBytes =
Math.ceil(maxTableEntries * 4 / SECTOR_SIZE) * SECTOR_SIZE
const batPosition = FOOTER_SIZE + HEADER_SIZE
const firstBlockPosition = batPosition + tablePhysicalSizeBytes
@@ -101,13 +102,14 @@ export default asyncIteratorToStream(async function * (
if (currentVhdBlockIndex >= 0) {
yield * yieldAndTrack(
currentBlockWithBitmap,
bat.readUInt32BE(currentVhdBlockIndex * 4) * 512
bat.readUInt32BE(currentVhdBlockIndex * 4) * SECTOR_SIZE
)
}
currentBlockWithBitmap = Buffer.alloc(bitmapSize + VHD_BLOCK_SIZE_BYTES)
currentVhdBlockIndex = batIndex
}
const blockOffset = (next.offsetBytes / 512) % VHD_BLOCK_SIZE_SECTORS
const blockOffset =
(next.offsetBytes / SECTOR_SIZE) % VHD_BLOCK_SIZE_SECTORS
for (let bitPos = 0; bitPos < VHD_BLOCK_SIZE_SECTORS / ratio; bitPos++) {
setBitmap(currentBlockWithBitmap, blockOffset + bitPos)
}

View File

@@ -1,5 +1,6 @@
export { default } from './vhd'
export { default as chainVhd } from './chain'
export { default as createContentStream } from './createContentStream'
export { default as createReadableRawStream } from './createReadableRawStream'
export {
default as createReadableSparseStream,

View File

@@ -1,5 +1,4 @@
import assert from 'assert'
import getStream from 'get-stream'
import { fromEvent } from 'promise-toolbox'
import constantStream from './_constant-stream'
@@ -93,20 +92,14 @@ export default class Vhd {
// Read functions.
// =================================================================
_readStream (start, n) {
return this._handler.createReadStream(this._path, {
start,
end: start + n - 1, // end is inclusive
})
}
_read (start, n) {
return this._readStream(start, n)
.then(getStream.buffer)
.then(buf => {
assert.equal(buf.length, n)
return buf
})
async _read (start, n) {
const { bytesRead, buffer } = await this._handler.read(
this._path,
Buffer.alloc(n),
start
)
assert.equal(bytesRead, n)
return buffer
}
containsBlock (id) {
@@ -336,11 +329,11 @@ export default class Vhd {
`freeFirstBlockSpace: move first block ${firstSector} -> ${newFirstSector}`
)
// copy the first block at the end
const stream = await this._readStream(
const block = await this._read(
sectorsToBytes(firstSector),
fullBlockSize
)
await this._write(stream, sectorsToBytes(newFirstSector))
await this._write(block, sectorsToBytes(newFirstSector))
await this._setBatEntry(first, newFirstSector)
await this.writeFooter(true)
spaceNeededBytes -= this.fullBlockSize
@@ -476,12 +469,12 @@ export default class Vhd {
// For each sector of block data...
const { sectorsPerBlock } = child
let parentBitmap = null
for (let i = 0; i < sectorsPerBlock; i++) {
// If no changes on one sector, skip.
if (!mapTestBit(bitmap, i)) {
continue
}
let parentBitmap = null
let endSector = i + 1
// Count changed sectors.

View File

@@ -4,7 +4,7 @@ process.env.DEBUG = '*'
const defer = require('golike-defer').default
const pump = require('pump')
const { fromCallback } = require('promise-toolbox')
const { CancelToken, fromCallback } = require('promise-toolbox')
const { createClient } = require('../')
@@ -30,8 +30,11 @@ defer(async ($defer, args) => {
await xapi.connect()
$defer(() => xapi.disconnect())
const { cancel, token } = CancelToken.source()
process.on('SIGINT', cancel)
// https://xapi-project.github.io/xen-api/snapshots.html#downloading-a-disk-or-snapshot
const exportStream = await xapi.getResource('/export_raw_vdi/', {
const exportStream = await xapi.getResource(token, '/export_raw_vdi/', {
query: {
format: raw ? 'raw' : 'vhd',
vdi: await resolveRef(xapi, 'VDI', args[1])

View File

@@ -4,7 +4,7 @@ process.env.DEBUG = '*'
const defer = require('golike-defer').default
const pump = require('pump')
const { fromCallback } = require('promise-toolbox')
const { CancelToken, fromCallback } = require('promise-toolbox')
const { createClient } = require('../')
@@ -24,8 +24,11 @@ defer(async ($defer, args) => {
await xapi.connect()
$defer(() => xapi.disconnect())
const { cancel, token } = CancelToken.source()
process.on('SIGINT', cancel)
// https://xapi-project.github.io/xen-api/importexport.html
const exportStream = await xapi.getResource('/export/', {
const exportStream = await xapi.getResource(token, '/export/', {
query: {
ref: await resolveRef(xapi, 'VM', args[1]),
use_compression: 'true'

View File

@@ -3,6 +3,7 @@
process.env.DEBUG = '*'
const defer = require('golike-defer').default
const { CancelToken } = require('promise-toolbox')
const { createClient } = require('../')
@@ -28,8 +29,11 @@ defer(async ($defer, args) => {
await xapi.connect()
$defer(() => xapi.disconnect())
const { cancel, token } = CancelToken.source()
process.on('SIGINT', cancel)
// https://xapi-project.github.io/xen-api/snapshots.html#uploading-a-disk-or-snapshot
await xapi.putResource(createInputStream(args[2]), '/import_raw_vdi/', {
await xapi.putResource(token, createInputStream(args[2]), '/import_raw_vdi/', {
query: {
format: raw ? 'raw' : 'vhd',
vdi: await resolveRef(xapi, 'VDI', args[1])

View File

@@ -3,6 +3,7 @@
process.env.DEBUG = '*'
const defer = require('golike-defer').default
const { CancelToken } = require('promise-toolbox')
const { createClient } = require('../')
@@ -22,8 +23,11 @@ defer(async ($defer, args) => {
await xapi.connect()
$defer(() => xapi.disconnect())
const { cancel, token } = CancelToken.source()
process.on('SIGINT', cancel)
// https://xapi-project.github.io/xen-api/importexport.html
await xapi.putResource(createInputStream(args[1]), '/import/', {
await xapi.putResource(token, createInputStream(args[1]), '/import/', {
query: args[2] && { sr_id: await resolveRef(xapi, 'SR', args[2]) }
})
})(process.argv.slice(2)).catch(

View File

@@ -1,6 +1,6 @@
{
"dependencies": {
"golike-defer": "^0.1.0",
"pump": "^1.0.2"
"golike-defer": "^0.4.1",
"pump": "^3.0.0"
}
}

View File

@@ -0,0 +1,30 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
end-of-stream@^1.1.0:
version "1.4.1"
resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.1.tgz#ed29634d19baba463b6ce6b80a37213eab71ec43"
dependencies:
once "^1.4.0"
golike-defer@^0.4.1:
version "0.4.1"
resolved "https://registry.yarnpkg.com/golike-defer/-/golike-defer-0.4.1.tgz#7a1cd435d61e461305805d980b133a0f3db4e1cc"
once@^1.3.1, once@^1.4.0:
version "1.4.0"
resolved "https://registry.yarnpkg.com/once/-/once-1.4.0.tgz#583b1aa775961d4b113ac17d9c50baef9dd76bd1"
dependencies:
wrappy "1"
pump@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64"
dependencies:
end-of-stream "^1.1.0"
once "^1.3.1"
wrappy@1:
version "1.0.2"
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"

View File

@@ -1,6 +1,6 @@
{
"name": "xen-api",
"version": "0.16.9",
"version": "0.16.11",
"license": "ISC",
"description": "Connector to the Xen API",
"keywords": [

View File

@@ -96,6 +96,7 @@ class XapiError extends BaseError {
// slots than can be assigned later
this.method = undefined
this.url = undefined
this.task = undefined
}
}
@@ -188,7 +189,9 @@ const getTaskResult = task => {
return Promise.reject(new Cancel('task canceled'))
}
if (status === 'failure') {
return Promise.reject(wrapError(task.error_info))
const error = wrapError(task.error_info)
error.task = task
return Promise.reject(error)
}
if (status === 'success') {
// the result might be:
@@ -595,7 +598,10 @@ export class Xapi extends EventEmitter {
if (error != null && (response = error.response) != null) {
response.req.abort()
const { headers: { location }, statusCode } = response
const {
headers: { location },
statusCode,
} = response
if (statusCode === 302 && location !== undefined) {
return doRequest(location)
}
@@ -777,15 +783,13 @@ export class Xapi extends EventEmitter {
this._pool = object
const eventWatchers = this._eventWatchers
if (eventWatchers !== undefined) {
forEach(object.other_config, (_, key) => {
const eventWatcher = eventWatchers[key]
if (eventWatcher !== undefined) {
delete eventWatchers[key]
eventWatcher(object)
}
})
}
Object.keys(object.other_config).forEach(key => {
const eventWatcher = eventWatchers[key]
if (eventWatcher !== undefined) {
delete eventWatchers[key]
eventWatcher(object)
}
})
} else if (type === 'task') {
if (prev === undefined) {
++this._nTasks

View File

@@ -1,6 +1,6 @@
{
"name": "xo-acl-resolver",
"version": "0.2.3",
"version": "0.2.4",
"license": "ISC",
"description": "Xen-Orchestra internal: do ACLs resolution",
"keywords": [],

View File

@@ -50,7 +50,9 @@ const checkAuthorizationByTypes = {
network: or(checkSelf, checkMember('$pool')),
SR: or(checkSelf, checkMember('$pool')),
PIF: checkMember('$host'),
SR: or(checkSelf, checkMember('$container')),
task: checkMember('$host'),

View File

@@ -28,7 +28,7 @@
"node": ">=6"
},
"dependencies": {
"@babel/polyfill": "7.0.0-beta.44",
"@babel/polyfill": "7.0.0-beta.49",
"bluebird": "^3.5.1",
"chalk": "^2.2.0",
"event-to-promise": "^0.8.0",
@@ -49,10 +49,10 @@
"xo-lib": "^0.9.0"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"

View File

@@ -0,0 +1,3 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -25,17 +25,16 @@
"node": ">=4"
},
"dependencies": {
"babel-runtime": "^6.18.0",
"@babel/runtime": "^7.0.0-beta.49",
"kindof": "^2.0.0",
"lodash": "^4.17.2",
"make-error": "^1.0.2"
},
"devDependencies": {
"babel-cli": "^6.24.1",
"babel-plugin-lodash": "^3.3.2",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-env": "^1.5.2",
"babel-preset-stage-3": "^6.24.1",
"@babel/cli": "^7.0.0-beta.49",
"@babel/core": "^7.0.0-beta.49",
"@babel/plugin-transform-runtime": "^7.0.0-beta.49",
"@babel/preset-env": "^7.0.0-beta.49",
"cross-env": "^5.1.3",
"event-to-promise": "^0.8.0",
"rimraf": "^2.6.1"
@@ -46,22 +45,5 @@
"prebuild": "rimraf dist/",
"predev": "yarn run prebuild",
"prepublishOnly": "yarn run build"
},
"babel": {
"plugins": [
"lodash",
"transform-runtime"
],
"presets": [
[
"env",
{
"targets": {
"node": 4
}
}
],
"stage-3"
]
}
}

View File

@@ -0,0 +1,3 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -27,10 +27,10 @@
"lodash": "^4.13.1"
},
"devDependencies": {
"babel-cli": "^6.24.1",
"@babel/cli": "^7.0.0-beta.49",
"@babel/core": "^7.0.0-beta.49",
"@babel/preset-env": "^7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"babel-preset-env": "^1.5.2",
"babel-preset-stage-3": "^6.24.1",
"cross-env": "^5.1.3",
"deep-freeze": "^0.0.1",
"rimraf": "^2.6.1"
@@ -41,22 +41,5 @@
"prebuild": "rimraf dist/",
"predev": "yarn run prebuild",
"prepare": "yarn run build"
},
"babel": {
"plugins": [
"lodash"
],
"presets": [
[
"env",
{
"targets": {
"browsers": "> 5%",
"node": 4
}
}
],
"stage-3"
]
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "xo-server-backup-reports",
"version": "0.11.0",
"version": "0.12.2",
"license": "AGPL-3.0",
"description": "Backup reports plugin for XO-Server",
"keywords": [

View File

@@ -1,7 +1,6 @@
import humanFormat from 'human-format'
import moment from 'moment-timezone'
import { find, forEach, get, startCase } from 'lodash'
import { forEach, get, startCase } from 'lodash'
import pkg from '../package'
export const configurationSchema = {
@@ -37,6 +36,12 @@ const ICON_FAILURE = '🚨'
const ICON_SKIPPED = '⏩'
const ICON_SUCCESS = '✔'
const STATUS_ICON = {
skipped: ICON_SKIPPED,
success: ICON_SUCCESS,
failure: ICON_FAILURE,
}
const DATE_FORMAT = 'dddd, MMMM Do YYYY, h:mm:ss a'
const createDateFormater = timezone =>
timezone !== undefined
@@ -57,10 +62,12 @@ const formatSize = bytes =>
})
const formatSpeed = (bytes, milliseconds) =>
humanFormat(bytes * 1e3 / milliseconds, {
scale: 'binary',
unit: 'B/s',
})
milliseconds > 0
? humanFormat((bytes * 1e3) / milliseconds, {
scale: 'binary',
unit: 'B/s',
})
: 'N/A'
const logError = e => {
console.error('backup report error:', e)
@@ -95,43 +102,42 @@ class BackupReportsXoPlugin {
this._xo.removeListener('job:terminated', this._report)
}
_wrapper (status, job, schedule) {
_wrapper (status, job, schedule, runJobId) {
return new Promise(resolve =>
resolve(
job.type === 'backup'
? this._backupNgListener(status, job, schedule)
: this._listener(status, job, schedule)
? this._backupNgListener(status, job, schedule, runJobId)
: this._listener(status, job, schedule, runJobId)
)
).catch(logError)
}
async _backupNgListener (runJobId, _, { timezone }) {
async _backupNgListener (_1, _2, { timezone }, runJobId) {
const xo = this._xo
const logs = await xo.getBackupNgLogs(runJobId)
const jobLog = logs['roots'][0]
const vmsTaskLog = logs[jobLog.id]
const log = await xo.getBackupNgLogs(runJobId)
const { reportWhen, mode } = jobLog.data || {}
if (reportWhen === 'never') {
const { reportWhen, mode } = log.data || {}
if (
reportWhen === 'never' ||
(log.status === 'success' && reportWhen === 'failure')
) {
return
}
const jobName = (await xo.getJob(log.jobId, 'backup')).name
const formatDate = createDateFormater(timezone)
const jobName = (await xo.getJob(jobLog.jobId, 'backup')).name
if (jobLog.error !== undefined) {
const [globalStatus, icon] =
jobLog.error.message === NO_VMS_MATCH_THIS_PATTERN
? ['Skipped', ICON_SKIPPED]
: ['Failure', ICON_FAILURE]
if (
(log.status === 'failure' || log.status === 'skipped') &&
log.result !== undefined
) {
let markdown = [
`## Global status: ${globalStatus}`,
`## Global status: ${log.status}`,
'',
`- **mode**: ${mode}`,
`- **Start time**: ${formatDate(jobLog.start)}`,
`- **End time**: ${formatDate(jobLog.end)}`,
`- **Duration**: ${formatDuration(jobLog.duration)}`,
`- **Error**: ${jobLog.error.message}`,
`- **Start time**: ${formatDate(log.start)}`,
`- **End time**: ${formatDate(log.end)}`,
`- **Duration**: ${formatDuration(log.end - log.start)}`,
`- **Error**: ${log.result.message}`,
'---',
'',
`*${pkg.name} v${pkg.version}*`,
@@ -139,12 +145,14 @@ class BackupReportsXoPlugin {
markdown = markdown.join('\n')
return this._sendReport({
subject: `[Xen Orchestra] ${globalStatus} Backup report for ${jobName} ${icon}`,
subject: `[Xen Orchestra] ${
log.status
} Backup report for ${jobName} ${STATUS_ICON[log.status]}`,
markdown,
nagiosStatus: 2,
nagiosMarkdown: `[Xen Orchestra] [${globalStatus}] Backup report for ${jobName} - Error : ${
jobLog.error.message
}`,
nagiosMarkdown: `[Xen Orchestra] [${
log.status
}] Backup report for ${jobName} - Error : ${log.result.message}`,
})
}
@@ -157,14 +165,12 @@ class BackupReportsXoPlugin {
let globalTransferSize = 0
let nFailures = 0
let nSkipped = 0
for (const vmTaskLog of vmsTaskLog || []) {
const vmTaskStatus = vmTaskLog.status
if (vmTaskStatus === 'success' && reportWhen === 'failure') {
for (const taskLog of log.tasks) {
if (taskLog.status === 'success' && reportWhen === 'failure') {
return
}
const vmId = vmTaskLog.data.id
const vmId = taskLog.data.id
let vm
try {
vm = xo.getObject(vmId)
@@ -173,136 +179,170 @@ class BackupReportsXoPlugin {
`### ${vm !== undefined ? vm.name_label : 'VM not found'}`,
'',
`- **UUID**: ${vm !== undefined ? vm.uuid : vmId}`,
`- **Start time**: ${formatDate(vmTaskLog.start)}`,
`- **End time**: ${formatDate(vmTaskLog.end)}`,
`- **Duration**: ${formatDuration(vmTaskLog.duration)}`,
`- **Start time**: ${formatDate(taskLog.start)}`,
`- **End time**: ${formatDate(taskLog.end)}`,
`- **Duration**: ${formatDuration(taskLog.end - taskLog.start)}`,
]
const failedSubTasks = []
const operationsText = []
const snapshotText = []
const srsText = []
const remotesText = []
for (const subTaskLog of logs[vmTaskLog.taskId] || []) {
const { data, status, result, message } = subTaskLog
const icon =
subTaskLog.status === 'success' ? ICON_SUCCESS : ICON_FAILURE
const errorMessage = ` **Error**: ${get(result, 'message')}`
if (message === 'snapshot') {
operationsText.push(`- **Snapshot** ${icon}`)
if (status === 'failure') {
failedSubTasks.push('Snapshot')
operationsText.push('', errorMessage)
}
} else if (data.type === 'remote') {
const remoteId = data.id
const remote = await xo.getRemote(remoteId).catch(() => {})
remotesText.push(
`- **${
remote !== undefined ? remote.name : `Remote Not found`
}** (${remoteId}) ${icon}`
for (const subTaskLog of taskLog.tasks || []) {
if (
subTaskLog.message !== 'export' &&
subTaskLog.message !== 'snapshot'
) {
continue
}
const icon = STATUS_ICON[subTaskLog.status]
const errorMessage = ` - **Error**: ${get(
subTaskLog.result,
'message'
)}`
if (subTaskLog.message === 'snapshot') {
snapshotText.push(
`- **Snapshot** ${icon}`,
` - **Start time**: ${formatDate(subTaskLog.start)}`,
` - **End time**: ${formatDate(subTaskLog.end)}`
)
if (status === 'failure') {
failedSubTasks.push(remote !== undefined ? remote.name : remoteId)
} else if (subTaskLog.data.type === 'remote') {
const id = subTaskLog.data.id
const remote = await xo.getRemote(id).catch(() => {})
remotesText.push(
` - **${
remote !== undefined ? remote.name : `Remote Not found`
}** (${id}) ${icon}`,
` - **Start time**: ${formatDate(subTaskLog.start)}`,
` - **End time**: ${formatDate(subTaskLog.end)}`,
` - **Duration**: ${formatDuration(
subTaskLog.end - subTaskLog.start
)}`
)
if (subTaskLog.status === 'failure') {
failedSubTasks.push(remote !== undefined ? remote.name : id)
remotesText.push('', errorMessage)
}
} else {
const srId = data.id
const id = subTaskLog.data.id
let sr
try {
sr = xo.getObject(srId)
sr = xo.getObject(id)
} catch (e) {}
const [srName, srUuid] =
sr !== undefined ? [sr.name_label, sr.uuid] : [`SR Not found`, srId]
srsText.push(`- **${srName}** (${srUuid}) ${icon}`)
if (status === 'failure') {
failedSubTasks.push(sr !== undefined ? sr.name_label : srId)
sr !== undefined ? [sr.name_label, sr.uuid] : [`SR Not found`, id]
srsText.push(
` - **${srName}** (${srUuid}) ${icon}`,
` - **Start time**: ${formatDate(subTaskLog.start)}`,
` - **End time**: ${formatDate(subTaskLog.end)}`,
` - **Duration**: ${formatDuration(
subTaskLog.end - subTaskLog.start
)}`
)
if (subTaskLog.status === 'failure') {
failedSubTasks.push(sr !== undefined ? sr.name_label : id)
srsText.push('', errorMessage)
}
}
forEach(subTaskLog.tasks, operationLog => {
if (
operationLog.message !== 'merge' &&
operationLog.message !== 'transfer'
) {
return
}
const operationInfoText = []
if (operationLog.status === 'success') {
const size = operationLog.result.size
if (operationLog.message === 'merge') {
globalMergeSize += size
} else {
globalTransferSize += size
}
operationInfoText.push(
` - **Size**: ${formatSize(size)}`,
` - **Speed**: ${formatSpeed(
size,
operationLog.end - operationLog.start
)}`
)
} else {
operationInfoText.push(
` - **Error**: ${get(operationLog.result, 'message')}`
)
}
const operationText = [
` - **${operationLog.message}** ${
STATUS_ICON[operationLog.status]
}`,
` - **Start time**: ${formatDate(operationLog.start)}`,
` - **End time**: ${formatDate(operationLog.end)}`,
` - **Duration**: ${formatDuration(
operationLog.end - operationLog.start
)}`,
...operationInfoText,
].join('\n')
if (get(subTaskLog, 'data.type') === 'remote') {
remotesText.push(operationText)
remotesText.join('\n')
}
if (get(subTaskLog, 'data.type') === 'SR') {
srsText.push(operationText)
srsText.join('\n')
}
})
}
if (operationsText.length !== 0) {
operationsText.unshift(`#### Operations`, '')
}
if (srsText.length !== 0) {
srsText.unshift(`#### SRs`, '')
srsText.unshift(`- **SRs**`)
}
if (remotesText.length !== 0) {
remotesText.unshift(`#### remotes`, '')
remotesText.unshift(`- **Remotes**`)
}
const subText = [...operationsText, '', ...srsText, '', ...remotesText]
const result = vmTaskLog.result
if (vmTaskStatus === 'failure' && result !== undefined) {
const { message } = result
if (isSkippedError(result)) {
const subText = [...snapshotText, '', ...srsText, '', ...remotesText]
if (taskLog.result !== undefined) {
if (taskLog.status === 'skipped') {
++nSkipped
skippedVmsText.push(
...text,
`- **Reason**: ${
message === UNHEALTHY_VDI_CHAIN_ERROR
taskLog.result.message === UNHEALTHY_VDI_CHAIN_ERROR
? UNHEALTHY_VDI_CHAIN_MESSAGE
: message
: taskLog.result.message
}`,
''
)
nagiosText.push(
`[(Skipped) ${
vm !== undefined ? vm.name_label : 'undefined'
} : ${message} ]`
`[(Skipped) ${vm !== undefined ? vm.name_label : 'undefined'} : ${
taskLog.result.message
} ]`
)
} else {
++nFailures
failedVmsText.push(...text, `- **Error**: ${message}`, '')
failedVmsText.push(
...text,
`- **Error**: ${taskLog.result.message}`,
''
)
nagiosText.push(
`[(Failed) ${
vm !== undefined ? vm.name_label : 'undefined'
} : ${message} ]`
`[(Failed) ${vm !== undefined ? vm.name_label : 'undefined'} : ${
taskLog.result.message
} ]`
)
}
} else {
let transferSize, transferDuration, mergeSize, mergeDuration
forEach(logs[vmTaskLog.taskId], ({ taskId }) => {
if (transferSize !== undefined) {
return false
}
const transferTask = find(logs[taskId], { message: 'transfer' })
if (transferTask !== undefined) {
transferSize = transferTask.result.size
transferDuration = transferTask.end - transferTask.start
}
const mergeTask = find(logs[taskId], { message: 'merge' })
if (mergeTask !== undefined) {
mergeSize = mergeTask.result.size
mergeDuration = mergeTask.end - mergeTask.start
}
})
if (transferSize !== undefined) {
globalTransferSize += transferSize
text.push(
`- **Transfer size**: ${formatSize(transferSize)}`,
`- **Transfer speed**: ${formatSpeed(
transferSize,
transferDuration
)}`
)
}
if (mergeSize !== undefined) {
globalMergeSize += mergeSize
text.push(
`- **Merge size**: ${formatSize(mergeSize)}`,
`- **Merge speed**: ${formatSpeed(mergeSize, mergeDuration)}`
)
}
if (vmTaskStatus === 'failure') {
if (taskLog.status === 'failure') {
++nFailures
failedVmsText.push(...text, '', '', ...subText, '')
nagiosText.push(
`[(Failed) ${
`[${
vm !== undefined ? vm.name_label : 'undefined'
}: (failed)[${failedSubTasks.toString()}]]`
)
@@ -311,23 +351,16 @@ class BackupReportsXoPlugin {
}
}
}
const globalSuccess = nFailures === 0 && nSkipped === 0
if (reportWhen === 'failure' && globalSuccess) {
return
}
const nVms = vmsTaskLog.length
const nVms = log.tasks.length
const nSuccesses = nVms - nFailures - nSkipped
const globalStatus = globalSuccess
? `Success`
: nFailures !== 0 ? `Failure` : `Skipped`
let markdown = [
`## Global status: ${globalStatus}`,
`## Global status: ${log.status}`,
'',
`- **mode**: ${mode}`,
`- **Start time**: ${formatDate(jobLog.start)}`,
`- **End time**: ${formatDate(jobLog.end)}`,
`- **Duration**: ${formatDuration(jobLog.duration)}`,
`- **Start time**: ${formatDate(log.start)}`,
`- **End time**: ${formatDate(log.end)}`,
`- **Duration**: ${formatDuration(log.start - log.end)}`,
`- **Successes**: ${nSuccesses} / ${nVms}`,
]
@@ -367,17 +400,16 @@ class BackupReportsXoPlugin {
markdown = markdown.join('\n')
return this._sendReport({
markdown,
subject: `[Xen Orchestra] ${globalStatus} Backup report for ${jobName} ${
globalSuccess
? ICON_SUCCESS
: nFailures !== 0 ? ICON_FAILURE : ICON_SKIPPED
subject: `[Xen Orchestra] ${log.status} Backup report for ${jobName} ${
STATUS_ICON[log.status]
}`,
nagiosStatus: globalSuccess ? 0 : 2,
nagiosMarkdown: globalSuccess
? `[Xen Orchestra] [Success] Backup report for ${jobName}`
: `[Xen Orchestra] [${
nFailures !== 0 ? 'Failure' : 'Skipped'
}] Backup report for ${jobName} - VMs : ${nagiosText.join(' ')}`,
nagiosStatus: log.status === 'success' ? 0 : 2,
nagiosMarkdown:
log.status === 'success'
? `[Xen Orchestra] [Success] Backup report for ${jobName}`
: `[Xen Orchestra] [${
nFailures !== 0 ? 'Failure' : 'Skipped'
}] Backup report for ${jobName} - VMs : ${nagiosText.join(' ')}`,
})
}
@@ -401,7 +433,7 @@ class BackupReportsXoPlugin {
}),
xo.sendPassiveCheck !== undefined &&
xo.sendPassiveCheck({
nagiosStatus,
status: nagiosStatus,
message: nagiosMarkdown,
}),
])
@@ -567,7 +599,9 @@ class BackupReportsXoPlugin {
const nSuccesses = nCalls - nFailures - nSkipped
const globalStatus = globalSuccess
? `Success`
: nFailures !== 0 ? `Failure` : `Skipped`
: nFailures !== 0
? `Failure`
: `Skipped`
let markdown = [
`## Global status: ${globalStatus}`,
@@ -625,7 +659,9 @@ class BackupReportsXoPlugin {
subject: `[Xen Orchestra] ${globalStatus} Backup report for ${tag} ${
globalSuccess
? ICON_SUCCESS
: nFailures !== 0 ? ICON_FAILURE : ICON_SKIPPED
: nFailures !== 0
? ICON_FAILURE
: ICON_SKIPPED
}`,
nagiosStatus: globalSuccess ? 0 : 2,
nagiosMarkdown: globalSuccess

View File

@@ -1,6 +1,6 @@
{
"name": "xo-server-load-balancer",
"version": "0.3.1",
"version": "0.3.2",
"license": "AGPL-3.0",
"description": "Load balancer for XO-Server",
"keywords": [

View File

@@ -7,9 +7,13 @@ import { debug } from './utils'
export default class DensityPlan extends Plan {
_checkRessourcesThresholds (objects, averages) {
const { low } = this._thresholds.memoryFree
return filter(
objects,
object => averages[object.id].memoryFree > this._thresholds.memoryFree.low
object => {
const { memory, memoryFree = memory } = averages[object.id]
return memoryFree > low
}
)
}
@@ -145,7 +149,9 @@ export default class DensityPlan extends Plan {
// Test if a VM migration on a destination (of a destinations set) is possible.
_testMigration ({ vm, destinations, hostsAverages, vmsAverages }) {
const { _thresholds: { critical: criticalThreshold } } = this
const {
_thresholds: { critical: criticalThreshold },
} = this
// Sort the destinations by available memory. (- -> +)
destinations.sort(

View File

@@ -56,7 +56,7 @@ export default class PerformancePlan extends Plan {
}
const { averages, toOptimize } = results
let { hosts } = results
const { hosts } = results
toOptimize.sort((a, b) => {
a = averages[a.id]
@@ -69,12 +69,12 @@ export default class PerformancePlan extends Plan {
const { id } = exceededHost
debug(`Try to optimize Host (${exceededHost.id}).`)
hosts = filter(hosts, host => host.id !== id)
const availableHosts = filter(hosts, host => host.id !== id)
// Search bests combinations for the worst host.
await this._optimize({
exceededHost,
hosts,
hosts: availableHosts,
hostsAverages: averages,
})
}

View File

@@ -1,4 +1,4 @@
import { filter, includes, map as mapToArray } from 'lodash'
import { filter, includes, map as mapToArray, size } from 'lodash'
import { EXECUTION_DELAY, debug } from './utils'
@@ -23,13 +23,18 @@ const numberOrDefault = (value, def) => (value >= 0 ? value : def)
// Averages.
// ===================================================================
function computeAverage (values, nPoints = values.length) {
function computeAverage (values, nPoints) {
if (values === undefined) {
return
}
let sum = 0
let tot = 0
const { length } = values
const start = nPoints !== undefined ? length - nPoints : 0
for (let i = length - nPoints; i < length; i++) {
for (let i = start; i < length; i++) {
const value = values[i]
sum += value || 0
@@ -53,7 +58,7 @@ function computeRessourcesAverage (objects, objectsStats, nPoints) {
cpu: computeAverage(
mapToArray(stats.cpus, cpu => computeAverage(cpu, nPoints))
),
nCpus: stats.cpus.length,
nCpus: size(stats.cpus),
memoryFree: computeAverage(stats.memoryFree, nPoints),
memory: computeAverage(stats.memory, nPoints),
}
@@ -69,9 +74,13 @@ function computeRessourcesAverageWithWeight (averages1, averages2, ratio) {
const objectAverages = (averages[id] = {})
for (const averageName in averages1[id]) {
const average1 = averages1[id][averageName]
if (average1 === undefined) {
continue
}
objectAverages[averageName] =
averages1[id][averageName] * ratio +
averages2[id][averageName] * (1 - ratio)
average1 * ratio + averages2[id][averageName] * (1 - ratio)
}
}

View File

@@ -26,10 +26,10 @@
"lodash": "^4.17.4"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "^7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "^7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"

View File

@@ -1,6 +1,6 @@
{
"name": "xo-server-usage-report",
"version": "0.4.2",
"version": "0.5.0",
"license": "AGPL-3.0",
"description": "",
"keywords": [

View File

@@ -139,8 +139,8 @@ Handlebars.registerHelper(
new Handlebars.SafeString(
isFinite(+value) && +value !== 0
? (value = round(value, 2)) > 0
? `(<b style="color: green;">▲ ${value}</b>)`
: `(<b style="color: red;">▼ ${String(value).slice(1)}</b>)`
? `(<b style="color: green;">▲ ${value}%</b>)`
: `(<b style="color: red;">▼ ${String(value).slice(1)}%</b>)`
: ''
)
)
@@ -270,12 +270,16 @@ async function getHostsStats ({ runningHosts, xo }) {
function getSrsStats (xoObjects) {
return orderBy(
map(filter(xoObjects, { type: 'SR' }), sr => {
map(filter(xoObjects, obj => obj.type === 'SR' && obj.size > 0), sr => {
const total = sr.size / gibPower
const used = sr.physical_usage / gibPower
let name = sr.name_label
if (!sr.shared) {
name += ` (${find(xoObjects, { id: sr.$container }).name_label})`
}
return {
uuid: sr.uuid,
name: sr.name_label,
name,
total,
used,
free: total - used,

View File

@@ -1,6 +1,6 @@
{
"name": "xo-server",
"version": "5.19.6",
"version": "5.21.0",
"license": "AGPL-3.0",
"description": "Server part of Xen-Orchestra",
"keywords": [
@@ -31,15 +31,15 @@
"node": ">=6"
},
"dependencies": {
"@babel/polyfill": "7.0.0-beta.44",
"@babel/polyfill": "7.0.0-beta.49",
"@marsaud/smb2-promise": "^0.2.1",
"@xen-orchestra/cron": "^1.0.3",
"@xen-orchestra/fs": "^0.0.0",
"@xen-orchestra/fs": "^0.1.0",
"ajv": "^6.1.1",
"app-conf": "^0.5.0",
"archiver": "^2.1.0",
"async-iterator-to-stream": "^1.0.1",
"base64url": "^2.0.0",
"base64url": "^3.0.0",
"bind-property-descriptor": "^1.0.0",
"blocked": "^1.2.1",
"bluebird": "^3.5.1",
@@ -59,10 +59,10 @@
"express-session": "^1.15.6",
"fatfs": "^0.10.4",
"from2": "^2.3.0",
"fs-extra": "^5.0.0",
"fs-extra": "^6.0.1",
"get-stream": "^3.0.0",
"golike-defer": "^0.4.1",
"hashy": "^0.6.2",
"hashy": "^0.7.1",
"helmet": "^3.9.0",
"highland": "^2.11.1",
"http-proxy": "^1.16.2",
@@ -70,14 +70,14 @@
"http-server-plus": "^0.10.0",
"human-format": "^0.10.0",
"is-redirect": "^1.0.0",
"jest-worker": "^22.4.3",
"jest-worker": "^23.0.0",
"js-yaml": "^3.10.0",
"json-rpc-peer": "^0.15.3",
"json5": "^1.0.0",
"julien-f-source-map-support": "0.1.0",
"julien-f-unzip": "^0.2.1",
"kindof": "^2.0.0",
"level": "^3.0.0",
"level": "^4.0.0",
"level-party": "^3.0.4",
"level-sublevel": "^6.6.1",
"limit-concurrency-decorator": "^0.4.0",
@@ -93,16 +93,16 @@
"partial-stream": "0.0.0",
"passport": "^0.4.0",
"passport-local": "^1.0.0",
"pretty-format": "^22.0.3",
"pretty-format": "^23.0.0",
"promise-toolbox": "^0.9.5",
"proxy-agent": "^2.1.0",
"proxy-agent": "^3.0.0",
"pug": "^2.0.0-rc.4",
"pw": "^0.0.4",
"redis": "^2.8.0",
"schema-inspector": "^1.6.8",
"semver": "^5.4.1",
"serve-static": "^1.13.1",
"split-lines": "^1.1.0",
"split-lines": "^2.0.0",
"stack-chain": "^2.0.0",
"stoppable": "^1.0.5",
"struct-fu": "^1.2.0",
@@ -111,29 +111,29 @@
"tmp": "^0.0.33",
"uuid": "^3.0.1",
"value-matcher": "^0.2.0",
"vhd-lib": "^0.0.0",
"vhd-lib": "^0.2.0",
"ws": "^5.0.0",
"xen-api": "^0.16.9",
"xen-api": "^0.16.11",
"xml2js": "^0.4.19",
"xo-acl-resolver": "^0.2.3",
"xo-acl-resolver": "^0.2.4",
"xo-collection": "^0.4.1",
"xo-common": "^0.1.1",
"xo-remote-parser": "^0.3",
"xo-vmdk-to-vhd": "^0.1.1",
"xo-vmdk-to-vhd": "^0.1.3",
"yazl": "^2.4.3"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/plugin-proposal-decorators": "7.0.0-beta.44",
"@babel/plugin-proposal-export-default-from": "7.0.0-beta.44",
"@babel/plugin-proposal-export-namespace-from": "7.0.0-beta.44",
"@babel/plugin-proposal-function-bind": "7.0.0-beta.44",
"@babel/plugin-proposal-optional-chaining": "^7.0.0-beta.44",
"@babel/plugin-proposal-pipeline-operator": "^7.0.0-beta.44",
"@babel/plugin-proposal-throw-expressions": "^7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/preset-flow": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/plugin-proposal-decorators": "7.0.0-beta.49",
"@babel/plugin-proposal-export-default-from": "7.0.0-beta.49",
"@babel/plugin-proposal-export-namespace-from": "7.0.0-beta.49",
"@babel/plugin-proposal-function-bind": "7.0.0-beta.49",
"@babel/plugin-proposal-optional-chaining": "^7.0.0-beta.49",
"@babel/plugin-proposal-pipeline-operator": "^7.0.0-beta.49",
"@babel/plugin-proposal-throw-expressions": "^7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"@babel/preset-flow": "7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"index-modules": "^0.3.0",

View File

@@ -1,4 +1,5 @@
import { basename } from 'path'
import { isEmpty, pickBy } from 'lodash'
import { safeDateFormat } from '../utils'
@@ -117,8 +118,8 @@ getJob.params = {
},
}
export async function runJob ({ id, schedule }) {
return this.runJobSequence([id], await this.getSchedule(schedule))
export async function runJob ({ id, schedule, vm }) {
return this.runJobSequence([id], await this.getSchedule(schedule), vm)
}
runJob.permission = 'admin'
@@ -130,12 +131,17 @@ runJob.params = {
schedule: {
type: 'string',
},
vm: {
type: 'string',
optional: true,
},
}
// -----------------------------------------------------------------------------
export function getAllLogs () {
return this.getBackupNgLogs()
export async function getAllLogs (filter) {
const logs = await this.getBackupNgLogs()
return isEmpty(filter) ? logs : pickBy(logs, filter)
}
getAllLogs.permission = 'admin'

View File

@@ -0,0 +1,41 @@
export function getAll () {
return this.getAllCloudConfigs()
}
getAll.permission = 'admin'
getAll.description = 'Gets all existing cloud configs templates'
export function create (props) {
return this.createCloudConfig(props)
}
create.permission = 'admin'
create.description = 'Creates a new cloud config template'
create.params = {
name: { type: 'string' },
template: { type: 'string' },
}
export function update (props) {
return this.updateCloudConfig(props)
}
update.permission = 'admin'
update.description = 'Modifies an existing cloud config template'
update.params = {
id: { type: 'string' },
name: { type: 'string', optional: true },
template: { type: 'string', optional: true },
}
function delete_ ({ id }) {
return this.deleteCloudConfig(id)
}
delete_.permission = 'admin'
delete_.description = 'Deletes an existing cloud config template'
delete_.params = {
id: { type: 'string' },
}
export { delete_ as delete }

View File

@@ -76,6 +76,21 @@ export { restartAgent as restart_agent } // eslint-disable-line camelcase
// -------------------------------------------------------------------
export function setRemoteSyslogHost ({ host, syslogDestination }) {
return this.getXapi(host).setRemoteSyslogHost(host._xapiId, syslogDestination)
}
setRemoteSyslogHost.params = {
id: { type: 'string' },
syslogDestination: { type: 'string' },
}
setRemoteSyslogHost.resolve = {
host: ['id', 'host', 'administrate'],
}
// -------------------------------------------------------------------
export function start ({ host }) {
return this.getXapi(host).powerOnHost(host._xapiId)
}

View File

@@ -1,5 +1,12 @@
// FIXME so far, no acls for jobs
export function cancel ({ runId }) {
return this.cancelJobRun(runId)
}
cancel.permission = 'admin'
cancel.description = 'Cancel a current run'
export async function getAll () {
return /* await */ this.getAllJobs('call')
}

View File

@@ -146,8 +146,18 @@ export { uploadPatch as patch }
export async function mergeInto ({ source, target, force }) {
const sourceHost = this.getObject(source.master)
const targetHost = this.getObject(target.master)
if (sourceHost.productBrand !== targetHost.productBrand) {
throw new Error(
`a ${sourceHost.productBrand} pool cannot be merged into a ${
targetHost.productBrand
} pool`
)
}
const sourcePatches = sourceHost.patches
const targetPatches = this.getObject(target.master).patches
const targetPatches = targetHost.patches
const counterDiff = differenceBy(sourcePatches, targetPatches, 'name')
if (counterDiff.length > 0) {

View File

@@ -25,8 +25,10 @@ function checkPermissionOnSrs (vm, permission = 'operate') {
if (vbd.is_cd_drive || !vdiId) {
return
}
return permissions.push([this.getObject(vdiId, 'VDI').$SR, permission])
return permissions.push([
this.getObject(vdiId, ['VDI', 'VDI-snapshot']).$SR,
permission,
])
})
return this.hasPermissions(this.session.get('user_id'), permissions).then(
@@ -50,11 +52,16 @@ const extract = (obj, prop) => {
export async function create (params) {
const { user } = this
const resourceSet = extract(params, 'resourceSet')
if (resourceSet === undefined && user.permission !== 'admin') {
const template = extract(params, 'template')
if (
resourceSet === undefined &&
!(await this.hasPermissions(this.user.id, [
[template.$pool, 'administrate'],
]))
) {
throw unauthorized()
}
const template = extract(params, 'template')
params.template = template._xapiId
const xapi = this.getXapi(template)
@@ -467,7 +474,7 @@ export async function migrate ({
})
}
if (!await this.hasPermissions(this.session.get('user_id'), permissions)) {
if (!(await this.hasPermissions(this.session.get('user_id'), permissions))) {
throw unauthorized()
}
@@ -656,8 +663,7 @@ clone.params = {
}
clone.resolve = {
// TODO: is it necessary for snapshots?
vm: ['id', 'VM', 'administrate'],
vm: ['id', ['VM', 'VM-snapshot'], 'administrate'],
}
// -------------------------------------------------------------------
@@ -707,9 +713,9 @@ copy.resolve = {
export async function convertToTemplate ({ vm }) {
// Convert to a template requires pool admin permission.
if (
!await this.hasPermissions(this.session.get('user_id'), [
!(await this.hasPermissions(this.session.get('user_id'), [
[vm.$pool, 'administrate'],
])
]))
) {
throw unauthorized()
}
@@ -1012,13 +1018,12 @@ export async function stop ({ vm, force }) {
// Hard shutdown
if (force) {
await xapi.call('VM.hard_shutdown', vm._xapiRef)
return
return xapi.shutdownVm(vm._xapiRef, { hard: true })
}
// Clean shutdown
try {
await xapi.call('VM.clean_shutdown', vm._xapiRef)
await xapi.shutdownVm(vm._xapiRef)
} catch (error) {
const { code } = error
if (
@@ -1269,7 +1274,9 @@ export async function createInterface ({
await this.checkResourceSetConstraints(resourceSet, this.user.id, [
network.id,
])
} else if (!await this.hasPermissions(this.user.id, [[network.id, 'view']])) {
} else if (
!(await this.hasPermissions(this.user.id, [[network.id, 'view']]))
) {
throw unauthorized()
}

View File

@@ -507,7 +507,7 @@ const setUpConsoleProxy = (webServer, xo) => {
const { token } = parseCookies(req.headers.cookie)
const user = await xo.authenticateUser({ token })
if (!await xo.hasPermissions(user.id, [[id, 'operate']])) {
if (!(await xo.hasPermissions(user.id, [[id, 'operate']]))) {
throw invalidCredentials()
}
@@ -551,7 +551,7 @@ export default async function main (args) {
debug('blocked for %sms', ms | 0)
},
{
threshold: 50,
threshold: 500,
}
)
}

View File

@@ -13,6 +13,10 @@ export default {
type: 'string',
description: 'identifier of this job',
},
scheduleId: {
type: 'string',
description: 'identifier of the schedule which ran the job',
},
key: {
type: 'string',
},

View File

@@ -146,6 +146,7 @@ const TRANSFORMS = {
license_params: obj.license_params,
license_server: obj.license_server,
license_expiry: toTimestamp(obj.license_params.expiry),
logging: obj.logging,
name_description: obj.name_description,
name_label: obj.name_label,
memory: (function () {
@@ -186,9 +187,14 @@ const TRANSFORMS = {
}
}),
agentStartTime: toTimestamp(otherConfig.agent_start_time),
rebootRequired: !isEmpty(obj.updates_requiring_reboot),
rebootRequired:
softwareVersion.product_brand === 'XCP-ng'
? toTimestamp(otherConfig.boot_time) <
+otherConfig.rpm_patch_installation_time
: !isEmpty(obj.updates_requiring_reboot),
tags: obj.tags,
version: softwareVersion.product_version,
productBrand: softwareVersion.product_brand,
// TODO: dedupe.
PIFs: link(obj, 'PIFs'),
@@ -227,15 +233,20 @@ const TRANSFORMS = {
return
}
if (!guestMetrics) {
if (guestMetrics === undefined) {
return false
}
const { major, minor } = guestMetrics.PV_drivers_version
if (major === undefined || minor === undefined) {
return false
}
return {
major,
minor,
major: +major,
minor: +minor,
version: +`${major}.${minor}`,
}
})()
@@ -584,6 +595,7 @@ const TRANSFORMS = {
task (obj) {
return {
allowedOperations: obj.allowed_operations,
created: toTimestamp(obj.created),
current_operations: obj.current_operations,
finished: toTimestamp(obj.finished),

View File

@@ -98,7 +98,9 @@ const getValuesFromDepth = (obj, targetPath) => {
const testMetric = (test, type) =>
typeof test === 'string'
? test === type
: typeof test === 'function' ? test(type) : test.exec(type)
: typeof test === 'function'
? test(type)
: test.exec(type)
const findMetric = (metrics, metricType) => {
let testResult
@@ -193,7 +195,6 @@ const STATS = {
iowait: {
test: /^iowait_(\w+)$/,
getPath: matches => ['iowait', matches[1]],
transformValue: value => value * 1e2,
},
},
vm: {
@@ -242,13 +243,14 @@ export default class XapiStats {
// Execute one http request on a XenServer for get stats
// Return stats (Json format) or throws got exception
@limitConcurrency(3)
_getJson (xapi, host, timestamp) {
_getJson (xapi, host, timestamp, step) {
return xapi
.getResource('/rrd_updates', {
host,
query: {
cf: 'AVERAGE',
host: 'true',
interval: step,
json: 'true',
start: timestamp,
},
@@ -316,7 +318,7 @@ export default class XapiStats {
}
const timestamp = await this._getNextTimestamp(xapi, host, step)
const json = await this._getJson(xapi, host, timestamp)
const json = await this._getJson(xapi, host, timestamp, step)
if (json.meta.step !== step) {
throw new FaultyGranularity(
`Unable to get the true granularity: ${json.meta.step}`

View File

@@ -70,7 +70,7 @@ import {
// ===================================================================
const TAG_BASE_DELTA = 'xo:base_delta'
const TAG_COPY_SRC = 'xo:copy_of'
export const TAG_COPY_SRC = 'xo:copy_of'
// ===================================================================
@@ -426,6 +426,14 @@ export default class Xapi extends XapiBase {
await this.call('host.restart_agent', this.getObject(hostId).$ref)
}
async setRemoteSyslogHost (hostId, syslogDestination) {
const host = this.getObject(hostId)
await this.call('host.set_logging', host.$ref, {
syslog_destination: syslogDestination,
})
await this.call('host.syslog_reconfigure', host.$ref)
}
async shutdownHost (hostId, force = false) {
const host = this.getObject(hostId)
@@ -656,7 +664,7 @@ export default class Xapi extends XapiBase {
}
// ensure the vm record is up-to-date
vm = await this.barrier('VM', $ref)
vm = await this.barrier($ref)
return Promise.all([
forceDeleteDefaultTemplate &&
@@ -816,12 +824,14 @@ export default class Xapi extends XapiBase {
} = {}
): Promise<DeltaVmExport> {
let vm = this.getObject(vmId)
if (!bypassVdiChainsCheck) {
this._assertHealthyVdiChains(vm)
}
// do not use the snapshot name in the delta export
const exportedNameLabel = vm.name_label
if (!vm.is_a_snapshot) {
if (!bypassVdiChainsCheck) {
this._assertHealthyVdiChains(vm)
}
vm = await this._snapshotVm($cancelToken, vm, snapshotNameLabel)
$defer.onFailure(() => this._deleteVm(vm))
}
@@ -958,7 +968,9 @@ export default class Xapi extends XapiBase {
)
if (!baseVm) {
throw new Error('could not find the base VM')
throw new Error(
`could not find the base VM (copy of ${remoteBaseVmUuid})`
)
}
}
}
@@ -1071,7 +1083,7 @@ export default class Xapi extends XapiBase {
.once('finish', () => {
transferSize += sizeStream.size
})
stream.task = sizeStream.task
sizeStream.task = stream.task
await this._importVdiContent(vdi, sizeStream, VDI_FORMAT_VHD)
}
}),
@@ -1142,7 +1154,9 @@ export default class Xapi extends XapiBase {
vdis[vdi.$ref] =
mapVdisSrs && mapVdisSrs[vdi.$id]
? hostXapi.getObject(mapVdisSrs[vdi.$id]).$ref
: sr !== undefined ? hostXapi.getObject(sr).$ref : defaultSr.$ref // Will error if there are no default SR.
: sr !== undefined
? hostXapi.getObject(sr).$ref
: defaultSr.$ref // Will error if there are no default SR.
}
}

View File

@@ -35,11 +35,24 @@ declare class XapiObject {
}
type Id = string | XapiObject
declare export class Vbd extends XapiObject {
type: string;
VDI: string;
}
declare export class Vdi extends XapiObject {
$snapshot_of: Vdi;
uuid: string;
}
declare export class Vm extends XapiObject {
$snapshots: Vm[];
$VBDs: Vbd[];
is_a_snapshot: boolean;
is_a_template: boolean;
name_label: string;
power_state: 'Running' | 'Halted' | 'Paused' | 'Suspended';
other_config: $Dict<string>;
snapshot_time: number;
uuid: string;
@@ -67,21 +80,24 @@ declare export class Xapi {
_snapshotVm(cancelToken: mixed, vm: Vm, nameLabel?: string): Promise<Vm>;
addTag(object: Id, tag: string): Promise<void>;
barrier(): void;
barrier(ref: string): XapiObject;
barrier(): Promise<void>;
barrier(ref: string): Promise<XapiObject>;
deleteVm(vm: Id): Promise<void>;
editVm(vm: Id, $Dict<mixed>): Promise<void>;
exportDeltaVm(
cancelToken: mixed,
snapshot: Id,
baseSnapshot ?: Id
): Promise<DeltaVmExport>;
exportVm(
cancelToken: mixed,
vm: Vm,
options ?: Object
): Promise<AugmentedReadable>;
getObject(object: Id): XapiObject;
importDeltaVm(data: DeltaVmImport, options: Object): Promise<{ vm: Vm }>;
importVm(stream: AugmentedReadable, options: Object): Promise<Vm>;
exportDeltaVm(
cancelToken: mixed,
snapshot: Id,
baseSnapshot ?: Id,
opts?: { fullVdisRequired?: string[] }
): Promise<DeltaVmExport>;
exportVm(
cancelToken: mixed,
vm: Vm,
options ?: Object
): Promise<AugmentedReadable>;
getObject(object: Id): XapiObject;
importDeltaVm(data: DeltaVmImport, options: Object): Promise<{ vm: Vm }>;
importVm(stream: AugmentedReadable, options: Object): Promise<Vm>;
shutdownVm(object: Id): Promise<void>;
startVm(object: Id): Promise<void>;
}

View File

@@ -1,5 +1,6 @@
import deferrable from 'golike-defer'
import every from 'lodash/every'
import filter from 'lodash/filter'
import find from 'lodash/find'
import includes from 'lodash/includes'
import isObject from 'lodash/isObject'
@@ -11,6 +12,7 @@ import unzip from 'julien-f-unzip'
import { debounce } from '../../decorators'
import {
asyncMap,
ensureArray,
forEach,
mapFilter,
@@ -149,9 +151,12 @@ export default {
},
async listMissingPoolPatchesOnHost (hostId) {
const host = this.getObject(hostId)
// Returns an array to not break compatibility.
return mapToArray(
await this._listMissingPoolPatchesOnHost(this.getObject(hostId))
await (host.software_version.product_brand === 'XCP-ng'
? this._xcpListHostUpdates(host)
: this._listMissingPoolPatchesOnHost(host))
)
},
@@ -440,8 +445,14 @@ export default {
},
async installAllPoolPatchesOnHost (hostId) {
let host = this.getObject(hostId)
const host = this.getObject(hostId)
if (host.software_version.product_brand === 'XCP-ng') {
return this._xcpInstallHostUpdates(host)
}
return this._installAllPoolPatchesOnHost(host)
},
async _installAllPoolPatchesOnHost (host) {
const installableByUuid =
host.license_params.sku_type !== 'free'
? await this._listMissingPoolPatchesOnHost(host)
@@ -479,6 +490,13 @@ export default {
},
async installAllPoolPatchesOnAllHosts () {
if (this.pool.$master.software_version.product_brand === 'XCP-ng') {
return this._xcpInstallAllPoolUpdatesOnHost()
}
return this._installAllPoolPatchesOnAllHosts()
},
async _installAllPoolPatchesOnAllHosts () {
const installableByUuid = assign(
{},
...(await Promise.all(
@@ -518,4 +536,47 @@ export default {
})
}
},
// ----------------------------------
// XCP-ng dedicated zone for patching
// ----------------------------------
// list all yum updates available for a XCP-ng host
async _xcpListHostUpdates (host) {
return JSON.parse(
await this.call(
'host.call_plugin',
host.$ref,
'updater.py',
'check_update',
{}
)
)
},
// install all yum updates for a XCP-ng host
async _xcpInstallHostUpdates (host) {
const update = await this.call(
'host.call_plugin',
host.$ref,
'updater.py',
'update',
{}
)
if (JSON.parse(update).exit !== 0) {
throw new Error('Update install failed')
} else {
await this._updateObjectMapProperty(host, 'other_config', {
rpm_patch_installation_time: String(Date.now() / 1000),
})
}
},
// install all yum updates for all XCP-ng hosts in a give pool
async _xcpInstallAllPoolUpdatesOnHost () {
await asyncMap(filter(this.objects.all, { $type: 'host' }), host =>
this._xcpInstallHostUpdates(host)
)
},
}

View File

@@ -1,6 +1,6 @@
import deferrable from 'golike-defer'
import { catchPlus as pCatch, ignoreErrors } from 'promise-toolbox'
import { find, gte, includes, isEmpty, lte } from 'lodash'
import { find, gte, includes, isEmpty, lte, noop } from 'lodash'
import { forEach, mapToArray, parseSize } from '../../utils'
@@ -204,7 +204,7 @@ export default {
if (cloudConfig != null) {
// Refresh the record.
await this.barrier('VM', vm.$ref)
await this.barrier(vm.$ref)
vm = this.getObjectByRef(vm.$ref)
// Find the SR of the first VDI.
@@ -224,7 +224,7 @@ export default {
}
// wait for the record with all the VBDs and VIFs
return this.barrier('VM', vm.$ref)
return this.barrier(vm.$ref)
},
// High level method to edit a VM.
@@ -429,4 +429,11 @@ export default {
// the force parameter is always true
return this.call('VM.resume', this.getObject(vmId).$ref, false, true)
},
shutdownVm (vmId, { hard = false } = {}) {
return this.call(
`VM.${hard ? 'hard' : 'clean'}_shutdown`,
this.getObject(vmId).$ref
).then(noop)
},
}

View File

@@ -0,0 +1,138 @@
import { forEach } from 'lodash'
const isSkippedError = error =>
error.message === 'no disks found' ||
error.message === 'no such object' ||
error.message === 'no VMs match this pattern' ||
error.message === 'unhealthy VDI chain'
const getStatus = (
error,
status = error === undefined ? 'success' : 'failure'
) => (status === 'failure' && isSkippedError(error) ? 'skipped' : status)
const computeStatusAndSortTasks = (status, tasks) => {
if (status === 'failure' || tasks === undefined) {
return status
}
for (let i = 0, n = tasks.length; i < n; ++i) {
const taskStatus = tasks[i].status
if (taskStatus === 'failure') {
return taskStatus
}
if (taskStatus === 'skipped') {
status = taskStatus
}
}
tasks.sort(taskTimeComparator)
return status
}
const taskTimeComparator = ({ start: s1, end: e1 }, { start: s2, end: e2 }) => {
if (e1 !== undefined) {
if (e2 !== undefined) {
// finished tasks are ordered by their end times
return e1 - e2
}
// finished task before unfinished tasks
return -1
} else if (e2 === undefined) {
// unfinished tasks are ordered by their start times
return s1 - s2
}
// unfinished task after finished tasks
return 1
}
export default {
async getBackupNgLogs (runId?: string) {
const { runningJobs } = this
const consolidated = {}
const started = {}
forEach(await this.getLogs('jobs'), ({ data, time, message }, id) => {
const { event } = data
if (event === 'job.start') {
if (
(data.type === 'backup' || data.key === undefined) &&
(runId === undefined || runId === id)
) {
const { scheduleId, jobId } = data
consolidated[id] = started[id] = {
data: data.data,
id,
jobId,
scheduleId,
start: time,
status: runningJobs[jobId] === id ? 'pending' : 'interrupted',
}
}
} else if (event === 'job.end') {
const { runJobId } = data
const log = started[runJobId]
if (log !== undefined) {
delete started[runJobId]
log.end = time
log.status = computeStatusAndSortTasks(
getStatus((log.result = data.error)),
log.tasks
)
}
} else if (event === 'task.start') {
const parent = started[data.parentId]
if (parent !== undefined) {
;(parent.tasks || (parent.tasks = [])).push(
(started[id] = {
data: data.data,
id,
message,
start: time,
status: parent.status,
})
)
}
} else if (event === 'task.end') {
const { taskId } = data
const log = started[taskId]
if (log !== undefined) {
// TODO: merge/transfer work-around
delete started[taskId]
log.end = time
log.status = computeStatusAndSortTasks(
getStatus((log.result = data.result), data.status),
log.tasks
)
}
} else if (event === 'jobCall.start') {
const parent = started[data.runJobId]
if (parent !== undefined) {
;(parent.tasks || (parent.tasks = [])).push(
(started[id] = {
data: {
type: 'VM',
id: data.params.id,
},
id,
start: time,
status: parent.status,
})
)
}
} else if (event === 'jobCall.end') {
const { runCallId } = data
const log = started[runCallId]
if (log !== undefined) {
delete started[runCallId]
log.end = time
log.status = computeStatusAndSortTasks(
getStatus((log.result = data.error)),
log.tasks
)
}
}
})
return runId === undefined ? consolidated : consolidated[runId]
},
}

View File

@@ -3,20 +3,27 @@
// $FlowFixMe
import type RemoteHandler from '@xen-orchestra/fs'
import defer from 'golike-defer'
import limitConcurrency from 'limit-concurrency-decorator'
import { type Pattern, createPredicate } from 'value-matcher'
import { type Readable, PassThrough } from 'stream'
import { AssertionError } from 'assert'
import { basename, dirname } from 'path'
import {
countBy,
forEach,
groupBy,
isEmpty,
last,
mapValues,
noop,
some,
sum,
values,
} from 'lodash'
import { fromEvent as pFromEvent, timeout as pTimeout } from 'promise-toolbox'
import {
fromEvent as pFromEvent,
ignoreErrors,
timeout as pTimeout,
} from 'promise-toolbox'
import Vhd, {
chainVhd,
createSyntheticStream as createVhdReadStream,
@@ -29,9 +36,12 @@ import createSizeStream from '../../size-stream'
import {
type DeltaVmExport,
type DeltaVmImport,
type Vdi,
type Vm,
type Xapi,
TAG_COPY_SRC,
} from '../../xapi'
import { getVmDisks } from '../../xapi/utils'
import {
asyncMap,
resolveRelativeFromFile,
@@ -41,12 +51,15 @@ import {
import { translateLegacyJob } from './migration'
type Mode = 'full' | 'delta'
type ReportWhen = 'always' | 'failure' | 'never'
export type Mode = 'full' | 'delta'
export type ReportWhen = 'always' | 'failure' | 'never'
type Settings = {|
concurrency?: number,
deleteFirst?: boolean,
copyRetention?: number,
exportRetention?: number,
offlineSnapshot?: boolean,
reportWhen?: ReportWhen,
snapshotRetention?: number,
vmTimeout?: number,
@@ -91,33 +104,6 @@ type MetadataFull = {|
|}
type Metadata = MetadataDelta | MetadataFull
type ConsolidatedJob = {|
duration?: number,
end?: number,
error?: Object,
id: string,
jobId: string,
mode: Mode,
start: number,
type: 'backup' | 'call',
userId: string,
|}
type ConsolidatedTask = {|
data?: Object,
duration?: number,
end?: number,
parentId: string,
message: string,
result?: Object,
start: number,
status: 'canceled' | 'failure' | 'success',
taskId: string,
|}
type ConsolidatedBackupNgLog = {
roots: Array<ConsolidatedJob>,
[parentId: string]: Array<ConsolidatedTask>,
}
const compareSnapshotTime = (a: Vm, b: Vm): number =>
a.snapshot_time < b.snapshot_time ? -1 : 1
@@ -131,20 +117,25 @@ const compareTimestamp = (a: Metadata, b: Metadata): number =>
const getOldEntries = <T>(retention: number, entries?: T[]): T[] =>
entries === undefined
? []
: --retention > 0 ? entries.slice(0, -retention) : entries
: --retention > 0
? entries.slice(0, -retention)
: entries
const defaultSettings: Settings = {
concurrency: 0,
deleteFirst: false,
exportRetention: 0,
offlineSnapshot: false,
reportWhen: 'failure',
snapshotRetention: 0,
vmTimeout: 0,
}
const getSetting = (
const getSetting = <T>(
settings: $Dict<Settings>,
name: $Keys<Settings>,
...keys: string[]
): any => {
keys: string[],
defaultValue?: T
): T | any => {
for (let i = 0, n = keys.length; i < n; ++i) {
const objectSettings = settings[keys[i]]
if (objectSettings !== undefined) {
@@ -154,12 +145,16 @@ const getSetting = (
}
}
}
if (defaultValue !== undefined) {
return defaultValue
}
return defaultSettings[name]
}
const BACKUP_DIR = 'xo-vm-backups'
const getVmBackupDir = (uuid: string) => `${BACKUP_DIR}/${uuid}`
const isHiddenFile = (filename: string) => filename[0] === '.'
const isMetadataFile = (filename: string) => filename.endsWith('.json')
const isVhd = (filename: string) => filename.endsWith('.vhd')
@@ -260,6 +255,10 @@ const importers: $Dict<
},
}
const PARSE_UUID_RE = /-/g
const parseUuid = (uuid: string) =>
Buffer.from(uuid.replace(PARSE_UUID_RE, ''), 'hex')
const parseVmBackupId = (id: string) => {
const i = id.indexOf('/')
return {
@@ -329,10 +328,7 @@ const wrapTask = async <T>(opts: any, task: Promise<T>): Promise<T> => {
value => {
logger.notice(message, {
event: 'task.end',
result:
result === undefined
? value
: typeof result === 'function' ? result(value) : result,
result: typeof result === 'function' ? result(value) : result,
status: 'success',
taskId,
})
@@ -368,10 +364,7 @@ const wrapTaskFn = <T>(
const value = await task.apply(this, [taskId, ...arguments])
logger.notice(message, {
event: 'task.end',
result:
result === undefined
? value
: typeof result === 'function' ? result(value) : result,
result: typeof result === 'function' ? result(value) : result,
status: 'success',
taskId,
})
@@ -433,6 +426,7 @@ export default class BackupNg {
app.on('start', () => {
const executor: Executor = async ({
cancelToken,
data: vmId,
job: job_,
logger,
runJobId,
@@ -443,18 +437,36 @@ export default class BackupNg {
}
const job: BackupJob = (job_: any)
const vms: $Dict<Vm> = app.getObjects({
filter: createPredicate({
type: 'VM',
...job.vms,
}),
})
if (isEmpty(vms)) {
throw new Error('no VMs match this pattern')
let vms: $Dict<Vm> | void
if (vmId === undefined) {
vms = app.getObjects({
filter: createPredicate({
type: 'VM',
...job.vms,
}),
})
if (isEmpty(vms)) {
throw new Error('no VMs match this pattern')
}
}
const jobId = job.id
const scheduleId = schedule.id
await asyncMap(vms, async vm => {
const srs = unboxIds(job.srs).map(id => {
const xapi = app.getXapi(id)
return {
__proto__: xapi.getObject(id),
xapi,
}
})
const remotes = await Promise.all(
unboxIds(job.remotes).map(async id => ({
id,
handler: await app.getRemoteHandler(id),
}))
)
let handleVm = async vm => {
const { name_label: name, uuid } = vm
const taskId: string = logger.notice(
`Starting backup of ${name}. (${jobId})`,
@@ -476,16 +488,14 @@ export default class BackupNg {
job,
schedule,
logger,
taskId
taskId,
srs,
remotes
)
const vmTimeout: number = getSetting(
job.settings,
'vmTimeout',
const vmTimeout: number = getSetting(job.settings, 'vmTimeout', [
uuid,
scheduleId,
logger,
taskId
)
])
if (vmTimeout !== 0) {
p = pTimeout.call(p, vmTimeout)
}
@@ -506,7 +516,19 @@ export default class BackupNg {
: serializeError(error),
})
}
})
}
if (vms === undefined) {
return handleVm(await app.getObject(vmId))
}
const concurrency: number = getSetting(job.settings, 'concurrency', [
'',
])
if (concurrency !== 0) {
handleVm = limitConcurrency(concurrency)(handleVm)
}
await asyncMap(vms, handleVm)
}
app.registerJobExecutor('backup', executor)
})
@@ -669,7 +691,7 @@ export default class BackupNg {
// - [ ] display queued VMs
// - [ ] snapshots and files of an old job should be detected and removed
// - [ ] delta import should support mapVdisSrs
// - [ ] size of the path? (base64url(Buffer.from(uuid.split('-').join(''), 'hex')))
// - [ ] size of the path? (base64url(parseUuid(uuid)))
// - [ ] what does mean the vmTimeout with the new concurrency? a VM can take
// a very long time to finish if there are other VMs before…
// - [ ] detect and gc uncomplete replications
@@ -703,7 +725,9 @@ export default class BackupNg {
job: BackupJob,
schedule: Schedule,
logger: any,
taskId: string
taskId: string,
srs: any[],
remotes: any[]
): Promise<void> {
const app = this._app
const xapi = app.getXapi(vmUuid)
@@ -712,31 +736,66 @@ export default class BackupNg {
// ensure the VM itself does not have any backup metadata which would be
// copied on manual snapshots and interfere with the backup jobs
if ('xo:backup:job' in vm.other_config) {
await xapi._updateObjectMapProperty(vm, 'other_config', {
'xo:backup:job': null,
'xo:backup:schedule': null,
'xo:backup:vm': null,
})
await wrapTask(
{
logger,
message: 'clean backup metadata on VM',
parentId: taskId,
},
xapi._updateObjectMapProperty(vm, 'other_config', {
'xo:backup:job': null,
'xo:backup:schedule': null,
'xo:backup:vm': null,
})
)
}
const { id: jobId, settings } = job
const { id: scheduleId } = schedule
const exportRetention: number = getSetting(
settings,
'exportRetention',
scheduleId
)
let exportRetention: number = getSetting(settings, 'exportRetention', [
scheduleId,
])
let copyRetention: number | void = getSetting(settings, 'copyRetention', [
scheduleId,
])
if (copyRetention === undefined) {
// if copyRetention is not defined, it uses exportRetention's value due to
// previous implementation which did not support copyRetention
copyRetention = srs.length === 0 ? 0 : exportRetention
if (remotes.length === 0) {
exportRetention = 0
}
} else if (exportRetention !== 0 && remotes.length === 0) {
throw new Error('export retention must be 0 without remotes')
}
if (copyRetention !== 0 && srs.length === 0) {
throw new Error('copy retention must be 0 without SRs')
}
if (
remotes.length !== 0 &&
srs.length !== 0 &&
(copyRetention === 0) !== (exportRetention === 0)
) {
throw new Error('both or neither copy and export retentions must be 0')
}
const snapshotRetention: number = getSetting(
settings,
'snapshotRetention',
scheduleId
[scheduleId]
)
if (exportRetention === 0) {
if (snapshotRetention === 0) {
throw new Error('export and snapshots retentions cannot both be 0')
}
if (
copyRetention === 0 &&
exportRetention === 0 &&
snapshotRetention === 0
) {
throw new Error('copy, export and snapshot retentions cannot both be 0')
}
if (
@@ -752,13 +811,29 @@ export default class BackupNg {
.filter(_ => _.other_config['xo:backup:job'] === jobId)
.sort(compareSnapshotTime)
await xapi._assertHealthyVdiChains(vm)
xapi._assertHealthyVdiChains(vm)
const offlineSnapshot: boolean = getSetting(settings, 'offlineSnapshot', [
vmUuid,
'',
])
const startAfterSnapshot = offlineSnapshot && vm.power_state === 'Running'
if (startAfterSnapshot) {
await wrapTask(
{
logger,
message: 'shutdown VM',
parentId: taskId,
},
xapi.shutdownVm(vm)
)
}
let snapshot: Vm = (await wrapTask(
{
parentId: taskId,
logger,
message: 'snapshot',
parentId: taskId,
result: _ => _.uuid,
},
xapi._snapshotVm(
@@ -767,11 +842,23 @@ export default class BackupNg {
`[XO Backup ${job.name}] ${vm.name_label}`
)
): any)
await xapi._updateObjectMapProperty(snapshot, 'other_config', {
'xo:backup:job': jobId,
'xo:backup:schedule': scheduleId,
'xo:backup:vm': vmUuid,
})
if (startAfterSnapshot) {
ignoreErrors.call(xapi.startVm(vm))
}
await wrapTask(
{
logger,
message: 'add metadata to snapshot',
parentId: taskId,
},
xapi._updateObjectMapProperty(snapshot, 'other_config', {
'xo:backup:job': jobId,
'xo:backup:schedule': scheduleId,
'xo:backup:vm': vmUuid,
})
)
$defer(() =>
asyncMap(
@@ -785,18 +872,20 @@ export default class BackupNg {
)
)
snapshot = ((await xapi.barrier(snapshot.$ref): any): Vm)
snapshot = ((await wrapTask(
{
logger,
message: 'waiting for uptodate snapshot record',
parentId: taskId,
},
xapi.barrier(snapshot.$ref)
): any): Vm)
if (exportRetention === 0) {
if (copyRetention === 0 && exportRetention === 0) {
return
}
const remotes = unboxIds(job.remotes)
const srs = unboxIds(job.srs)
const nTargets = remotes.length + srs.length
if (nTargets === 0) {
throw new Error('export retention must be 0 without remotes and SRs')
}
const now = Date.now()
const vmDir = getVmBackupDir(vmUuid)
@@ -812,14 +901,21 @@ export default class BackupNg {
$defer.call(xapi, 'deleteVm', snapshot)
}
let xva: any = await xapi.exportVm($cancelToken, snapshot, {
compress: job.compression === 'native',
})
let xva: any = await wrapTask(
{
logger,
message: 'start snapshot export',
parentId: taskId,
},
xapi.exportVm($cancelToken, snapshot, {
compress: job.compression === 'native',
})
)
const exportTask = xva.task
xva = xva.pipe(createSizeStream())
const forkExport =
nTargets === 0
nTargets === 1
? () => xva
: () => {
const fork = xva.pipe(new PassThrough())
@@ -847,17 +943,15 @@ export default class BackupNg {
[
...remotes.map(
wrapTaskFn(
id => ({
({ id }) => ({
data: { id, type: 'remote' },
logger,
message: 'export',
parentId: taskId,
}),
async (taskId, remoteId) => {
async (taskId, { handler, id: remoteId }) => {
const fork = forkExport()
const handler = await app.getRemoteHandler(remoteId)
const oldBackups: MetadataFull[] = (getOldEntries(
exportRetention,
await this._listVmBackups(
@@ -867,11 +961,9 @@ export default class BackupNg {
)
): any)
const deleteFirst = getSetting(
settings,
'deleteFirst',
remoteId
)
const deleteFirst = getSetting(settings, 'deleteFirst', [
remoteId,
])
if (deleteFirst) {
await this._deleteFullVmBackups(handler, oldBackups)
}
@@ -881,9 +973,7 @@ export default class BackupNg {
logger,
message: 'transfer',
parentId: taskId,
result: {
size: 0,
},
result: () => ({ size: xva.size }),
},
writeStream(fork, handler, dataFilename)
)
@@ -898,24 +988,23 @@ export default class BackupNg {
),
...srs.map(
wrapTaskFn(
id => ({
({ $id: id }) => ({
data: { id, type: 'SR' },
logger,
message: 'export',
parentId: taskId,
}),
async (taskId, srId) => {
async (taskId, sr) => {
const fork = forkExport()
const xapi = app.getXapi(srId)
const sr = xapi.getObject(srId)
const { $id: srId, xapi } = sr
const oldVms = getOldEntries(
exportRetention,
copyRetention,
listReplicatedVms(xapi, scheduleId, srId, vmUuid)
)
const deleteFirst = getSetting(settings, 'deleteFirst', srId)
const deleteFirst = getSetting(settings, 'deleteFirst', [srId])
if (deleteFirst) {
await this._deleteVms(xapi, oldVms)
}
@@ -926,9 +1015,7 @@ export default class BackupNg {
logger,
message: 'transfer',
parentId: taskId,
result: {
size: 0,
},
result: () => ({ size: xva.size }),
},
xapi._importVm($cancelToken, fork, sr, vm =>
xapi._setObjectProperties(vm, {
@@ -966,17 +1053,108 @@ export default class BackupNg {
$defer.onFailure.call(xapi, 'deleteVm', snapshot)
}
const baseSnapshot = last(snapshots)
if (baseSnapshot !== undefined) {
console.log(baseSnapshot.$id) // TODO: remove
// check current state
// await Promise.all([asyncMap(remotes, remoteId => {})])
}
// JFT: TODO: remove when enough time has passed (~2018-09)
//
// Fix VHDs UUID (= VDI.uuid), which was not done before 2018-06-16.
await asyncMap(remotes, async ({ handler }) =>
asyncMap(
this._listVmBackups(handler, vmUuid, _ => _.mode === 'delta'),
({ _filename, vdis, vhds }) => {
const vmDir = dirname(_filename)
return asyncMap(vhds, async (vhdPath, vdiId) => {
const uuid = parseUuid(vdis[vdiId].uuid)
const deltaExport = await xapi.exportDeltaVm(
$cancelToken,
snapshot,
baseSnapshot
const vhd = new Vhd(handler, `${vmDir}/${vhdPath}`)
await vhd.readHeaderAndFooter()
if (!vhd.footer.uuid.equals(uuid)) {
vhd.footer.uuid = uuid
await vhd.readBlockAllocationTable()
await vhd.writeFooter()
}
})
}
)
)
let baseSnapshot, fullVdisRequired
await (async () => {
baseSnapshot = (last(snapshots): Vm | void)
if (baseSnapshot === undefined) {
return
}
const fullRequired = { __proto__: null }
const vdis: $Dict<Vdi> = getVmDisks(baseSnapshot)
for (const { $id: srId, xapi } of srs) {
const replicatedVm = listReplicatedVms(
xapi,
scheduleId,
srId,
vmUuid
).find(vm => vm.other_config[TAG_COPY_SRC] === baseSnapshot.uuid)
if (replicatedVm === undefined) {
baseSnapshot = undefined
return
}
const replicatedVdis = countBy(
getVmDisks(replicatedVm),
vdi => vdi.other_config[TAG_COPY_SRC]
)
forEach(vdis, vdi => {
if (!(vdi.uuid in replicatedVdis)) {
fullRequired[vdi.$snapshot_of.$id] = true
}
})
}
await asyncMap(remotes, ({ handler }) => {
return asyncMap(vdis, async vdi => {
const snapshotOf = vdi.$snapshot_of
const dir = `${vmDir}/vdis/${jobId}/${snapshotOf.uuid}`
const files = await handler
.list(dir, { filter: isVhd })
.catch(_ => [])
let full = true
await asyncMap(files, async file => {
if (file[0] !== '.') {
try {
const vhd = new Vhd(handler, `${dir}/${file}`)
await vhd.readHeaderAndFooter()
if (vhd.footer.uuid.equals(parseUuid(vdi.uuid))) {
full = false
}
return
} catch (error) {
if (!(error instanceof AssertionError)) {
throw error
}
}
}
// either a temporary file or an invalid VHD
await handler.unlink(`${dir}/${file}`)
})
if (full) {
fullRequired[snapshotOf.$id] = true
}
})
})
fullVdisRequired = Object.keys(fullRequired)
})()
const deltaExport = await wrapTask(
{
logger,
message: 'start snapshot export',
parentId: taskId,
},
xapi.exportDeltaVm($cancelToken, snapshot, baseSnapshot, {
fullVdisRequired,
})
)
const metadata: MetadataDelta = {
@@ -1031,21 +1209,23 @@ export default class BackupNg {
}
})()
const isFull = some(
deltaExport.vdis,
vdi => vdi.other_config['xo:base_delta'] === undefined
)
await waitAll(
[
...remotes.map(
wrapTaskFn(
id => ({
data: { id, type: 'remote' },
({ id }) => ({
data: { id, isFull, type: 'remote' },
logger,
message: 'export',
parentId: taskId,
}),
async (taskId, remoteId) => {
async (taskId, { handler, id: remoteId }) => {
const fork = forkExport()
const handler = await app.getRemoteHandler(remoteId)
const oldBackups: MetadataDelta[] = (getOldEntries(
exportRetention,
await this._listVmBackups(
@@ -1060,16 +1240,14 @@ export default class BackupNg {
logger,
message: 'merge',
parentId: taskId,
result: {
size: 0,
},
result: size => ({ size }),
},
this._deleteDeltaVmBackups(handler, oldBackups)
)
const deleteFirst =
exportRetention > 1 &&
getSetting(settings, 'deleteFirst', remoteId)
getSetting(settings, 'deleteFirst', [remoteId])
if (deleteFirst) {
await deleteOldBackups()
}
@@ -1079,9 +1257,7 @@ export default class BackupNg {
logger,
message: 'transfer',
parentId: taskId,
result: {
size: 0,
},
result: size => ({ size }),
},
asyncMap(
fork.vdis,
@@ -1093,13 +1269,19 @@ export default class BackupNg {
let parentPath
if (isDelta) {
const vdiDir = dirname(path)
const parent = (await handler.list(vdiDir))
.filter(isVhd)
parentPath = (await handler.list(vdiDir, {
filter: filename =>
!isHiddenFile(filename) && isVhd(filename),
prependDir: true,
}))
.sort()
.pop()
parentPath = `${vdiDir}/${parent}`
// ensure parent exists and is a valid VHD
await new Vhd(handler, parentPath).readHeaderAndFooter()
}
// FIXME: should only be renamed after the metadata file has been written
await writeStream(
fork.streams[`${id}.vhd`](),
handler,
@@ -1115,8 +1297,17 @@ export default class BackupNg {
if (isDelta) {
await chainVhd(handler, parentPath, handler, path)
}
// set the correct UUID in the VHD
const vhd = new Vhd(handler, path)
await vhd.readHeaderAndFooter()
vhd.footer.uuid = parseUuid(vdi.uuid)
await vhd.readBlockAllocationTable() // required by writeFooter()
await vhd.writeFooter()
return handler.getSize(path)
})
)
).then(sum)
)
await handler.outputFile(metadataFilename, jsonMetadata)
@@ -1128,24 +1319,23 @@ export default class BackupNg {
),
...srs.map(
wrapTaskFn(
id => ({
data: { id, type: 'SR' },
({ $id: id }) => ({
data: { id, isFull, type: 'SR' },
logger,
message: 'export',
parentId: taskId,
}),
async (taskId, srId) => {
async (taskId, sr) => {
const fork = forkExport()
const xapi = app.getXapi(srId)
const sr = xapi.getObject(srId)
const { $id: srId, xapi } = sr
const oldVms = getOldEntries(
exportRetention,
copyRetention,
listReplicatedVms(xapi, scheduleId, srId, vmUuid)
)
const deleteFirst = getSetting(settings, 'deleteFirst', srId)
const deleteFirst = getSetting(settings, 'deleteFirst', [srId])
if (deleteFirst) {
await this._deleteVms(xapi, oldVms)
}
@@ -1155,16 +1345,14 @@ export default class BackupNg {
logger,
message: 'transfer',
parentId: taskId,
result: {
size: 0,
},
result: ({ transferSize }) => ({ size: transferSize }),
},
xapi.importDeltaVm(fork, {
disableStartAfterImport: false, // we'll take care of that
name_label: `${metadata.vm.name_label} (${safeDateFormat(
metadata.timestamp
)})`,
srId: sr.$id,
srId,
})
)
@@ -1196,18 +1384,17 @@ export default class BackupNg {
async _deleteDeltaVmBackups (
handler: RemoteHandler,
backups: MetadataDelta[]
): Promise<void> {
await asyncMap(backups, async backup => {
): Promise<number> {
return asyncMap(backups, async backup => {
const filename = ((backup._filename: any): string)
return Promise.all([
handler.unlink(filename),
asyncMap(backup.vhds, _ =>
// $FlowFixMe injected $defer param
this._deleteVhd(handler, resolveRelativeFromFile(filename, _))
),
])
})
await handler.unlink(filename)
return asyncMap(backup.vhds, _ =>
// $FlowFixMe injected $defer param
this._deleteVhd(handler, resolveRelativeFromFile(filename, _))
).then(sum)
}).then(sum)
}
async _deleteFullVmBackups (
@@ -1225,7 +1412,11 @@ export default class BackupNg {
// FIXME: synchronize by job/VDI, otherwise it can cause issues with the merge
@defer
async _deleteVhd ($defer: any, handler: RemoteHandler, path: string) {
async _deleteVhd (
$defer: any,
handler: RemoteHandler,
path: string
): Promise<number> {
const vhds = await asyncMap(
await handler.list(dirname(path), { filter: isVhd, prependDir: true }),
async path => {
@@ -1250,19 +1441,21 @@ export default class BackupNg {
_ => _ !== undefined && _.header.parentUnicodeName === base
)
if (child === undefined) {
return handler.unlink(path)
await handler.unlink(path)
return 0
}
$defer.onFailure.call(handler, 'unlink', path)
const childPath = child.path
await this._app.worker.mergeVhd(
const mergedDataSize: number = await this._app.worker.mergeVhd(
handler._remote,
path,
handler._remote,
childPath
)
await handler.rename(path, childPath)
return mergedDataSize
}
async _deleteVms (xapi: Xapi, vms: Vm[]): Promise<void> {
@@ -1307,62 +1500,4 @@ export default class BackupNg {
return backups.sort(compareTimestamp)
}
async getBackupNgLogs (runId?: string): Promise<ConsolidatedBackupNgLog> {
const rawLogs = await this._app.getLogs('jobs')
const logs: $Dict<ConsolidatedJob & ConsolidatedTask> = {}
forEach(rawLogs, (log, id) => {
const { data, time, message } = log
const { event } = data
delete data.event
switch (event) {
case 'job.start':
if (data.type === 'backup' && (runId === undefined || runId === id)) {
logs[id] = {
...data,
id,
start: time,
}
}
break
case 'job.end':
const job = logs[data.runJobId]
if (job !== undefined) {
job.end = time
job.duration = time - job.start
job.error = data.error
}
break
case 'task.start':
if (logs[data.parentId] !== undefined) {
logs[id] = {
...data,
start: time,
message,
}
}
break
case 'task.end':
const task = logs[data.taskId]
if (task !== undefined) {
// work-around
if (
time === task.start &&
(message === 'merge' || message === 'tranfer')
) {
delete logs[data.taskId]
} else {
task.status = data.status
task.taskId = data.taskId
task.result = data.result
task.end = time
task.duration = time - task.start
}
}
}
})
return groupBy(logs, log => log.parentId || 'roots')
}
}

View File

@@ -141,7 +141,9 @@ const listPartitions = (() => {
valueTransform: (value, key) =>
key === 'start' || key === 'size'
? +value
: key === 'type' ? TYPES[+value] || value : value,
: key === 'type'
? TYPES[+value] || value
: value,
})
return device =>
@@ -903,6 +905,8 @@ export default class {
const xapi = this._xo.getXapi(vm)
vm = xapi.getObject(vm._xapiId)
xapi._assertHealthyVdiChains(vm)
const reg = new RegExp(
'^rollingSnapshot_[^_]+_' + escapeStringRegexp(tag) + '_'
)

View File

@@ -0,0 +1,71 @@
// @flow
import { noSuchObject } from 'xo-common/api-errors'
import Collection from '../collection/redis'
import patch from '../patch'
type CloudConfig = {|
id: string,
name: string,
template: string,
|}
class CloudConfigs extends Collection {
get (properties) {
return super.get(properties)
}
}
export default class {
_app: any
_db: {|
add: Function,
first: Function,
get: Function,
remove: Function,
update: Function,
|}
constructor (app: any) {
this._app = app
const db = (this._db = new CloudConfigs({
connection: app._redis,
prefix: 'xo:cloudConfig',
}))
app.on('clean', () => db.rebuildIndexes())
app.on('start', () =>
app.addConfigManager(
'cloudConfigs',
() => db.get(),
cloudConfigs => db.update(cloudConfigs)
)
)
}
createCloudConfig (cloudConfig: $Diff<CloudConfig, {| id: string |}>) {
return this._db.add(cloudConfig).properties
}
async updateCloudConfig ({ id, name, template }: $Shape<CloudConfig>) {
const cloudConfig = await this.getCloudConfig(id)
patch(cloudConfig, { name, template })
return this._db.update(cloudConfig)
}
deleteCloudConfig (id: string) {
return this._db.remove(id)
}
getAllCloudConfigs (): Promise<Array<CloudConfig>> {
return this._db.get()
}
async getCloudConfig (id: string): Promise<CloudConfig> {
const cloudConfig = await this._db.first(id)
if (cloudConfig === null) {
throw noSuchObject(id, 'cloud config')
}
return cloudConfig.properties
}
}

View File

@@ -2,7 +2,7 @@
import type { Pattern } from 'value-matcher'
import { cancelable } from 'promise-toolbox'
import { CancelToken } from 'promise-toolbox'
import { map as mapToArray } from 'lodash'
import { noSuchObject } from 'xo-common/api-errors'
@@ -60,6 +60,7 @@ export type CallJob = {|
export type Executor = ({|
app: Object,
cancelToken: any,
data: any,
job: Job,
logger: Logger,
runJobId: string,
@@ -120,7 +121,12 @@ export default class Jobs {
_executors: { __proto__: null, [string]: Executor }
_jobs: JobsDb
_logger: Logger
_runningJobs: { __proto__: null, [string]: boolean }
_runningJobs: { __proto__: null, [string]: string }
_runs: { __proto__: null, [string]: () => void }
get runningJobs () {
return this._runningJobs
}
constructor (xo: any) {
this._app = xo
@@ -132,6 +138,7 @@ export default class Jobs {
}))
this._logger = undefined
this._runningJobs = { __proto__: null }
this._runs = { __proto__: null }
executors.call = executeCall
@@ -150,6 +157,13 @@ export default class Jobs {
})
}
cancelJobRun (id: string) {
const run = this._runs[id]
if (run !== undefined) {
return run.cancel()
}
}
async getAllJobs (type?: string): Promise<Array<Job>> {
// $FlowFixMe don't know what is the problem (JFT)
const jobs = await this._jobs.get()
@@ -201,7 +215,7 @@ export default class Jobs {
return /* await */ this._jobs.remove(id)
}
async _runJob (cancelToken: any, job: Job, schedule?: Schedule) {
async _runJob (job: Job, schedule?: Schedule, data_?: any) {
const { id } = job
const runningJobs = this._runningJobs
@@ -232,6 +246,7 @@ export default class Jobs {
event: 'job.start',
userId: job.userId,
jobId: id,
scheduleId: schedule?.id,
// $FlowFixMe only defined for CallJob
key: job.key,
type,
@@ -239,15 +254,21 @@ export default class Jobs {
runningJobs[id] = runJobId
const runs = this._runs
const { cancel, token } = CancelToken.source()
runs[runJobId] = { cancel }
let session
try {
const app = this._app
session = app.createUserConnection()
session.set('user_id', job.userId)
await executor({
const status = await executor({
app,
cancelToken,
cancelToken: token,
data: data_,
job,
logger,
runJobId,
@@ -259,7 +280,7 @@ export default class Jobs {
runJobId,
})
app.emit('job:terminated', runJobId, job, schedule)
app.emit('job:terminated', status, job, schedule, runJobId)
} catch (error) {
logger.error(`The execution of ${id} has failed.`, {
event: 'job.end',
@@ -269,27 +290,24 @@ export default class Jobs {
throw error
} finally {
delete runningJobs[id]
delete runs[runJobId]
if (session !== undefined) {
session.close()
}
}
}
@cancelable
async runJobSequence (
$cancelToken: any,
idSequence: Array<string>,
schedule?: Schedule
schedule?: Schedule,
data?: any
) {
const jobs = await Promise.all(
mapToArray(idSequence, id => this.getJob(id))
)
for (const job of jobs) {
if ($cancelToken.requested) {
break
}
await this._runJob($cancelToken, job, schedule)
await this._runJob(job, schedule, data)
}
}
}

View File

@@ -23,7 +23,6 @@ export default class {
Promise.all(mapToArray(remotes, remote => this._remotes.save(remote)))
)
await this.initRemotes()
await this.syncAllRemotes()
})
xo.on('stop', () => this.forgetAllRemotes())
@@ -111,15 +110,4 @@ export default class {
} catch (_) {}
}
}
// TODO: Should it be private?
async initRemotes () {
const remotes = await this.getAllRemotes()
if (!remotes || !remotes.length) {
await this.createRemote({
name: 'default',
url: 'file://var/lib/xoa-backups',
})
}
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "xo-vmdk-to-vhd",
"version": "0.1.1",
"version": "0.1.3",
"license": "AGPL-3.0",
"description": "JS lib streaming a vmdk file to a vhd",
"keywords": [
@@ -23,23 +23,23 @@
"node": ">=4"
},
"dependencies": {
"@babel/runtime": "^7.0.0-beta.44",
"@babel/runtime": "^7.0.0-beta.49",
"child-process-promise": "^2.0.3",
"pipette": "^0.9.3",
"promise-toolbox": "^0.9.5",
"tmp": "^0.0.33",
"vhd-lib": "^0.0.0"
"vhd-lib": "^0.2.0"
},
"devDependencies": {
"@babel/cli": "7.0.0-beta.44",
"@babel/core": "7.0.0-beta.44",
"@babel/plugin-transform-runtime": "^7.0.0-beta.44",
"@babel/preset-env": "7.0.0-beta.44",
"@babel/cli": "7.0.0-beta.49",
"@babel/core": "7.0.0-beta.49",
"@babel/plugin-transform-runtime": "^7.0.0-beta.49",
"@babel/preset-env": "7.0.0-beta.49",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"event-to-promise": "^0.8.0",
"execa": "^0.10.0",
"fs-extra": "^5.0.0",
"fs-extra": "^6.0.1",
"get-stream": "^3.0.0",
"index-modules": "^0.3.0",
"rimraf": "^2.6.2"

View File

@@ -1,6 +1,7 @@
import { createReadableSparseStream } from 'vhd-lib'
import { VMDKDirectParser, readVmdkGrainTable } from './vmdk-read'
import VMDKDirectParser from './vmdk-read'
import readVmdkGrainTable from './vmdk-read-table'
async function convertFromVMDK (vmdkReadStream, table) {
const parser = new VMDKDirectParser(vmdkReadStream)

View File

@@ -0,0 +1,97 @@
const SECTOR_SIZE = 512
const HEADER_SIZE = 512
const FOOTER_POSITION = -1024
const DISK_CAPACITY_OFFSET = 12
const GRAIN_SIZE_OFFSET = 20
const NUM_GTE_PER_GT_OFFSET = 44
const GRAIN_ADDRESS_OFFSET = 56
/**
*
* the grain table is the array of LBAs (in byte, not in sector) ordered by their position in the VDMK file
* THIS CODE RUNS ON THE BROWSER
*/
export default async function readVmdkGrainTable (fileAccessor) {
const getLongLong = (buffer, offset, name) => {
if (buffer.length < offset + 8) {
throw new Error(
`buffer ${name} is too short, expecting ${offset + 8} minimum, got ${
buffer.length
}`
)
}
const dataView = new DataView(buffer)
const res = dataView.getUint32(offset, true)
const highBits = dataView.getUint32(offset + 4, true)
const MANTISSA_BITS_IN_DOUBLE = 53
if (highBits >= Math.pow(2, MANTISSA_BITS_IN_DOUBLE - 32)) {
throw new Error(
'Unsupported file, high order bits are to high in field ' + name
)
}
return res + highBits * Math.pow(2, 32)
}
let headerBuffer = await fileAccessor(0, HEADER_SIZE)
let grainAddrBuffer = headerBuffer.slice(
GRAIN_ADDRESS_OFFSET,
GRAIN_ADDRESS_OFFSET + 8
)
if (
new Int8Array(grainAddrBuffer).reduce((acc, val) => acc && val === -1, true)
) {
headerBuffer = await fileAccessor(
FOOTER_POSITION,
FOOTER_POSITION + HEADER_SIZE
)
grainAddrBuffer = headerBuffer.slice(
GRAIN_ADDRESS_OFFSET,
GRAIN_ADDRESS_OFFSET + 8
)
}
const grainDirPosBytes =
getLongLong(grainAddrBuffer, 0, 'grain directory address') * SECTOR_SIZE
const capacity =
getLongLong(headerBuffer, DISK_CAPACITY_OFFSET, 'capacity') * SECTOR_SIZE
const grainSize =
getLongLong(headerBuffer, GRAIN_SIZE_OFFSET, 'grain size') * SECTOR_SIZE
const grainCount = Math.ceil(capacity / grainSize)
const numGTEsPerGT = getLongLong(
headerBuffer,
NUM_GTE_PER_GT_OFFSET,
'num GTE per GT'
)
const grainTablePhysicalSize = numGTEsPerGT * 4
const grainDirectoryEntries = Math.ceil(grainCount / numGTEsPerGT)
const grainDirectoryPhysicalSize = grainDirectoryEntries * 4
const grainDirBuffer = await fileAccessor(
grainDirPosBytes,
grainDirPosBytes + grainDirectoryPhysicalSize
)
const grainDir = new Uint32Array(grainDirBuffer)
const cachedGrainTables = []
for (let i = 0; i < grainDirectoryEntries; i++) {
const grainTableAddr = grainDir[i] * SECTOR_SIZE
if (grainTableAddr !== 0) {
cachedGrainTables[i] = new Uint32Array(
await fileAccessor(
grainTableAddr,
grainTableAddr + grainTablePhysicalSize
)
)
}
}
const extractedGrainTable = []
for (let i = 0; i < grainCount; i++) {
const directoryEntry = Math.floor(i / numGTEsPerGT)
const grainTable = cachedGrainTables[directoryEntry]
if (grainTable !== undefined) {
const grainAddr = grainTable[i % numGTEsPerGT]
if (grainAddr !== 0) {
extractedGrainTable.push([i, grainAddr])
}
}
}
extractedGrainTable.sort(
([i1, grainAddress1], [i2, grainAddress2]) => grainAddress1 - grainAddress2
)
return extractedGrainTable.map(([index, grainAddress]) => index * grainSize)
}

View File

@@ -6,7 +6,7 @@ import { fromCallback as pFromCallback } from 'promise-toolbox'
import rimraf from 'rimraf'
import tmp from 'tmp'
import { VMDKDirectParser } from './vmdk-read'
import VMDKDirectParser from './vmdk-read'
jest.setTimeout(10000)

View File

@@ -4,7 +4,9 @@ import zlib from 'zlib'
import { VirtualBuffer } from './virtual-buffer'
const sectorSize = 512
const SECTOR_SIZE = 512
const HEADER_SIZE = 512
const VERSION_OFFSET = 4
const compressionDeflate = 'COMPRESSION_DEFLATE'
const compressionNone = 'COMPRESSION_NONE'
const compressionMap = [compressionNone, compressionDeflate]
@@ -119,7 +121,7 @@ function parseHeader (buffer) {
}
}
async function readGrain (offsetSectors, buffer, compressed) {
const offset = offsetSectors * sectorSize
const offset = offsetSectors * SECTOR_SIZE
const size = buffer.readUInt32LE(offset + 8)
const grainBuffer = buffer.slice(offset + 12, offset + 12 + size)
const grainContent = compressed
@@ -130,7 +132,7 @@ async function readGrain (offsetSectors, buffer, compressed) {
offsetSectors: offsetSectors,
offset,
lba,
lbaBytes: lba * sectorSize,
lbaBytes: lba * SECTOR_SIZE,
size,
buffer: grainBuffer,
grain: grainContent,
@@ -146,10 +148,10 @@ function tryToParseMarker (buffer) {
}
function alignSectors (number) {
return Math.ceil(number / sectorSize) * sectorSize
return Math.ceil(number / SECTOR_SIZE) * SECTOR_SIZE
}
export class VMDKDirectParser {
export default class VMDKDirectParser {
constructor (readStream) {
this.virtualBuffer = new VirtualBuffer(readStream)
this.header = null
@@ -177,9 +179,9 @@ export class VMDKDirectParser {
l2IsContiguous = l2IsContiguous && l1Entry - previousL1Entry === 4
} else {
l2IsContiguous =
l1Entry * sectorSize === this.virtualBuffer.position ||
l1Entry * sectorSize === this.virtualBuffer.position + 512
l2Start = l1Entry * sectorSize
l1Entry * SECTOR_SIZE === this.virtualBuffer.position ||
l1Entry * SECTOR_SIZE === this.virtualBuffer.position + SECTOR_SIZE
l2Start = l1Entry * SECTOR_SIZE
}
}
if (!l2IsContiguous) {
@@ -200,37 +202,29 @@ export class VMDKDirectParser {
l2ByteSize,
'L2 table ' + position
)
let grainsAreInAscendingOrder = true
let previousL2Entry = 0
let firstGrain = null
for (let i = 0; i < l2entries; i++) {
const l2Entry = l2Buffer.readUInt32LE(i * 4)
if (i > 0 && previousL2Entry !== 0 && l2Entry !== 0) {
grainsAreInAscendingOrder =
grainsAreInAscendingOrder && previousL2Entry < l2Entry
}
previousL2Entry = l2Entry
if (firstGrain === null) {
firstGrain = l2Entry
}
}
if (!grainsAreInAscendingOrder) {
// TODO: here we could transform the file to a sparse VHD on the fly because we have the complete table
throw new Error('Unsupported file format')
}
const freeSpace = firstGrain * sectorSize - this.virtualBuffer.position
const freeSpace = firstGrain * SECTOR_SIZE - this.virtualBuffer.position
if (freeSpace > 0) {
await this.virtualBuffer.readChunk(freeSpace, 'freeSpace after L2')
}
}
async readHeader () {
const headerBuffer = await this.virtualBuffer.readChunk(512, 'readHeader')
const headerBuffer = await this.virtualBuffer.readChunk(
HEADER_SIZE,
'readHeader'
)
const magicString = headerBuffer.slice(0, 4).toString('ascii')
if (magicString !== 'KDMV') {
throw new Error('not a VMDK file')
}
const version = headerBuffer.readUInt32LE(4)
const version = headerBuffer.readUInt32LE(VERSION_OFFSET)
if (version !== 1 && version !== 3) {
throw new Error(
'unsupported VMDK version ' +
@@ -240,7 +234,7 @@ export class VMDKDirectParser {
}
this.header = parseHeader(headerBuffer)
// I think the multiplications are OK, because the descriptor is always at the beginning of the file
const descriptorLength = this.header.descriptorSizeSectors * sectorSize
const descriptorLength = this.header.descriptorSizeSectors * SECTOR_SIZE
const descriptorBuffer = await this.virtualBuffer.readChunk(
descriptorLength,
'descriptor'
@@ -251,16 +245,16 @@ export class VMDKDirectParser {
this.header.grainDirectoryOffsetSectors !== -1 &&
this.header.grainDirectoryOffsetSectors !== 0
) {
l1PositionBytes = this.header.grainDirectoryOffsetSectors * sectorSize
l1PositionBytes = this.header.grainDirectoryOffsetSectors * SECTOR_SIZE
}
const endOfDescriptor = this.virtualBuffer.position
if (
l1PositionBytes !== null &&
(l1PositionBytes === endOfDescriptor ||
l1PositionBytes === endOfDescriptor + sectorSize)
l1PositionBytes === endOfDescriptor + SECTOR_SIZE)
) {
if (l1PositionBytes === endOfDescriptor + sectorSize) {
await this.virtualBuffer.readChunk(sectorSize, 'skipping L1 marker')
if (l1PositionBytes === endOfDescriptor + SECTOR_SIZE) {
await this.virtualBuffer.readChunk(SECTOR_SIZE, 'skipping L1 marker')
}
await this._readL1()
}
@@ -271,7 +265,7 @@ export class VMDKDirectParser {
while (!this.virtualBuffer.isDepleted) {
const position = this.virtualBuffer.position
const sector = await this.virtualBuffer.readChunk(
512,
SECTOR_SIZE,
'marker start ' + position
)
if (sector.length === 0) {
@@ -281,14 +275,14 @@ export class VMDKDirectParser {
if (marker.size === 0) {
if (marker.value !== 0) {
await this.virtualBuffer.readChunk(
marker.value * sectorSize,
marker.value * SECTOR_SIZE,
'other marker value ' + this.virtualBuffer.position
)
}
} else if (marker.size > 10) {
const grainDiskSize = marker.size + 12
const alignedGrainDiskSize = alignSectors(grainDiskSize)
const remainOfBufferSize = alignedGrainDiskSize - sectorSize
const remainOfBufferSize = alignedGrainDiskSize - SECTOR_SIZE
const remainderOfGrainBuffer = await this.virtualBuffer.readChunk(
remainOfBufferSize,
'grain remainder ' + this.virtualBuffer.position
@@ -305,60 +299,3 @@ export class VMDKDirectParser {
}
}
}
export async function readVmdkGrainTable (fileAccessor) {
let headerBuffer = await fileAccessor(0, 512)
let grainAddrBuffer = headerBuffer.slice(56, 56 + 8)
if (
new Int8Array(grainAddrBuffer).reduce((acc, val) => acc && val === -1, true)
) {
headerBuffer = await fileAccessor(-1024, -1024 + 512)
grainAddrBuffer = headerBuffer.slice(56, 56 + 8)
}
const grainDirPosBytes =
new DataView(grainAddrBuffer).getUint32(0, true) * 512
const capacity =
new DataView(headerBuffer.slice(12, 12 + 8)).getUint32(0, true) * 512
const grainSize =
new DataView(headerBuffer.slice(20, 20 + 8)).getUint32(0, true) * 512
const grainCount = Math.ceil(capacity / grainSize)
const numGTEsPerGT = new DataView(headerBuffer.slice(44, 44 + 8)).getUint32(
0,
true
)
const grainTablePhysicalSize = numGTEsPerGT * 4
const grainDirectoryEntries = Math.ceil(grainCount / numGTEsPerGT)
const grainDirectoryPhysicalSize = grainDirectoryEntries * 4
const grainDirBuffer = await fileAccessor(
grainDirPosBytes,
grainDirPosBytes + grainDirectoryPhysicalSize
)
const grainDir = new Uint32Array(grainDirBuffer)
const cachedGrainTables = []
for (let i = 0; i < grainDirectoryEntries; i++) {
const grainTableAddr = grainDir[i] * 512
if (grainTableAddr !== 0) {
cachedGrainTables[i] = new Uint32Array(
await fileAccessor(
grainTableAddr,
grainTableAddr + grainTablePhysicalSize
)
)
}
}
const extractedGrainTable = []
for (let i = 0; i < grainCount; i++) {
const directoryEntry = Math.floor(i / numGTEsPerGT)
const grainTable = cachedGrainTables[directoryEntry]
if (grainTable !== undefined) {
const grainAddr = grainTable[i % numGTEsPerGT]
if (grainAddr !== 0) {
extractedGrainTable.push([i, grainAddr])
}
}
}
extractedGrainTable.sort(
([i1, grainAddress1], [i2, grainAddress2]) => grainAddress1 - grainAddress2
)
return extractedGrainTable.map(([index, grainAddress]) => index * grainSize)
}

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "xo-web",
"version": "5.19.4",
"version": "5.20.2",
"license": "AGPL-3.0",
"description": "Web interface client for Xen-Orchestra",
"keywords": [
@@ -30,7 +30,7 @@
"node": ">=6"
},
"devDependencies": {
"@julien-f/freactal": "0.1.0",
"@julien-f/freactal": "0.1.1",
"@nraynaud/novnc": "0.6.1",
"@xen-orchestra/cron": "^1.0.3",
"ansi_up": "^3.0.0",
@@ -89,8 +89,8 @@
"lodash": "^4.6.1",
"loose-envify": "^1.1.0",
"make-error": "^1.3.2",
"marked": "^0.3.9",
"modular-cssify": "^8.0.0",
"marked": "^0.4.0",
"modular-cssify": "^10.0.0",
"moment": "^2.20.1",
"moment-timezone": "^0.5.14",
"notifyjs": "^3.0.0",
@@ -120,7 +120,7 @@
"react-test-renderer": "^15.6.2",
"react-virtualized": "^9.15.0",
"readable-stream": "^2.3.3",
"redux": "^3.7.2",
"redux": "^4.0.0",
"redux-thunk": "^2.0.1",
"reselect": "^2.5.4",
"rimraf": "^2.6.2",
@@ -134,11 +134,11 @@
"watchify": "^3.7.0",
"whatwg-fetch": "^2.0.3",
"xml2js": "^0.4.19",
"xo-acl-resolver": "^0.2.3",
"xo-acl-resolver": "^0.2.4",
"xo-common": "^0.1.1",
"xo-lib": "^0.8.0",
"xo-remote-parser": "^0.3",
"xo-vmdk-to-vhd": "^0.1.1"
"xo-vmdk-to-vhd": "^0.1.3"
},
"scripts": {
"build": "NODE_ENV=production gulp build",

View File

@@ -7,26 +7,22 @@ const call = fn => fn()
// callbacks have been correctly initialized when there are circular dependencies
const addSubscriptions = subscriptions => Component =>
class SubscriptionWrapper extends React.PureComponent {
constructor () {
super()
// provide all props since the beginning (better behavior with Freactal)
const state = (this.state = {})
Object.keys(subscriptions).forEach(key => {
state[key] = undefined
})
}
_unsubscribes = null
componentWillMount () {
const state = {}
this._unsubscribes = map(
typeof subscriptions === 'function'
? subscriptions(this.props)
: subscriptions,
(subscribe, prop) =>
subscribe(value => this.setState({ [prop]: value }))
(subscribe, prop) => {
state[prop] = undefined
return subscribe(value => this.setState({ [prop]: value }))
}
)
// provide all props since the beginning (better behavior with Freactal)
this.setState(state)
}
componentWillUnmount () {

View File

@@ -20,7 +20,10 @@ export const Card = propTypes({
shadow: propTypes.bool,
})(({ shadow, ...props }) => {
props.className = 'card'
props.style = shadow ? CARD_STYLE_WITH_SHADOW : CARD_STYLE
props.style = {
...props.style,
...(shadow ? CARD_STYLE_WITH_SHADOW : CARD_STYLE),
}
return <div {...props} />
})

View File

@@ -0,0 +1,37 @@
import _ from 'intl'
import React from 'react'
import { map } from 'lodash'
import Icon from './icon'
import Tooltip from './tooltip'
import { alert } from './modal'
const AVAILABLE_TEMPLATE_VARS = {
'{name}': 'templateNameInfo',
'%': 'templateIndexInfo',
}
const showAvailableTemplateVars = () =>
alert(
_('availableTemplateVarsTitle'),
<ul>
{map(AVAILABLE_TEMPLATE_VARS, (value, key) => (
<li key={key}>{_.keyValue(key, _(value))}</li>
))}
</ul>
)
export const AvailableTemplateVars = () => (
<Tooltip content={_('availableTemplateVarsInfo')}>
<a
className='text-info'
style={{ cursor: 'pointer' }}
onClick={showAvailableTemplateVars}
>
<Icon icon='info' />
</a>
</Tooltip>
)
export const DEFAULT_CLOUD_CONFIG_TEMPLATE =
'#cloud-config\n#hostname: {name}%\n#ssh_authorized_keys:\n# - ssh-rsa <myKey>\n#packages:\n# - htop\n'

View File

@@ -142,15 +142,15 @@ export default class Select extends React.PureComponent {
simpleValue,
value,
} = props
let option
if (
autoSelectSingleOption &&
options != null &&
options.length === 1 &&
(value == null ||
(simpleValue && value === '') ||
(multi && value.length === 0))
(multi && value.length === 0)) &&
([option] = options.filter(_ => !_.disabled)).length === 1
) {
const option = options[0]
props.onChange(
simpleValue ? option[props.valueKey] : multi ? [option] : option
)

View File

@@ -71,6 +71,7 @@ const messages = {
settingsAclsPage: 'ACLs',
settingsPluginsPage: 'Plugins',
settingsLogsPage: 'Logs',
settingsCloudConfigsPage: 'Cloud configs',
settingsIpsPage: 'IPs',
settingsConfigPage: 'Config',
aboutPage: 'About',
@@ -83,6 +84,10 @@ const messages = {
newServerPage: 'Server',
newImport: 'Import',
xosan: 'XOSAN',
backupDeprecatedMessage:
'Warning: Backup is deprecated, use Backup NG instead.',
backupMigrationLink: 'How to migrate to Backup NG',
backupNgNewPage: 'Create a new backup with Backup NG',
backupOverviewPage: 'Overview',
backupNewPage: 'New',
backupRemotesPage: 'Remotes',
@@ -185,9 +190,17 @@ const messages = {
sortedTableNumberOfSelectedItems: '{nSelected, number} selected',
sortedTableSelectAllItems: 'Click here to select all items',
// ----- state -----
state: 'State',
stateDisabled: 'Disabled',
stateEnabled: 'Enabled',
// ----- Forms -----
formCancel: 'Cancel',
formCreate: 'Create',
formEdit: 'Edit',
formId: 'ID',
formName: 'Name',
formReset: 'Reset',
formSave: 'Save',
add: 'Add',
@@ -222,9 +235,10 @@ const messages = {
selectIp: 'Select IP(s)…',
selectIpPool: 'Select IP pool(s)…',
selectVgpuType: 'Select VGPU type(s)…',
fillRequiredInformations: 'Fill required informations.',
fillOptionalInformations: 'Fill informations (optional)',
fillRequiredInformations: 'Fill required information.',
fillOptionalInformations: 'Fill information (optional)',
selectTableReset: 'Reset',
selectCloudConfigs: 'Select Cloud Config(s)…',
// --- Dates/Scheduler ---
@@ -259,6 +273,9 @@ const messages = {
jobCallInProgess: 'In progress',
jobTransferredDataSize: 'Transfer size:',
jobTransferredDataSpeed: 'Transfer speed:',
operationSize: 'Size',
operationSpeed: 'Speed',
exportType: 'Type',
jobMergedDataSize: 'Merge size:',
jobMergedDataSpeed: 'Merge speed:',
allJobCalls: 'All',
@@ -276,12 +293,10 @@ const messages = {
jobAction: 'Action',
jobTag: 'Tag',
jobScheduling: 'Scheduling',
jobState: 'State',
jobStateEnabled: 'Enabled',
jobStateDisabled: 'Disabled',
jobTimezone: 'Timezone',
jobServerTimezone: 'Server',
runJob: 'Run job',
cancelJob: 'Cancel job',
runJobConfirm: 'Are you sure you want to run {backupType} {id} ({tag})?',
runJobVerbose: 'One shot running started. See overview for logs.',
jobEdit: 'Edit job',
@@ -306,6 +321,7 @@ const messages = {
taskMergedDataSize: 'Merge size',
taskMergedDataSpeed: 'Merge speed',
taskError: 'Error',
taskReason: 'Reason',
saveBackupJob: 'Save',
deleteBackupSchedule: 'Remove backup job',
deleteBackupScheduleQuestion:
@@ -317,11 +333,25 @@ const messages = {
jobEditMessage:
'You are editing job {name} ({id}). Saving will override previous job state.',
scheduleEdit: 'Edit schedule',
missingBackupName: "A name is required to create the backup's job!",
missingVms: 'Missing VMs!',
missingBackupMode: 'You need to choose a backup mode!',
missingRemotes: 'Missing remotes!',
missingSrs: 'Missing SRs!',
missingSchedules: 'Missing schedules!',
missingExportRetention:
'The Backup mode and The Delta Backup mode require export retention to be higher than 0!',
missingCopyRetention:
'The CR mode and The DR mode require copy retention to be higher than 0!',
missingSnapshotRetention:
'The Rolling Snapshot mode requires snapshot retention to be higher than 0!',
retentionNeeded: 'One of the retentions needs to be higher than 0!',
scheduleAdd: 'Add a schedule',
scheduleDelete: 'Delete',
scheduleRun: 'Run schedule',
deleteSelectedSchedules: 'Delete selected schedules',
noScheduledJobs: 'No scheduled jobs.',
legacySnapshotsLink: 'You can delete all your legacy backup snapshots.',
newSchedule: 'New schedule',
noJobs: 'No jobs found.',
noSchedules: 'No schedules found',
@@ -335,6 +365,7 @@ const messages = {
jobUserNotFound: "This job's creator no longer exists",
backupUserNotFound: "This backup's creator no longer exists",
redirectToMatchingVms: 'Click here to see the matching VMs',
migrateToBackupNg: 'Migrate to backup NG',
noMatchingVms: 'There are no matching VMs!',
allMatchingVms: '{icon} See the matching VMs ({nMatchingVms, number})',
backupOwner: 'Backup owner',
@@ -342,6 +373,7 @@ const messages = {
migrateBackupScheduleMessage:
'This will migrate this backup to a backup NG. This operation is not reversible. Do you want to continue?',
runBackupNgJobConfirm: 'Are you sure you want to run {name} ({id})?',
cancelJobConfirm: 'Are you sure you want to cancel {name} ({id})?',
// ------ New backup -----
newBackupAdvancedSettings: 'Advanced settings',
@@ -349,15 +381,18 @@ const messages = {
reportWhenFailure: 'Failure',
reportWhenNever: 'Never',
reportWhen: 'Report when',
concurrency: 'Concurrency',
newBackupSelection: 'Select your backup type:',
smartBackupModeSelection: 'Select backup mode:',
normalBackup: 'Normal backup',
smartBackup: 'Smart backup',
exportRetention: 'Export retention',
copyRetention: 'Copy retention',
snapshotRetention: 'Snapshot retention',
backupName: 'Name',
useDelta: 'Use delta',
useCompression: 'Use compression',
offlineSnapshot: 'Offline snapshot',
dbAndDrRequireEntreprisePlan: 'Delta Backup and DR require Entreprise plan',
crRequiresPremiumPlan: 'CR requires Premium plan',
smartBackupModeTitle: 'Smart mode',
@@ -597,11 +632,15 @@ const messages = {
vmsTabName: 'Vms',
srsTabName: 'Srs',
// ----- Pool advanced tab -----
poolEditAll: 'Edit all',
poolEditRemoteSyslog: 'Edit remote syslog for all hosts',
poolHaStatus: 'High Availability',
poolHaEnabled: 'Enabled',
poolHaDisabled: 'Disabled',
setpoolMaster: 'Master',
poolGpuGroups: 'GPU groups',
poolRemoteSyslogPlaceHolder: 'Logging host',
setpoolMaster: 'Master',
syslogRemoteHost: 'Remote syslog host',
// ----- Pool host tab -----
hostNameLabel: 'Name',
hostDescription: 'Description',
@@ -681,6 +720,7 @@ const messages = {
hostLicenseType: 'Type',
hostLicenseSocket: 'Socket',
hostLicenseExpiry: 'Expiry',
hostRemoteSyslog: 'Remote syslog',
supplementalPacks: 'Installed supplemental packs',
supplementalPackNew: 'Install new supplemental pack',
supplementalPackPoolNew: 'Install supplemental pack on every host',
@@ -735,6 +775,7 @@ const messages = {
patchNameLabel: 'Name',
patchUpdateButton: 'Install all patches',
patchDescription: 'Description',
patchVersion: 'Version',
patchApplied: 'Applied date',
patchSize: 'Size',
patchStatus: 'Status',
@@ -752,6 +793,15 @@ const messages = {
'This will install a patch only on this host. This is NOT the recommended way: please go into the Pool patch view and follow instructions there. If you are sure about this, you can continue anyway',
installPatchWarningReject: 'Go to pool',
installPatchWarningResolve: 'Install',
patchRelease: 'Release',
updatePluginNotInstalled:
'Update plugin is not installed on this host. Please run `yum install xcp-ng-updater` first.',
showChangelog: 'Show changelog',
changelog: 'Changelog',
changelogPatch: 'Patch',
changelogAuthor: 'Author',
changelogDate: 'Date',
changelogDescription: 'Description',
// ----- Pool patch tabs -----
refreshPatches: 'Refresh patches',
installPoolPatches: 'Install pool patches',
@@ -775,10 +825,6 @@ const messages = {
powerStateSuspended: 'suspended',
// ----- VM home -----
vmStatus: 'No Xen tools detected',
vmName: 'No IPv4 record',
vmDescription: 'No IP record',
vmSettings: 'Started {ago}',
vmCurrentStatus: 'Current status:',
vmNotRunning: 'Not running',
vmHaltedSince: 'Halted {ago}',
@@ -933,6 +979,7 @@ const messages = {
defaultCpuCap: 'Default ({value, number})',
pvArgsLabel: 'PV args',
xenToolsStatus: 'Xen tools version',
xenToolsNotInstalled: 'Not installed',
osName: 'OS name',
osKernel: 'OS kernel',
autoPowerOn: 'Auto power on',
@@ -1064,6 +1111,8 @@ const messages = {
vmContainer: 'Resident on',
vmSnapshotsRelatedToNonExistentBackups:
'VM snapshots related to non-existent backups',
snapshotOf: 'Snapshot of',
legacySnapshots: 'Legacy backups snapshots',
alarmMessage: 'Alarms',
noAlarms: 'No alarms',
alarmDate: 'Date',
@@ -1104,8 +1153,13 @@ const messages = {
newVmReset: 'Reset',
newVmSelectTemplate: 'Select template',
newVmSshKey: 'SSH key',
newVmConfigDrive: 'Config drive',
noConfigDrive: 'No config drive',
newVmCustomConfig: 'Custom config',
availableTemplateVarsInfo:
'Click here to see the available template variables',
availableTemplateVarsTitle: 'Available template variables',
templateNameInfo: 'the VM\'s name. It must not contain "_"',
templateIndexInfo: "the VM's index, it will take 0 in case of single VM",
newVmBootAfterCreate: 'Boot VM after creation',
newVmMacPlaceholder: 'Auto-generated if empty',
newVmCpuWeightLabel: 'CPU weight',
@@ -1191,7 +1245,7 @@ const messages = {
noVmImportErrorDescription: 'No description available',
vmImportError: 'Error:',
vmImportFileType: '{type} file:',
vmImportConfigAlert: 'Please to check and/or modify the VM configuration.',
vmImportConfigAlert: 'Please check and/or modify the VM configuration.',
// ---- Tasks ---
noTasks: 'No pending tasks',
@@ -1206,12 +1260,11 @@ const messages = {
// ---- Backup views ---
backupSchedules: 'Schedules',
backupSavedSchedules: 'Saved schedules',
backupNewSchedules: 'New schedules',
scheduleCron: 'Cron pattern',
scheduleName: 'Name',
scheduleTimezone: 'Timezone',
scheduleExportRetention: 'Export ret.',
scheduleCopyRetention: 'Copy ret.',
scheduleSnapshotRetention: 'Snapshot ret.',
getRemote: 'Get remote',
listRemote: 'List Remote',
@@ -1674,6 +1727,7 @@ const messages = {
logIndicationToDisable: 'Click to disable',
reportBug: 'Report a bug',
unhealthyVdiChainError: 'Job canceled to protect the VDI chain',
backupRestartVm: "Restart VM's backup",
clickForMoreInformation: 'Click for more information',
// ----- IPs ------
@@ -1720,6 +1774,16 @@ const messages = {
settingsAclsButtonTooltipSR: 'SR',
settingsAclsButtonTooltipnetwork: 'Network',
// ----- Settings/Cloud configs -----
settingsCloudConfigTemplate: 'Template',
confirmDeleteCloudConfigsTitle:
'Delete cloud config{nCloudConfigs, plural, one {} other {s}}',
confirmDeleteCloudConfigsBody:
'Are you sure you want to delete {nCloudConfigs, number} cloud config{nCloudConfigs, plural, one {} other {s}}?',
deleteCloudConfig: 'Delete cloud config',
editCloudConfig: 'Edit cloud config',
deleteSelectedCloudConfigs: 'Delete selected cloud configs',
// ----- Config -----
noConfigFile: 'No config file selected',
importTip:

View File

@@ -16,7 +16,11 @@ export default class BooleanInput extends Component {
return (
<PrimitiveInputWrapper {...props}>
<div className='checkbox form-control'>
<Toggle disabled={disabled} onChange={onChange} value={value} />
<Toggle
disabled={disabled}
onChange={onChange}
value={value || false}
/>
</div>
</PrimitiveInputWrapper>
)

View File

@@ -50,19 +50,17 @@ const SrItem = propTypes({
return (state, props) => ({
container: getContainer(state, props),
})
})(({ sr, container }) => {
let label = `${sr.name_label || sr.id}`
if (isSrWritable(sr)) {
label += ` (${formatSize(sr.size - sr.physical_usage)} free)`
}
return (
<span>
<Icon icon='sr' /> {label}
</span>
)
})
})(({ sr, container }) => (
<span>
<Icon icon='sr' /> {sr.name_label || sr.id}
{container !== undefined && (
<span className='text-muted'> - {container.name_label}</span>
)}
{isSrWritable(sr) && (
<span>{` (${formatSize(sr.size - sr.physical_usage)} free)`}</span>
)}
</span>
))
)
// VM.
@@ -96,6 +94,11 @@ const VgpuItem = connectStore(() => ({
const xoItemToRender = {
// Subscription objects.
cloudConfig: template => (
<span>
<Icon icon='template' /> {template.name}
</span>
),
group: group => (
<span>
<Icon icon='group' /> {group.name}

View File

@@ -0,0 +1,51 @@
import _ from 'intl'
import React from 'react'
import ActionButton from './action-button'
import ActionRowButton from './action-row-button'
import propTypes from './prop-types-decorator'
export const CAN_REPORT_BUG = process.env.XOA_PLAN > 1
const reportBug = ({ formatMessage, message, title }) => {
const encodedTitle = encodeURIComponent(title)
const encodedMessage = encodeURIComponent(
formatMessage !== undefined ? formatMessage(message) : message
)
window.open(
process.env.XOA_PLAN < 5
? `https://xen-orchestra.com/#!/member/support?title=${encodedTitle}&message=${encodedMessage}`
: `https://github.com/vatesfr/xen-orchestra/issues/new?title=${encodedTitle}&body=${encodedMessage}`
)
}
const ReportBugButton = ({
formatMessage,
message,
rowButton,
title,
...props
}) => {
const Button = rowButton ? ActionRowButton : ActionButton
return (
<Button
{...props}
data-formatMessage={formatMessage}
data-message={message}
data-title={title}
handler={reportBug}
icon='bug'
tooltip={_('reportBug')}
/>
)
}
propTypes(ReportBugButton)({
formatMessage: propTypes.func,
message: propTypes.string.isRequired,
rowButton: propTypes.bool,
title: propTypes.string.isRequired,
})
export default ReportBugButton

View File

@@ -6,7 +6,6 @@ import {
filter,
flatten,
forEach,
get,
groupBy,
includes,
isArray,
@@ -36,11 +35,13 @@ import {
createGetObjectsOfType,
createGetTags,
createSelector,
createSort,
getObject,
} from './selectors'
import { addSubscriptions, connectStore, resolveResourceSets } from './utils'
import {
isSrWritable,
subscribeCloudConfigs,
subscribeCurrentUser,
subscribeGroups,
subscribeIpPools,
@@ -61,7 +62,9 @@ const ADDON_BUTTON_STYLE = { lineHeight: '1.4' }
const getIds = value =>
value == null || isString(value) || isInteger(value)
? value
: isArray(value) ? map(value, getIds) : value.id
: isArray(value)
? map(value, getIds)
: value.id
const getOption = (object, container) => ({
label: container
@@ -362,40 +365,10 @@ export const SelectSr = makeStoreSelect(
const getPools = createGetObjectsOfType('pool')
const getHosts = createGetObjectsOfType('host')
const getSrsByContainer = createSelector(
createGetObjectsOfType('SR')
.filter((_, { predicate }) => predicate || isSrWritable)
.sort(),
createSelector(getHosts, getPools, (hosts, pools) => id =>
hosts[id] || pools[id]
),
(srs, containerFinder) => {
const { length } = srs
if (length >= 2) {
let sr1, sr2
const srsToModify = {}
for (let i = 1; i < length; ++i) {
sr1 = srs[i]
for (let j = 0; j < i; ++j) {
sr2 = srs[j]
if (sr1.name_label === sr2.name_label) {
srsToModify[sr1.id] = sr1
srsToModify[sr2.id] = sr2
}
}
}
forEach(srsToModify, sr => {
sr.name_label = `(${get(
containerFinder(sr.$container),
'name_label'
)}) ${sr.name_label}`
})
}
return groupBy(srs, '$container')
}
)
const getSrsByContainer = createGetObjectsOfType('SR')
.filter((_, { predicate }) => predicate || isSrWritable)
.sort()
.groupBy('$container')
const getContainerIds = createSelector(getSrsByContainer, srsByContainer =>
keys(srsByContainer)
@@ -888,16 +861,15 @@ export class SelectResourceSetsNetwork extends React.PureComponent {
this.refs.select.value = value
}
_getNetworks = createSelector(
() => this.props.resourceSet,
({ objectsByType }) => {
const { predicate } = this.props
const networks = objectsByType['network']
return sortBy(
predicate ? filter(networks, predicate) : networks,
'name_label'
_getNetworks = createSort(
createFilter(
() => this.props.resourceSet.objectsByType.network,
createSelector(
() => this.props.predicate,
predicate => predicate || (() => true)
)
}
),
'name_label'
)
render () {
@@ -1067,3 +1039,18 @@ export const SelectIpPool = makeSubscriptionSelect(
},
{ placeholder: _('selectIpPool') }
)
// ===================================================================
export const SelectCloudConfig = makeSubscriptionSelect(
subscriber =>
subscribeCloudConfigs(cloudConfigs => {
subscriber({
xoObjects: map(sortBy(cloudConfigs, 'name'), cloudConfig => ({
...cloudConfig,
type: 'cloudConfig',
})),
})
}),
{ placeholder: _('selectCloudConfigs') }
)

View File

@@ -15,6 +15,7 @@ import {
pickBy,
size,
slice,
some,
} from 'lodash'
import invoke from './invoke'
@@ -147,7 +148,9 @@ export const createFilter = (collection, predicate) =>
_createCollectionWrapper(
(collection, predicate) =>
predicate === false
? isArrayLike(collection) ? EMPTY_ARRAY : EMPTY_OBJECT
? isArrayLike(collection)
? EMPTY_ARRAY
: EMPTY_OBJECT
: predicate
? (isArrayLike(collection) ? filter : pickBy)(collection, predicate)
: collection
@@ -541,3 +544,9 @@ export const createGetVmDisks = vmSelector =>
)
)
)
export const getIsPoolAdmin = create(
create(createGetObjectsOfType('pool'), _createCollectionWrapper(Object.keys)),
getCheckPermissions,
(poolsIds, check) => some(poolsIds, poolId => check(poolId, 'administrate'))
)

View File

@@ -167,7 +167,10 @@ class ColumnHead extends Component {
})
class Checkbox extends Component {
componentDidUpdate () {
const { props: { indeterminate }, ref } = this
const {
props: { indeterminate },
ref,
} = this
if (ref !== null) {
ref.indeterminate = indeterminate
}
@@ -195,7 +198,7 @@ const actionsShape = propTypes.arrayOf(
disabled: propTypes.oneOfType([propTypes.bool, propTypes.func]),
handler: propTypes.func.isRequired,
icon: propTypes.string.isRequired,
label: propTypes.node.isRequired,
label: propTypes.oneOfType([propTypes.node, propTypes.func]).isRequired,
level: propTypes.oneOf(['primary', 'warning', 'danger']),
redirectOnSuccess: propTypes.oneOfType([propTypes.func, propTypes.string]),
})
@@ -487,8 +490,8 @@ export default class SortedTable extends Component {
) {
this.setState({
highlighted:
(itemIndex + visibleItems.length + 1) % visibleItems.length ||
0,
(itemIndex + visibleItems.length + 1) %
visibleItems.length || 0,
})
}
break
@@ -500,8 +503,8 @@ export default class SortedTable extends Component {
) {
this.setState({
highlighted:
(itemIndex + visibleItems.length - 1) % visibleItems.length ||
0,
(itemIndex + visibleItems.length - 1) %
visibleItems.length || 0,
})
}
break
@@ -893,7 +896,7 @@ export default class SortedTable extends Component {
</span>
)
)}
{nSelectedItems !== 0 && (
{(nSelectedItems !== 0 || all) && (
<div className='pull-right'>
<ButtonGroup>
{map(groupedActions, (props, key) => (

View File

@@ -205,7 +205,7 @@ export const formatSizeRaw = bytes =>
humanFormat.raw(bytes, { scale: 'binary', unit: 'B' })
export const formatSpeed = (bytes, milliseconds) =>
humanFormat(bytes * 1e3 / milliseconds, { scale: 'binary', unit: 'B/s' })
humanFormat((bytes * 1e3) / milliseconds, { scale: 'binary', unit: 'B/s' })
const timeScale = new humanFormat.Scale({
ns: 1e-6,
@@ -516,7 +516,7 @@ export const createFakeProgress = (() => {
const startTime = Date.now() / 1e3
return () => {
const x = Date.now() / 1e3 - startTime
return -Math.exp(x * Math.log(1 - S) / d) + 1
return -Math.exp((x * Math.log(1 - S)) / d) + 1
}
}
})()
@@ -563,3 +563,10 @@ export const isLatestXosanPackInstalled = (latestXosanPack, hosts) =>
export const getMemoryUsedMetric = ({ memory, memoryFree = memory }) =>
map(memory, (value, key) => value - memoryFree[key])
// ===================================================================
export const generateRandomId = () =>
Math.random()
.toString(36)
.slice(2)

View File

@@ -596,7 +596,11 @@ export const IopsLineChart = injectIntl(
data: propTypes.array.isRequired,
options: propTypes.object,
})(({ addSumSeries, data, options = {}, intl }) => {
const { endTimestamp, interval, stats: { iops } } = data
const {
endTimestamp,
interval,
stats: { iops },
} = data
const { length } = get(iops, 'r')
@@ -635,7 +639,11 @@ export const IoThroughputChart = injectIntl(
data: propTypes.array.isRequired,
options: propTypes.object,
})(({ addSumSeries, data, options = {}, intl }) => {
const { endTimestamp, interval, stats: { ioThroughput } } = data
const {
endTimestamp,
interval,
stats: { ioThroughput },
} = data
const { length } = get(ioThroughput, 'r') || []
@@ -674,7 +682,11 @@ export const LatencyChart = injectIntl(
data: propTypes.array.isRequired,
options: propTypes.object,
})(({ addSumSeries, data, options = {}, intl }) => {
const { endTimestamp, interval, stats: { latency } } = data
const {
endTimestamp,
interval,
stats: { latency },
} = data
const { length } = get(latency, 'r') || []
@@ -713,7 +725,11 @@ export const IowaitChart = injectIntl(
data: propTypes.array.isRequired,
options: propTypes.object,
})(({ addSumSeries, data, options = {}, intl }) => {
const { endTimestamp, interval, stats: { iowait } } = data
const {
endTimestamp,
interval,
stats: { iowait },
} = data
const { length } = iowait[Object.keys(iowait)[0]] || []
@@ -737,7 +753,7 @@ export const IowaitChart = injectIntl(
nValues: length,
endTimestamp,
interval,
valueTransform: value => `${round(value, 2)}%`,
valueTransform: value => `${round(value, 3)}%`,
}),
...options,
}}

View File

@@ -23,7 +23,7 @@ class CreateNetworkModalBody extends Component {
pool: container.$pool,
name: refs.name.value,
description: refs.description.value,
pif: refs.pif.value.id,
pif: refs.pif.value && refs.pif.value.id,
mtu: refs.mtu.value,
vlan: refs.vlan.value,
}

View File

@@ -9,6 +9,7 @@ import {
assign,
filter,
forEach,
get,
includes,
isEmpty,
isEqual,
@@ -576,6 +577,15 @@ export const editHost = (host, props) =>
export const fetchHostStats = (host, granularity) =>
_call('host.stats', { host: resolveId(host), granularity })
export const setRemoteSyslogHost = (host, syslogDestination) =>
_call('host.setRemoteSyslogHost', {
id: resolveId(host),
syslogDestination,
})
export const setRemoteSyslogHosts = (hosts, syslogDestination) =>
Promise.all(map(hosts, host => setRemoteSyslogHost(host, syslogDestination)))
export const restartHost = (host, force = false) =>
confirm({
title: _('restartHostModalTitle'),
@@ -655,14 +665,26 @@ export const enableHost = host => _call('host.enable', { id: resolveId(host) })
export const disableHost = host =>
_call('host.disable', { id: resolveId(host) })
export const getHostMissingPatches = host =>
_call('host.listMissingPatches', { host: resolveId(host) }).then(
patches =>
// Hide paid patches to XS-free users
host.license_params.sku_type !== 'free'
? patches
: filter(patches, ['paid', false])
)
const missingUpdatePluginByHost = { __proto__: null }
export const getHostMissingPatches = async host => {
const hostId = resolveId(host)
if (host.productBrand !== 'XCP-ng') {
const patches = await _call('host.listMissingPatches', { host: hostId })
// Hide paid patches to XS-free users
return host.license_params.sku_type !== 'free'
? patches
: filter(patches, { paid: false })
}
if (missingUpdatePluginByHost[hostId]) {
return null
}
try {
return await _call('host.listMissingPatches', { host: hostId })
} catch (_) {
missingUpdatePluginByHost[hostId] = true
return null
}
}
export const emergencyShutdownHost = host =>
confirm({
@@ -1066,7 +1088,7 @@ export const migrateVm = (vm, host) =>
_('migrateVmNoTargetHostMessage')
)
}
_call('vm.migrate', { vm: vm.id, ...params })
return _call('vm.migrate', { vm: vm.id, ...params })
}, noop)
import MigrateVmsModalBody from './migrate-vms-modal' // eslint-disable-line import/first
@@ -1107,7 +1129,7 @@ export const migrateVms = vms =>
export const createVm = args => _call('vm.create', args)
export const createVms = (args, nameLabels) =>
export const createVms = (args, nameLabels, cloudConfigs) =>
confirm({
title: _('newVmCreateVms'),
body: _('newVmCreateVmsConfirm', { nbVms: nameLabels.length }),
@@ -1115,8 +1137,15 @@ export const createVms = (args, nameLabels) =>
() =>
Promise.all(
map(nameLabels, (
name_label // eslint-disable-line camelcase
) => _call('vm.create', { ...args, name_label }))
name_label, // eslint-disable-line camelcase
i
) =>
_call('vm.create', {
...args,
name_label,
cloudConfig: get(cloudConfigs, i),
})
)
),
noop
)
@@ -1650,6 +1679,15 @@ export const runJob = job => {
return _call('job.runSequence', { idSequence: [resolveId(job)] })
}
export const cancelJob = ({ id, name, runId }) =>
confirm({
title: _('cancelJob'),
body: _('cancelJobConfirm', {
id: id.slice(0, 5),
name: <strong>{name}</strong>,
}),
}).then(() => _call('job.cancel', { runId }))
// Backup/Schedule ---------------------------------------------------------
export const createSchedule = (
@@ -1814,7 +1852,7 @@ export const purgePluginConfiguration = async id => {
subscribePlugins.forceRefresh()
}
export const testPlugin = async (id, data) => _call('plugin.test', { id, data })
export const testPlugin = (id, data) => _call('plugin.test', { id, data })
export const sendUsageReport = () => _call('plugin.usageReport.send')
@@ -2364,6 +2402,39 @@ export const setIpPool = (ipPool, { name, addresses, networks }) =>
networks: resolveIds(networks),
})::tap(subscribeIpPools.forceRefresh)
// Cloud configs --------------------------------------------------------------------
export const subscribeCloudConfigs = createSubscription(() =>
_call('cloudConfig.getAll')
)
export const createCloudConfig = props =>
_call('cloudConfig.create', props)::tap(subscribeCloudConfigs.forceRefresh)
export const deleteCloudConfigs = ids => {
const { length } = ids
if (length === 0) {
return
}
const vars = { nCloudConfigs: length }
return confirm({
title: _('confirmDeleteCloudConfigsTitle', vars),
body: <p>{_('confirmDeleteCloudConfigsBody', vars)}</p>,
}).then(
() =>
Promise.all(
ids.map(id => _call('cloudConfig.delete', { id: resolveId(id) }))
)::tap(subscribeCloudConfigs.forceRefresh),
noop
)
}
export const editCloudConfig = (cloudConfig, props) =>
_call('cloudConfig.update', { ...props, id: resolveId(cloudConfig) })::tap(
subscribeCloudConfigs.forceRefresh
)
// XO SAN ----------------------------------------------------------------------
export const getVolumeInfo = (xosanSr, infoType) =>

View File

@@ -2,7 +2,7 @@ import addSubscriptions from 'add-subscriptions'
import React from 'react'
import { injectState, provideState } from '@julien-f/freactal'
import { subscribeBackupNgJobs, subscribeSchedules } from 'xo'
import { find, groupBy } from 'lodash'
import { find, groupBy, keyBy } from 'lodash'
import New from './new'
@@ -18,7 +18,7 @@ export default [
computed: {
job: (_, { jobs, routeParams: { id } }) => find(jobs, { id }),
schedules: (_, { schedulesByJob, routeParams: { id } }) =>
schedulesByJob && schedulesByJob[id],
schedulesByJob && keyBy(schedulesByJob[id], 'id'),
},
}),
injectState,

View File

@@ -206,7 +206,7 @@ export default class Restore extends Component {
render () {
return (
<Upgrade place='restoreBackup' available={2}>
<Upgrade place='restoreBackup' available={4}>
<div>
<div className='mb-1'>
<ActionButton

View File

@@ -0,0 +1,156 @@
import _ from 'intl'
import Component from 'base-component'
import Icon from 'icon'
import Link from 'link'
import NoObjects from 'no-objects'
import React from 'react'
import renderXoItem from 'render-xo-item'
import SortedTable from 'sorted-table'
import { Card, CardHeader, CardBlock } from 'card'
import { addSubscriptions, connectStore } from 'utils'
import { Container, Row, Col } from 'grid'
import { FormattedRelative, FormattedTime } from 'react-intl'
import { includes, map } from 'lodash'
import { deleteSnapshot, deleteSnapshots, subscribeSchedules } from 'xo'
import {
createSelector,
createGetObjectsOfType,
createCollectionWrapper,
} from 'selectors'
const SNAPSHOT_COLUMNS = [
{
name: _('snapshotDate'),
itemRenderer: snapshot => (
<span>
<FormattedTime
day='numeric'
hour='numeric'
minute='numeric'
month='long'
value={snapshot.snapshot_time * 1000}
year='numeric'
/>{' '}
(<FormattedRelative value={snapshot.snapshot_time * 1000} />)
</span>
),
sortCriteria: 'snapshot_time',
sortOrder: 'desc',
},
{
name: _('vmNameLabel'),
itemRenderer: renderXoItem,
sortCriteria: 'name_label',
},
{
name: _('vmNameDescription'),
itemRenderer: snapshot => snapshot.name_description,
sortCriteria: 'name_description',
},
{
name: _('snapshotOf'),
itemRenderer: (snapshot, { vms }) => {
const vm = vms[snapshot.$snapshot_of]
return vm && <Link to={`/vms/${vm.id}`}>{renderXoItem(vm)}</Link>
},
sortCriteria: (snapshot, { vms }) => {
const vm = vms[snapshot.$snapshot_of]
return vm && vm.name_label
},
},
]
const ACTIONS = [
{
label: _('deleteSnapshots'),
individualLabel: _('deleteSnapshot'),
icon: 'delete',
level: 'danger',
handler: deleteSnapshots,
individualHandler: deleteSnapshot,
},
]
@addSubscriptions({
schedules: subscribeSchedules,
})
@connectStore(() => {
const getSnapshots = createGetObjectsOfType('VM-snapshot')
return {
loneSnapshots: getSnapshots.filter(
createSelector(
createCollectionWrapper(
(_, props) =>
props.schedules !== undefined && map(props.schedules, 'id')
),
scheduleIds =>
scheduleIds
? _ => {
const scheduleId = _.other['xo:backup:schedule']
return (
scheduleId !== undefined && !includes(scheduleIds, scheduleId)
)
}
: false
)
),
legacySnapshots: getSnapshots.filter([
(() => {
const RE = /^(?:XO_DELTA_EXPORT:|XO_DELTA_BASE_VM_SNAPSHOT_|rollingSnapshot_)/
return (
{ name_label } // eslint-disable-line camelcase
) => RE.test(name_label)
})(),
]),
vms: createGetObjectsOfType('VM'),
}
})
export default class Health extends Component {
render () {
return (
<Container>
<Row className='lone-snapshots'>
<Col>
<Card>
<CardHeader>
<Icon icon='vm' /> {_('vmSnapshotsRelatedToNonExistentBackups')}
</CardHeader>
<CardBlock>
<NoObjects
actions={ACTIONS}
collection={this.props.loneSnapshots}
columns={SNAPSHOT_COLUMNS}
component={SortedTable}
data-vms={this.props.vms}
emptyMessage={_('noSnapshots')}
shortcutsTarget='.lone-snapshots'
/>
</CardBlock>
</Card>
</Col>
</Row>
<Row className='legacy-snapshots'>
<Col>
<Card>
<CardHeader>
<Icon icon='vm' /> {_('legacySnapshots')}
</CardHeader>
<CardBlock>
<NoObjects
actions={ACTIONS}
collection={this.props.legacySnapshots}
columns={SNAPSHOT_COLUMNS}
component={SortedTable}
data-vms={this.props.vms}
emptyMessage={_('noSnapshots')}
shortcutsTarget='.legacy-snapshots'
/>
</CardBlock>
</Card>
</Col>
</Row>
</Container>
)
}
}

View File

@@ -13,6 +13,7 @@ import { Container, Row, Col } from 'grid'
import { NavLink, NavTabs } from 'nav'
import { routes } from 'utils'
import {
cancelJob,
deleteBackupNgJobs,
disableSchedule,
enableSchedule,
@@ -28,6 +29,7 @@ import Edit from './edit'
import New from './new'
import FileRestore from './file-restore'
import Restore from './restore'
import Health from './health'
const _runBackupNgJob = ({ id, name, schedule }) =>
confirm({
@@ -44,6 +46,7 @@ const SchedulePreviewBody = ({ item: job, userData: { schedulesByJob } }) => (
<th>{_('scheduleCron')}</th>
<th>{_('scheduleTimezone')}</th>
<th>{_('scheduleExportRetention')}</th>
<th>{_('scheduleCopyRetention')}</th>
<th>{_('scheduleSnapshotRetention')}</th>
<th>{_('scheduleRun')}</th>
</tr>
@@ -52,13 +55,14 @@ const SchedulePreviewBody = ({ item: job, userData: { schedulesByJob } }) => (
<td>{schedule.cron}</td>
<td>{schedule.timezone}</td>
<td>{job.settings[schedule.id].exportRetention}</td>
<td>{job.settings[schedule.id].copyRetention}</td>
<td>{job.settings[schedule.id].snapshotRetention}</td>
<td>
<StateButton
disabledLabel={_('jobStateDisabled')}
disabledLabel={_('stateDisabled')}
disabledHandler={enableSchedule}
disabledTooltip={_('logIndicationToEnable')}
enabledLabel={_('jobStateEnabled')}
enabledLabel={_('stateEnabled')}
enabledHandler={disableSchedule}
enabledTooltip={_('logIndicationToDisable')}
handlerParam={schedule.id}
@@ -66,15 +70,28 @@ const SchedulePreviewBody = ({ item: job, userData: { schedulesByJob } }) => (
/>
</td>
<td>
<ActionButton
btnStyle='primary'
data-id={job.id}
data-name={job.name}
data-schedule={schedule.id}
handler={_runBackupNgJob}
icon='run-schedule'
size='small'
/>
{job.runId !== undefined ? (
<ActionButton
btnStyle='danger'
handler={cancelJob}
handlerParam={job}
icon='cancel'
key='cancel'
size='small'
tooltip={_('formCancel')}
/>
) : (
<ActionButton
btnStyle='primary'
data-id={job.id}
data-name={job.name}
data-schedule={schedule.id}
handler={_runBackupNgJob}
icon='run-schedule'
key='run'
size='small'
/>
)}
</td>
</tr>
))}
@@ -137,7 +154,7 @@ class JobsTable extends React.Component {
},
{
handler: (job, { goTo }) => goTo(`/backup-ng/${job.id}/edit`),
label: '',
label: _('formEdit'),
icon: 'edit',
level: 'primary',
},
@@ -197,6 +214,10 @@ const HEADER = (
<Icon icon='menu-backup-file-restore' />{' '}
{_('backupFileRestorePage')}
</NavLink>
<NavLink to='/backup-ng/health'>
<Icon icon='menu-dashboard-health' />{' '}
{_('overviewHealthDashboardPage')}
</NavLink>
</NavTabs>
</Col>
</Row>
@@ -209,6 +230,7 @@ export default routes('overview', {
overview: Overview,
restore: Restore,
'file-restore': FileRestore,
health: Health,
})(({ children }) => (
<Page header={HEADER} title='backupPage' formatTitle>
{children}

View File

@@ -6,40 +6,76 @@ import renderXoItem, { renderXoItemFromId } from 'render-xo-item'
import Select from 'form/select'
import Tooltip from 'tooltip'
import Upgrade from 'xoa-upgrade'
import { addSubscriptions, resolveId, resolveIds } from 'utils'
import { Card, CardBlock, CardHeader } from 'card'
import { Container, Col, Row } from 'grid'
import {
find,
findKey,
addSubscriptions,
generateRandomId,
resolveId,
resolveIds,
} from 'utils'
import { Card, CardBlock, CardHeader } from 'card'
import { constructSmartPattern, destructSmartPattern } from 'smart-backup'
import { Container, Col, Row } from 'grid'
import { injectState, provideState } from '@julien-f/freactal'
import { SelectRemote, SelectSr, SelectVm } from 'select-objects'
import { Toggle } from 'form'
import {
cloneDeep,
flatten,
get,
forEach,
includes,
isEmpty,
keyBy,
map,
mapValues,
some,
} from 'lodash'
import { injectState, provideState } from '@julien-f/freactal'
import { Toggle } from 'form'
import { constructSmartPattern, destructSmartPattern } from 'smart-backup'
import { SelectRemote, SelectSr, SelectVm } from 'select-objects'
import {
createBackupNgJob,
createSchedule,
deleteSchedule,
editBackupNgJob,
editSchedule,
isSrWritable,
subscribeRemotes,
} from 'xo'
import Schedules from './schedules'
import SmartBackup from './smart-backup'
import { FormGroup, getRandomId, Input, Ul, Li } from './utils'
import { FormFeedback, FormGroup, Input, Number, Ul, Li } from './utils'
// ===================================================================
const normaliseTagValues = values => resolveIds(values).map(value => [value])
const normalizeTagValues = values => resolveIds(values).map(value => [value])
const normalizeCopyRetention = settings => {
forEach(settings, schedule => {
if (schedule.copyRetention === undefined) {
schedule.copyRetention = schedule.exportRetention
}
})
}
const normalizeSettings = ({
settings,
exportMode,
copyMode,
snapshotMode,
}) => {
forEach(settings, setting => {
if (!exportMode) {
setting.exportRetention = undefined
}
if (!copyMode) {
setting.copyRetention = undefined
}
if (!snapshotMode) {
setting.snapshotRetention = undefined
}
})
return settings
}
const constructPattern = values =>
values.length === 1
@@ -65,32 +101,6 @@ const destructVmsPattern = pattern =>
vms: destructPattern(pattern),
}
const getNewSettings = schedules => {
const newSettings = {}
for (const id in schedules) {
newSettings[id] = {
exportRetention: schedules[id].exportRetention,
snapshotRetention: schedules[id].snapshotRetention,
}
}
return newSettings
}
const getNewSchedules = schedules => {
const newSchedules = {}
for (const id in schedules) {
newSchedules[id] = {
cron: schedules[id].cron,
timezone: schedules[id].timezone,
}
}
return newSchedules
}
const REPORT_WHEN_FILTER_OPTIONS = [
{
label: 'reportWhenAlways',
@@ -108,23 +118,30 @@ const REPORT_WHEN_FILTER_OPTIONS = [
const getOptionRenderer = ({ label }) => <span>{_(label)}</span>
const createDoesRetentionExist = name => {
const predicate = setting => setting[name] > 0
return ({ settings }) => some(settings, predicate)
}
const getInitialState = () => ({
$pool: {},
backupMode: false,
compression: true,
concurrency: 0,
crMode: false,
deltaMode: false,
drMode: false,
editionMode: undefined,
formId: getRandomId(),
formId: generateRandomId(),
name: '',
newSchedules: {},
offlineSnapshot: false,
paramsUpdated: false,
powerState: 'All',
remotes: [],
reportWhen: 'failure',
schedules: [],
schedules: {},
settings: {},
showErrors: false,
smartMode: false,
snapshotMode: false,
srs: [],
@@ -149,15 +166,32 @@ export default [
initialState: getInitialState,
effects: {
createJob: () => async state => {
if (state.isJobInvalid) {
return {
...state,
showErrors: true,
}
}
await createBackupNgJob({
name: state.name,
mode: state.isDelta ? 'delta' : 'full',
compression: state.compression ? 'native' : '',
schedules: getNewSchedules(state.newSchedules),
schedules: mapValues(
state.schedules,
({ id, ...schedule }) => schedule
),
settings: {
...getNewSettings(state.newSchedules),
...normalizeSettings({
settings: cloneDeep(state.settings),
exportMode: state.exportMode,
copyMode: state.copyMode,
snapshotMode: state.snapshotMode,
}),
'': {
reportWhen: state.reportWhen,
concurrency: state.concurrency,
offlineSnapshot: state.offlineSnapshot,
},
},
remotes:
@@ -174,70 +208,56 @@ export default [
})
},
editJob: () => async (state, props) => {
const newSettings = {}
if (!isEmpty(state.newSchedules)) {
await Promise.all(
map(state.newSchedules, async schedule => {
const scheduleId = (await createSchedule(props.job.id, {
cron: schedule.cron,
timezone: schedule.timezone,
})).id
newSettings[scheduleId] = {
exportRetention: schedule.exportRetention,
snapshotRetention: schedule.snapshotRetention,
}
})
)
if (state.isJobInvalid) {
return {
...state,
showErrors: true,
}
}
await Promise.all(
map(props.schedules, oldSchedule => {
const scheduleId = oldSchedule.id
const newSchedule = find(state.schedules, { id: scheduleId })
if (
newSchedule !== undefined &&
newSchedule.cron === oldSchedule.cron &&
newSchedule.timezone === oldSchedule.timezone
) {
return
}
const id = oldSchedule.id
const newSchedule = state.schedules[id]
if (newSchedule === undefined) {
return deleteSchedule(scheduleId)
return deleteSchedule(id)
}
return editSchedule({
id: scheduleId,
jobId: props.job.id,
cron: newSchedule.cron,
timezone: newSchedule.timezone,
})
if (
newSchedule.cron !== oldSchedule.cron ||
newSchedule.timezone !== oldSchedule.timezone
) {
return editSchedule({
id,
cron: newSchedule.cron,
timezone: newSchedule.timezone,
})
}
})
)
const oldSettings = props.job.settings
const settings = state.settings
if (!('' in oldSettings)) {
oldSettings[''] = {}
}
for (const id in oldSettings) {
const oldSetting = oldSettings[id]
const newSetting = settings[id]
const settings = cloneDeep(state.settings)
await Promise.all(
map(state.schedules, async schedule => {
const tmpId = schedule.id
if (props.schedules[tmpId] === undefined) {
const { id } = await createSchedule(props.job.id, {
cron: schedule.cron,
timezone: schedule.timezone,
})
if (id === '') {
oldSetting.reportWhen = state.reportWhen
} else if (!(id in settings)) {
delete oldSettings[id]
} else if (
oldSetting.snapshotRetention !== newSetting.snapshotRetention ||
oldSetting.exportRetention !== newSetting.exportRetention
) {
newSettings[id] = {
exportRetention: newSetting.exportRetention,
snapshotRetention: newSetting.snapshotRetention,
settings[id] = settings[tmpId]
delete settings[tmpId]
}
}
})
)
settings[''] = {
...props.job.settings[''],
reportWhen: state.reportWhen,
concurrency: state.concurrency,
offlineSnapshot: state.offlineSnapshot,
}
await editBackupNgJob({
@@ -245,10 +265,12 @@ export default [
name: state.name,
mode: state.isDelta ? 'delta' : 'full',
compression: state.compression ? 'native' : '',
settings: {
...oldSettings,
...newSettings,
},
settings: normalizeSettings({
settings,
exportMode: state.exportMode,
copyMode: state.copyMode,
snapshotMode: state.snapshotMode,
}),
remotes:
state.deltaMode || state.backupMode
? constructPattern(state.remotes)
@@ -266,9 +288,19 @@ export default [
...state,
[mode]: !state[mode],
}),
setCompression: (_, { target: { checked } }) => state => ({
setCheckboxValue: (_, { target: { checked, name } }) => state => ({
...state,
compression: checked,
[name]: checked,
}),
toggleScheduleState: (_, id) => state => ({
...state,
schedules: {
...state.schedules,
[id]: {
...state.schedules[id],
enabled: !state.schedules[id].enabled,
},
},
}),
toggleSmartMode: (_, smartMode) => state => ({
...state,
@@ -309,9 +341,16 @@ export default [
const remotes =
job.remotes !== undefined ? destructPattern(job.remotes) : []
const srs = job.srs !== undefined ? destructPattern(job.srs) : []
const globalSettings = job.settings['']
const settings = { ...job.settings }
const { concurrency, reportWhen, offlineSnapshot } =
job.settings[''] || {}
const settings = cloneDeep(job.settings)
delete settings['']
const drMode = job.mode === 'full' && !isEmpty(srs)
const crMode = job.mode === 'delta' && !isEmpty(srs)
if (drMode || crMode) {
normalizeCopyRetention(settings)
}
return {
...state,
@@ -325,11 +364,13 @@ export default [
),
backupMode: job.mode === 'full' && !isEmpty(remotes),
deltaMode: job.mode === 'delta' && !isEmpty(remotes),
drMode: job.mode === 'full' && !isEmpty(srs),
crMode: job.mode === 'delta' && !isEmpty(srs),
drMode,
crMode,
remotes,
srs,
reportWhen: get(globalSettings, 'reportWhen') || 'failure',
reportWhen: reportWhen || 'failure',
concurrency: concurrency || 0,
offlineSnapshot,
settings,
schedules,
...destructVmsPattern(job.vms),
@@ -344,64 +385,48 @@ export default [
tmpSchedule: {},
editionMode: undefined,
}),
editSchedule: (_, schedule) => state => {
const { snapshotRetention, exportRetention } =
state.settings[schedule.id] || {}
return {
...state,
editionMode: 'editSchedule',
tmpSchedule: {
exportRetention,
snapshotRetention,
...schedule,
},
}
},
deleteSchedule: (_, id) => async (state, props) => {
const schedules = [...state.schedules]
schedules.splice(findKey(state.schedules, { id }), 1)
return {
...state,
schedules,
}
},
editNewSchedule: (_, schedule) => state => ({
editSchedule: (_, schedule) => state => ({
...state,
editionMode: 'editNewSchedule',
editionMode: 'editSchedule',
tmpSchedule: {
...schedule,
},
}),
deleteNewSchedule: (_, id) => async (state, props) => {
const newSchedules = { ...state.newSchedules }
delete newSchedules[id]
deleteSchedule: (_, schedule) => state => {
const id = resolveId(schedule)
const schedules = { ...state.schedules }
const settings = { ...state.settings }
delete schedules[id]
delete settings[id]
return {
...state,
newSchedules,
schedules,
settings,
}
},
saveSchedule: (
_,
{ cron, timezone, exportRetention, snapshotRetention }
{ cron, timezone, exportRetention, copyRetention, snapshotRetention }
) => async (state, props) => {
if (!state.exportMode) {
exportRetention = 0
}
if (!state.snapshotMode) {
snapshotRetention = 0
}
if (state.editionMode === 'creation') {
const id = generateRandomId()
return {
...state,
editionMode: undefined,
newSchedules: {
...state.newSchedules,
[getRandomId()]: {
schedules: {
...state.schedules,
[id]: {
id,
cron,
timezone,
},
},
settings: {
...state.settings,
[id]: {
exportRetention,
copyRetention,
snapshotRetention,
},
},
@@ -409,43 +434,27 @@ export default [
}
const id = state.tmpSchedule.id
if (state.editionMode === 'editSchedule') {
const scheduleKey = findKey(state.schedules, { id })
const schedules = [...state.schedules]
schedules[scheduleKey] = {
...schedules[scheduleKey],
cron,
timezone,
}
const schedules = { ...state.schedules }
const settings = { ...state.settings }
const settings = { ...state.settings }
settings[id] = {
exportRetention,
snapshotRetention,
}
return {
...state,
editionMode: undefined,
schedules,
settings,
tmpSchedule: {},
}
schedules[id] = {
...schedules[id],
cron,
timezone,
}
settings[id] = {
...settings[id],
exportRetention,
copyRetention,
snapshotRetention,
}
return {
...state,
editionMode: undefined,
schedules,
settings,
tmpSchedule: {},
newSchedules: {
...state.newSchedules,
[id]: {
cron,
timezone,
exportRetention,
snapshotRetention,
},
},
}
},
setPowerState: (_, powerState) => state => ({
@@ -491,41 +500,55 @@ export default [
...state,
reportWhen: value,
}),
setConcurrency: (_, concurrency) => state => ({
...state,
concurrency,
}),
},
computed: {
needUpdateParams: (state, { job, schedules }) =>
job !== undefined && schedules !== undefined && !state.paramsUpdated,
isJobInvalid: state =>
state.name.trim() === '' ||
(isEmpty(state.schedules) && isEmpty(state.newSchedules)) ||
(isEmpty(state.vms) && !state.smartMode) ||
((state.backupMode || state.deltaMode) && isEmpty(state.remotes)) ||
((state.drMode || state.crMode) && isEmpty(state.srs)) ||
(state.exportMode && !state.exportRetentionExists) ||
(state.snapshotMode && !state.snapshotRetentionExists) ||
(!state.isDelta && !state.isFull && !state.snapshotMode),
showCompression: state => state.isFull && state.exportRetentionExists,
exportMode: state =>
state.backupMode || state.deltaMode || state.drMode || state.crMode,
exportRetentionExists: ({ newSchedules, settings }) =>
some(
{ ...newSchedules, ...settings },
({ exportRetention }) => exportRetention !== 0
),
snapshotRetentionExists: ({ newSchedules, settings }) =>
some(
{ ...newSchedules, ...settings },
({ snapshotRetention }) => snapshotRetention !== 0
),
state.missingName ||
state.missingVms ||
state.missingBackupMode ||
state.missingSchedules ||
state.missingRemotes ||
state.missingSrs ||
state.missingExportRetention ||
state.missingCopyRetention ||
state.missingSnapshotRetention,
missingName: state => state.name.trim() === '',
missingVms: state => isEmpty(state.vms) && !state.smartMode,
missingBackupMode: state =>
!state.isDelta && !state.isFull && !state.snapshotMode,
missingRemotes: state =>
(state.backupMode || state.deltaMode) && isEmpty(state.remotes),
missingSrs: state => (state.drMode || state.crMode) && isEmpty(state.srs),
missingSchedules: state => isEmpty(state.schedules),
missingExportRetention: state =>
state.exportMode && !state.exportRetentionExists,
missingCopyRetention: state =>
state.copyMode && !state.copyRetentionExists,
missingSnapshotRetention: state =>
state.snapshotMode && !state.snapshotRetentionExists,
showCompression: state =>
state.isFull &&
(state.exportRetentionExists || state.copyRetentionExists),
exportMode: state => state.backupMode || state.deltaMode,
copyMode: state => state.drMode || state.crMode,
exportRetentionExists: createDoesRetentionExist('exportRetention'),
copyRetentionExists: createDoesRetentionExist('copyRetention'),
snapshotRetentionExists: createDoesRetentionExist('snapshotRetention'),
isDelta: state => state.deltaMode || state.crMode,
isFull: state => state.backupMode || state.drMode,
vmsSmartPattern: ({ $pool, powerState, tags }) => ({
$pool: constructSmartPattern($pool, resolveIds),
power_state: powerState === 'All' ? undefined : powerState,
tags: constructSmartPattern(tags, normaliseTagValues),
tags: constructSmartPattern(tags, normalizeTagValues),
type: 'VM',
}),
srPredicate: ({ srs }) => ({ id }) => !includes(srs, id),
srPredicate: ({ srs }) => sr => isSrWritable(sr) && !includes(srs, sr.id),
remotePredicate: ({ remotes }) => ({ id }) => !includes(remotes, id),
},
}),
@@ -542,7 +565,7 @@ export default [
<Col mediumSize={6}>
<Card>
<CardHeader>
{_('backupName')}
{_('backupName')}*
<Tooltip content={_('smartBackupModeTitle')}>
<Toggle
className='pull-right'
@@ -557,7 +580,13 @@ export default [
<label>
<strong>{_('backupName')}</strong>
</label>
<Input onChange={effects.setName} value={state.name} />
<FormFeedback
component={Input}
message={_('missingBackupName')}
onChange={effects.setName}
error={state.showErrors ? state.missingName : undefined}
value={state.name}
/>
</FormGroup>
{state.smartMode ? (
<Upgrade place='newBackup' required={3}>
@@ -568,9 +597,12 @@ export default [
<label>
<strong>{_('vmsToBackup')}</strong>
</label>
<SelectVm
<FormFeedback
component={SelectVm}
message={_('missingVms')}
multi
onChange={effects.setVms}
error={state.showErrors ? state.missingVms : undefined}
value={state.vms}
/>
</FormGroup>
@@ -578,16 +610,21 @@ export default [
{state.showCompression && (
<label>
<input
type='checkbox'
onChange={effects.setCompression}
checked={state.compression}
name='compression'
onChange={effects.setCheckboxValue}
type='checkbox'
/>{' '}
<strong>{_('useCompression')}</strong>
</label>
)}
</CardBlock>
</Card>
<Card>
<FormFeedback
component={Card}
error={state.showErrors ? state.missingBackupMode : undefined}
message={_('missingBackupMode')}
>
<CardBlock>
<div className='text-xs-center'>
<ActionButton
@@ -655,7 +692,8 @@ export default [
)}
</div>
</CardBlock>
</Card>
</FormFeedback>
<br />
{(state.backupMode || state.deltaMode) && (
<Card>
<CardHeader>
@@ -666,9 +704,14 @@ export default [
<label>
<strong>{_('backupTargetRemotes')}</strong>
</label>
<SelectRemote
<FormFeedback
component={SelectRemote}
message={_('missingRemotes')}
onChange={effects.addRemote}
predicate={state.remotePredicate}
error={
state.showErrors ? state.missingRemotes : undefined
}
value={null}
/>
<br />
@@ -709,9 +752,12 @@ export default [
<label>
<strong>{_('backupTargetSrs')}</strong>
</label>
<SelectSr
<FormFeedback
component={SelectSr}
message={_('missingSrs')}
onChange={effects.addSr}
predicate={state.srPredicate}
error={state.showErrors ? state.missingSrs : undefined}
value={null}
/>
<br />
@@ -751,6 +797,26 @@ export default [
valueKey='value'
/>
</FormGroup>
<FormGroup>
<label>
<strong>{_('concurrency')}</strong>
</label>
<Number
onChange={effects.setConcurrency}
value={state.concurrency}
/>
</FormGroup>
<FormGroup>
<label>
<strong>{_('offlineSnapshot')}</strong>{' '}
<input
checked={state.offlineSnapshot}
name='offlineSnapshot'
onChange={effects.setCheckboxValue}
type='checkbox'
/>
</label>
</FormGroup>
</CardBlock>
</Card>
</Col>
@@ -764,11 +830,12 @@ export default [
{state.paramsUpdated ? (
<ActionButton
btnStyle='primary'
disabled={state.isJobInvalid}
form={state.formId}
handler={effects.editJob}
icon='save'
redirectOnSuccess='/backup-ng'
redirectOnSuccess={
state.isJobInvalid ? undefined : '/backup-ng'
}
size='large'
>
{_('formSave')}
@@ -776,11 +843,12 @@ export default [
) : (
<ActionButton
btnStyle='primary'
disabled={state.isJobInvalid}
form={state.formId}
handler={effects.createJob}
icon='save'
redirectOnSuccess='/backup-ng'
redirectOnSuccess={
state.isJobInvalid ? undefined : '/backup-ng'
}
size='large'
>
{_('formCreate')}

View File

@@ -1,62 +1,34 @@
import _ from 'intl'
import ActionButton from 'action-button'
import moment from 'moment-timezone'
import PropTypes from 'prop-types'
import React from 'react'
import Scheduler, { SchedulePreview } from 'scheduling'
import { Card, CardBlock } from 'card'
import { generateRandomId } from 'utils'
import { injectState, provideState } from '@julien-f/freactal'
import { isEqual } from 'lodash'
import { FormGroup, getRandomId, Input } from './utils'
const Number = [
provideState({
effects: {
onChange: (_, { target: { value } }) => (state, props) => {
if (value === '') {
return
}
props.onChange(+value)
},
},
}),
injectState,
({ effects, state, value }) => (
<Input
type='number'
onChange={effects.onChange}
value={String(value)}
min='0'
/>
),
].reduceRight((value, decorator) => decorator(value))
Number.propTypes = {
onChange: PropTypes.func.isRequired,
value: PropTypes.number.isRequired,
}
import { FormFeedback, FormGroup, Number } from './utils'
export default [
injectState,
provideState({
initialState: ({
copyMode,
exportMode,
snapshotMode,
schedule: {
cron = '0 0 * * *',
exportRetention = 1,
snapshotRetention = 1,
exportRetention = exportMode ? 1 : undefined,
copyRetention = copyMode ? 1 : undefined,
snapshotRetention = snapshotMode ? 1 : undefined,
timezone = moment.tz.guess(),
},
}) => ({
oldSchedule: {
cron,
exportRetention,
snapshotRetention,
timezone,
},
cron,
exportRetention,
formId: getRandomId(),
copyRetention,
formId: generateRandomId(),
snapshotRetention,
timezone,
}),
@@ -65,6 +37,10 @@ export default [
...state,
exportRetention: value,
}),
setCopyRetention: (_, value) => state => ({
...state,
copyRetention: value,
}),
setSnapshotRetention: (_, value) => state => ({
...state,
snapshotRetention: value,
@@ -81,34 +57,54 @@ export default [
retentionNeeded: ({
exportMode,
exportRetention,
copyMode,
copyRetention,
snapshotMode,
snapshotRetention,
}) =>
!(
(exportMode && exportRetention !== 0) ||
(snapshotMode && snapshotRetention !== 0)
(exportMode && exportRetention > 0) ||
(copyMode && copyRetention > 0) ||
(snapshotMode && snapshotRetention > 0)
),
scheduleNotEdited: ({
cron,
editionMode,
exportRetention,
oldSchedule,
snapshotRetention,
timezone,
}) =>
editionMode !== 'creation' &&
isEqual(oldSchedule, {
scheduleNotEdited: (
{
cron,
editionMode,
exportRetention,
copyRetention,
snapshotRetention,
timezone,
}),
},
{ schedule }
) =>
editionMode !== 'creation' &&
isEqual(
{
cron: schedule.cron,
exportRetention: schedule.exportRetention,
copyRetention: schedule.copyRetention,
snapshotRetention: schedule.snapshotRetention,
timezone: schedule.timezone,
},
{
cron,
exportRetention,
copyRetention,
snapshotRetention,
timezone,
}
),
},
}),
injectState,
({ effects, state }) => (
<form id={state.formId}>
<Card>
<FormFeedback
component={Card}
error={state.retentionNeeded}
message={_('retentionNeeded')}
>
<CardBlock>
{state.exportMode && (
<FormGroup>
@@ -118,6 +114,19 @@ export default [
<Number
onChange={effects.setExportRetention}
value={state.exportRetention}
optional
/>
</FormGroup>
)}
{state.copyMode && (
<FormGroup>
<label>
<strong>{_('copyRetention')}</strong>
</label>
<Number
onChange={effects.setCopyRetention}
value={state.copyRetention}
optional
/>
</FormGroup>
)}
@@ -129,6 +138,7 @@ export default [
<Number
onChange={effects.setSnapshotRetention}
value={state.snapshotRetention}
optional
/>
</FormGroup>
)}
@@ -143,6 +153,7 @@ export default [
btnStyle='primary'
data-cron={state.cron}
data-exportRetention={state.exportRetention}
data-copyRetention={state.copyRetention}
data-snapshotRetention={state.snapshotRetention}
data-timezone={state.timezone}
disabled={state.isScheduleInvalid}
@@ -162,7 +173,7 @@ export default [
{_('formCancel')}
</ActionButton>
</CardBlock>
</Card>
</FormFeedback>
</form>
),
].reduceRight((value, decorator) => decorator(value))

View File

@@ -2,115 +2,134 @@ import _ from 'intl'
import ActionButton from 'action-button'
import React from 'react'
import SortedTable from 'sorted-table'
import StateButton from 'state-button'
import { Card, CardBlock, CardHeader } from 'card'
import { injectState, provideState } from '@julien-f/freactal'
import { isEmpty, findKey, size } from 'lodash'
import { isEmpty, find, size } from 'lodash'
import NewSchedule from './new-schedule'
import { FormGroup } from './utils'
import { FormFeedback } from './utils'
// ===================================================================
const SCHEDULES_COLUMNS = [
{
itemRenderer: _ => _.cron,
sortCriteria: 'cron',
name: _('scheduleCron'),
},
{
itemRenderer: _ => _.timezone,
sortCriteria: 'timezone',
name: _('scheduleTimezone'),
},
{
itemRenderer: _ => _.exportRetention,
sortCriteria: _ => _.exportRetention,
name: _('scheduleExportRetention'),
},
{
itemRenderer: _ => _.snapshotRetention,
sortCriteria: _ => _.snapshotRetention,
name: _('scheduleSnapshotRetention'),
},
const FEEDBACK_ERRORS = [
'missingSchedules',
'missingCopyRetention',
'missingExportRetention',
'missingSnapshotRetention',
]
const SAVED_SCHEDULES_COLUMNS = [
{
itemRenderer: _ => _.name,
sortCriteria: 'name',
name: _('scheduleName'),
default: true,
},
...SCHEDULES_COLUMNS,
]
const rowTransform = (schedule, { settings }) => {
const { exportRetention, snapshotRetention } = settings[schedule.id] || {}
return {
...schedule,
exportRetention,
snapshotRetention,
}
}
const SAVED_SCHEDULES_INDIVIDUAL_ACTIONS = [
{
handler: (schedule, { editSchedule }) => editSchedule(schedule),
label: _('scheduleEdit'),
icon: 'edit',
disabled: (_, { disabledEdition }) => disabledEdition,
level: 'primary',
},
{
handler: (schedule, { deleteSchedule }) => deleteSchedule(schedule.id),
label: _('scheduleDelete'),
disabled: (_, { disabledDeletion }) => disabledDeletion,
icon: 'delete',
level: 'danger',
},
]
const NEW_SCHEDULES_INDIVIDUAL_ACTIONS = [
{
handler: (schedule, { editNewSchedule, newSchedules }) =>
editNewSchedule({
id: findKey(newSchedules, schedule),
...schedule,
}),
label: _('scheduleEdit'),
disabled: (_, { disabledEdition }) => disabledEdition,
icon: 'edit',
level: 'primary',
},
{
handler: (schedule, { deleteNewSchedule, newSchedules }) =>
deleteNewSchedule(findKey(newSchedules, schedule)),
label: _('scheduleDelete'),
icon: 'delete',
level: 'danger',
},
]
// ===================================================================
export default [
injectState,
provideState({
computed: {
disabledDeletion: state =>
state.schedules.length + size(state.newSchedules) <= 1,
disabledDeletion: state => size(state.schedules) <= 1,
disabledEdition: state =>
state.editionMode !== undefined ||
(!state.exportMode && !state.snapshotMode),
(!state.exportMode && !state.copyMode && !state.snapshotMode),
error: state => find(FEEDBACK_ERRORS, error => state[error]),
individualActions: (
{ disabledDeletion, disabledEdition },
{ effects: { deleteSchedule, editSchedule } }
) => [
{
disabled: disabledEdition,
handler: editSchedule,
icon: 'edit',
label: _('scheduleEdit'),
level: 'primary',
},
{
disabled: disabledDeletion,
handler: deleteSchedule,
icon: 'delete',
label: _('scheduleDelete'),
level: 'danger',
},
],
rowTransform: ({ settings }) => schedule => {
const { exportRetention, copyRetention, snapshotRetention } =
settings[schedule.id] || {}
return {
...schedule,
exportRetention,
copyRetention,
snapshotRetention,
}
},
schedulesColumns: (state, { effects: { toggleScheduleState } }) => {
const columns = [
{
itemRenderer: _ => _.name,
sortCriteria: 'name',
name: _('scheduleName'),
default: true,
},
{
itemRenderer: schedule => (
<StateButton
disabledLabel={_('stateDisabled')}
disabledTooltip={_('logIndicationToEnable')}
enabledLabel={_('stateEnabled')}
enabledTooltip={_('logIndicationToDisable')}
handler={toggleScheduleState}
handlerParam={schedule.id}
state={schedule.enabled}
/>
),
sortCriteria: 'enabled',
name: _('state'),
},
{
itemRenderer: _ => _.cron,
sortCriteria: 'cron',
name: _('scheduleCron'),
},
{
itemRenderer: _ => _.timezone,
sortCriteria: 'timezone',
name: _('scheduleTimezone'),
},
]
if (state.exportMode) {
columns.push({
itemRenderer: _ => _.exportRetention,
sortCriteria: _ => _.exportRetention,
name: _('scheduleExportRetention'),
})
}
if (state.copyMode) {
columns.push({
itemRenderer: _ => _.copyRetention,
sortCriteria: _ => _.copyRetention,
name: _('scheduleCopyRetention'),
})
}
if (state.snapshotMode) {
columns.push({
itemRenderer: _ => _.snapshotRetention,
sortCriteria: _ => _.snapshotRetention,
name: _('scheduleSnapshotRetention'),
})
}
return columns
},
},
}),
injectState,
({ effects, state }) => (
<div>
<Card>
<FormFeedback
component={Card}
error={state.showErrors ? state.error !== undefined : undefined}
message={state.error !== undefined && _(state.error)}
>
<CardHeader>
{_('backupSchedules')}
{_('backupSchedules')}*
<ActionButton
btnStyle='primary'
className='pull-right'
@@ -121,48 +140,25 @@ export default [
/>
</CardHeader>
<CardBlock>
{isEmpty(state.schedules) &&
isEmpty(state.newSchedules) && (
<p className='text-md-center'>{_('noSchedules')}</p>
)}
{!isEmpty(state.schedules) && (
<FormGroup>
<label>
<strong>{_('backupSavedSchedules')}</strong>
</label>
<SortedTable
collection={state.schedules}
columns={SAVED_SCHEDULES_COLUMNS}
data-deleteSchedule={effects.deleteSchedule}
data-disabledDeletion={state.disabledDeletion}
data-disabledEdition={state.disabledEdition}
data-editSchedule={effects.editSchedule}
data-settings={state.settings}
individualActions={SAVED_SCHEDULES_INDIVIDUAL_ACTIONS}
rowTransform={rowTransform}
/>
</FormGroup>
)}
{!isEmpty(state.newSchedules) && (
<FormGroup>
<label>
<strong>{_('backupNewSchedules')}</strong>
</label>
<SortedTable
collection={state.newSchedules}
columns={SCHEDULES_COLUMNS}
data-deleteNewSchedule={effects.deleteNewSchedule}
data-disabledEdition={state.disabledEdition}
data-editNewSchedule={effects.editNewSchedule}
data-newSchedules={state.newSchedules}
individualActions={NEW_SCHEDULES_INDIVIDUAL_ACTIONS}
/>
</FormGroup>
{isEmpty(state.schedules) ? (
<p className='text-md-center'>{_('noSchedules')}</p>
) : (
<SortedTable
collection={state.schedules}
columns={state.schedulesColumns}
individualActions={state.individualActions}
rowTransform={state.rowTransform}
/>
)}
</CardBlock>
</Card>
</FormFeedback>
{state.editionMode !== undefined && (
<NewSchedule schedule={state.tmpSchedule} />
<NewSchedule
copyMode={state.copyMode}
exportMode={state.exportMode}
schedule={state.tmpSchedule}
snapshotMode={state.snapshotMode}
/>
)}
</div>
),

Some files were not shown because too many files have changed in this diff Show More