Compare commits

..

1 Commits

Author SHA1 Message Date
florent
32478e470b FB proposition for react (WIP)
after a few try, I think it can work this way :

 * everything is in one store: xoApi, session, even component states. try to use boolean when possible to simplify usage in components
 * components states should rarely be used : let components be stateless, a pure function of their props
 * use connect on high level component to give them access to part of the store and the actions. A component should not have access to the full store and action.
 * use event to get data back from child components ( if they can't dispatch action themselves)
 * redux-thunk allow to dispatch multiple action in cascade (even async one) , like want to save->saving ->saved (or errored)
 * use imutable object, this will allow faster performance in the long run
2016-01-31 22:06:12 +01:00
972 changed files with 1295 additions and 149107 deletions

12
.babelrc Normal file
View File

@@ -0,0 +1,12 @@
{
"comments": false,
"compact": true,
"plugins": [
"transform-runtime"
],
"presets": [
"es2015",
"stage-0",
"react"
]
}

3
.bowerrc Normal file
View File

@@ -0,0 +1,3 @@
{
"directory": "dist/bower_components"
}

View File

@@ -11,7 +11,7 @@ root = true
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
trim_trailing_whitespaces = true
# CoffeeScript
#
@@ -28,12 +28,12 @@ indent_style = space
# Package.json
#
# This indentation style is the one used by npm.
[package.json]
[/package.json]
indent_size = 2
indent_style = space
# Pug (Jade)
[*.{jade,pug}]
# Jade
[*.jade]
indent_size = 2
indent_style = space
@@ -41,7 +41,7 @@ indent_style = space
#
# Two spaces seems to be the standard most common style, at least in
# Node.js (http://nodeguide.com/style.html#tabs-vs-spaces).
[*.{js,jsx,ts,tsx}]
[*.js]
indent_size = 2
indent_style = space

View File

@@ -1,27 +0,0 @@
module.exports = {
extends: ['standard', 'standard-jsx'],
globals: {
__DEV__: true,
$Dict: true,
$Diff: true,
$Exact: true,
$Keys: true,
$PropertyType: true,
$Shape: true,
},
parser: 'babel-eslint',
parserOptions: {
ecmaFeatures: {
legacyDecorators: true,
},
},
rules: {
'comma-dangle': ['error', 'always-multiline'],
indent: 'off',
'no-var': 'error',
'node/no-extraneous-import': 'error',
'node/no-extraneous-require': 'error',
'prefer-const': 'error',
'react/jsx-indent': 'off',
},
}

View File

@@ -1,16 +0,0 @@
[ignore]
<PROJECT_ROOT>/node_modules/.*
[include]
[libs]
[lints]
[options]
esproposal.decorators=ignore
esproposal.optional_chaining=enable
include_warnings=true
module.use_strict=true
[strict]

35
.gitignore vendored
View File

@@ -1,32 +1,9 @@
/_book/
/coverage/
/node_modules/
/lerna-debug.log
/lerna-debug.log.*
/@xen-orchestra/*/dist/
/@xen-orchestra/*/node_modules/
/packages/*/dist/
/packages/*/node_modules/
/packages/vhd-cli/src/commands/index.js
/packages/xen-api/examples/node_modules/
/packages/xen-api/plot.dat
/packages/xo-server/.xo-server.*
/packages/xo-server/src/api/index.js
/packages/xo-server/src/xapi/mixins/index.js
/packages/xo-server/src/xo-mixins/index.js
/packages/xo-server-auth-ldap/ldap.cache.conf
/packages/xo-web/src/common/intl/locales/index.js
/packages/xo-web/src/common/themes/index.js
/.nyc_output/
/bower_components/
/dist/
npm-debug.log
npm-debug.log.*
pnpm-debug.log
pnpm-debug.log.*
yarn-error.log
yarn-error.log.*
!node_modules/*
node_modules/*/

5
.mocha.js Normal file
View File

@@ -0,0 +1,5 @@
Error.stackTraceLimit = 100
try { require('trace') } catch (_) {}
try { require('clarify') } catch (_) {}
try { require('source-map-support/register') } catch (_) {}

1
.mocha.opts Normal file
View File

@@ -0,0 +1 @@
--require ./.mocha.js

View File

@@ -1,5 +0,0 @@
module.exports = {
semi: false,
singleQuote: true,
trailingComma: 'es5',
}

View File

@@ -1,24 +1,10 @@
language: node_js
node_js:
#- stable # disable for now due to an issue of indirect dep upath with Node 9
- 8
- 'stable'
- '4'
- '0.12'
- '0.10'
# Use containers.
# http://docs.travis-ci.com/user/workers/container-based-infrastructure/
sudo: false
addons:
apt:
packages:
- qemu-utils
- blktap-utils
- vmdk-stream-converter
before_install:
- curl -o- -L https://yarnpkg.com/install.sh | bash
- export PATH="$HOME/.yarn/bin:$PATH"
cache:
yarn: true
script:
- yarn run travis-tests

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,49 +0,0 @@
# @xen-orchestra/async-map [![Build Status](https://travis-ci.org/vatesfr/xen-orchestra.png?branch=master)](https://travis-ci.org/vatesfr/xen-orchestra)
> ${pkg.description}
## Install
Installation of the [npm package](https://npmjs.org/package/@xen-orchestra/async-map):
```
> npm install --save @xen-orchestra/async-map
```
## Usage
**TODO**
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](https://github.com/vatesfr/xen-orchestra/issues)
you've encountered;
- fork and create a pull request.
## License
ISC © [Vates SAS](https://vates.fr)

View File

@@ -1,50 +0,0 @@
{
"name": "@xen-orchestra/async-map",
"version": "0.0.0",
"license": "ISC",
"description": "",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/async-map",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@isonoe.net"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=6"
},
"dependencies": {
"lodash": "^4.17.4"
},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"@babel/preset-flow": "^7.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run prebuild",
"prepare": "yarn run build",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,43 +0,0 @@
// type MaybePromise<T> = Promise<T> | T
//
// declare export function asyncMap<T1, T2>(
// collection: MaybePromise<T1[]>,
// (T1, number) => MaybePromise<T2>
// ): Promise<T2[]>
// declare export function asyncMap<K, V1, V2>(
// collection: MaybePromise<{ [K]: V1 }>,
// (V1, K) => MaybePromise<V2>
// ): Promise<V2[]>
import map from 'lodash/map'
// Similar to map() + Promise.all() but wait for all promises to
// settle before rejecting (with the first error)
const asyncMap = (collection, iteratee) => {
let then
if (collection != null && typeof (then = collection.then) === 'function') {
return then.call(collection, collection => asyncMap(collection, iteratee))
}
let errorContainer
const onError = error => {
if (errorContainer === undefined) {
errorContainer = { error }
}
}
return Promise.all(
map(collection, (item, key, collection) =>
new Promise(resolve => {
resolve(iteratee(item, key, collection))
}).catch(onError)
)
).then(values => {
if (errorContainer !== undefined) {
throw errorContainer.error
}
return values
})
}
export { asyncMap as default }

View File

@@ -1,67 +0,0 @@
'use strict'
const PLUGINS_RE = /^(?:@babel\/|babel-)plugin-.+$/
const PRESETS_RE = /^@babel\/preset-.+$/
const NODE_ENV = process.env.NODE_ENV || 'development'
const __PROD__ = NODE_ENV === 'production'
const __TEST__ = NODE_ENV === 'test'
const configs = {
'@babel/plugin-proposal-decorators': {
legacy: true,
},
'@babel/plugin-proposal-pipeline-operator': {
proposal: 'minimal',
},
'@babel/preset-env' (pkg) {
return {
debug: !__TEST__,
// disabled until https://github.com/babel/babel/issues/8323 is resolved
// loose: true,
shippedProposals: true,
targets: (() => {
let node = (pkg.engines || {}).node
if (node !== undefined) {
const trimChars = '^=>~'
while (trimChars.includes(node[0])) {
node = node.slice(1)
}
}
return { browsers: pkg.browserslist, node }
})(),
useBuiltIns: '@babel/polyfill' in (pkg.dependencies || {}) && 'usage',
}
},
}
const getConfig = (key, ...args) => {
const config = configs[key]
return config === undefined
? {}
: typeof config === 'function'
? config(...args)
: config
}
module.exports = function (pkg, plugins, presets) {
plugins === undefined && (plugins = {})
presets === undefined && (presets = {})
Object.keys(pkg.devDependencies || {}).forEach(name => {
if (!(name in presets) && PLUGINS_RE.test(name)) {
plugins[name] = getConfig(name, pkg)
} else if (!(name in presets) && PRESETS_RE.test(name)) {
presets[name] = getConfig(name, pkg)
}
})
return {
comments: !__PROD__,
ignore: __TEST__ ? undefined : [/\.spec\.js$/],
plugins: Object.keys(plugins).map(plugin => [plugin, plugins[plugin]]),
presets: Object.keys(presets).map(preset => [preset, presets[preset]]),
}
}

View File

@@ -1,11 +0,0 @@
{
"private": true,
"name": "@xen-orchestra/babel-config",
"version": "0.0.0",
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/babel-config",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
}
}

View File

@@ -1,117 +0,0 @@
#!/usr/bin/env node
const defer = require('golike-defer').default
const { NULL_REF, Xapi } = require('xen-api')
const pkg = require('./package.json')
Xapi.prototype.getVmDisks = async function (vm) {
const disks = { __proto__: null }
await Promise.all([
...vm.VBDs.map(async vbdRef => {
const vbd = await this.getRecord('VBD', vbdRef)
let vdiRef
if (vbd.type === 'Disk' && (vdiRef = vbd.VDI) !== NULL_REF) {
disks[vbd.userdevice] = await this.getRecord('VDI', vdiRef)
}
}),
])
return disks
}
defer(async function main ($defer, args) {
if (args.length === 0 || args.includes('-h') || args.includes('--help')) {
const cliName = Object.keys(pkg.bin)[0]
return console.error(
'%s',
`
Usage: ${cliName} <source XAPI URL> <source snapshot UUID> <target XAPI URL> <target VM UUID> <backup job id> <backup schedule id>
${cliName} v${pkg.version}
`
)
}
const [
srcXapiUrl,
srcSnapshotUuid,
tgtXapiUrl,
tgtVmUuid,
jobId,
scheduleId,
] = args
const srcXapi = new Xapi({
allowUnauthorized: true,
url: srcXapiUrl,
watchEvents: false,
})
await srcXapi.connect()
defer.call(srcXapi, 'disconnect')
const tgtXapi = new Xapi({
allowUnauthorized: true,
url: tgtXapiUrl,
watchEvents: false,
})
await tgtXapi.connect()
defer.call(tgtXapi, 'disconnect')
const [srcSnapshot, tgtVm] = await Promise.all([
srcXapi.getRecordByUuid('VM', srcSnapshotUuid),
tgtXapi.getRecordByUuid('VM', tgtVmUuid),
])
const srcVm = await srcXapi.getRecord('VM', srcSnapshot.snapshot_of)
const metadata = {
'xo:backup:datetime': srcSnapshot.snapshot_time,
'xo:backup:job': jobId,
'xo:backup:schedule': scheduleId,
'xo:backup:vm': srcVm.uuid,
}
const [srcDisks, tgtDisks] = await Promise.all([
srcXapi.getVmDisks(srcSnapshot),
tgtXapi.getVmDisks(tgtVm),
])
const userDevices = Object.keys(tgtDisks)
const tgtSr = await tgtXapi.getRecord(
'SR',
tgtDisks[Object.keys(tgtDisks)[0]].SR
)
await Promise.all([
srcXapi.setFieldEntries(srcSnapshot, 'other_config', metadata),
srcXapi.setFieldEntries(srcSnapshot, 'other_config', {
'xo:backup:exported': 'true',
}),
tgtXapi.setField(
tgtVm,
'name_label',
`${srcVm.name_label} (${srcSnapshot.snapshot_time})`
),
tgtXapi.setFieldEntries(tgtVm, 'other_config', metadata),
tgtXapi.setFieldEntries(tgtVm, 'other_config', {
'xo:backup:sr': tgtSr.uuid,
'xo:copy_of': srcSnapshotUuid,
}),
tgtXapi.setFieldEntries(tgtVm, 'blocked_operations', {
start:
'Start operation for this vm is blocked, clone it if you want to use it.',
}),
Promise.all(
userDevices.map(userDevice => {
const srcDisk = srcDisks[userDevice]
const tgtDisk = tgtDisks[userDevice]
return tgtXapi.setFieldEntry(
tgtDisk,
'other_config',
'xo:copy_of',
srcDisk.uuid
)
})
),
])
})(process.argv.slice(2)).catch(console.error.bind(console, 'Fatal error:'))

View File

@@ -1,20 +0,0 @@
{
"name": "@xen-orchestra/cr-seed-cli",
"version": "0.2.0",
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/cr-seed-cli",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"engines": {
"node": ">=8"
},
"bin": {
"xo-cr-seed": "./index.js"
},
"dependencies": {
"golike-defer": "^0.4.1",
"xen-api": "^0.20.0"
}
}

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,145 +0,0 @@
# @xen-orchestra/cron [![Build Status](https://travis-ci.org/vatesfr/xen-orchestra.png?branch=master)](https://travis-ci.org/vatesfr/xen-orchestra)
> Focused, well maintained, cron parser/scheduler
## Install
Installation of the [npm package](https://npmjs.org/package/@xen-orchestra/cron):
```
> npm install --save @xen-orchestra/cron
```
### Pattern syntax
```
<minute> <hour> <day of month> <month> <day of week>
```
Each entry can be:
- a single value
- a range (`0-23` or `*/2`)
- a list of values/ranges (`1,8-12`)
A wildcard (`*`) can be used as a shortcut for the whole range
(`first-last`).
Step values can be used in conjunctions with ranges. For instance,
`1-7/2` is the same as `1,3,5,7`.
| Field | Allowed values |
|------------------|----------------|
| minute | 0-59 |
| hour | 0-23 |
| day of the month | 1-31 or 3-letter names (`jan`, `feb`, …) |
| month | 0-11 |
| day of week | 0-7 (0 and 7 both mean Sunday) or 3-letter names (`mon`, `tue`, …) |
> Note: the month range is 0-11 to be compatible with
> [cron](https://github.com/kelektiv/node-cron), it does not appear to
> be very standard though.
### API
`createSchedule(pattern: string, zone: string = 'utc'): Schedule`
> Create a new schedule.
- `pattern`: the pattern to use, see [the syntax](#pattern-syntax)
- `zone`: the timezone to use, use `'local'` for the local timezone
```js
import { createSchedule } from '@xen-orchestra/cron'
const schedule = createSchedule('0 0 * * sun', 'America/New_York')
```
`Schedule#createJob(fn: Function): Job`
> Create a new job from this schedule.
- `fn`: function to execute, if it returns a promise, it will be
awaited before scheduling the next run.
```js
const job = schedule.createJob(() => {
console.log(new Date())
})
```
`Schedule#next(n: number): Array<Date>`
> Returns the next dates matching this schedule.
- `n`: number of dates to return
```js
schedule.next(2)
// [ 2018-02-11T05:00:00.000Z, 2018-02-18T05:00:00.000Z ]
```
`Schedule#startJob(fn: Function): () => void`
> Start a new job from this schedule and return a function to stop it.
- `fn`: function to execute, if it returns a promise, it will be
awaited before scheduling the next run.
```js
const stopJob = schedule.startJob(() => {
console.log(new Date())
})
stopJob()
```
`Job#start(): void`
> Start this job.
```js
job.start()
```
`Job#stop(): void`
> Stop this job.
```js
job.stop()
```
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](https://github.com/vatesfr/xen-orchestra/issues)
you've encountered;
- fork and create a pull request.
## License
ISC © [Vates SAS](https://vates.fr)

View File

@@ -1,59 +0,0 @@
{
"name": "@xen-orchestra/cron",
"version": "1.0.3",
"license": "ISC",
"description": "Focused, well maintained, cron parser/scheduler",
"keywords": [
"cron",
"cronjob",
"crontab",
"job",
"parser",
"pattern",
"schedule",
"scheduling",
"task"
],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/cron",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@isonoe.net"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=6"
},
"dependencies": {
"lodash": "^4.17.4",
"moment-timezone": "^0.5.14"
},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"@babel/preset-flow": "^7.0.0",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run clean",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,83 +0,0 @@
import moment from 'moment-timezone'
import next from './next'
import parse from './parse'
const MAX_DELAY = 2 ** 31 - 1
class Job {
constructor (schedule, fn) {
const wrapper = () => {
let result
try {
result = fn()
} catch (_) {
// catch any thrown value to ensure it does not break the job
}
let then
if (result != null && typeof (then = result.then) === 'function') {
then.call(result, scheduleNext, scheduleNext)
} else {
scheduleNext()
}
}
const scheduleNext = () => {
const delay = schedule._nextDelay()
this._timeout =
delay < MAX_DELAY
? setTimeout(wrapper, delay)
: setTimeout(scheduleNext, MAX_DELAY)
}
this._scheduleNext = scheduleNext
this._timeout = undefined
}
start () {
this.stop()
this._scheduleNext()
}
stop () {
clearTimeout(this._timeout)
}
}
class Schedule {
constructor (pattern, zone = 'utc') {
this._schedule = parse(pattern)
this._createDate =
zone.toLowerCase() === 'utc'
? moment.utc
: zone === 'local'
? moment
: () => moment.tz(zone)
}
createJob (fn) {
return new Job(this, fn)
}
next (n) {
const dates = new Array(n)
const schedule = this._schedule
let date = this._createDate()
for (let i = 0; i < n; ++i) {
dates[i] = (date = next(schedule, date)).toDate()
}
return dates
}
_nextDelay () {
const now = this._createDate()
return next(this._schedule, now) - now
}
startJob (fn) {
const job = this.createJob(fn)
job.start()
return job.stop.bind(job)
}
}
export const createSchedule = (...args) => new Schedule(...args)

View File

@@ -1,89 +0,0 @@
import moment from 'moment-timezone'
import sortedIndex from 'lodash/sortedIndex'
const NEXT_MAPPING = {
month: { year: 1 },
date: { month: 1 },
day: { week: 1 },
hour: { day: 1 },
minute: { hour: 1 },
}
const getFirst = values => (values !== undefined ? values[0] : 0)
const setFirstAvailable = (date, unit, values) => {
if (values === undefined) {
return
}
const curr = date.get(unit)
const next = values[sortedIndex(values, curr) % values.length]
if (curr === next) {
return
}
const timestamp = +date
date.set(unit, next)
if (timestamp > +date) {
date.add(NEXT_MAPPING[unit])
}
return true
}
// returns the next run, after the passed date
export default (schedule, fromDate) => {
let date = moment(fromDate)
.set({
second: 0,
millisecond: 0,
})
.add({ minute: 1 })
const { minute, hour, dayOfMonth, month, dayOfWeek } = schedule
setFirstAvailable(date, 'minute', minute)
if (setFirstAvailable(date, 'hour', hour)) {
date.set('minute', getFirst(minute))
}
let loop
let i = 0
do {
loop = false
if (setFirstAvailable(date, 'month', month)) {
date.set({
date: 1,
hour: getFirst(hour),
minute: getFirst(minute),
})
}
let newDate = date.clone()
if (dayOfMonth === undefined) {
if (dayOfWeek !== undefined) {
setFirstAvailable(newDate, 'day', dayOfWeek)
}
} else if (dayOfWeek === undefined) {
setFirstAvailable(newDate, 'date', dayOfMonth)
} else {
const dateDay = newDate.clone()
setFirstAvailable(dateDay, 'date', dayOfMonth)
setFirstAvailable(newDate, 'day', dayOfWeek)
newDate = moment.min(dateDay, newDate)
}
if (+date !== +newDate) {
loop = date.month() !== newDate.month()
date = newDate.set({
hour: getFirst(hour),
minute: getFirst(minute),
})
}
} while (loop && ++i < 5)
if (loop) {
throw new Error('no solutions found for this schedule')
}
return date
}

View File

@@ -1,48 +0,0 @@
/* eslint-env jest */
import mapValues from 'lodash/mapValues'
import moment from 'moment-timezone'
import next from './next'
import parse from './parse'
const N = (pattern, fromDate = '2018-04-09T06:25') => {
const iso = next(parse(pattern), moment.utc(fromDate)).toISOString()
return iso.slice(0, iso.lastIndexOf(':'))
}
describe('next()', () => {
mapValues(
{
minutely: ['* * * * *', '2018-04-09T06:26'],
hourly: ['@hourly', '2018-04-09T07:00'],
daily: ['@daily', '2018-04-10T00:00'],
monthly: ['@monthly', '2018-05-01T00:00'],
yearly: ['@yearly', '2019-01-01T00:00'],
weekly: ['@weekly', '2018-04-15T00:00'],
},
([pattern, result], title) =>
it(title, () => {
expect(N(pattern)).toBe(result)
})
)
it('select first between month-day and week-day', () => {
expect(N('* * 10 * wen')).toBe('2018-04-10T00:00')
expect(N('* * 12 * wen')).toBe('2018-04-11T00:00')
})
it('select the last available day of a month', () => {
expect(N('* * 29 feb *')).toBe('2020-02-29T00:00')
})
it('fails when no solutions has been found', () => {
expect(() => N('0 0 30 feb *')).toThrow(
'no solutions found for this schedule'
)
})
it('select the first sunday of the month', () => {
expect(N('* * * * 0', '2018-03-31T00:00')).toBe('2018-04-01T00:00')
})
})

View File

@@ -1,193 +0,0 @@
const compareNumbers = (a, b) => a - b
const createParser = ({ fields: [...fields], presets: { ...presets } }) => {
const m = fields.length
for (let j = 0; j < m; ++j) {
const field = fields[j]
let { aliases } = field
if (aliases !== undefined) {
let symbols = aliases
if (Array.isArray(aliases)) {
aliases = {}
const [start] = field.range
symbols.forEach((alias, i) => {
aliases[alias] = start + i
})
} else {
symbols = Object.keys(aliases)
}
fields[j] = {
...field,
aliases,
aliasesRegExp: new RegExp(symbols.join('|'), 'y'),
}
}
}
let field, i, n, pattern, schedule, values
const isDigit = c => c >= '0' && c <= '9'
const match = c => pattern[i] === c && (++i, true)
const consumeWhitespaces = () => {
let c
while ((c = pattern[i]) === ' ' || c === '\t') {
++i
}
}
const parseInteger = () => {
let c
const digits = []
while (isDigit((c = pattern[i]))) {
++i
digits.push(c)
}
if (digits.length === 0) {
throw new SyntaxError(`${field.name}: missing integer at character ${i}`)
}
return Number.parseInt(digits.join(''), 10)
}
const parseValue = () => {
let value
const { aliasesRegExp } = field
if (aliasesRegExp === undefined || isDigit(pattern[i])) {
value = parseInteger()
const { post } = field
if (post !== undefined) {
value = post(value)
}
} else {
aliasesRegExp.lastIndex = i
const matches = aliasesRegExp.exec(pattern)
if (matches === null) {
throw new SyntaxError(
`${field.name}: missing alias or integer at character ${i}`
)
}
const [alias] = matches
i += alias.length
value = field.aliases[alias]
}
const { range } = field
if (value < range[0] || value > range[1]) {
throw new SyntaxError(
`${field.name}: ${value} is not between ${range[0]} and ${range[1]}`
)
}
return value
}
const parseRange = () => {
let end, start, step
if (match('*')) {
if (!match('/')) {
return
}
;[start, end] = field.range
step = parseInteger()
} else {
start = parseValue()
if (!match('-')) {
values.add(start)
return
}
end = parseValue()
step = match('/') ? parseInteger() : 1
}
for (let i = start; i <= end; i += step) {
values.add(i)
}
}
const parseSequence = () => {
do {
parseRange()
} while (match(','))
}
const parse = p => {
{
const schedule = presets[p]
if (schedule !== undefined) {
return typeof schedule === 'string'
? (presets[p] = parse(schedule))
: schedule
}
}
try {
i = 0
n = p.length
pattern = p
schedule = {}
for (let j = 0; j < m; ++j) {
consumeWhitespaces()
field = fields[j]
values = new Set()
parseSequence()
if (values.size !== 0) {
schedule[field.name] = Array.from(values).sort(compareNumbers)
}
}
consumeWhitespaces()
if (i !== n) {
throw new SyntaxError(
`unexpected character at offset ${i}, expected end`
)
}
return schedule
} finally {
field = pattern = schedule = values = undefined
}
}
return parse
}
export default createParser({
fields: [
{
name: 'minute',
range: [0, 59],
},
{
name: 'hour',
range: [0, 23],
},
{
name: 'dayOfMonth',
range: [1, 31],
},
{
aliases: 'jan feb mar apr may jun jul aug sep oct nov dec'.split(' '),
name: 'month',
range: [0, 11],
},
{
aliases: 'sun mon tue wen thu fri sat'.split(' '),
name: 'dayOfWeek',
post: value => (value === 7 ? 0 : value),
range: [0, 6],
},
],
presets: {
'@annually': '0 0 1 jan *',
'@daily': '0 0 * * *',
'@hourly': '0 * * * *',
'@monthly': '0 0 1 * *',
'@weekly': '0 0 * * sun',
'@yearly': '0 0 1 jan *',
},
})

View File

@@ -1,49 +0,0 @@
/* eslint-env jest */
import parse from './parse'
describe('parse()', () => {
it('works', () => {
expect(parse('0 0-10 */10 jan,2,4-11/3 *')).toEqual({
minute: [0],
hour: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
dayOfMonth: [1, 11, 21, 31],
month: [0, 2, 4, 7, 10],
})
})
it('correctly parse months', () => {
expect(parse('* * * 0,11 *')).toEqual({
month: [0, 11],
})
expect(parse('* * * jan,dec *')).toEqual({
month: [0, 11],
})
})
it('correctly parse days', () => {
expect(parse('* * * * mon,sun')).toEqual({
dayOfWeek: [0, 1],
})
})
it('reports missing integer', () => {
expect(() => parse('*/a')).toThrow('minute: missing integer at character 2')
expect(() => parse('*')).toThrow('hour: missing integer at character 1')
})
it('reports invalid aliases', () => {
expect(() => parse('* * * jan-foo *')).toThrow(
'month: missing alias or integer at character 10'
)
})
it('dayOfWeek: 0 and 7 bind to sunday', () => {
expect(parse('* * * * 0')).toEqual({
dayOfWeek: [0],
})
expect(parse('* * * * 7')).toEqual({
dayOfWeek: [0],
})
})
})

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,49 +0,0 @@
# ${pkg.name} [![Build Status](https://travis-ci.org/${pkg.shortGitHubPath}.png?branch=master)](https://travis-ci.org/${pkg.shortGitHubPath})
> ${pkg.description}
## Install
Installation of the [npm package](https://npmjs.org/package/${pkg.name}):
```
> npm install --save ${pkg.name}
```
## Usage
**TODO**
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](${pkg.bugs})
you've encountered;
- fork and create a pull request.
## License
${pkg.license} © [${pkg.author.name}](${pkg.author.url})

View File

@@ -1,47 +0,0 @@
{
"name": "@xen-orchestra/defined",
"version": "0.0.0",
"license": "ISC",
"description": "",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/defined",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@vates.fr"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=6"
},
"dependencies": {},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"@babel/preset-flow": "^7.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run prebuild",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,64 +0,0 @@
// @flow
// Usage:
//
// ```js
// const httpProxy = defined(
// process.env.HTTP_PROXY,
// process.env.http_proxy
// )
//
// const httpProxy = defined([
// process.env.HTTP_PROXY,
// process.env.http_proxy
// ])
// ```
export default function defined () {
let args = arguments
let n = args.length
if (n === 1) {
args = arguments[0]
n = args.length
}
for (let i = 0; i < n; ++i) {
let arg = arguments[i]
if (typeof arg === 'function') {
arg = get(arg)
}
if (arg !== undefined) {
return arg
}
}
}
// Usage:
//
// ```js
// const friendName = get(() => props.user.friends[0].name)
//
// // this form can be used to avoid recreating functions:
// const getFriendName = _ => _.friends[0].name
// const friendName = get(getFriendName, props.user)
// ```
export const get = (accessor: (input: ?any) => any, arg: ?any) => {
try {
return accessor(arg)
} catch (error) {
if (!(error instanceof TypeError)) {
// avoid hiding other errors
throw error
}
}
}
// Usage:
//
// ```js
// const httpAgent = ifDef(
// process.env.HTTP_PROXY,
// _ => new ProxyAgent(_)
// )
// ```
export const ifDef = (value: ?any, thenFn: (value: any) => any) =>
value !== undefined ? thenFn(value) : value

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,71 +0,0 @@
# @xen-orchestra/emit-async [![Build Status](https://travis-ci.org/${pkg.shortGitHubPath}.png?branch=master)](https://travis-ci.org/${pkg.shortGitHubPath})
> ${pkg.description}
## Install
Installation of the [npm package](https://npmjs.org/package/@xen-orchestra/emit-async):
```
> npm install --save @xen-orchestra/emit-async
```
## Usage
```js
import EE from 'events'
import emitAsync from '@xen-orchestra/emit-async'
const ee = new EE()
ee.emitAsync = emitAsync
ee.on('start', async function () {
// whatever
})
// similar to EventEmmiter#emit() but returns a promise which resolves when all
// listeners have resolved
await ee.emitAsync('start')
// by default, it will rejects as soon as one listener reject, you can customise
// error handling though:
await ee.emitAsync({
onError (error) {
console.warn(error)
}
}, 'start')
```
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](${pkg.bugs})
you've encountered;
- fork and create a pull request.
## License
${pkg.license} © [${pkg.author.name}](${pkg.author.url})

View File

@@ -1,46 +0,0 @@
{
"name": "@xen-orchestra/emit-async",
"version": "0.0.0",
"license": "ISC",
"description": "",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/emit-async",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@vates.fr"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=6"
},
"dependencies": {},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run prebuild",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,26 +0,0 @@
export default function emitAsync (event) {
let opts
let i = 1
// an option object has been passed as first param
if (typeof event !== 'string') {
opts = event
event = arguments[i++]
}
const n = arguments.length - i
const args = new Array(n)
for (let j = 0; j < n; ++j) {
args[j] = arguments[j + i]
}
const onError = opts != null && opts.onError
return Promise.all(
this.listeners(event).map(listener =>
new Promise(resolve => {
resolve(listener.apply(this, args))
}).catch(onError)
)
)
}

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,52 +0,0 @@
{
"name": "@xen-orchestra/fs",
"version": "0.4.1",
"license": "AGPL-3.0",
"description": "The File System for Xen Orchestra backups.",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/fs",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"preferGlobal": true,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"engines": {
"node": ">=6"
},
"dependencies": {
"@marsaud/smb2": "^0.9.0",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",
"get-stream": "^4.0.0",
"lodash": "^4.17.4",
"promise-toolbox": "^0.11.0",
"through2": "^2.0.3",
"tmp": "^0.0.33",
"xo-remote-parser": "^0.5.0"
},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/plugin-proposal-function-bind": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"@babel/preset-flow": "^7.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"index-modules": "^0.3.0",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run clean",
"prepare": "yarn run build"
}
}

View File

@@ -1,324 +0,0 @@
// @flow
import getStream from 'get-stream'
import { randomBytes } from 'crypto'
import { fromCallback, fromEvent, ignoreErrors, timeout } from 'promise-toolbox'
import { type Readable, type Writable } from 'stream'
import { parse } from 'xo-remote-parser'
import { createChecksumStream, validChecksumOfReadStream } from './checksum'
type Data = Buffer | Readable | string
type FileDescriptor = {| fd: mixed, path: string |}
type LaxReadable = Readable & Object
type LaxWritable = Writable & Object
type File = FileDescriptor | string
const checksumFile = file => file + '.checksum'
const DEFAULT_TIMEOUT = 6e5 // 10 min
export default class RemoteHandlerAbstract {
_remote: Object
_timeout: number
constructor (remote: any, options: Object = {}) {
if (remote.url === 'test://') {
this._remote = remote
} else {
this._remote = { ...remote, ...parse(remote.url) }
if (this._remote.type !== this.type) {
throw new Error('Incorrect remote type')
}
}
;({ timeout: this._timeout = DEFAULT_TIMEOUT } = options)
}
get type (): string {
throw new Error('Not implemented')
}
/**
* Asks the handler to sync the state of the effective remote with its' metadata
*/
async sync (): Promise<mixed> {
return this._sync()
}
async _sync (): Promise<mixed> {
throw new Error('Not implemented')
}
/**
* Free the resources possibly dedicated to put the remote at work, when it is no more needed
*/
async forget (): Promise<void> {
await this._forget()
}
async _forget (): Promise<void> {
throw new Error('Not implemented')
}
async test (): Promise<Object> {
const testFileName = `${Date.now()}.test`
const data = await fromCallback(cb => randomBytes(1024 * 1024, cb))
let step = 'write'
try {
await this.outputFile(testFileName, data)
step = 'read'
const read = await this.readFile(testFileName)
if (data.compare(read) !== 0) {
throw new Error('output and input did not match')
}
return {
success: true,
}
} catch (error) {
return {
success: false,
step,
file: testFileName,
error: error.message || String(error),
}
} finally {
ignoreErrors.call(this.unlink(testFileName))
}
}
async outputFile (file: string, data: Data, options?: Object): Promise<void> {
return this._outputFile(file, data, {
flags: 'wx',
...options,
})
}
async _outputFile (file: string, data: Data, options?: Object): Promise<void> {
const stream = await this.createOutputStream(file, options)
const promise = fromEvent(stream, 'finish')
stream.end(data)
await promise
}
async read (
file: File,
buffer: Buffer,
position?: number
): Promise<{| bytesRead: number, buffer: Buffer |}> {
return this._read(file, buffer, position)
}
_read (
file: File,
buffer: Buffer,
position?: number
): Promise<{| bytesRead: number, buffer: Buffer |}> {
throw new Error('Not implemented')
}
async readFile (file: string, options?: Object): Promise<Buffer> {
return this._readFile(file, options)
}
_readFile (file: string, options?: Object): Promise<Buffer> {
return this.createReadStream(file, options).then(getStream.buffer)
}
async rename (
oldPath: string,
newPath: string,
{ checksum = false }: Object = {}
) {
let p = timeout.call(this._rename(oldPath, newPath), this._timeout)
if (checksum) {
p = Promise.all([
p,
this._rename(checksumFile(oldPath), checksumFile(newPath)),
])
}
return p
}
async _rename (oldPath: string, newPath: string) {
throw new Error('Not implemented')
}
async list (
dir: string = '.',
{
filter,
prependDir = false,
}: { filter?: (name: string) => boolean, prependDir?: boolean } = {}
): Promise<string[]> {
let entries = await timeout.call(this._list(dir), this._timeout)
if (filter !== undefined) {
entries = entries.filter(filter)
}
if (prependDir) {
entries.forEach((entry, i) => {
entries[i] = dir + '/' + entry
})
}
return entries
}
async _list (dir: string): Promise<string[]> {
throw new Error('Not implemented')
}
createReadStream (
file: string,
{ checksum = false, ignoreMissingChecksum = false, ...options }: Object = {}
): Promise<LaxReadable> {
const path = typeof file === 'string' ? file : file.path
const streamP = timeout
.call(this._createReadStream(file, options), this._timeout)
.then(stream => {
// detect early errors
let promise = fromEvent(stream, 'readable')
// try to add the length prop if missing and not a range stream
if (
stream.length === undefined &&
options.end === undefined &&
options.start === undefined
) {
promise = Promise.all([
promise,
ignoreErrors.call(
this.getSize(file).then(size => {
stream.length = size
})
),
])
}
return promise.then(() => stream)
})
if (!checksum) {
return streamP
}
// avoid a unhandled rejection warning
ignoreErrors.call(streamP)
return this.readFile(checksumFile(path)).then(
checksum =>
streamP.then(stream => {
const { length } = stream
stream = (validChecksumOfReadStream(
stream,
String(checksum).trim()
): LaxReadable)
stream.length = length
return stream
}),
error => {
if (ignoreMissingChecksum && error && error.code === 'ENOENT') {
return streamP
}
throw error
}
)
}
async _createReadStream (
file: string,
options?: Object
): Promise<LaxReadable> {
throw new Error('Not implemented')
}
async openFile (path: string, flags?: string): Promise<FileDescriptor> {
return {
fd: await timeout.call(this._openFile(path, flags), this._timeout),
path,
}
}
async _openFile (path: string, flags?: string): Promise<mixed> {
throw new Error('Not implemented')
}
async closeFile (fd: FileDescriptor): Promise<void> {
await timeout.call(this._closeFile(fd.fd), this._timeout)
}
async _closeFile (fd: mixed): Promise<void> {
throw new Error('Not implemented')
}
async refreshChecksum (path: string): Promise<void> {
const stream = (await this.createReadStream(path)).pipe(
createChecksumStream()
)
stream.resume() // start reading the whole file
await this.outputFile(checksumFile(path), await stream.checksum)
}
async createOutputStream (
file: File,
{ checksum = false, ...options }: Object = {}
): Promise<LaxWritable> {
const path = typeof file === 'string' ? file : file.path
const streamP = timeout.call(
this._createOutputStream(file, {
flags: 'wx',
...options,
}),
this._timeout
)
if (!checksum) {
return streamP
}
const checksumStream = createChecksumStream()
const forwardError = error => {
checksumStream.emit('error', error)
}
const stream = await streamP
stream.on('error', forwardError)
checksumStream.pipe(stream)
// $FlowFixMe
checksumStream.checksumWritten = checksumStream.checksum
.then(value => this.outputFile(checksumFile(path), value))
.catch(forwardError)
return checksumStream
}
async _createOutputStream (
file: mixed,
options?: Object
): Promise<LaxWritable> {
throw new Error('Not implemented')
}
async unlink (file: string, { checksum = true }: Object = {}): Promise<void> {
if (checksum) {
ignoreErrors.call(this._unlink(checksumFile(file)))
}
await timeout.call(this._unlink(file), this._timeout)
}
async _unlink (file: mixed): Promise<void> {
throw new Error('Not implemented')
}
async getSize (file: mixed): Promise<number> {
return timeout.call(this._getSize(file), this._timeout)
}
async _getSize (file: mixed): Promise<number> {
throw new Error('Not implemented')
}
}

View File

@@ -1,113 +0,0 @@
/* eslint-env jest */
import { TimeoutError } from 'promise-toolbox'
import AbstractHandler from './abstract'
const TIMEOUT = 10e3
class TestHandler extends AbstractHandler {
constructor (impl) {
super({ url: 'test://' }, { timeout: TIMEOUT })
Object.keys(impl).forEach(method => {
this[`_${method}`] = impl[method]
})
}
}
describe('rename()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
rename: () => new Promise(() => {}),
})
const promise = testHandler.rename('oldPath', 'newPath')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('list()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
list: () => new Promise(() => {}),
})
const promise = testHandler.list()
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('createReadStream()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
createReadStream: () => new Promise(() => {}),
})
const promise = testHandler.createReadStream('file')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('openFile()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
openFile: () => new Promise(() => {}),
})
const promise = testHandler.openFile('path')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('closeFile()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
closeFile: () => new Promise(() => {}),
})
const promise = testHandler.closeFile({ fd: undefined, path: '' })
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('createOutputStream()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
createOutputStream: () => new Promise(() => {}),
})
const promise = testHandler.createOutputStream('File')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('unlink()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
unlink: () => new Promise(() => {}),
})
const promise = testHandler.unlink('')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})
describe('getSize()', () => {
it(`throws in case of timeout`, async () => {
const testHandler = new TestHandler({
getSize: () => new Promise(() => {}),
})
const promise = testHandler.getSize('')
jest.advanceTimersByTime(TIMEOUT)
await expect(promise).rejects.toThrowError(TimeoutError)
})
})

View File

@@ -1,100 +0,0 @@
// @flow
// $FlowFixMe
import through2 from 'through2'
import { createHash } from 'crypto'
import { defer, fromEvent } from 'promise-toolbox'
import { invert } from 'lodash'
import { type Readable, type Transform } from 'stream'
// Format: $<algorithm>$<salt>$<encrypted>
//
// http://man7.org/linux/man-pages/man3/crypt.3.html#NOTES
const ALGORITHM_TO_ID = {
md5: '1',
sha256: '5',
sha512: '6',
}
const ID_TO_ALGORITHM = invert(ALGORITHM_TO_ID)
// Create a through stream which computes the checksum of all data going
// through.
//
// The `checksum` attribute is a promise which resolves at the end of the stream
// with a string representation of the checksum.
//
// const source = ...
// const checksumStream = source.pipe(createChecksumStream())
// checksumStream.resume() // make the data flow without an output
// console.log(await checksumStream.checksum)
export const createChecksumStream = (
algorithm: string = 'md5'
): Transform & { checksum: Promise<string> } => {
const algorithmId = ALGORITHM_TO_ID[algorithm]
if (!algorithmId) {
throw new Error(`unknown algorithm: ${algorithm}`)
}
const hash = createHash(algorithm)
const { promise, resolve, reject } = defer()
const stream = through2(
(chunk, enc, callback) => {
hash.update(chunk)
callback(null, chunk)
},
callback => {
resolve(`$${algorithmId}$$${hash.digest('hex')}`)
callback()
}
).once('error', reject)
stream.checksum = promise
return stream
}
// Check if the checksum of a readable stream is equals to an expected checksum.
// The given stream is wrapped in a stream which emits an error event
// if the computed checksum is not equals to the expected checksum.
export const validChecksumOfReadStream = (
stream: Readable,
expectedChecksum: string
): Readable & { checksumVerified: Promise<void> } => {
const algorithmId = expectedChecksum.slice(
1,
expectedChecksum.indexOf('$', 1)
)
if (!algorithmId) {
throw new Error(`unknown algorithm: ${algorithmId}`)
}
const hash = createHash(ID_TO_ALGORITHM[algorithmId])
const wrapper: any = stream.pipe(
through2(
{ highWaterMark: 0 },
(chunk, enc, callback) => {
hash.update(chunk)
callback(null, chunk)
},
callback => {
const checksum = `$${algorithmId}$$${hash.digest('hex')}`
callback(
checksum !== expectedChecksum
? new Error(
`Bad checksum (${checksum}), expected: ${expectedChecksum}`
)
: null
)
}
)
)
stream.on('error', error => wrapper.emit('error', error))
wrapper.checksumVerified = fromEvent(wrapper, 'end')
return wrapper
}

View File

@@ -1,26 +0,0 @@
/* eslint-env jest */
import rimraf from 'rimraf'
import tmp from 'tmp'
import { pFromCallback } from 'promise-toolbox'
import { getHandler } from '.'
const initialDir = process.cwd()
beforeEach(async () => {
const dir = await pFromCallback(cb => tmp.dir(cb))
process.chdir(dir)
})
afterEach(async () => {
const tmpDir = process.cwd()
process.chdir(initialDir)
await pFromCallback(cb => rimraf(tmpDir, cb))
})
test("fs test doesn't crash", async () => {
const handler = getHandler({ url: 'file://' + process.cwd() })
const result = await handler.test()
expect(result.success).toBeTruthy()
})

View File

@@ -1,26 +0,0 @@
// @flow
import type RemoteHandler from './abstract'
import RemoteHandlerLocal from './local'
import RemoteHandlerNfs from './nfs'
import RemoteHandlerSmb from './smb'
export type { default as RemoteHandler } from './abstract'
export type Remote = { url: string }
const HANDLERS = {
file: RemoteHandlerLocal,
smb: RemoteHandlerSmb,
nfs: RemoteHandlerNfs,
}
export const getHandler = (remote: Remote, ...rest: any): RemoteHandler => {
// FIXME: should be done in xo-remote-parser.
const type = remote.url.split('://')[0]
const Handler = HANDLERS[type]
if (!Handler) {
throw new Error('Unhandled remote type')
}
return new Handler(remote, ...rest)
}

View File

@@ -1,124 +0,0 @@
import fs from 'fs-extra'
import { dirname, resolve } from 'path'
import { noop, startsWith } from 'lodash'
import RemoteHandlerAbstract from './abstract'
export default class LocalHandler extends RemoteHandlerAbstract {
get type () {
return 'file'
}
_getRealPath () {
return this._remote.path
}
_getFilePath (file) {
const realPath = this._getRealPath()
const parts = [realPath]
if (file) {
parts.push(file)
}
const path = resolve.apply(null, parts)
if (!startsWith(path, realPath)) {
throw new Error('Remote path is unavailable')
}
return path
}
async _sync () {
if (this._remote.enabled) {
const path = this._getRealPath()
await fs.ensureDir(path)
await fs.access(path, fs.R_OK | fs.W_OK)
}
return this._remote
}
async _forget () {
return noop()
}
async _outputFile (file, data, options) {
const path = this._getFilePath(file)
await fs.ensureDir(dirname(path))
await fs.writeFile(path, data, options)
}
async _read (file, buffer, position) {
const needsClose = typeof file === 'string'
file = needsClose ? await fs.open(this._getFilePath(file), 'r') : file.fd
try {
return await fs.read(
file,
buffer,
0,
buffer.length,
position === undefined ? null : position
)
} finally {
if (needsClose) {
await fs.close(file)
}
}
}
async _readFile (file, options) {
return fs.readFile(this._getFilePath(file), options)
}
async _rename (oldPath, newPath) {
return fs.rename(this._getFilePath(oldPath), this._getFilePath(newPath))
}
async _list (dir = '.') {
return fs.readdir(this._getFilePath(dir))
}
async _createReadStream (file, options) {
return typeof file === 'string'
? fs.createReadStream(this._getFilePath(file), options)
: fs.createReadStream('', {
autoClose: false,
...options,
fd: file.fd,
})
}
async _createOutputStream (file, options) {
if (typeof file === 'string') {
const path = this._getFilePath(file)
await fs.ensureDir(dirname(path))
return fs.createWriteStream(path, options)
}
return fs.createWriteStream('', {
autoClose: false,
...options,
fd: file.fd,
})
}
async _unlink (file) {
return fs.unlink(this._getFilePath(file)).catch(error => {
// do not throw if the file did not exist
if (error == null || error.code !== 'ENOENT') {
throw error
}
})
}
async _getSize (file) {
const stats = await fs.stat(
this._getFilePath(typeof file === 'string' ? file : file.path)
)
return stats.size
}
async _openFile (path, flags) {
return fs.open(this._getFilePath(path), flags)
}
async _closeFile (fd) {
return fs.close(fd)
}
}

View File

@@ -1,82 +0,0 @@
import execa from 'execa'
import fs from 'fs-extra'
import { join } from 'path'
import { tmpdir } from 'os'
import LocalHandler from './local'
const DEFAULT_NFS_OPTIONS = 'vers=3'
export default class NfsHandler extends LocalHandler {
constructor (
remote,
{ mountsDir = join(tmpdir(), 'xo-fs-mounts'), ...opts } = {}
) {
super(remote, opts)
this._realPath = join(mountsDir, remote.id)
}
get type () {
return 'nfs'
}
_getRealPath () {
return this._realPath
}
async _mount () {
await fs.ensureDir(this._getRealPath())
const { host, path, port, options } = this._remote
return execa(
'mount',
[
'-t',
'nfs',
'-o',
DEFAULT_NFS_OPTIONS + (options !== undefined ? `,${options}` : ''),
`${host}${port !== undefined ? ':' + port : ''}:${path}`,
this._getRealPath(),
],
{
env: {
LANG: 'C',
},
}
).catch(error => {
if (!error.stderr.includes('already mounted')) {
throw error
}
})
}
async _sync () {
if (this._remote.enabled) {
await this._mount()
} else {
await this._umount()
}
return this._remote
}
async _forget () {
try {
await this._umount(this._remote)
} catch (_) {
// We have to go on...
}
}
async _umount () {
await execa('umount', ['--force', this._getRealPath()], {
env: {
LANG: 'C',
},
}).catch(error => {
if (!error.stderr.includes('not mounted')) {
throw error
}
})
}
}

View File

@@ -1,244 +0,0 @@
import Smb2 from '@marsaud/smb2'
import { pFinally } from 'promise-toolbox'
import RemoteHandlerAbstract from './abstract'
const noop = () => {}
// Normalize the error code for file not found.
const normalizeError = error => {
const { code } = error
return code === 'STATUS_OBJECT_NAME_NOT_FOUND' ||
code === 'STATUS_OBJECT_PATH_NOT_FOUND'
? Object.create(error, {
code: {
configurable: true,
readable: true,
value: 'ENOENT',
writable: true,
},
})
: error
}
export default class SmbHandler extends RemoteHandlerAbstract {
constructor (remote, opts) {
super(remote, opts)
this._forget = noop
}
get type () {
return 'smb'
}
_getClient () {
const remote = this._remote
return new Smb2({
share: `\\\\${remote.host}`,
domain: remote.domain,
username: remote.username,
password: remote.password,
autoCloseTimeout: 0,
})
}
_getFilePath (file) {
if (file === '.') {
file = undefined
}
let path = this._remote.path !== '' ? this._remote.path : ''
// Ensure remote path is a directory.
if (path !== '' && path[path.length - 1] !== '\\') {
path += '\\'
}
if (file) {
path += file.replace(/\//g, '\\')
}
return path
}
_dirname (file) {
const parts = file.split('\\')
parts.pop()
return parts.join('\\')
}
async _sync () {
if (this._remote.enabled) {
// Check access (smb2 does not expose connect in public so far...)
await this.list()
}
return this._remote
}
async _outputFile (file, data, options = {}) {
const client = this._getClient()
const path = this._getFilePath(file)
const dir = this._dirname(path)
if (dir) {
await client.ensureDir(dir)
}
return client.writeFile(path, data, options)::pFinally(() => {
client.disconnect()
})
}
async _read (file, buffer, position) {
const needsClose = typeof file === 'string'
let client
if (needsClose) {
client = this._getClient()
file = await client.open(this._getFilePath(file))
} else {
;({ client, file } = file.fd)
}
try {
return await client.read(file, buffer, 0, buffer.length, position)
} finally {
if (needsClose) {
await client.close(file)
client.disconnect()
}
}
}
async _readFile (file, options = {}) {
const client = this._getClient()
let content
try {
content = await client
.readFile(this._getFilePath(file), options)
::pFinally(() => {
client.disconnect()
})
} catch (error) {
throw normalizeError(error)
}
return content
}
async _rename (oldPath, newPath) {
const client = this._getClient()
try {
await client
.rename(this._getFilePath(oldPath), this._getFilePath(newPath), {
replace: true,
})
::pFinally(() => {
client.disconnect()
})
} catch (error) {
throw normalizeError(error)
}
}
async _list (dir = '.') {
const client = this._getClient()
let list
try {
list = await client.readdir(this._getFilePath(dir))::pFinally(() => {
client.disconnect()
})
} catch (error) {
throw normalizeError(error)
}
return list
}
async _createReadStream (file, options = {}) {
if (typeof file !== 'string') {
file = file.path
}
const client = this._getClient()
let stream
try {
// FIXME ensure that options are properly handled by @marsaud/smb2
stream = await client.createReadStream(this._getFilePath(file), options)
stream.on('end', () => client.disconnect())
} catch (error) {
throw normalizeError(error)
}
return stream
}
async _createOutputStream (file, options = {}) {
if (typeof file !== 'string') {
file = file.path
}
const client = this._getClient()
const path = this._getFilePath(file)
const dir = this._dirname(path)
let stream
try {
if (dir) {
await client.ensureDir(dir)
}
stream = await client.createWriteStream(path, options) // FIXME ensure that options are properly handled by @marsaud/smb2
} catch (err) {
client.disconnect()
throw err
}
stream.on('finish', () => client.disconnect())
return stream
}
async _unlink (file) {
const client = this._getClient()
try {
await client.unlink(this._getFilePath(file))::pFinally(() => {
client.disconnect()
})
} catch (error) {
throw normalizeError(error)
}
}
async _getSize (file) {
const client = await this._getClient()
let size
try {
size = await client
.getSize(this._getFilePath(typeof file === 'string' ? file : file.path))
::pFinally(() => {
client.disconnect()
})
} catch (error) {
throw normalizeError(error)
}
return size
}
// TODO: add flags
async _openFile (path) {
const client = this._getClient()
return {
client,
file: await client.open(this._getFilePath(path)),
}
}
async _closeFile ({ client, file }) {
await client.close(file)
client.disconnect()
}
}

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,160 +0,0 @@
# @xen-orchestra/log [![Build Status](https://travis-ci.org/vatesfr/xen-orchestra.png?branch=master)](https://travis-ci.org/vatesfr/xen-orchestra)
> ${pkg.description}
## Install
Installation of the [npm package](https://npmjs.org/package/@xen-orchestra/log):
```
> npm install --save @xen-orchestra/log
```
## Usage
Everywhere something should be logged:
```js
import createLogger from '@xen-orchestra/log'
const log = createLogger('my-module')
log.debug('only useful for debugging')
log.info('this information is relevant to the user')
log.warn('something went wrong but did not prevent current action')
log.error('something went wrong')
log.fatal('service/app is going down')
```
Then, at application level, configure the logs are handled:
```js
import { configure, catchGlobalErrors } from '@xen-orchestra/log/configure'
import transportConsole from '@xen-orchestra/log/transports/console'
import transportEmail from '@xen-orchestra/log/transports/email'
const transport = transportEmail({
service: 'gmail',
auth: {
user: 'jane.smith@gmail.com',
pass: 'H&NbECcpXF|pyXe#%ZEb'
},
from: 'jane.smith@gmail.com',
to: [
'jane.smith@gmail.com',
'sam.doe@yahoo.com'
]
})
configure([
{
// if filter is a string, then it is pattern
// (https://github.com/visionmedia/debug#wildcards) which is
// matched against the namespace of the logs
filter: process.env.DEBUG,
transport: transportConsole()
},
{
// only levels >= warn
level: 'warn',
transport
}
])
// send all global errors (uncaught exceptions, warnings, unhandled rejections)
// to this transport
catchGlobalErrors(transport)
```
### Transports
#### Console
```js
import transportConsole from '@xen-orchestra/log/transports/console'
configure(transports.console())
```
#### Email
Optional dependency:
```
> yarn add nodemailer pretty-format
```
Configuration:
```js
import transportEmail from '@xen-orchestra/log/transports/email'
configure(transportEmail({
service: 'gmail',
auth: {
user: 'jane.smith@gmail.com',
pass: 'H&NbECcpXF|pyXe#%ZEb'
},
from: 'jane.smith@gmail.com',
to: [
'jane.smith@gmail.com',
'sam.doe@yahoo.com'
]
}))
```
#### Syslog
Optional dependency:
```
> yarn add split-host syslog-client
```
Configuration:
```js
import transportSyslog from '@xen-orchestra/log/transports/syslog'
// By default, log to udp://localhost:514
configure(transportSyslog())
// But TCP, a different host, or a different port can be used
configure(transportSyslog('tcp://syslog.company.lan'))
```
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](https://github.com/vatesfr/xo-web/issues/)
you've encountered;
- fork and create a pull request.
## License
ISC © [Vates SAS](https://vates.fr)

View File

@@ -1 +0,0 @@
module.exports = require('./dist/configure')

View File

@@ -1,52 +0,0 @@
{
"name": "@xen-orchestra/log",
"version": "0.1.4",
"license": "ISC",
"description": "",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/log",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@vates.fr"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"configure.js",
"dist/",
"transports/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=4"
},
"dependencies": {
"lodash": "^4.17.4",
"promise-toolbox": "^0.11.0"
},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"index-modules": "^0.3.0",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run prebuild",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,106 +0,0 @@
import createConsoleTransport from './transports/console'
import LEVELS, { resolve } from './levels'
import { compileGlobPattern } from './utils'
// ===================================================================
const createTransport = config => {
if (typeof config === 'function') {
return config
}
if (Array.isArray(config)) {
const transports = config.map(createTransport)
const { length } = transports
return function () {
for (let i = 0; i < length; ++i) {
transports[i].apply(this, arguments)
}
}
}
let { filter, transport } = config
const level = resolve(config.level)
if (filter !== undefined) {
if (typeof filter === 'string') {
const re = compileGlobPattern(filter)
filter = log => re.test(log.namespace)
}
const orig = transport
transport = function (log) {
if ((level !== undefined && log.level >= level) || filter(log)) {
return orig.apply(this, arguments)
}
}
} else if (level !== undefined) {
const orig = transport
transport = function (log) {
if (log.level >= level) {
return orig.apply(this, arguments)
}
}
}
return transport
}
const symbol =
typeof Symbol !== 'undefined'
? Symbol.for('@xen-orchestra/log')
: '@@@xen-orchestra/log'
global[symbol] = createTransport({
// display warnings or above, and all that are enabled via DEBUG or
// NODE_DEBUG env
filter: process.env.DEBUG || process.env.NODE_DEBUG,
level: LEVELS.INFO,
transport: createConsoleTransport(),
})
export const configure = config => {
global[symbol] = createTransport(config)
}
// -------------------------------------------------------------------
export const catchGlobalErrors = logger => {
// patch process
const onUncaughtException = error => {
logger.error('uncaught exception', { error })
}
const onUnhandledRejection = error => {
logger.warn('possibly unhandled rejection', { error })
}
const onWarning = error => {
logger.warn('Node warning', { error })
}
process.on('uncaughtException', onUncaughtException)
process.on('unhandledRejection', onUnhandledRejection)
process.on('warning', onWarning)
// patch EventEmitter
const EventEmitter = require('events')
const { prototype } = EventEmitter
const { emit } = prototype
function patchedEmit (event, error) {
if (event === 'error' && this.listenerCount(event) === 0) {
logger.error('unhandled error event', { error })
return false
}
return emit.apply(this, arguments)
}
prototype.emit = patchedEmit
return () => {
process.removeListener('uncaughtException', onUncaughtException)
process.removeListener('unhandledRejection', onUnhandledRejection)
process.removeListener('warning', onWarning)
if (prototype.emit === patchedEmit) {
prototype.emit = emit
}
}
}

View File

@@ -1,76 +0,0 @@
import createTransport from './transports/console'
import LEVELS from './levels'
const symbol =
typeof Symbol !== 'undefined'
? Symbol.for('@xen-orchestra/log')
: '@@@xen-orchestra/log'
if (!(symbol in global)) {
// the default behavior, without requiring `configure` is to avoid
// logging anything unless it's a real error
const transport = createTransport()
global[symbol] = log => log.level > LEVELS.WARN && transport(log)
}
// -------------------------------------------------------------------
function Log (data, level, namespace, message, time) {
this.data = data
this.level = level
this.namespace = namespace
this.message = message
this.time = time
}
function Logger (namespace) {
this._namespace = namespace
// bind all logging methods
for (const name in LEVELS) {
const lowerCase = name.toLowerCase()
this[lowerCase] = this[lowerCase].bind(this)
}
}
const { prototype } = Logger
for (const name in LEVELS) {
const level = LEVELS[name]
prototype[name.toLowerCase()] = function (message, data) {
if (typeof message !== 'string') {
if (message instanceof Error) {
data = { error: message }
;({ message = 'an error has occured' } = message)
} else {
return this.warn('incorrect value passed to logger', {
level,
value: message,
})
}
}
global[symbol](new Log(data, level, this._namespace, message, new Date()))
}
}
prototype.wrap = function (message, fn) {
const logger = this
const warnAndRethrow = error => {
logger.warn(message, { error })
throw error
}
return function () {
try {
const result = fn.apply(this, arguments)
const then = result != null && result.then
return typeof then === 'function'
? then.call(result, warnAndRethrow)
: result
} catch (error) {
warnAndRethrow(error)
}
}
}
const createLogger = namespace => new Logger(namespace)
export { createLogger as default }

View File

@@ -1,24 +0,0 @@
const LEVELS = Object.create(null)
export { LEVELS as default }
// https://github.com/trentm/node-bunyan#levels
LEVELS.FATAL = 60 // service/app is going down
LEVELS.ERROR = 50 // fatal for current action
LEVELS.WARN = 40 // something went wrong but it's not fatal
LEVELS.INFO = 30 // detail on unusual but normal operation
LEVELS.DEBUG = 20
export const NAMES = Object.create(null)
for (const name in LEVELS) {
NAMES[LEVELS[name]] = name
}
export const resolve = level => {
if (typeof level === 'string') {
level = LEVELS[level.toUpperCase()]
}
return level
}
Object.freeze(LEVELS)
Object.freeze(NAMES)

View File

@@ -1,32 +0,0 @@
/* eslint-env jest */
import { forEach, isInteger } from 'lodash'
import LEVELS, { NAMES, resolve } from './levels'
describe('LEVELS', () => {
it('maps level names to their integer values', () => {
forEach(LEVELS, (value, name) => {
expect(isInteger(value)).toBe(true)
})
})
})
describe('NAMES', () => {
it('maps level values to their names', () => {
forEach(LEVELS, (value, name) => {
expect(NAMES[value]).toBe(name)
})
})
})
describe('resolve()', () => {
it('returns level values either from values or names', () => {
forEach(LEVELS, value => {
expect(resolve(value)).toBe(value)
})
forEach(NAMES, (name, value) => {
expect(resolve(name)).toBe(+value)
})
})
})

View File

@@ -1,24 +0,0 @@
import LEVELS, { NAMES } from '../levels'
// Bind console methods (necessary for browsers)
const debugConsole = console.log.bind(console)
const infoConsole = console.info.bind(console)
const warnConsole = console.warn.bind(console)
const errorConsole = console.error.bind(console)
const { ERROR, INFO, WARN } = LEVELS
const consoleTransport = ({ data, level, namespace, message, time }) => {
const fn =
level < INFO
? debugConsole
: level < WARN
? infoConsole
: level < ERROR
? warnConsole
: errorConsole
fn('%s - %s - [%s] %s', time.toISOString(), namespace, NAMES[level], message)
data != null && fn(data)
}
export default () => consoleTransport

View File

@@ -1,70 +0,0 @@
import fromCallback from 'promise-toolbox/fromCallback'
import prettyFormat from 'pretty-format' // eslint-disable-line node/no-extraneous-import
import { createTransport } from 'nodemailer' // eslint-disable-line node/no-extraneous-import
import { evalTemplate, required } from '../utils'
import { NAMES } from '../levels'
export default ({
// transport options (https://nodemailer.com/smtp/)
auth,
authMethod,
host,
ignoreTLS,
port,
proxy,
requireTLS,
secure,
service,
tls,
// message options (https://nodemailer.com/message/)
bcc,
cc,
from = required('from'),
to = required('to'),
subject = '[{{level}} - {{namespace}}] {{time}} {{message}}',
}) => {
const transporter = createTransport(
{
auth,
authMethod,
host,
ignoreTLS,
port,
proxy,
requireTLS,
secure,
service,
tls,
disableFileAccess: true,
disableUrlAccess: true,
},
{
bcc,
cc,
from,
to,
}
)
return log =>
fromCallback(cb =>
transporter.sendMail(
{
subject: evalTemplate(
subject,
key =>
key === 'level'
? NAMES[log.level]
: key === 'time'
? log.time.toISOString()
: log[key]
),
text: prettyFormat(log.data),
},
cb
)
)
}

View File

@@ -1,7 +0,0 @@
export default () => {
const memoryLogger = log => {
logs.push(log)
}
const logs = (memoryLogger.logs = [])
return memoryLogger
}

View File

@@ -1,42 +0,0 @@
import fromCallback from 'promise-toolbox/fromCallback'
import splitHost from 'split-host' // eslint-disable-line node/no-extraneous-import node/no-missing-import
import startsWith from 'lodash/startsWith'
import { createClient, Facility, Severity, Transport } from 'syslog-client' // eslint-disable-line node/no-extraneous-import node/no-missing-import
import LEVELS from '../levels'
// https://github.com/paulgrove/node-syslog-client#syslogseverity
const LEVEL_TO_SEVERITY = {
[LEVELS.FATAL]: Severity.Critical,
[LEVELS.ERROR]: Severity.Error,
[LEVELS.WARN]: Severity.Warning,
[LEVELS.INFO]: Severity.Informational,
[LEVELS.DEBUG]: Severity.Debug,
}
const facility = Facility.User
export default target => {
const opts = {}
if (target !== undefined) {
if (startsWith(target, 'tcp://')) {
target = target.slice(6)
opts.transport = Transport.Tcp
} else if (startsWith(target, 'udp://')) {
target = target.slice(6)
opts.transport = Transport.Udp
}
;({ host: target, port: opts.port } = splitHost(target))
}
const client = createClient(target, opts)
return log =>
fromCallback(cb =>
client.log(log.message, {
facility,
severity: LEVEL_TO_SEVERITY[log.level],
})
)
}

View File

@@ -1,62 +0,0 @@
import escapeRegExp from 'lodash/escapeRegExp'
// ===================================================================
const TPL_RE = /\{\{(.+?)\}\}/g
export const evalTemplate = (tpl, data) => {
const getData =
typeof data === 'function' ? (_, key) => data(key) : (_, key) => data[key]
return tpl.replace(TPL_RE, getData)
}
// -------------------------------------------------------------------
const compileGlobPatternFragment = pattern =>
pattern
.split('*')
.map(escapeRegExp)
.join('.*')
export const compileGlobPattern = pattern => {
const no = []
const yes = []
pattern.split(/[\s,]+/).forEach(pattern => {
if (pattern[0] === '-') {
no.push(pattern.slice(1))
} else {
yes.push(pattern)
}
})
const raw = ['^']
if (no.length !== 0) {
raw.push('(?!', no.map(compileGlobPatternFragment).join('|'), ')')
}
if (yes.length !== 0) {
raw.push('(?:', yes.map(compileGlobPatternFragment).join('|'), ')')
} else {
raw.push('.*')
}
raw.push('$')
return new RegExp(raw.join(''))
}
// -------------------------------------------------------------------
export const required = name => {
throw new Error(`missing required arg ${name}`)
}
// -------------------------------------------------------------------
export const serializeError = error => ({
...error,
message: error.message,
name: error.name,
stack: error.stack,
})

View File

@@ -1,13 +0,0 @@
/* eslint-env jest */
import { compileGlobPattern } from './utils'
describe('compileGlobPattern()', () => {
it('works', () => {
const re = compileGlobPattern('foo, ba*, -bar')
expect(re.test('foo')).toBe(true)
expect(re.test('bar')).toBe(false)
expect(re.test('baz')).toBe(true)
expect(re.test('qux')).toBe(false)
})
})

View File

@@ -1 +0,0 @@
module.exports = require('../dist/transports/console.js')

View File

@@ -1 +0,0 @@
module.exports = require('../dist/transports/email.js')

View File

@@ -1 +0,0 @@
module.exports = require('../dist/transports/memory.js')

View File

@@ -1 +0,0 @@
module.exports = require('../dist/transports/syslog.js')

View File

@@ -1,3 +0,0 @@
module.exports = require('../../@xen-orchestra/babel-config')(
require('./package.json')
)

View File

@@ -1,24 +0,0 @@
/benchmark/
/benchmarks/
*.bench.js
*.bench.js.map
/examples/
example.js
example.js.map
*.example.js
*.example.js.map
/fixture/
/fixtures/
*.fixture.js
*.fixture.js.map
*.fixtures.js
*.fixtures.js.map
/test/
/tests/
*.spec.js
*.spec.js.map
__snapshots__/

View File

@@ -1,49 +0,0 @@
# ${pkg.name} [![Build Status](https://travis-ci.org/${pkg.shortGitHubPath}.png?branch=master)](https://travis-ci.org/${pkg.shortGitHubPath})
> ${pkg.description}
## Install
Installation of the [npm package](https://npmjs.org/package/${pkg.name}):
```
> npm install --save ${pkg.name}
```
## Usage
**TODO**
## Development
```
# Install dependencies
> yarn
# Run the tests
> yarn test
# Continuously compile
> yarn dev
# Continuously run the tests
> yarn dev-test
# Build for production (automatically called by npm install)
> yarn build
```
## Contributions
Contributions are *very* welcomed, either on the documentation or on
the code.
You may:
- report any [issue](${pkg.bugs})
you've encountered;
- fork and create a pull request.
## License
${pkg.license} © [${pkg.author.name}](${pkg.author.url})

View File

@@ -1,49 +0,0 @@
{
"name": "@xen-orchestra/mixin",
"version": "0.0.0",
"license": "ISC",
"description": "",
"keywords": [],
"homepage": "https://github.com/vatesfr/xen-orchestra/tree/master/@xen-orchestra/mixin",
"bugs": "https://github.com/vatesfr/xen-orchestra/issues",
"repository": {
"type": "git",
"url": "https://github.com/vatesfr/xen-orchestra.git"
},
"author": {
"name": "Julien Fontanet",
"email": "julien.fontanet@vates.fr"
},
"preferGlobal": false,
"main": "dist/",
"bin": {},
"files": [
"dist/"
],
"browserslist": [
">2%"
],
"engines": {
"node": ">=6"
},
"dependencies": {
"bind-property-descriptor": "^1.0.0"
},
"devDependencies": {
"@babel/cli": "^7.0.0",
"@babel/core": "^7.0.0",
"@babel/preset-env": "^7.0.0",
"babel-plugin-dev": "^1.0.0",
"babel-plugin-lodash": "^3.3.2",
"cross-env": "^5.1.3",
"rimraf": "^2.6.2"
},
"scripts": {
"build": "cross-env NODE_ENV=production babel --source-maps --out-dir=dist/ src/",
"clean": "rimraf dist/",
"dev": "cross-env NODE_ENV=development babel --watch --source-maps --out-dir=dist/ src/",
"prebuild": "yarn run clean",
"predev": "yarn run prebuild",
"prepublishOnly": "yarn run build"
}
}

View File

@@ -1,130 +0,0 @@
import { getBoundPropertyDescriptor } from 'bind-property-descriptor'
// ===================================================================
const { defineProperties, getOwnPropertyDescriptor } = Object
const isIgnoredProperty = name => name[0] === '_' || name === 'constructor'
const IGNORED_STATIC_PROPERTIES = {
__proto__: null,
arguments: true,
caller: true,
length: true,
name: true,
prototype: true,
}
const isIgnoredStaticProperty = name => name in IGNORED_STATIC_PROPERTIES
const ownKeys =
(typeof Reflect !== 'undefined' && Reflect.ownKeys) ||
(({ getOwnPropertyNames: names, getOwnPropertySymbols: symbols }) =>
symbols !== undefined ? obj => names(obj).concat(symbols(obj)) : names)(
Object
)
// -------------------------------------------------------------------
const mixin = Mixins => Class => {
if (__DEV__ && !Array.isArray(Mixins)) {
throw new TypeError('Mixins should be an array')
}
const { name } = Class
// Copy properties of plain object mix-ins to the prototype.
{
const allMixins = Mixins
Mixins = []
const { prototype } = Class
const descriptors = { __proto__: null }
allMixins.forEach(Mixin => {
if (typeof Mixin === 'function') {
Mixins.push(Mixin)
return
}
for (const prop of ownKeys(Mixin)) {
if (__DEV__ && prop in prototype) {
throw new Error(`${name}#${prop} is already defined`)
}
;(descriptors[prop] = getOwnPropertyDescriptor(
Mixin,
prop
)).enumerable = false // Object methods are enumerable but class methods are not.
}
})
defineProperties(prototype, descriptors)
}
const n = Mixins.length
function DecoratedClass (...args) {
const instance = new Class(...args)
for (let i = 0; i < n; ++i) {
const Mixin = Mixins[i]
const { prototype } = Mixin
const mixinInstance = new Mixin(instance, ...args)
const descriptors = { __proto__: null }
const props = ownKeys(prototype)
for (let j = 0, m = props.length; j < m; ++j) {
const prop = props[j]
if (isIgnoredProperty(prop)) {
continue
}
if (prop in instance) {
throw new Error(`${name}#${prop} is already defined`)
}
descriptors[prop] = getBoundPropertyDescriptor(
prototype,
prop,
mixinInstance
)
}
defineProperties(instance, descriptors)
}
return instance
}
// Copy original and mixed-in static properties on Decorator class.
const descriptors = { __proto__: null }
ownKeys(Class).forEach(prop => {
let descriptor
if (
!(
isIgnoredStaticProperty(prop) &&
// if they already exist...
(descriptor = getOwnPropertyDescriptor(DecoratedClass, prop)) !==
undefined &&
// and are not configurable.
!descriptor.configurable
)
) {
descriptors[prop] = getOwnPropertyDescriptor(Class, prop)
}
})
Mixins.forEach(Mixin => {
ownKeys(Mixin).forEach(prop => {
if (isIgnoredStaticProperty(prop)) {
return
}
if (__DEV__ && prop in descriptors) {
throw new Error(`${name}.${prop} is already defined`)
}
descriptors[prop] = getOwnPropertyDescriptor(Mixin, prop)
})
})
defineProperties(DecoratedClass, descriptors)
return DecoratedClass
}
export { mixin as default }

File diff suppressed because it is too large Load Diff

View File

@@ -1,46 +0,0 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at julien.fontanet@vates.fr. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

View File

@@ -1,27 +0,0 @@
<!--
Welcome to the issue section of Xen Orchestra!
Here you can:
- report an issue
- propose an enhancement
- ask a question
Please, respect this template as much as possible, it helps us sort
the issues :)
-->
### Context
- **XO origin**: the sources / XO Appliance
- **Versions**:
- Node: **FILL HERE**
- xo-web: **FILL HERE**
- xo-server: **FILL HERE**
### Expected behavior
<!-- What you expect to happen -->
### Current behavior
<!-- What is actually happening -->

View File

@@ -1,17 +0,0 @@
### Check list
> Check items when done or if not relevant
- [ ] PR reference the relevant issue (e.g. `Fixes #007`)
- [ ] if UI changes, a screenshot has been added to the PR
- [ ] CHANGELOG:
- enhancement/bug fix entry added
- list of packages to release updated (`${name} v${new version}`)
- [ ] documentation updated
### Process
1. create a PR as soon as possible
1. mark it as `WiP:` (Work in Progress) if not ready to be merged
1. when you want a review, add a reviewer
1. if necessary, update your PR, and re- add a reviewer

View File

@@ -1,10 +1,68 @@
# Xen Orchestra [![Chat with us](https://storage.crisp.im/plugins/images/936925df-f37b-4ba8-bab0-70cd2edcb0be/badge.svg)](https://go.crisp.im/chat/embed/?website_id=-JzqzzwddSV7bKGtEyAQ) [![Build Status](https://travis-ci.org/vatesfr/xen-orchestra.png?branch=master)](https://travis-ci.org/vatesfr/xen-orchestra)
# Xen Orchestra Web
![](http://i.imgur.com/tRffA5y.png)
XO-Web is part of [Xen Orchestra](https://github.com/vatesfr/xo), a web interface for XenServer or XAPI enabled hosts.
It is a web client for [XO-Server](https://github.com/vatesfr/xo-server).
[![Dependency Status](https://david-dm.org/vatesfr/xo-web.svg?theme=shields.io)](https://david-dm.org/vatesfr/xo-web)
[![devDependency Status](https://david-dm.org/vatesfr/xo-web/dev-status.svg?theme=shields.io)](https://david-dm.org/vatesfr/xo-web#info=devDependencies)
___
## Installation
XOA or manual install procedure is [available here](https://xen-orchestra.com/docs/installation.html)
XOA or manual install procedure is [available here](https://github.com/vatesfr/xo/blob/master/doc/installation/README.md)
## Compilation
Production build:
```
$ npm run build
```
Development build:
```
$ npm run dev
```
## How to report a bug?
If you are certain the bug is exclusively related to XO-Web, you may use the [bugtracker of this repository](https://github.com/vatesfr/xo-web/issues).
Otherwise, please consider using the [bugtracker of the general repository](https://github.com/vatesfr/xo/issues).
## Process for new release
```bash
# Switch to the master branch.
git checkout master
# Fetches latest changes.
git pull --ff-only
# Merge changes of the next-release branch.
git merge next-release
# Increment the version (patch, minor or major).
npm version minor
# Go back to the next-release branch.
git checkout next-release
# Fetches the last changes (the merge and version bump) from master to
# next-release.
git merge --ff-only master
# Push the changes on git.
git push --follow-tags origin master next-release
# Publish this release to npm.
npm publish
```
## License

View File

@@ -1,5 +0,0 @@
module.exports = {
// Necessary for jest to be able to find the `.babelrc.js` closest to the file
// instead of only the one in this directory.
babelrcRoots: true,
}

View File

@@ -1,9 +0,0 @@
{
"gitbook": ">=3.0.0",
"root": "./docs",
"plugins": [
"anchors",
"-edit-link"
],
"pluginsConfig": {}
}

View File

@@ -1,31 +0,0 @@
# Xen Orchestra
## Introduction
Welcome to the official Xen Orchestra (XO) documentation.
XO is a web interface to visualize and administer your XenServer (or XAPI enabled) hosts. **No agent** is required for it to work.
It aims to be easy to use on any device supporting modern web technologies (HTML 5, CSS 3, JavaScript), such as your desktop computer or your smartphone.
![](https://pbs.twimg.com/profile_images/601775622675898368/xWbbafyO_400x400.png)
## XOA quick deploy
SSH to your XenServer, and execute the following:
```
bash -c "$(curl -s http://xoa.io/deploy)"
```
### XOA credentials
* Web UI: `admin@admin.net` / `admin`
* Console/SSH: `xoa` / `xoa` (first login)
## Must read
* [XOA installation](xoa.md)
* [Main features](features.md)
* [Pro Support](support.md)

View File

@@ -1,77 +0,0 @@
# Summary
* [Introduction](README.md)
* [Architecture](architecture.md)
* [xo-server](xo-server.md)
* [xo-web](xo-web.md)
* [xo-cli](xo-cli.md)
* [others](others.md)
* [Installation](installation.md)
* [XOA](xoa.md)
* [Updater](updater.md)
* [Trial activation](trial_activation.md)
* [Plugins](plugins.md)
* [Logs](logs.md)
* [Compatibility](supported-version.md)
* [Troubleshooting](troubleshooting.md)
* [From the sources](from_the_sources.md)
* [Configuration](configuration.md)
* [Features](features.md)
* [Administration](administration.md)
* [Home view](user_interface.md)
* [Search and filters](search.md)
* [VM management](vm_management.md)
* [VM creation](vm_creation.md)
* [VM import and export](vm_import_export.md)
* [XenServer Patching](patching.md)
* [Docker support](docker_support.md)
* [Backup and DR](backups.md)
* [Full backups](full_backups.md)
* [Rolling snapshots](rolling_snapshots.md)
* [Continuous Delta backups](delta_backups.md)
* [Continuous Replication](continuous_replication.md)
* [Disaster recovery](disaster_recovery.md)
* [Smart Backup](smart_backup.md)
* [File level Restore](file_level_restore.md)
* [Backup Concurrency](concurrency.md)
* [Configure backup reports](backup_reports.md)
* [Backup troubleshooting](backup_troubleshooting.md)
* [User authentication](authentication.md)
* [Built-in](built-in.md)
* [LDAP](ldap.md)
* [SAML](saml.md)
* [GitHub](github.md)
* [Google](google.md)
* [Resources delegation](resources_delegation.md)
* [ACLs](acls.md)
* [CloudInit](cloudinit.md)
* [Self Service](self_service.md)
* [Visualizations](visualizations.md)
* [Health](health.md)
* [Job manager](scheduler.md)
* [Alerts](alerts.md)
* [Load balancing](load_balancing.md)
* [Auto scalability](auto_scalability.md)
* [Forecaster](forecaster.md)
* [Recipes](recipes.md)
* [Reverse proxy](reverse_proxy.md)
* [How to contribute?](contributing.md)
* [Support](support.md)
* [Roadmap](roadmap.md)
* [Purchase](purchase.md)
* [Direct purchase](directpurchase.md)
* [Through purchase department](through_purchase_department.md)
* [Reseller](reseller.md)
* [Editions](editions.md)
* [Trial](trial.md)
* [Invoices](invoices.md)
* [Upgrade](upgrade.md)
* [XOSAN](xosan.md)
* [Requirements](xosan_requirements.md)
* [Types](xosan_types.md)
* [Replicated](xosan_replicated.md)
* [Disperse](xosan_disperse.md)
* [Creation](xosan_create.md)
* [Trial](xosan_trial.md)
* [General Troubleshooting](general-troubleshooting.md)
* [Glossary](glossary.md)

View File

@@ -1,72 +0,0 @@
# ACLs
> ACLs are permissions that apply to pre-existing objects. Only a super admin (XO administrator) can create objects.
ACLs are the permissions for your users or groups. The ACLs view can be accessed in the "Settings" panel.
1. Select the user or group you want to apply permissions on
2. Select the object on which the permission will apply
3. Choose the role for this ACL
4. Click on "Create"
![](./assets/createacl.png)
> Pro tip: you can click to add multiple objects at the same time!
Your ACL is now available in the right list:
![](./assets/acllist.png)
You can edit/remove existing ACLs here.
## Roles
There are 3 different roles for your users:
* Admin
* Operator
* Viewer
### Admin
An object admin can do everything on it, even destroy it. E.g with its admin VM:
* remove it
* migrate it (to a host with admin permission on it)
* modify the VM resources, name and description
* clone it
* copy it
* convert it into a template
* snapshot it (even revert from a snapshot)
* export it
* attach/add visible disks
* same for network cards
### Operator
An operator can make everyday operations on assigned objects. E.g on a VM:
* eject a CD
* insert a CD (if he can view the ISO storage repository)
* start, restart, shutdown, suspend/resume it
All other operations are forbidden.
### Viewer
A viewer can only see the VM state and its metrics. That's all!
## Inheritance
Objects have a hierarchy: a Pool contains all its hosts, containing itself all its VMs.
If you give a *view* permission to a user (or a group) on a pool, he will automatically see all the objects inside this pool (SRs, hosts, VMs).
## Examples
### Allow a user to install an OS
If the OS install needs an ISO, you need to give this user 2 permissions:
* *Operate* on the VM (e.g to start it)
* *View* on the ISO Storage containing the needed ISO.

View File

@@ -1,12 +0,0 @@
# Administration
This section contains everyday XenServer administration tasks.
* [Home view](user_interface.md)
* [Search and filters](search.md)
* [VM management](vm_management.md)
* [VM creation](vm_creation.md)
* [VM import and export](vm_import_export.md)
* [XenServer Patching](patching.md)
![](./assets/xo5homevms.png)

View File

@@ -1,35 +0,0 @@
# Alerts
Alerts are a way to warn the administrator about various events. The first kind of alerts will be emails and also in a dedicated area of `xo-web` to display them.
## Performances alerts
The administrator will configure alerts based on performance thresholds.
The configurable metrics are:
* CPU usage (VM, host)
* RAM usage (VM, host)
* network bandwidth (VM, host)
* load average (host)
* disk IO (VM)
* total IO (SR, only for XenServer Dundee and higher)
If any configured values exceed the threshold during a selected period of time, an alert will be sent.
Those alerts will be also stored and accessible in the web interface, and also later for the load balancing feature (helping it to solve those performance problems).
## Updates alerts
When your XOA detects new packages, you'll be notified by email.
## Backup alerts
Same story for backups: if a backup fails, you'll receive an email.
You can choose to be notified only if it fails or even after each backup job.
Current supported alerts system:
* Email
* XMPP

View File

@@ -1,28 +0,0 @@
# Architecture
Xen Orchestra (XO) is software built with a server and clients, such as the web client `xo-web`, but also a CLI capable client, called `xo-cli`.
> XO is totally agent-less: you don't have to install any program on your hosts to get it working!
## XOA
*Xen Orchestra Virtual Appliance* (XOA) is a virtual machine with Xen Orchestra already installed, thus working out-of-the-box.
This is the easiest way to try Xen Orchestra quickly.
Your XOA is connected to all your hosts, or the pool master only if you are using Pools in XenServer:
![](./assets/partner2.jpg)
## Xen Orchestra (XO)
![](./assets/xo-arch.jpg)
Xen Orchestra itself is built as a modular solution. Each part has its role:
- The core is "[xo-server](https://github.com/vatesfr/xen-orchestra/tree/master/packages/xo-server/)" - a daemon dealing directly with XenServer or XAPI capable hosts. This is where users are stored, and it's the center point for talking to your whole Xen infrastructure.
- The web interface is "[xo-web](https://github.com/vatesfr/xen-orchestra/tree/master/packages/xo-web)" - it runs directly from your browser. The connection with ```xo-server``` is done via *WebSockets*.
- "[xo-cli](https://github.com/vatesfr/xen-orchestra/tree/master/packages/xo-cli)" is a module allowing you to send commands directly from the command line.
We have other modules as well (like the LDAP plugin for example). It allows us to use this modular architecture to add further parts as we need them. It's completely flexible, allowing us to adapt Xen Orchestra to every existing workflow.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

Some files were not shown because too many files have changed in this diff Show More