mirror of
https://github.com/sphinx-doc/sphinx.git
synced 2025-02-25 18:55:22 -06:00
Merge with default
This commit is contained in:
commit
3ad1f1c164
@ -7,15 +7,15 @@
|
|||||||
^build/
|
^build/
|
||||||
^dist/
|
^dist/
|
||||||
^tests/.coverage
|
^tests/.coverage
|
||||||
|
^tests/build/
|
||||||
^sphinx/pycode/Grammar.*pickle
|
^sphinx/pycode/Grammar.*pickle
|
||||||
^Sphinx.egg-info/
|
^Sphinx.egg-info/
|
||||||
^doc/_build/
|
^doc/_build/
|
||||||
^TAGS
|
^TAGS
|
||||||
|
^\.tags
|
||||||
^\.ropeproject/
|
^\.ropeproject/
|
||||||
^env/
|
^env/
|
||||||
\.DS_Store$
|
\.DS_Store$
|
||||||
~$
|
~$
|
||||||
^utils/.*3\.py$
|
^utils/.*3\.py$
|
||||||
^distribute-
|
^distribute-
|
||||||
^tests/root/_build/*
|
|
||||||
^tests/root/generated/*
|
|
||||||
|
303
CHANGES
303
CHANGES
@ -12,12 +12,14 @@ Incompatible changes
|
|||||||
* A new node, ``sphinx.addnodes.literal_strong``, has been added, for text that
|
* A new node, ``sphinx.addnodes.literal_strong``, has been added, for text that
|
||||||
should appear literally (i.e. no smart quotes) in strong font. Custom writers
|
should appear literally (i.e. no smart quotes) in strong font. Custom writers
|
||||||
will have to be adapted to handle this node.
|
will have to be adapted to handle this node.
|
||||||
* PR#269, #1476: replace `<tt>` tag by `<code>`. User customized stylesheets
|
* PR#269, #1476: replace ``<tt>`` tag by ``<code>``. User customized stylesheets
|
||||||
should be updated If the css contain some styles for `<tt>` tag.
|
should be updated If the css contain some styles for ``tt>`` tag.
|
||||||
Thanks to Takeshi Komiya.
|
Thanks to Takeshi Komiya.
|
||||||
* #1543: :confval:`templates_path` is automatically added to
|
* #1543: `templates_path` is automatically added to
|
||||||
:confval:`exclude_patterns` to avoid reading autosummary rst templates in the
|
`exclude_patterns` to avoid reading autosummary rst templates in the
|
||||||
templates directory.
|
templates directory.
|
||||||
|
* Custom domains should implement the new `Domain.resolve_any_xref`
|
||||||
|
method to make the `any` role work properly.
|
||||||
|
|
||||||
Features added
|
Features added
|
||||||
--------------
|
--------------
|
||||||
@ -26,22 +28,31 @@ Features added
|
|||||||
* Add support for docutils 0.12
|
* Add support for docutils 0.12
|
||||||
* Added ``sphinx.ext.napoleon`` extension for NumPy and Google style docstring
|
* Added ``sphinx.ext.napoleon`` extension for NumPy and Google style docstring
|
||||||
support.
|
support.
|
||||||
|
* Added support for parallel reading (parsing) of source files with the
|
||||||
|
`sphinx-build -j` option. Third-party extensions will need to be checked for
|
||||||
|
compatibility and may need to be adapted if they store information in the
|
||||||
|
build environment object. See `env-merge-info`.
|
||||||
|
* Added the `any` role that can be used to find a cross-reference of
|
||||||
|
*any* type in *any* domain. Custom domains should implement the new
|
||||||
|
`Domain.resolve_any_xref` method to make this work properly.
|
||||||
* Exception logs now contain the last 10 messages emitted by Sphinx.
|
* Exception logs now contain the last 10 messages emitted by Sphinx.
|
||||||
* Added support for extension versions (a string returned by ``setup()``, these
|
* Added support for extension versions (a string returned by ``setup()``, these
|
||||||
can be shown in the traceback log files). Version requirements for extensions
|
can be shown in the traceback log files). Version requirements for extensions
|
||||||
can be specified in projects using the new :confval:`needs_extensions` config
|
can be specified in projects using the new `needs_extensions` config
|
||||||
value.
|
value.
|
||||||
|
* Changing the default role within a document with the :dudir:`default-role`
|
||||||
|
directive is now supported.
|
||||||
* PR#214: Added stemming support for 14 languages, so that the built-in document
|
* PR#214: Added stemming support for 14 languages, so that the built-in document
|
||||||
search can now handle these. Thanks to Shibukawa Yoshiki.
|
search can now handle these. Thanks to Shibukawa Yoshiki.
|
||||||
* PR#202: Allow "." and "~" prefixed references in ``:param:`` doc fields
|
* PR#202: Allow "." and "~" prefixed references in ``:param:`` doc fields
|
||||||
for Python.
|
for Python.
|
||||||
* PR#184: Add :confval:`autodoc_mock_imports`, allowing to mock imports of
|
* PR#184: Add `autodoc_mock_imports`, allowing to mock imports of
|
||||||
external modules that need not be present when autodocumenting.
|
external modules that need not be present when autodocumenting.
|
||||||
* #925: Allow list-typed config values to be provided on the command line,
|
* #925: Allow list-typed config values to be provided on the command line,
|
||||||
like ``-D key=val1,val2``.
|
like ``-D key=val1,val2``.
|
||||||
* #668: Allow line numbering of ``code-block`` and ``literalinclude`` directives
|
* #668: Allow line numbering of `code-block` and `literalinclude` directives
|
||||||
to start at an arbitrary line number, with a new ``lineno-start`` option.
|
to start at an arbitrary line number, with a new ``lineno-start`` option.
|
||||||
* PR#172, PR#266: The :rst:dir:`code-block` and :rst:dir:`literalinclude`
|
* PR#172, PR#266: The `code-block` and `literalinclude`
|
||||||
directives now can have a ``caption`` option that shows a filename before the
|
directives now can have a ``caption`` option that shows a filename before the
|
||||||
code in the output. Thanks to Nasimul Haque, Takeshi Komiya.
|
code in the output. Thanks to Nasimul Haque, Takeshi Komiya.
|
||||||
* Prompt for the document language in sphinx-quickstart.
|
* Prompt for the document language in sphinx-quickstart.
|
||||||
@ -56,135 +67,43 @@ Features added
|
|||||||
for the ids defined on the node. Thanks to Olivier Heurtier.
|
for the ids defined on the node. Thanks to Olivier Heurtier.
|
||||||
* PR#229: Allow registration of other translators. Thanks to Russell Sim.
|
* PR#229: Allow registration of other translators. Thanks to Russell Sim.
|
||||||
* Add app.set_translator() API to register or override a Docutils translator
|
* Add app.set_translator() API to register or override a Docutils translator
|
||||||
class like :confval:`html_translator_class`.
|
class like `html_translator_class`.
|
||||||
* PR#267, #1134: add 'diff' parameter to literalinclude. Thanks to Richard Wall
|
* PR#267, #1134: add 'diff' parameter to literalinclude. Thanks to Richard Wall
|
||||||
and WAKAYAMA shirou.
|
and WAKAYAMA shirou.
|
||||||
* PR#272: Added 'bizstyle' theme. Thanks to Shoji KUMAGAI.
|
* PR#272: Added 'bizstyle' theme. Thanks to Shoji KUMAGAI.
|
||||||
* Automatically compile ``*.mo`` files from ``*.po`` files when
|
* Automatically compile ``*.mo`` files from ``*.po`` files when
|
||||||
:confval:`gettext_auto_build` is True (default) and ``*.po`` is newer than
|
`gettext_auto_build` is True (default) and ``*.po`` is newer than
|
||||||
``*.mo`` file.
|
``*.mo`` file.
|
||||||
* #623: :mod:`~sphinx.ext.viewcode` supports imported function/class aliases.
|
* #623: `sphinx.ext.viewcode` supports imported function/class aliases.
|
||||||
* PR#275: :mod:`~sphinx.ext.intersphinx` supports multiple target for the
|
* PR#275: `sphinx.ext.intersphinx` supports multiple target for the
|
||||||
inventory. Thanks to Brigitta Sipocz.
|
inventory. Thanks to Brigitta Sipocz.
|
||||||
|
* PR#261: Added the `env-before-read-docs` event that can be connected to modify
|
||||||
|
the order of documents before they are read by the environment.
|
||||||
|
* #1284: Program options documented with :rst:dir:`option` can now start with
|
||||||
|
``+``.
|
||||||
|
* PR#291: The caption of :rst:dir:`code-block` is recognised as a title of ref
|
||||||
|
target. Thanks to Takeshi Komiya.
|
||||||
|
|
||||||
Bugs fixed
|
Bugs fixed
|
||||||
----------
|
----------
|
||||||
|
|
||||||
* #1568: fix a crash when a "centered" directive contains a reference.
|
* #1568: Fix a crash when a "centered" directive contains a reference.
|
||||||
* #1563: :meth:`~sphinx.application.Sphinx.add_search_language` raises
|
* #1563: :meth:`~sphinx.application.Sphinx.add_search_language` raises
|
||||||
AssertionError for correct type of argument. Thanks to rikoman.
|
AssertionError for correct type of argument. Thanks to rikoman.
|
||||||
* #1174: Fix smart quotes being applied inside roles like :rst:role:`program` or
|
* #1174: Fix smart quotes being applied inside roles like :rst:role:`program` or
|
||||||
:rst:role:`makevar`.
|
`makevar`.
|
||||||
* #1335: Fix autosummary template overloading with exclamation prefix like
|
|
||||||
``{% extends "!autosummary/class.rst" %}`` cause infinite recursive function
|
|
||||||
call. This was caused by PR#181.
|
|
||||||
* #1337: Fix autodoc with ``autoclass_content="both"`` uses useless
|
|
||||||
``object.__init__`` docstring when class does not have ``__init__``.
|
|
||||||
This was caused by a change for #1138.
|
|
||||||
* #1340: Can't search alphabetical words on the HTML quick search generated
|
|
||||||
with language='ja'.
|
|
||||||
* #1319: Do not crash if the :confval:`html_logo` file does not exist.
|
|
||||||
* #603: Do not use the HTML-ized title for building the search index (that
|
|
||||||
resulted in "literal" being found on every page with a literal in the
|
|
||||||
title).
|
|
||||||
* #751: Allow production lists longer than a page in LaTeX by using longtable.
|
|
||||||
* #764: Always look for stopwords lowercased in JS search.
|
|
||||||
* #814: autodoc: Guard against strange type objects that don't have
|
|
||||||
``__bases__``.
|
|
||||||
* #932: autodoc: Do not crash if ``__doc__`` is not a string.
|
|
||||||
* #933: Do not crash if an :rst:role:`option` value is malformed (contains
|
|
||||||
spaces but no option name).
|
|
||||||
* #908: On Python 3, handle error messages from LaTeX correctly in the pngmath
|
|
||||||
extension.
|
|
||||||
* #943: In autosummary, recognize "first sentences" to pull from the docstring
|
|
||||||
if they contain uppercase letters.
|
|
||||||
* #923: Take the entire LaTeX document into account when caching
|
|
||||||
pngmath-generated images. This rebuilds them correctly when
|
|
||||||
:confval:`pngmath_latex_preamble` changes.
|
|
||||||
* #901: Emit a warning when using docutils' new "math" markup without a Sphinx
|
|
||||||
math extension active.
|
|
||||||
* #845: In code blocks, when the selected lexer fails, display line numbers
|
|
||||||
nevertheless if configured.
|
|
||||||
* #929: Support parsed-literal blocks in LaTeX output correctly.
|
|
||||||
* #949: Update the tabulary.sty packed with Sphinx.
|
|
||||||
* #1050: Add anonymous labels into ``objects.inv`` to be referenced via
|
|
||||||
:mod:`~sphinx.ext.intersphinx`.
|
|
||||||
* #1095: Fix print-media stylesheet being included always in the "scrolls"
|
|
||||||
theme.
|
|
||||||
* #1085: Fix current classname not getting set if class description has
|
|
||||||
``:noindex:`` set.
|
|
||||||
* #1181: Report option errors in autodoc directives more gracefully.
|
|
||||||
* #1155: Fix autodocumenting C-defined methods as attributes in Python 3.
|
|
||||||
* #1233: Allow finding both Python classes and exceptions with the "class" and
|
|
||||||
"exc" roles in intersphinx.
|
|
||||||
* #1198: Allow "image" for the "figwidth" option of the :rst:dir:`figure`
|
|
||||||
directive as documented by docutils.
|
|
||||||
* #1152: Fix pycode parsing errors of Python 3 code by including two grammar
|
|
||||||
versions for Python 2 and 3, and loading the appropriate version for the
|
|
||||||
running Python version.
|
|
||||||
* #1017: Be helpful and tell the user when the argument to :rst:dir:`option`
|
|
||||||
does not match the required format.
|
|
||||||
* #1345: Fix two bugs with :confval:`nitpick_ignore`; now you don't have to
|
|
||||||
remove the store environment for changes to have effect.
|
|
||||||
* #1072: In the JS search, fix issues searching for upper-cased words by
|
|
||||||
lowercasing words before stemming.
|
|
||||||
* #1299: Make behavior of the :rst:dir:`math` directive more consistent and
|
|
||||||
avoid producing empty environments in LaTeX output.
|
|
||||||
* #1308: Strip HTML tags from the content of "raw" nodes before feeding it
|
|
||||||
to the search indexer.
|
|
||||||
* #1249: Fix duplicate LaTeX page numbering for manual documents.
|
|
||||||
* #1292: In the linkchecker, retry HEAD requests when denied by HTTP 405.
|
|
||||||
Also make the redirect code apparent and tweak the output a bit to be
|
|
||||||
more obvious.
|
|
||||||
* #1285: Avoid name clashes between C domain objects and section titles.
|
|
||||||
* #848: Always take the newest code in incremental rebuilds with the
|
|
||||||
:mod:`sphinx.ext.viewcode` extension.
|
|
||||||
* #979, #1266: Fix exclude handling in ``sphinx-apidoc``.
|
|
||||||
* #1302: Fix regression in :mod:`sphinx.ext.inheritance_diagram` when
|
|
||||||
documenting classes that can't be pickled.
|
|
||||||
* #1316: Remove hard-coded ``font-face`` resources from epub theme.
|
|
||||||
* #1329: Fix traceback with empty translation msgstr in .po files.
|
|
||||||
* #1300: Fix references not working in translated documents in some instances.
|
|
||||||
* #1283: Fix a bug in the detection of changed files that would try to access
|
|
||||||
doctrees of deleted documents.
|
|
||||||
* #1330: Fix :confval:`exclude_patterns` behavior with subdirectories in the
|
|
||||||
:confval:`html_static_path`.
|
|
||||||
* #1323: Fix emitting empty ``<ul>`` tags in the HTML writer, which is not
|
|
||||||
valid HTML.
|
|
||||||
* #1147: Don't emit a sidebar search box in the "singlehtml" builder.
|
|
||||||
* PR#211: When checking for existence of the :confval:`html_logo` file, check
|
|
||||||
the full relative path and not the basename.
|
|
||||||
* #1357: Option names documented by :rst:dir:`option` are now again allowed to
|
|
||||||
not start with a dash or slash, and referencing them will work correctly.
|
|
||||||
* #1358: Fix handling of image paths outside of the source directory when using
|
|
||||||
the "wildcard" style reference.
|
|
||||||
* #1374: Fix for autosummary generating overly-long summaries if first line
|
|
||||||
doesn't end with a period.
|
|
||||||
* #1391: Actually prevent using "pngmath" and "mathjax" extensions at the same
|
|
||||||
time in sphinx-quickstart.
|
|
||||||
* #1386: Fix bug preventing more than one theme being added by the entry point
|
|
||||||
mechanism.
|
|
||||||
* #1370: Ignore "toctree" nodes in text writer, instead of raising.
|
|
||||||
* #1364: Fix 'make gettext' fails when the '.. todolist::' directive is present.
|
|
||||||
* #1367: Fix a change of PR#96 that break sphinx.util.docfields.Field.make_field
|
|
||||||
interface/behavior for `item` argument usage.
|
|
||||||
* #1363: Fix i18n: missing python domain's cross-references with currentmodule
|
|
||||||
directive or currentclass directive.
|
|
||||||
* #1419: Generated i18n sphinx.js files are missing message catalog entries
|
|
||||||
from '.js_t' and '.html'. The issue was introduced in Sphinx 1.1.
|
|
||||||
* #636: Keep straight single quotes in literal blocks in the LaTeX build.
|
|
||||||
* PR#235: comment db schema of websupport lacked a length of the node_id field.
|
* PR#235: comment db schema of websupport lacked a length of the node_id field.
|
||||||
Thanks to solos.
|
Thanks to solos.
|
||||||
* #1466,PR#241: Fix failure of the cpp domain parser to parse C+11
|
* #1466,PR#241: Fix failure of the cpp domain parser to parse C+11
|
||||||
"variadic templates" declarations. Thanks to Victor Zverovich.
|
"variadic templates" declarations. Thanks to Victor Zverovich.
|
||||||
* #1459,PR#244: Fix default mathjax js path point to `http://` that cause
|
* #1459,PR#244: Fix default mathjax js path point to ``http://`` that cause
|
||||||
mixed-content error on HTTPS server. Thanks to sbrandtb and robo9k.
|
mixed-content error on HTTPS server. Thanks to sbrandtb and robo9k.
|
||||||
* PR#157: autodoc remove spurious signatures from @property decorated
|
* PR#157: autodoc remove spurious signatures from @property decorated
|
||||||
attributes. Thanks to David Ham.
|
attributes. Thanks to David Ham.
|
||||||
* PR#159: Add coverage targets to quickstart generated Makefile and make.bat.
|
* PR#159: Add coverage targets to quickstart generated Makefile and make.bat.
|
||||||
Thanks to Matthias Troffaes.
|
Thanks to Matthias Troffaes.
|
||||||
* #1251: When specifying toctree :numbered: option and :tocdepth: metadata,
|
* #1251: When specifying toctree :numbered: option and :tocdepth: metadata,
|
||||||
sub section number that is larger depth than `:tocdepth:` is shrinked.
|
sub section number that is larger depth than ``:tocdepth:`` is shrunk.
|
||||||
* PR#260: Encode underscore in citation labels for latex export. Thanks to
|
* PR#260: Encode underscore in citation labels for latex export. Thanks to
|
||||||
Lennart Fricke.
|
Lennart Fricke.
|
||||||
* PR#264: Fix could not resolve xref for figure node with :name: option.
|
* PR#264: Fix could not resolve xref for figure node with :name: option.
|
||||||
@ -208,8 +127,8 @@ Bugs fixed
|
|||||||
qualified name. It should be rather easy to change this behaviour and
|
qualified name. It should be rather easy to change this behaviour and
|
||||||
potentially index by namespaces/classes as well.
|
potentially index by namespaces/classes as well.
|
||||||
|
|
||||||
* PR#258, #939: Add dedent option for :rst:dir:`code-block` and
|
* PR#258, #939: Add dedent option for `code-block` and
|
||||||
:rst:dir:`literal-include`. Thanks to Zafar Siddiqui.
|
`literalinclude`. Thanks to Zafar Siddiqui.
|
||||||
* PR#268: Fix numbering section does not work at singlehtml mode. It still
|
* PR#268: Fix numbering section does not work at singlehtml mode. It still
|
||||||
ad-hoc fix because there is a issue that section IDs are conflicted.
|
ad-hoc fix because there is a issue that section IDs are conflicted.
|
||||||
Thanks to Takeshi Komiya.
|
Thanks to Takeshi Komiya.
|
||||||
@ -217,20 +136,18 @@ Bugs fixed
|
|||||||
Takeshi Komiya.
|
Takeshi Komiya.
|
||||||
* PR#274: Set its URL as a default title value if URL appears in toctree.
|
* PR#274: Set its URL as a default title value if URL appears in toctree.
|
||||||
Thanks to Takeshi Komiya.
|
Thanks to Takeshi Komiya.
|
||||||
* PR#276, #1381: :rst:role:`rfc` and :rst:role:`pep` roles support custom link
|
* PR#276, #1381: `rfc` and `pep` roles support custom link
|
||||||
text. Thanks to Takeshi Komiya.
|
text. Thanks to Takeshi Komiya.
|
||||||
* PR#277, #1513: highlights for function pointers in argument list of
|
* PR#277, #1513: highlights for function pointers in argument list of
|
||||||
:rst:dir:`c:function`. Thanks to Takeshi Komiya.
|
`c:function`. Thanks to Takeshi Komiya.
|
||||||
* PR#278: Fix section entries were shown twice if toctree has been put under
|
* PR#278: Fix section entries were shown twice if toctree has been put under
|
||||||
only directive. Thanks to Takeshi Komiya.
|
only directive. Thanks to Takeshi Komiya.
|
||||||
* #1547: pgen2 tokenizer doesn't recognize `...` literal (Ellipsis for py3).
|
* #1547: pgen2 tokenizer doesn't recognize ``...`` literal (Ellipsis for py3).
|
||||||
|
|
||||||
Documentation
|
Documentation
|
||||||
-------------
|
-------------
|
||||||
|
|
||||||
* Add clarification about the syntax of tags. (:file:`doc/markup/misc.rst`)
|
* Add clarification about the syntax of tags. (:file:`doc/markup/misc.rst`)
|
||||||
* #1325: Added a "Intersphinx" tutorial section. (:file:`doc/tutorial.rst`)
|
|
||||||
* Extended the :ref:`documentation about building extensions <dev-extensions>`.
|
|
||||||
|
|
||||||
|
|
||||||
Release 1.2.3 (released Sep 1, 2014)
|
Release 1.2.3 (released Sep 1, 2014)
|
||||||
@ -239,7 +156,7 @@ Release 1.2.3 (released Sep 1, 2014)
|
|||||||
Features added
|
Features added
|
||||||
--------------
|
--------------
|
||||||
|
|
||||||
* #1518: `sphinx-apidoc` command now have a `--version` option to show version
|
* #1518: ``sphinx-apidoc`` command now has a ``--version`` option to show version
|
||||||
information and exit
|
information and exit
|
||||||
* New locales: Hebrew, European Portuguese, Vietnamese.
|
* New locales: Hebrew, European Portuguese, Vietnamese.
|
||||||
|
|
||||||
@ -257,14 +174,14 @@ Bugs fixed
|
|||||||
Thanks to Jorge_C.
|
Thanks to Jorge_C.
|
||||||
* #1467: Exception on Python3 if nonexistent method is specified by automethod
|
* #1467: Exception on Python3 if nonexistent method is specified by automethod
|
||||||
* #1441: autosummary can't handle nested classes correctly.
|
* #1441: autosummary can't handle nested classes correctly.
|
||||||
* #1499: With non-callable `setup` in a conf.py, now sphinx-build emits
|
* #1499: With non-callable ``setup`` in a conf.py, now sphinx-build emits
|
||||||
user-friendly error message.
|
a user-friendly error message.
|
||||||
* #1502: In autodoc, fix display of parameter defaults containing backslashes.
|
* #1502: In autodoc, fix display of parameter defaults containing backslashes.
|
||||||
* #1226: autodoc, autosummary: importing setup.py by automodule will invoke
|
* #1226: autodoc, autosummary: importing setup.py by automodule will invoke
|
||||||
setup process and execute `sys.exit()`. Now sphinx avoids SystemExit
|
setup process and execute ``sys.exit()``. Now sphinx avoids SystemExit
|
||||||
exception and emits warnings without unexpected termination.
|
exception and emits warnings without unexpected termination.
|
||||||
* #1503: py:function directive generate incorrectly signature when specifying
|
* #1503: py:function directive generate incorrectly signature when specifying
|
||||||
a default parameter with an empty list `[]`. Thanks to Geert Jansen.
|
a default parameter with an empty list ``[]``. Thanks to Geert Jansen.
|
||||||
* #1508: Non-ASCII filename raise exception on make singlehtml, latex, man,
|
* #1508: Non-ASCII filename raise exception on make singlehtml, latex, man,
|
||||||
texinfo and changes.
|
texinfo and changes.
|
||||||
* #1531: On Python3 environment, docutils.conf with 'source_link=true' in the
|
* #1531: On Python3 environment, docutils.conf with 'source_link=true' in the
|
||||||
@ -274,11 +191,11 @@ Bugs fixed
|
|||||||
* PR#281, PR#282, #1509: TODO extension not compatible with websupport. Thanks
|
* PR#281, PR#282, #1509: TODO extension not compatible with websupport. Thanks
|
||||||
to Takeshi Komiya.
|
to Takeshi Komiya.
|
||||||
* #1477: gettext does not extract nodes.line in a table or list.
|
* #1477: gettext does not extract nodes.line in a table or list.
|
||||||
* #1544: `make text` generate wrong table when it has empty table cells.
|
* #1544: ``make text`` generates wrong table when it has empty table cells.
|
||||||
* #1522: Footnotes from table get displayed twice in LaTeX. This problem has
|
* #1522: Footnotes from table get displayed twice in LaTeX. This problem has
|
||||||
been appeared from Sphinx-1.2.1 by #949.
|
been appeared from Sphinx-1.2.1 by #949.
|
||||||
* #508: Sphinx every time exit with zero when is invoked from setup.py command.
|
* #508: Sphinx every time exit with zero when is invoked from setup.py command.
|
||||||
ex. `python setup.py build_sphinx -b doctest` return zero even if doctest
|
ex. ``python setup.py build_sphinx -b doctest`` return zero even if doctest
|
||||||
failed.
|
failed.
|
||||||
|
|
||||||
Release 1.2.2 (released Mar 2, 2014)
|
Release 1.2.2 (released Mar 2, 2014)
|
||||||
@ -287,7 +204,7 @@ Release 1.2.2 (released Mar 2, 2014)
|
|||||||
Bugs fixed
|
Bugs fixed
|
||||||
----------
|
----------
|
||||||
|
|
||||||
* PR#211: When checking for existence of the :confval:`html_logo` file, check
|
* PR#211: When checking for existence of the `html_logo` file, check
|
||||||
the full relative path and not the basename.
|
the full relative path and not the basename.
|
||||||
* PR#212: Fix traceback with autodoc and ``__init__`` methods without docstring.
|
* PR#212: Fix traceback with autodoc and ``__init__`` methods without docstring.
|
||||||
* PR#213: Fix a missing import in the setup command.
|
* PR#213: Fix a missing import in the setup command.
|
||||||
@ -305,7 +222,7 @@ Bugs fixed
|
|||||||
* #1370: Ignore "toctree" nodes in text writer, instead of raising.
|
* #1370: Ignore "toctree" nodes in text writer, instead of raising.
|
||||||
* #1364: Fix 'make gettext' fails when the '.. todolist::' directive is present.
|
* #1364: Fix 'make gettext' fails when the '.. todolist::' directive is present.
|
||||||
* #1367: Fix a change of PR#96 that break sphinx.util.docfields.Field.make_field
|
* #1367: Fix a change of PR#96 that break sphinx.util.docfields.Field.make_field
|
||||||
interface/behavior for `item` argument usage.
|
interface/behavior for ``item`` argument usage.
|
||||||
|
|
||||||
Documentation
|
Documentation
|
||||||
-------------
|
-------------
|
||||||
@ -327,7 +244,7 @@ Bugs fixed
|
|||||||
This was caused by a change for #1138.
|
This was caused by a change for #1138.
|
||||||
* #1340: Can't search alphabetical words on the HTML quick search generated
|
* #1340: Can't search alphabetical words on the HTML quick search generated
|
||||||
with language='ja'.
|
with language='ja'.
|
||||||
* #1319: Do not crash if the :confval:`html_logo` file does not exist.
|
* #1319: Do not crash if the `html_logo` file does not exist.
|
||||||
* #603: Do not use the HTML-ized title for building the search index (that
|
* #603: Do not use the HTML-ized title for building the search index (that
|
||||||
resulted in "literal" being found on every page with a literal in the
|
resulted in "literal" being found on every page with a literal in the
|
||||||
title).
|
title).
|
||||||
@ -344,7 +261,7 @@ Bugs fixed
|
|||||||
if they contain uppercase letters.
|
if they contain uppercase letters.
|
||||||
* #923: Take the entire LaTeX document into account when caching
|
* #923: Take the entire LaTeX document into account when caching
|
||||||
pngmath-generated images. This rebuilds them correctly when
|
pngmath-generated images. This rebuilds them correctly when
|
||||||
:confval:`pngmath_latex_preamble` changes.
|
`pngmath_latex_preamble` changes.
|
||||||
* #901: Emit a warning when using docutils' new "math" markup without a Sphinx
|
* #901: Emit a warning when using docutils' new "math" markup without a Sphinx
|
||||||
math extension active.
|
math extension active.
|
||||||
* #845: In code blocks, when the selected lexer fails, display line numbers
|
* #845: In code blocks, when the selected lexer fails, display line numbers
|
||||||
@ -361,14 +278,14 @@ Bugs fixed
|
|||||||
* #1155: Fix autodocumenting C-defined methods as attributes in Python 3.
|
* #1155: Fix autodocumenting C-defined methods as attributes in Python 3.
|
||||||
* #1233: Allow finding both Python classes and exceptions with the "class" and
|
* #1233: Allow finding both Python classes and exceptions with the "class" and
|
||||||
"exc" roles in intersphinx.
|
"exc" roles in intersphinx.
|
||||||
* #1198: Allow "image" for the "figwidth" option of the :rst:dir:`figure`
|
* #1198: Allow "image" for the "figwidth" option of the :dudir:`figure`
|
||||||
directive as documented by docutils.
|
directive as documented by docutils.
|
||||||
* #1152: Fix pycode parsing errors of Python 3 code by including two grammar
|
* #1152: Fix pycode parsing errors of Python 3 code by including two grammar
|
||||||
versions for Python 2 and 3, and loading the appropriate version for the
|
versions for Python 2 and 3, and loading the appropriate version for the
|
||||||
running Python version.
|
running Python version.
|
||||||
* #1017: Be helpful and tell the user when the argument to :rst:dir:`option`
|
* #1017: Be helpful and tell the user when the argument to :rst:dir:`option`
|
||||||
does not match the required format.
|
does not match the required format.
|
||||||
* #1345: Fix two bugs with :confval:`nitpick_ignore`; now you don't have to
|
* #1345: Fix two bugs with `nitpick_ignore`; now you don't have to
|
||||||
remove the store environment for changes to have effect.
|
remove the store environment for changes to have effect.
|
||||||
* #1072: In the JS search, fix issues searching for upper-cased words by
|
* #1072: In the JS search, fix issues searching for upper-cased words by
|
||||||
lowercasing words before stemming.
|
lowercasing words before stemming.
|
||||||
@ -391,8 +308,8 @@ Bugs fixed
|
|||||||
* #1300: Fix references not working in translated documents in some instances.
|
* #1300: Fix references not working in translated documents in some instances.
|
||||||
* #1283: Fix a bug in the detection of changed files that would try to access
|
* #1283: Fix a bug in the detection of changed files that would try to access
|
||||||
doctrees of deleted documents.
|
doctrees of deleted documents.
|
||||||
* #1330: Fix :confval:`exclude_patterns` behavior with subdirectories in the
|
* #1330: Fix `exclude_patterns` behavior with subdirectories in the
|
||||||
:confval:`html_static_path`.
|
`html_static_path`.
|
||||||
* #1323: Fix emitting empty ``<ul>`` tags in the HTML writer, which is not
|
* #1323: Fix emitting empty ``<ul>`` tags in the HTML writer, which is not
|
||||||
valid HTML.
|
valid HTML.
|
||||||
* #1147: Don't emit a sidebar search box in the "singlehtml" builder.
|
* #1147: Don't emit a sidebar search box in the "singlehtml" builder.
|
||||||
@ -424,7 +341,7 @@ Bugs fixed
|
|||||||
* Restore ``versionmodified`` CSS class for versionadded/changed and deprecated
|
* Restore ``versionmodified`` CSS class for versionadded/changed and deprecated
|
||||||
directives.
|
directives.
|
||||||
|
|
||||||
* PR#181: Fix `html_theme_path=['.']` is a trigger of rebuild all documents
|
* PR#181: Fix ``html_theme_path = ['.']`` is a trigger of rebuild all documents
|
||||||
always (This change keeps the current "theme changes cause a rebuild"
|
always (This change keeps the current "theme changes cause a rebuild"
|
||||||
feature).
|
feature).
|
||||||
|
|
||||||
@ -491,7 +408,7 @@ Features added
|
|||||||
* Support docutils.conf 'writers' and 'html4css1 writer' section in the HTML
|
* Support docutils.conf 'writers' and 'html4css1 writer' section in the HTML
|
||||||
writer. The latex, manpage and texinfo writers also support their respective
|
writer. The latex, manpage and texinfo writers also support their respective
|
||||||
'writers' sections.
|
'writers' sections.
|
||||||
* The new :confval:`html_extra_path` config value allows to specify directories
|
* The new `html_extra_path` config value allows to specify directories
|
||||||
with files that should be copied directly to the HTML output directory.
|
with files that should be copied directly to the HTML output directory.
|
||||||
* Autodoc directives for module data and attributes now support an
|
* Autodoc directives for module data and attributes now support an
|
||||||
``annotation`` option, so that the default display of the data/attribute
|
``annotation`` option, so that the default display of the data/attribute
|
||||||
@ -562,10 +479,10 @@ Incompatible changes
|
|||||||
|
|
||||||
* Removed ``sphinx.util.compat.directive_dwim()`` and
|
* Removed ``sphinx.util.compat.directive_dwim()`` and
|
||||||
``sphinx.roles.xfileref_role()`` which were deprecated since version 1.0.
|
``sphinx.roles.xfileref_role()`` which were deprecated since version 1.0.
|
||||||
* PR#122: the files given in :confval:`latex_additional_files` now override TeX
|
* PR#122: the files given in `latex_additional_files` now override TeX
|
||||||
files included by Sphinx, such as ``sphinx.sty``.
|
files included by Sphinx, such as ``sphinx.sty``.
|
||||||
* PR#124: the node generated by :rst:dir:`versionadded`,
|
* PR#124: the node generated by `versionadded`,
|
||||||
:rst:dir:`versionchanged` and :rst:dir:`deprecated` directives now includes
|
`versionchanged` and `deprecated` directives now includes
|
||||||
all added markup (such as "New in version X") as child nodes, and no
|
all added markup (such as "New in version X") as child nodes, and no
|
||||||
additional text must be generated by writers.
|
additional text must be generated by writers.
|
||||||
* PR#99: the :rst:dir:`seealso` directive now generates admonition nodes instead
|
* PR#99: the :rst:dir:`seealso` directive now generates admonition nodes instead
|
||||||
@ -619,7 +536,7 @@ Features added
|
|||||||
asterisks ("*").
|
asterisks ("*").
|
||||||
- The default value for the ``paragraphindent`` has been changed from 2 to 0
|
- The default value for the ``paragraphindent`` has been changed from 2 to 0
|
||||||
meaning that paragraphs are no longer indented by default.
|
meaning that paragraphs are no longer indented by default.
|
||||||
- #1110: A new configuration value :confval:`texinfo_no_detailmenu` has been
|
- #1110: A new configuration value `texinfo_no_detailmenu` has been
|
||||||
added for controlling whether a ``@detailmenu`` is added in the "Top"
|
added for controlling whether a ``@detailmenu`` is added in the "Top"
|
||||||
node's menu.
|
node's menu.
|
||||||
- Detailed menus are no longer created except for the "Top" node.
|
- Detailed menus are no longer created except for the "Top" node.
|
||||||
@ -628,16 +545,16 @@ Features added
|
|||||||
|
|
||||||
* LaTeX builder:
|
* LaTeX builder:
|
||||||
|
|
||||||
- PR#115: Add ``'transition'`` item in :confval:`latex_elements` for
|
- PR#115: Add ``'transition'`` item in `latex_elements` for
|
||||||
customizing how transitions are displayed. Thanks to Jeff Klukas.
|
customizing how transitions are displayed. Thanks to Jeff Klukas.
|
||||||
- PR#114: The LaTeX writer now includes the "cmap" package by default. The
|
- PR#114: The LaTeX writer now includes the "cmap" package by default. The
|
||||||
``'cmappkg'`` item in :confval:`latex_elements` can be used to control this.
|
``'cmappkg'`` item in `latex_elements` can be used to control this.
|
||||||
Thanks to Dmitry Shachnev.
|
Thanks to Dmitry Shachnev.
|
||||||
- The ``'fontpkg'`` item in :confval:`latex_elements` now defaults to ``''``
|
- The ``'fontpkg'`` item in `latex_elements` now defaults to ``''``
|
||||||
when the :confval:`language` uses the Cyrillic script. Suggested by Dmitry
|
when the `language` uses the Cyrillic script. Suggested by Dmitry
|
||||||
Shachnev.
|
Shachnev.
|
||||||
- The :confval:`latex_documents`, :confval:`texinfo_documents`, and
|
- The `latex_documents`, `texinfo_documents`, and
|
||||||
:confval:`man_pages` configuration values will be set to default values based
|
`man_pages` configuration values will be set to default values based
|
||||||
on the :confval:`master_doc` if not explicitly set in :file:`conf.py`.
|
on the :confval:`master_doc` if not explicitly set in :file:`conf.py`.
|
||||||
Previously, if these values were not set, no output would be generated by
|
Previously, if these values were not set, no output would be generated by
|
||||||
their respective builders.
|
their respective builders.
|
||||||
@ -655,13 +572,13 @@ Features added
|
|||||||
- Added the Docutils-native XML and pseudo-XML builders. See
|
- Added the Docutils-native XML and pseudo-XML builders. See
|
||||||
:class:`XMLBuilder` and :class:`PseudoXMLBuilder`.
|
:class:`XMLBuilder` and :class:`PseudoXMLBuilder`.
|
||||||
- PR#45: The linkcheck builder now checks ``#anchor``\ s for existence.
|
- PR#45: The linkcheck builder now checks ``#anchor``\ s for existence.
|
||||||
- PR#123, #1106: Add :confval:`epub_use_index` configuration value. If
|
- PR#123, #1106: Add `epub_use_index` configuration value. If
|
||||||
provided, it will be used instead of :confval:`html_use_index` for epub
|
provided, it will be used instead of `html_use_index` for epub
|
||||||
builder.
|
builder.
|
||||||
- PR#126: Add :confval:`epub_tocscope` configuration value. The setting
|
- PR#126: Add `epub_tocscope` configuration value. The setting
|
||||||
controls the generation of the epub toc. The user can now also include
|
controls the generation of the epub toc. The user can now also include
|
||||||
hidden toc entries.
|
hidden toc entries.
|
||||||
- PR#112: Add :confval:`epub_show_urls` configuration value.
|
- PR#112: Add `epub_show_urls` configuration value.
|
||||||
|
|
||||||
* Extensions:
|
* Extensions:
|
||||||
|
|
||||||
@ -729,7 +646,7 @@ Bugs fixed
|
|||||||
* #1127: Fix traceback when autodoc tries to tokenize a non-Python file.
|
* #1127: Fix traceback when autodoc tries to tokenize a non-Python file.
|
||||||
* #1126: Fix double-hyphen to en-dash conversion in wrong places such as
|
* #1126: Fix double-hyphen to en-dash conversion in wrong places such as
|
||||||
command-line option names in LaTeX.
|
command-line option names in LaTeX.
|
||||||
* #1123: Allow whitespaces in filenames given to :rst:dir:`literalinclude`.
|
* #1123: Allow whitespaces in filenames given to `literalinclude`.
|
||||||
* #1120: Added improvements about i18n for themes "basic", "haiku" and
|
* #1120: Added improvements about i18n for themes "basic", "haiku" and
|
||||||
"scrolls" that Sphinx built-in. Thanks to Leonardo J. Caballero G.
|
"scrolls" that Sphinx built-in. Thanks to Leonardo J. Caballero G.
|
||||||
* #1118: Updated Spanish translation. Thanks to Leonardo J. Caballero G.
|
* #1118: Updated Spanish translation. Thanks to Leonardo J. Caballero G.
|
||||||
@ -737,7 +654,7 @@ Bugs fixed
|
|||||||
* #1112: Avoid duplicate download files when referenced from documents in
|
* #1112: Avoid duplicate download files when referenced from documents in
|
||||||
different ways (absolute/relative).
|
different ways (absolute/relative).
|
||||||
* #1111: Fix failure to find uppercase words in search when
|
* #1111: Fix failure to find uppercase words in search when
|
||||||
:confval:`html_search_language` is 'ja'. Thanks to Tomo Saito.
|
`html_search_language` is 'ja'. Thanks to Tomo Saito.
|
||||||
* #1108: The text writer now correctly numbers enumerated lists with
|
* #1108: The text writer now correctly numbers enumerated lists with
|
||||||
non-default start values (based on patch by Ewan Edwards).
|
non-default start values (based on patch by Ewan Edwards).
|
||||||
* #1102: Support multi-context "with" statements in autodoc.
|
* #1102: Support multi-context "with" statements in autodoc.
|
||||||
@ -802,7 +719,7 @@ Release 1.1.3 (Mar 10, 2012)
|
|||||||
* #860: Do not crash when encountering invalid doctest examples, just
|
* #860: Do not crash when encountering invalid doctest examples, just
|
||||||
emit a warning.
|
emit a warning.
|
||||||
|
|
||||||
* #864: Fix crash with some settings of :confval:`modindex_common_prefix`.
|
* #864: Fix crash with some settings of `modindex_common_prefix`.
|
||||||
|
|
||||||
* #862: Fix handling of ``-D`` and ``-A`` options on Python 3.
|
* #862: Fix handling of ``-D`` and ``-A`` options on Python 3.
|
||||||
|
|
||||||
@ -866,7 +783,7 @@ Release 1.1 (Oct 9, 2011)
|
|||||||
Incompatible changes
|
Incompatible changes
|
||||||
--------------------
|
--------------------
|
||||||
|
|
||||||
* The :rst:dir:`py:module` directive doesn't output its ``platform`` option
|
* The `py:module` directive doesn't output its ``platform`` option
|
||||||
value anymore. (It was the only thing that the directive did output, and
|
value anymore. (It was the only thing that the directive did output, and
|
||||||
therefore quite inconsistent.)
|
therefore quite inconsistent.)
|
||||||
|
|
||||||
@ -902,7 +819,7 @@ Features added
|
|||||||
:rst:dir:`toctree`\'s ``numbered`` option.
|
:rst:dir:`toctree`\'s ``numbered`` option.
|
||||||
- #586: Implemented improved :rst:dir:`glossary` markup which allows
|
- #586: Implemented improved :rst:dir:`glossary` markup which allows
|
||||||
multiple terms per definition.
|
multiple terms per definition.
|
||||||
- #478: Added :rst:dir:`py:decorator` directive to describe decorators.
|
- #478: Added `py:decorator` directive to describe decorators.
|
||||||
- C++ domain now supports array definitions.
|
- C++ domain now supports array definitions.
|
||||||
- C++ domain now supports doc fields (``:param x:`` inside directives).
|
- C++ domain now supports doc fields (``:param x:`` inside directives).
|
||||||
- Section headings in :rst:dir:`only` directives are now correctly
|
- Section headings in :rst:dir:`only` directives are now correctly
|
||||||
@ -913,7 +830,7 @@ Features added
|
|||||||
* HTML builder:
|
* HTML builder:
|
||||||
|
|
||||||
- Added ``pyramid`` theme.
|
- Added ``pyramid`` theme.
|
||||||
- #559: :confval:`html_add_permalinks` is now a string giving the
|
- #559: `html_add_permalinks` is now a string giving the
|
||||||
text to display in permalinks.
|
text to display in permalinks.
|
||||||
- #259: HTML table rows now have even/odd CSS classes to enable
|
- #259: HTML table rows now have even/odd CSS classes to enable
|
||||||
"Zebra styling".
|
"Zebra styling".
|
||||||
@ -921,26 +838,26 @@ Features added
|
|||||||
|
|
||||||
* Other builders:
|
* Other builders:
|
||||||
|
|
||||||
- #516: Added new value of the :confval:`latex_show_urls` option to
|
- #516: Added new value of the `latex_show_urls` option to
|
||||||
show the URLs in footnotes.
|
show the URLs in footnotes.
|
||||||
- #209: Added :confval:`text_newlines` and :confval:`text_sectionchars`
|
- #209: Added `text_newlines` and `text_sectionchars`
|
||||||
config values.
|
config values.
|
||||||
- Added :confval:`man_show_urls` config value.
|
- Added `man_show_urls` config value.
|
||||||
- #472: linkcheck builder: Check links in parallel, use HTTP HEAD
|
- #472: linkcheck builder: Check links in parallel, use HTTP HEAD
|
||||||
requests and allow configuring the timeout. New config values:
|
requests and allow configuring the timeout. New config values:
|
||||||
:confval:`linkcheck_timeout` and :confval:`linkcheck_workers`.
|
`linkcheck_timeout` and `linkcheck_workers`.
|
||||||
- #521: Added :confval:`linkcheck_ignore` config value.
|
- #521: Added `linkcheck_ignore` config value.
|
||||||
- #28: Support row/colspans in tables in the LaTeX builder.
|
- #28: Support row/colspans in tables in the LaTeX builder.
|
||||||
|
|
||||||
* Configuration and extensibility:
|
* Configuration and extensibility:
|
||||||
|
|
||||||
- #537: Added :confval:`nitpick_ignore`.
|
- #537: Added `nitpick_ignore`.
|
||||||
- #306: Added :event:`env-get-outdated` event.
|
- #306: Added :event:`env-get-outdated` event.
|
||||||
- :meth:`.Application.add_stylesheet` now accepts full URIs.
|
- :meth:`.Application.add_stylesheet` now accepts full URIs.
|
||||||
|
|
||||||
* Autodoc:
|
* Autodoc:
|
||||||
|
|
||||||
- #564: Add :confval:`autodoc_docstring_signature`. When enabled (the
|
- #564: Add `autodoc_docstring_signature`. When enabled (the
|
||||||
default), autodoc retrieves the signature from the first line of the
|
default), autodoc retrieves the signature from the first line of the
|
||||||
docstring, if it is found there.
|
docstring, if it is found there.
|
||||||
- #176: Provide ``private-members`` option for autodoc directives.
|
- #176: Provide ``private-members`` option for autodoc directives.
|
||||||
@ -958,12 +875,12 @@ Features added
|
|||||||
- Added ``inline`` option to graphviz directives, and fixed the
|
- Added ``inline`` option to graphviz directives, and fixed the
|
||||||
default (block-style) in LaTeX output.
|
default (block-style) in LaTeX output.
|
||||||
- #590: Added ``caption`` option to graphviz directives.
|
- #590: Added ``caption`` option to graphviz directives.
|
||||||
- #553: Added :rst:dir:`testcleanup` blocks in the doctest extension.
|
- #553: Added `testcleanup` blocks in the doctest extension.
|
||||||
- #594: :confval:`trim_doctest_flags` now also removes ``<BLANKLINE>``
|
- #594: `trim_doctest_flags` now also removes ``<BLANKLINE>``
|
||||||
indicators.
|
indicators.
|
||||||
- #367: Added automatic exclusion of hidden members in inheritance
|
- #367: Added automatic exclusion of hidden members in inheritance
|
||||||
diagrams, and an option to selectively enable it.
|
diagrams, and an option to selectively enable it.
|
||||||
- Added :confval:`pngmath_add_tooltips`.
|
- Added `pngmath_add_tooltips`.
|
||||||
- The math extension displaymath directives now support ``name`` in
|
- The math extension displaymath directives now support ``name`` in
|
||||||
addition to ``label`` for giving the equation label, for compatibility
|
addition to ``label`` for giving the equation label, for compatibility
|
||||||
with Docutils.
|
with Docutils.
|
||||||
@ -1036,7 +953,7 @@ Release 1.0.8 (Sep 23, 2011)
|
|||||||
* #669: Respect the ``noindex`` flag option in py:module directives.
|
* #669: Respect the ``noindex`` flag option in py:module directives.
|
||||||
|
|
||||||
* #675: Fix IndexErrors when including nonexisting lines with
|
* #675: Fix IndexErrors when including nonexisting lines with
|
||||||
:rst:dir:`literalinclude`.
|
`literalinclude`.
|
||||||
|
|
||||||
* #676: Respect custom function/method parameter separator strings.
|
* #676: Respect custom function/method parameter separator strings.
|
||||||
|
|
||||||
@ -1119,7 +1036,7 @@ Release 1.0.6 (Jan 04, 2011)
|
|||||||
* #570: Try decoding ``-D`` and ``-A`` command-line arguments with
|
* #570: Try decoding ``-D`` and ``-A`` command-line arguments with
|
||||||
the locale's preferred encoding.
|
the locale's preferred encoding.
|
||||||
|
|
||||||
* #528: Observe :confval:`locale_dirs` when looking for the JS
|
* #528: Observe `locale_dirs` when looking for the JS
|
||||||
translations file.
|
translations file.
|
||||||
|
|
||||||
* #574: Add special code for better support of Japanese documents
|
* #574: Add special code for better support of Japanese documents
|
||||||
@ -1292,51 +1209,51 @@ Features added
|
|||||||
|
|
||||||
- Added a "nitpicky" mode that emits warnings for all missing
|
- Added a "nitpicky" mode that emits warnings for all missing
|
||||||
references. It is activated by the :option:`-n` command-line switch
|
references. It is activated by the :option:`-n` command-line switch
|
||||||
or the :confval:`nitpicky` config value.
|
or the `nitpicky` config value.
|
||||||
- Added ``latexpdf`` target in quickstart Makefile.
|
- Added ``latexpdf`` target in quickstart Makefile.
|
||||||
|
|
||||||
* Markup:
|
* Markup:
|
||||||
|
|
||||||
- The :rst:role:`menuselection` and :rst:role:`guilabel` roles now
|
- The `menuselection` and `guilabel` roles now
|
||||||
support ampersand accelerators.
|
support ampersand accelerators.
|
||||||
- New more compact doc field syntax is now recognized: ``:param type
|
- New more compact doc field syntax is now recognized: ``:param type
|
||||||
name: description``.
|
name: description``.
|
||||||
- Added ``tab-width`` option to :rst:dir:`literalinclude` directive.
|
- Added ``tab-width`` option to `literalinclude` directive.
|
||||||
- Added ``titlesonly`` option to :rst:dir:`toctree` directive.
|
- Added ``titlesonly`` option to :rst:dir:`toctree` directive.
|
||||||
- Added the ``prepend`` and ``append`` options to the
|
- Added the ``prepend`` and ``append`` options to the
|
||||||
:rst:dir:`literalinclude` directive.
|
`literalinclude` directive.
|
||||||
- #284: All docinfo metadata is now put into the document metadata, not
|
- #284: All docinfo metadata is now put into the document metadata, not
|
||||||
just the author.
|
just the author.
|
||||||
- The :rst:role:`ref` role can now also reference tables by caption.
|
- The `ref` role can now also reference tables by caption.
|
||||||
- The :rst:dir:`include` directive now supports absolute paths, which
|
- The :dudir:`include` directive now supports absolute paths, which
|
||||||
are interpreted as relative to the source directory.
|
are interpreted as relative to the source directory.
|
||||||
- In the Python domain, references like ``:func:`.name``` now look for
|
- In the Python domain, references like ``:func:`.name``` now look for
|
||||||
matching names with any prefix if no direct match is found.
|
matching names with any prefix if no direct match is found.
|
||||||
|
|
||||||
* Configuration:
|
* Configuration:
|
||||||
|
|
||||||
- Added :confval:`rst_prolog` config value.
|
- Added `rst_prolog` config value.
|
||||||
- Added :confval:`html_secnumber_suffix` config value to control
|
- Added `html_secnumber_suffix` config value to control
|
||||||
section numbering format.
|
section numbering format.
|
||||||
- Added :confval:`html_compact_lists` config value to control
|
- Added `html_compact_lists` config value to control
|
||||||
docutils' compact lists feature.
|
docutils' compact lists feature.
|
||||||
- The :confval:`html_sidebars` config value can now contain patterns
|
- The `html_sidebars` config value can now contain patterns
|
||||||
as keys, and the values can be lists that explicitly select which
|
as keys, and the values can be lists that explicitly select which
|
||||||
sidebar templates should be rendered. That means that the builtin
|
sidebar templates should be rendered. That means that the builtin
|
||||||
sidebar contents can be included only selectively.
|
sidebar contents can be included only selectively.
|
||||||
- :confval:`html_static_path` can now contain single file entries.
|
- `html_static_path` can now contain single file entries.
|
||||||
- The new universal config value :confval:`exclude_patterns` makes the
|
- The new universal config value `exclude_patterns` makes the
|
||||||
old :confval:`unused_docs`, :confval:`exclude_trees` and
|
old ``unused_docs``, ``exclude_trees`` and
|
||||||
:confval:`exclude_dirnames` obsolete.
|
``exclude_dirnames`` obsolete.
|
||||||
- Added :confval:`html_output_encoding` config value.
|
- Added `html_output_encoding` config value.
|
||||||
- Added the :confval:`latex_docclass` config value and made the
|
- Added the `latex_docclass` config value and made the
|
||||||
"twoside" documentclass option overridable by "oneside".
|
"twoside" documentclass option overridable by "oneside".
|
||||||
- Added the :confval:`trim_doctest_flags` config value, which is true
|
- Added the `trim_doctest_flags` config value, which is true
|
||||||
by default.
|
by default.
|
||||||
- Added :confval:`html_show_copyright` config value.
|
- Added `html_show_copyright` config value.
|
||||||
- Added :confval:`latex_show_pagerefs` and :confval:`latex_show_urls`
|
- Added `latex_show_pagerefs` and `latex_show_urls`
|
||||||
config values.
|
config values.
|
||||||
- The behavior of :confval:`html_file_suffix` changed slightly: the
|
- The behavior of `html_file_suffix` changed slightly: the
|
||||||
empty string now means "no suffix" instead of "default suffix", use
|
empty string now means "no suffix" instead of "default suffix", use
|
||||||
``None`` for "default suffix".
|
``None`` for "default suffix".
|
||||||
|
|
||||||
@ -1378,7 +1295,7 @@ Features added
|
|||||||
* Extension API:
|
* Extension API:
|
||||||
|
|
||||||
- Added :event:`html-collect-pages`.
|
- Added :event:`html-collect-pages`.
|
||||||
- Added :confval:`needs_sphinx` config value and
|
- Added `needs_sphinx` config value and
|
||||||
:meth:`~sphinx.application.Sphinx.require_sphinx` application API
|
:meth:`~sphinx.application.Sphinx.require_sphinx` application API
|
||||||
method.
|
method.
|
||||||
- #200: Added :meth:`~sphinx.application.Sphinx.add_stylesheet`
|
- #200: Added :meth:`~sphinx.application.Sphinx.add_stylesheet`
|
||||||
@ -1390,7 +1307,7 @@ Features added
|
|||||||
- Added the :mod:`~sphinx.ext.extlinks` extension.
|
- Added the :mod:`~sphinx.ext.extlinks` extension.
|
||||||
- Added support for source ordering of members in autodoc, with
|
- Added support for source ordering of members in autodoc, with
|
||||||
``autodoc_member_order = 'bysource'``.
|
``autodoc_member_order = 'bysource'``.
|
||||||
- Added :confval:`autodoc_default_flags` config value, which can be
|
- Added `autodoc_default_flags` config value, which can be
|
||||||
used to select default flags for all autodoc directives.
|
used to select default flags for all autodoc directives.
|
||||||
- Added a way for intersphinx to refer to named labels in other
|
- Added a way for intersphinx to refer to named labels in other
|
||||||
projects, and to specify the project you want to link to.
|
projects, and to specify the project you want to link to.
|
||||||
@ -1400,7 +1317,7 @@ Features added
|
|||||||
extension, thanks to Pauli Virtanen.
|
extension, thanks to Pauli Virtanen.
|
||||||
- #309: The :mod:`~sphinx.ext.graphviz` extension can now output SVG
|
- #309: The :mod:`~sphinx.ext.graphviz` extension can now output SVG
|
||||||
instead of PNG images, controlled by the
|
instead of PNG images, controlled by the
|
||||||
:confval:`graphviz_output_format` config value.
|
`graphviz_output_format` config value.
|
||||||
- Added ``alt`` option to :rst:dir:`graphviz` extension directives.
|
- Added ``alt`` option to :rst:dir:`graphviz` extension directives.
|
||||||
- Added ``exclude`` argument to :func:`.autodoc.between`.
|
- Added ``exclude`` argument to :func:`.autodoc.between`.
|
||||||
|
|
||||||
|
4
Makefile
4
Makefile
@ -48,10 +48,10 @@ reindent:
|
|||||||
@$(PYTHON) utils/reindent.py -r -n .
|
@$(PYTHON) utils/reindent.py -r -n .
|
||||||
endif
|
endif
|
||||||
|
|
||||||
test: build
|
test:
|
||||||
@cd tests; $(PYTHON) run.py -d -m '^[tT]est' $(TEST)
|
@cd tests; $(PYTHON) run.py -d -m '^[tT]est' $(TEST)
|
||||||
|
|
||||||
covertest: build
|
covertest:
|
||||||
@cd tests; $(PYTHON) run.py -d -m '^[tT]est' --with-coverage \
|
@cd tests; $(PYTHON) run.py -d -m '^[tT]est' --with-coverage \
|
||||||
--cover-package=sphinx $(TEST)
|
--cover-package=sphinx $(TEST)
|
||||||
|
|
||||||
|
10
README.rst
10
README.rst
@ -2,6 +2,9 @@
|
|||||||
README for Sphinx
|
README for Sphinx
|
||||||
=================
|
=================
|
||||||
|
|
||||||
|
This is the Sphinx documentation generator, see http://sphinx-doc.org/.
|
||||||
|
|
||||||
|
|
||||||
Installing
|
Installing
|
||||||
==========
|
==========
|
||||||
|
|
||||||
@ -17,7 +20,7 @@ Reading the docs
|
|||||||
After installing::
|
After installing::
|
||||||
|
|
||||||
cd doc
|
cd doc
|
||||||
sphinx-build . _build/html
|
make html
|
||||||
|
|
||||||
Then, direct your browser to ``_build/html/index.html``.
|
Then, direct your browser to ``_build/html/index.html``.
|
||||||
|
|
||||||
@ -35,6 +38,11 @@ If you want to use a different interpreter, e.g. ``python3``, use::
|
|||||||
|
|
||||||
PYTHON=python3 make test
|
PYTHON=python3 make test
|
||||||
|
|
||||||
|
Continuous testing runs on drone.io:
|
||||||
|
|
||||||
|
.. image:: https://drone.io/bitbucket.org/birkenfeld/sphinx/status.png
|
||||||
|
:target: https://drone.io/bitbucket.org/birkenfeld/sphinx/
|
||||||
|
|
||||||
|
|
||||||
Contributing
|
Contributing
|
||||||
============
|
============
|
||||||
|
3
doc/_templates/index.html
vendored
3
doc/_templates/index.html
vendored
@ -34,6 +34,9 @@
|
|||||||
<li>{%trans path=pathto('extensions')%}<b>Extensions:</b> automatic testing of code snippets, inclusion of
|
<li>{%trans path=pathto('extensions')%}<b>Extensions:</b> automatic testing of code snippets, inclusion of
|
||||||
docstrings from Python modules (API docs), and
|
docstrings from Python modules (API docs), and
|
||||||
<a href="{{ path }}#builtin-sphinx-extensions">more</a>{%endtrans%}</li>
|
<a href="{{ path }}#builtin-sphinx-extensions">more</a>{%endtrans%}</li>
|
||||||
|
<li>{%trans path=pathto('develop')%}<b>Contributed extensions:</b> more than
|
||||||
|
50 extensions <a href="{{ path }}#extensions">contributed by users</a>
|
||||||
|
in a second repository; most of them installable from PyPI{%endtrans%}</li>
|
||||||
</ul>
|
</ul>
|
||||||
<p>{%trans%}
|
<p>{%trans%}
|
||||||
Sphinx uses <a href="http://docutils.sf.net/rst.html">reStructuredText</a>
|
Sphinx uses <a href="http://docutils.sf.net/rst.html">reStructuredText</a>
|
||||||
|
2
doc/_templates/indexsidebar.html
vendored
2
doc/_templates/indexsidebar.html
vendored
@ -3,7 +3,7 @@
|
|||||||
{%trans%}project{%endtrans%}</p>
|
{%trans%}project{%endtrans%}</p>
|
||||||
|
|
||||||
<h3>Download</h3>
|
<h3>Download</h3>
|
||||||
{% if version.endswith('(hg)') %}
|
{% if version.endswith('a0') %}
|
||||||
<p>{%trans%}This documentation is for version <b>{{ version }}</b>, which is
|
<p>{%trans%}This documentation is for version <b>{{ version }}</b>, which is
|
||||||
not released yet.{%endtrans%}</p>
|
not released yet.{%endtrans%}</p>
|
||||||
<p>{%trans%}You can use it from the
|
<p>{%trans%}You can use it from the
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
:tocdepth: 2
|
:tocdepth: 2
|
||||||
|
|
||||||
.. _authors:
|
.. _authors:
|
||||||
|
|
||||||
Sphinx authors
|
Sphinx authors
|
||||||
==============
|
==============
|
||||||
|
|
||||||
.. include:: ../AUTHORS
|
.. include:: ../AUTHORS
|
||||||
|
|
||||||
|
@ -1,5 +1,7 @@
|
|||||||
:tocdepth: 2
|
:tocdepth: 2
|
||||||
|
|
||||||
|
.. default-role:: any
|
||||||
|
|
||||||
.. _changes:
|
.. _changes:
|
||||||
|
|
||||||
Changes in Sphinx
|
Changes in Sphinx
|
||||||
|
@ -83,7 +83,7 @@ texinfo_documents = [
|
|||||||
|
|
||||||
# We're not using intersphinx right now, but if we did, this would be part of
|
# We're not using intersphinx right now, but if we did, this would be part of
|
||||||
# the mapping:
|
# the mapping:
|
||||||
intersphinx_mapping = {'python': ('http://docs.python.org/dev', None)}
|
intersphinx_mapping = {'python': ('http://docs.python.org/2/', None)}
|
||||||
|
|
||||||
# Sphinx document translation with sphinx gettext feature uses these settings:
|
# Sphinx document translation with sphinx gettext feature uses these settings:
|
||||||
locale_dirs = ['locale/']
|
locale_dirs = ['locale/']
|
||||||
|
@ -707,7 +707,7 @@ that use Sphinx's HTMLWriter class.
|
|||||||
|
|
||||||
.. confval:: html_use_opensearch
|
.. confval:: html_use_opensearch
|
||||||
|
|
||||||
If nonempty, an `OpenSearch <http://opensearch.org>` description file will be
|
If nonempty, an `OpenSearch <http://opensearch.org>`_ description file will be
|
||||||
output, and all pages will contain a ``<link>`` tag referring to it. Since
|
output, and all pages will contain a ``<link>`` tag referring to it. Since
|
||||||
OpenSearch doesn't support relative URLs for its search page location, the
|
OpenSearch doesn't support relative URLs for its search page location, the
|
||||||
value of this option must be the base URL from which these documents are
|
value of this option must be the base URL from which these documents are
|
||||||
|
@ -130,6 +130,11 @@ These are the basic steps needed to start developing on Sphinx.
|
|||||||
* For bug fixes, first add a test that fails without your changes and passes
|
* For bug fixes, first add a test that fails without your changes and passes
|
||||||
after they are applied.
|
after they are applied.
|
||||||
|
|
||||||
|
* Tests that need a sphinx-build run should be integrated in one of the
|
||||||
|
existing test modules if possible. New tests that to ``@with_app`` and
|
||||||
|
then ``build_all`` for a few assertions are not good since *the test suite
|
||||||
|
should not take more than a minute to run*.
|
||||||
|
|
||||||
#. Please add a bullet point to :file:`CHANGES` if the fix or feature is not
|
#. Please add a bullet point to :file:`CHANGES` if the fix or feature is not
|
||||||
trivial (small doc updates, typo fixes). Then commit::
|
trivial (small doc updates, typo fixes). Then commit::
|
||||||
|
|
||||||
|
@ -437,6 +437,19 @@ handlers to the events. Example:
|
|||||||
|
|
||||||
.. versionadded:: 0.5
|
.. versionadded:: 0.5
|
||||||
|
|
||||||
|
.. event:: env-before-read-docs (app, env, docnames)
|
||||||
|
|
||||||
|
Emitted after the environment has determined the list of all added and
|
||||||
|
changed files and just before it reads them. It allows extension authors to
|
||||||
|
reorder the list of docnames (*inplace*) before processing, or add more
|
||||||
|
docnames that Sphinx did not consider changed (but never add any docnames
|
||||||
|
that are not in ``env.found_docs``).
|
||||||
|
|
||||||
|
You can also remove document names; do this with caution since it will make
|
||||||
|
Sphinx treat changed files as unchanged.
|
||||||
|
|
||||||
|
.. versionadded:: 1.3
|
||||||
|
|
||||||
.. event:: source-read (app, docname, source)
|
.. event:: source-read (app, docname, source)
|
||||||
|
|
||||||
Emitted when a source file has been read. The *source* argument is a list
|
Emitted when a source file has been read. The *source* argument is a list
|
||||||
@ -480,6 +493,26 @@ handlers to the events. Example:
|
|||||||
Here is the place to replace custom nodes that don't have visitor methods in
|
Here is the place to replace custom nodes that don't have visitor methods in
|
||||||
the writers, so that they don't cause errors when the writers encounter them.
|
the writers, so that they don't cause errors when the writers encounter them.
|
||||||
|
|
||||||
|
.. event:: env-merge-info (env, docnames, other)
|
||||||
|
|
||||||
|
This event is only emitted when parallel reading of documents is enabled. It
|
||||||
|
is emitted once for every subprocess that has read some documents.
|
||||||
|
|
||||||
|
You must handle this event in an extension that stores data in the
|
||||||
|
environment in a custom location. Otherwise the environment in the main
|
||||||
|
process will not be aware of the information stored in the subprocess.
|
||||||
|
|
||||||
|
*other* is the environment object from the subprocess, *env* is the
|
||||||
|
environment from the main process. *docnames* is a set of document names
|
||||||
|
that have been read in the subprocess.
|
||||||
|
|
||||||
|
For a sample of how to deal with this event, look at the standard
|
||||||
|
``sphinx.ext.todo`` extension. The implementation is often similar to that
|
||||||
|
of :event:`env-purge-doc`, only that information is not removed, but added to
|
||||||
|
the main environment from the other environment.
|
||||||
|
|
||||||
|
.. versionadded:: 1.3
|
||||||
|
|
||||||
.. event:: env-updated (app, env)
|
.. event:: env-updated (app, env)
|
||||||
|
|
||||||
Emitted when the :meth:`update` method of the build environment has
|
Emitted when the :meth:`update` method of the build environment has
|
||||||
|
@ -18,15 +18,32 @@ imports this module and executes its ``setup()`` function, which in turn
|
|||||||
notifies Sphinx of everything the extension offers -- see the extension tutorial
|
notifies Sphinx of everything the extension offers -- see the extension tutorial
|
||||||
for examples.
|
for examples.
|
||||||
|
|
||||||
.. versionadded:: 1.3
|
|
||||||
The ``setup()`` function can return a string, this is treated by Sphinx as
|
|
||||||
the version of the extension and used for informational purposes such as the
|
|
||||||
traceback file when an exception occurs.
|
|
||||||
|
|
||||||
The configuration file itself can be treated as an extension if it contains a
|
The configuration file itself can be treated as an extension if it contains a
|
||||||
``setup()`` function. All other extensions to load must be listed in the
|
``setup()`` function. All other extensions to load must be listed in the
|
||||||
:confval:`extensions` configuration value.
|
:confval:`extensions` configuration value.
|
||||||
|
|
||||||
|
Extension metadata
|
||||||
|
------------------
|
||||||
|
|
||||||
|
.. versionadded:: 1.3
|
||||||
|
|
||||||
|
The ``setup()`` function can return a dictionary. This is treated by Sphinx
|
||||||
|
as metadata of the extension. Metadata keys currently recognized are:
|
||||||
|
|
||||||
|
* ``'version'``: a string that identifies the extension version. It is used for
|
||||||
|
extension version requirement checking (see :confval:`needs_extensions`) and
|
||||||
|
informational purposes. If not given, ``"unknown version"`` is substituted.
|
||||||
|
* ``'parallel_read_safe'``: a boolean that specifies if parallel reading of
|
||||||
|
source files can be used when the extension is loaded. It defaults to
|
||||||
|
``False``, i.e. you have to explicitly specify your extension to be
|
||||||
|
parallel-read-safe after checking that it is.
|
||||||
|
* ``'parallel_write_safe'``: a boolean that specifies if parallel writing of
|
||||||
|
output files can be used when the extension is loaded. Since extensions
|
||||||
|
usually don't negatively influence the process, this defaults to ``True``.
|
||||||
|
|
||||||
|
APIs used for writing extensions
|
||||||
|
--------------------------------
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
|
|
||||||
tutorial
|
tutorial
|
||||||
|
@ -162,7 +162,7 @@ new Python module called :file:`todo.py` and add the setup function::
|
|||||||
app.connect('doctree-resolved', process_todo_nodes)
|
app.connect('doctree-resolved', process_todo_nodes)
|
||||||
app.connect('env-purge-doc', purge_todos)
|
app.connect('env-purge-doc', purge_todos)
|
||||||
|
|
||||||
return '0.1' # identifies the version of our extension
|
return {'version': '0.1'} # identifies the version of our extension
|
||||||
|
|
||||||
The calls in this function refer to classes and functions not yet written. What
|
The calls in this function refer to classes and functions not yet written. What
|
||||||
the individual calls do is the following:
|
the individual calls do is the following:
|
||||||
|
@ -36,21 +36,29 @@ installed) and handled in a smart way:
|
|||||||
highlighted as Python).
|
highlighted as Python).
|
||||||
|
|
||||||
* The highlighting language can be changed using the ``highlight`` directive,
|
* The highlighting language can be changed using the ``highlight`` directive,
|
||||||
used as follows::
|
used as follows:
|
||||||
|
|
||||||
.. highlight:: c
|
.. rst:directive:: .. highlight:: language
|
||||||
|
|
||||||
This language is used until the next ``highlight`` directive is encountered.
|
Example::
|
||||||
|
|
||||||
|
.. highlight:: c
|
||||||
|
|
||||||
|
This language is used until the next ``highlight`` directive is encountered.
|
||||||
|
|
||||||
* For documents that have to show snippets in different languages, there's also
|
* For documents that have to show snippets in different languages, there's also
|
||||||
a :rst:dir:`code-block` directive that is given the highlighting language
|
a :rst:dir:`code-block` directive that is given the highlighting language
|
||||||
directly::
|
directly:
|
||||||
|
|
||||||
.. code-block:: ruby
|
.. rst:directive:: .. code-block:: language
|
||||||
|
|
||||||
Some Ruby code.
|
Use it like this::
|
||||||
|
|
||||||
The directive's alias name :rst:dir:`sourcecode` works as well.
|
.. code-block:: ruby
|
||||||
|
|
||||||
|
Some Ruby code.
|
||||||
|
|
||||||
|
The directive's alias name :rst:dir:`sourcecode` works as well.
|
||||||
|
|
||||||
* The valid values for the highlighting language are:
|
* The valid values for the highlighting language are:
|
||||||
|
|
||||||
|
@ -12,7 +12,9 @@ They are written as ``:rolename:`content```.
|
|||||||
|
|
||||||
The default role (```content```) has no special meaning by default. You are
|
The default role (```content```) has no special meaning by default. You are
|
||||||
free to use it for anything you like, e.g. variable names; use the
|
free to use it for anything you like, e.g. variable names; use the
|
||||||
:confval:`default_role` config value to set it to a known role.
|
:confval:`default_role` config value to set it to a known role -- the
|
||||||
|
:rst:role:`any` role to find anything or the :rst:role:`py:obj` role to find
|
||||||
|
Python objects are very useful for this.
|
||||||
|
|
||||||
See :ref:`domains` for roles added by domains.
|
See :ref:`domains` for roles added by domains.
|
||||||
|
|
||||||
@ -38,12 +40,57 @@ more versatile:
|
|||||||
|
|
||||||
* If you prefix the content with ``~``, the link text will only be the last
|
* If you prefix the content with ``~``, the link text will only be the last
|
||||||
component of the target. For example, ``:py:meth:`~Queue.Queue.get``` will
|
component of the target. For example, ``:py:meth:`~Queue.Queue.get``` will
|
||||||
refer to ``Queue.Queue.get`` but only display ``get`` as the link text.
|
refer to ``Queue.Queue.get`` but only display ``get`` as the link text. This
|
||||||
|
does not work with all cross-reference roles, but is domain specific.
|
||||||
|
|
||||||
In HTML output, the link's ``title`` attribute (that is e.g. shown as a
|
In HTML output, the link's ``title`` attribute (that is e.g. shown as a
|
||||||
tool-tip on mouse-hover) will always be the full target name.
|
tool-tip on mouse-hover) will always be the full target name.
|
||||||
|
|
||||||
|
|
||||||
|
.. _any-role:
|
||||||
|
|
||||||
|
Cross-referencing anything
|
||||||
|
--------------------------
|
||||||
|
|
||||||
|
.. rst:role:: any
|
||||||
|
|
||||||
|
.. versionadded:: 1.3
|
||||||
|
|
||||||
|
This convenience role tries to do its best to find a valid target for its
|
||||||
|
reference text.
|
||||||
|
|
||||||
|
* First, it tries standard cross-reference targets that would be referenced
|
||||||
|
by :rst:role:`doc`, :rst:role:`ref` or :rst:role:`option`.
|
||||||
|
|
||||||
|
Custom objects added to the standard domain by extensions (see
|
||||||
|
:meth:`.add_object_type`) are also searched.
|
||||||
|
|
||||||
|
* Then, it looks for objects (targets) in all loaded domains. It is up to
|
||||||
|
the domains how specific a match must be. For example, in the Python
|
||||||
|
domain a reference of ``:any:`Builder``` would match the
|
||||||
|
``sphinx.builders.Builder`` class.
|
||||||
|
|
||||||
|
If none or multiple targets are found, a warning will be emitted. In the
|
||||||
|
case of multiple targets, you can change "any" to a specific role.
|
||||||
|
|
||||||
|
This role is a good candidate for setting :confval:`default_role`. If you
|
||||||
|
do, you can write cross-references without a lot of markup overhead. For
|
||||||
|
example, in this Python function documentation ::
|
||||||
|
|
||||||
|
.. function:: install()
|
||||||
|
|
||||||
|
This function installs a `handler` for every signal known by the
|
||||||
|
`signal` module. See the section `about-signals` for more information.
|
||||||
|
|
||||||
|
there could be references to a glossary term (usually ``:term:`handler```), a
|
||||||
|
Python module (usually ``:py:mod:`signal``` or ``:mod:`signal```) and a
|
||||||
|
section (usually ``:ref:`about-signals```).
|
||||||
|
|
||||||
|
The :rst:role:`any` role also works together with the
|
||||||
|
:mod:`~sphinx.ext.intersphinx` extension: when no local cross-reference is
|
||||||
|
found, all object types of intersphinx inventories are also searched.
|
||||||
|
|
||||||
|
|
||||||
Cross-referencing objects
|
Cross-referencing objects
|
||||||
-------------------------
|
-------------------------
|
||||||
|
|
||||||
|
@ -25,6 +25,7 @@ class desc(nodes.Admonition, nodes.Element):
|
|||||||
contains one or more ``desc_signature`` and a ``desc_content``.
|
contains one or more ``desc_signature`` and a ``desc_content``.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class desc_signature(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_signature(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for object signatures.
|
"""Node for object signatures.
|
||||||
|
|
||||||
@ -39,33 +40,42 @@ class desc_addname(nodes.Part, nodes.Inline, nodes.TextElement):
|
|||||||
# compatibility alias
|
# compatibility alias
|
||||||
desc_classname = desc_addname
|
desc_classname = desc_addname
|
||||||
|
|
||||||
|
|
||||||
class desc_type(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_type(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for return types or object type names."""
|
"""Node for return types or object type names."""
|
||||||
|
|
||||||
|
|
||||||
class desc_returns(desc_type):
|
class desc_returns(desc_type):
|
||||||
"""Node for a "returns" annotation (a la -> in Python)."""
|
"""Node for a "returns" annotation (a la -> in Python)."""
|
||||||
def astext(self):
|
def astext(self):
|
||||||
return ' -> ' + nodes.TextElement.astext(self)
|
return ' -> ' + nodes.TextElement.astext(self)
|
||||||
|
|
||||||
|
|
||||||
class desc_name(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_name(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for the main object name."""
|
"""Node for the main object name."""
|
||||||
|
|
||||||
|
|
||||||
class desc_parameterlist(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_parameterlist(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for a general parameter list."""
|
"""Node for a general parameter list."""
|
||||||
child_text_separator = ', '
|
child_text_separator = ', '
|
||||||
|
|
||||||
|
|
||||||
class desc_parameter(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_parameter(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for a single parameter."""
|
"""Node for a single parameter."""
|
||||||
|
|
||||||
|
|
||||||
class desc_optional(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_optional(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for marking optional parts of the parameter list."""
|
"""Node for marking optional parts of the parameter list."""
|
||||||
child_text_separator = ', '
|
child_text_separator = ', '
|
||||||
|
|
||||||
def astext(self):
|
def astext(self):
|
||||||
return '[' + nodes.TextElement.astext(self) + ']'
|
return '[' + nodes.TextElement.astext(self) + ']'
|
||||||
|
|
||||||
|
|
||||||
class desc_annotation(nodes.Part, nodes.Inline, nodes.TextElement):
|
class desc_annotation(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for signature annotations (not Python 3-style annotations)."""
|
"""Node for signature annotations (not Python 3-style annotations)."""
|
||||||
|
|
||||||
|
|
||||||
class desc_content(nodes.General, nodes.Element):
|
class desc_content(nodes.General, nodes.Element):
|
||||||
"""Node for object description content.
|
"""Node for object description content.
|
||||||
|
|
||||||
@ -82,15 +92,18 @@ class versionmodified(nodes.Admonition, nodes.TextElement):
|
|||||||
directives.
|
directives.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class seealso(nodes.Admonition, nodes.Element):
|
class seealso(nodes.Admonition, nodes.Element):
|
||||||
"""Custom "see also" admonition."""
|
"""Custom "see also" admonition."""
|
||||||
|
|
||||||
|
|
||||||
class productionlist(nodes.Admonition, nodes.Element):
|
class productionlist(nodes.Admonition, nodes.Element):
|
||||||
"""Node for grammar production lists.
|
"""Node for grammar production lists.
|
||||||
|
|
||||||
Contains ``production`` nodes.
|
Contains ``production`` nodes.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class production(nodes.Part, nodes.Inline, nodes.TextElement):
|
class production(nodes.Part, nodes.Inline, nodes.TextElement):
|
||||||
"""Node for a single grammar production rule."""
|
"""Node for a single grammar production rule."""
|
||||||
|
|
||||||
@ -107,26 +120,33 @@ class index(nodes.Invisible, nodes.Inline, nodes.TextElement):
|
|||||||
*entrytype* is one of "single", "pair", "double", "triple".
|
*entrytype* is one of "single", "pair", "double", "triple".
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class centered(nodes.Part, nodes.TextElement):
|
class centered(nodes.Part, nodes.TextElement):
|
||||||
"""Deprecated."""
|
"""Deprecated."""
|
||||||
|
|
||||||
|
|
||||||
class acks(nodes.Element):
|
class acks(nodes.Element):
|
||||||
"""Special node for "acks" lists."""
|
"""Special node for "acks" lists."""
|
||||||
|
|
||||||
|
|
||||||
class hlist(nodes.Element):
|
class hlist(nodes.Element):
|
||||||
"""Node for "horizontal lists", i.e. lists that should be compressed to
|
"""Node for "horizontal lists", i.e. lists that should be compressed to
|
||||||
take up less vertical space.
|
take up less vertical space.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class hlistcol(nodes.Element):
|
class hlistcol(nodes.Element):
|
||||||
"""Node for one column in a horizontal list."""
|
"""Node for one column in a horizontal list."""
|
||||||
|
|
||||||
|
|
||||||
class compact_paragraph(nodes.paragraph):
|
class compact_paragraph(nodes.paragraph):
|
||||||
"""Node for a compact paragraph (which never makes a <p> node)."""
|
"""Node for a compact paragraph (which never makes a <p> node)."""
|
||||||
|
|
||||||
|
|
||||||
class glossary(nodes.Element):
|
class glossary(nodes.Element):
|
||||||
"""Node to insert a glossary."""
|
"""Node to insert a glossary."""
|
||||||
|
|
||||||
|
|
||||||
class only(nodes.Element):
|
class only(nodes.Element):
|
||||||
"""Node for "only" directives (conditional inclusion based on tags)."""
|
"""Node for "only" directives (conditional inclusion based on tags)."""
|
||||||
|
|
||||||
@ -136,14 +156,17 @@ class only(nodes.Element):
|
|||||||
class start_of_file(nodes.Element):
|
class start_of_file(nodes.Element):
|
||||||
"""Node to mark start of a new file, used in the LaTeX builder only."""
|
"""Node to mark start of a new file, used in the LaTeX builder only."""
|
||||||
|
|
||||||
|
|
||||||
class highlightlang(nodes.Element):
|
class highlightlang(nodes.Element):
|
||||||
"""Inserted to set the highlight language and line number options for
|
"""Inserted to set the highlight language and line number options for
|
||||||
subsequent code blocks.
|
subsequent code blocks.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class tabular_col_spec(nodes.Element):
|
class tabular_col_spec(nodes.Element):
|
||||||
"""Node for specifying tabular columns, used for LaTeX output."""
|
"""Node for specifying tabular columns, used for LaTeX output."""
|
||||||
|
|
||||||
|
|
||||||
class meta(nodes.Special, nodes.PreBibliographic, nodes.Element):
|
class meta(nodes.Special, nodes.PreBibliographic, nodes.Element):
|
||||||
"""Node for meta directive -- same as docutils' standard meta node,
|
"""Node for meta directive -- same as docutils' standard meta node,
|
||||||
but pickleable.
|
but pickleable.
|
||||||
@ -160,22 +183,27 @@ class pending_xref(nodes.Inline, nodes.Element):
|
|||||||
BuildEnvironment.resolve_references.
|
BuildEnvironment.resolve_references.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class download_reference(nodes.reference):
|
class download_reference(nodes.reference):
|
||||||
"""Node for download references, similar to pending_xref."""
|
"""Node for download references, similar to pending_xref."""
|
||||||
|
|
||||||
|
|
||||||
class literal_emphasis(nodes.emphasis):
|
class literal_emphasis(nodes.emphasis):
|
||||||
"""Node that behaves like `emphasis`, but further text processors are not
|
"""Node that behaves like `emphasis`, but further text processors are not
|
||||||
applied (e.g. smartypants for HTML output).
|
applied (e.g. smartypants for HTML output).
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class literal_strong(nodes.strong):
|
class literal_strong(nodes.strong):
|
||||||
"""Node that behaves like `strong`, but further text processors are not
|
"""Node that behaves like `strong`, but further text processors are not
|
||||||
applied (e.g. smartypants for HTML output).
|
applied (e.g. smartypants for HTML output).
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
class abbreviation(nodes.Inline, nodes.TextElement):
|
class abbreviation(nodes.Inline, nodes.TextElement):
|
||||||
"""Node for abbreviations with explanations."""
|
"""Node for abbreviations with explanations."""
|
||||||
|
|
||||||
|
|
||||||
class termsep(nodes.Structural, nodes.Element):
|
class termsep(nodes.Structural, nodes.Element):
|
||||||
"""Separates two terms within a <term> node."""
|
"""Separates two terms within a <term> node."""
|
||||||
|
|
||||||
|
@ -88,7 +88,7 @@ def create_module_file(package, module, opts):
|
|||||||
text = format_heading(1, '%s module' % module)
|
text = format_heading(1, '%s module' % module)
|
||||||
else:
|
else:
|
||||||
text = ''
|
text = ''
|
||||||
#text += format_heading(2, ':mod:`%s` Module' % module)
|
# text += format_heading(2, ':mod:`%s` Module' % module)
|
||||||
text += format_directive(module, package)
|
text += format_directive(module, package)
|
||||||
write_file(makename(package, module), text, opts)
|
write_file(makename(package, module), text, opts)
|
||||||
|
|
||||||
@ -173,7 +173,7 @@ def shall_skip(module, opts):
|
|||||||
# skip if it has a "private" name and this is selected
|
# skip if it has a "private" name and this is selected
|
||||||
filename = path.basename(module)
|
filename = path.basename(module)
|
||||||
if filename != '__init__.py' and filename.startswith('_') and \
|
if filename != '__init__.py' and filename.startswith('_') and \
|
||||||
not opts.includeprivate:
|
not opts.includeprivate:
|
||||||
return True
|
return True
|
||||||
return False
|
return False
|
||||||
|
|
||||||
@ -218,7 +218,7 @@ def recurse_tree(rootpath, excludes, opts):
|
|||||||
if is_pkg:
|
if is_pkg:
|
||||||
# we are in a package with something to document
|
# we are in a package with something to document
|
||||||
if subs or len(py_files) > 1 or not \
|
if subs or len(py_files) > 1 or not \
|
||||||
shall_skip(path.join(root, INITPY), opts):
|
shall_skip(path.join(root, INITPY), opts):
|
||||||
subpackage = root[len(rootpath):].lstrip(path.sep).\
|
subpackage = root[len(rootpath):].lstrip(path.sep).\
|
||||||
replace(path.sep, '.')
|
replace(path.sep, '.')
|
||||||
create_package_file(root, root_package, subpackage,
|
create_package_file(root, root_package, subpackage,
|
||||||
@ -318,7 +318,7 @@ Note: By default this script will not overwrite already created files.""")
|
|||||||
(opts, args) = parser.parse_args(argv[1:])
|
(opts, args) = parser.parse_args(argv[1:])
|
||||||
|
|
||||||
if opts.show_version:
|
if opts.show_version:
|
||||||
print('Sphinx (sphinx-apidoc) %s' % __version__)
|
print('Sphinx (sphinx-apidoc) %s' % __version__)
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
if not args:
|
if not args:
|
||||||
|
@ -20,7 +20,7 @@ import traceback
|
|||||||
from os import path
|
from os import path
|
||||||
from collections import deque
|
from collections import deque
|
||||||
|
|
||||||
from six import iteritems, itervalues
|
from six import iteritems, itervalues, text_type
|
||||||
from six.moves import cStringIO
|
from six.moves import cStringIO
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from docutils.parsers.rst import convert_directive_function, \
|
from docutils.parsers.rst import convert_directive_function, \
|
||||||
@ -39,7 +39,8 @@ from sphinx.environment import BuildEnvironment, SphinxStandaloneReader
|
|||||||
from sphinx.util import pycompat # imported for side-effects
|
from sphinx.util import pycompat # imported for side-effects
|
||||||
from sphinx.util.tags import Tags
|
from sphinx.util.tags import Tags
|
||||||
from sphinx.util.osutil import ENOENT
|
from sphinx.util.osutil import ENOENT
|
||||||
from sphinx.util.console import bold, lightgray, darkgray
|
from sphinx.util.console import bold, lightgray, darkgray, darkgreen, \
|
||||||
|
term_width_line
|
||||||
|
|
||||||
if hasattr(sys, 'intern'):
|
if hasattr(sys, 'intern'):
|
||||||
intern = sys.intern
|
intern = sys.intern
|
||||||
@ -49,8 +50,10 @@ events = {
|
|||||||
'builder-inited': '',
|
'builder-inited': '',
|
||||||
'env-get-outdated': 'env, added, changed, removed',
|
'env-get-outdated': 'env, added, changed, removed',
|
||||||
'env-purge-doc': 'env, docname',
|
'env-purge-doc': 'env, docname',
|
||||||
|
'env-before-read-docs': 'env, docnames',
|
||||||
'source-read': 'docname, source text',
|
'source-read': 'docname, source text',
|
||||||
'doctree-read': 'the doctree before being pickled',
|
'doctree-read': 'the doctree before being pickled',
|
||||||
|
'env-merge-info': 'env, read docnames, other env instance',
|
||||||
'missing-reference': 'env, node, contnode',
|
'missing-reference': 'env, node, contnode',
|
||||||
'doctree-resolved': 'doctree, docname',
|
'doctree-resolved': 'doctree, docname',
|
||||||
'env-updated': 'env',
|
'env-updated': 'env',
|
||||||
@ -72,7 +75,7 @@ class Sphinx(object):
|
|||||||
self.verbosity = verbosity
|
self.verbosity = verbosity
|
||||||
self.next_listener_id = 0
|
self.next_listener_id = 0
|
||||||
self._extensions = {}
|
self._extensions = {}
|
||||||
self._extension_versions = {}
|
self._extension_metadata = {}
|
||||||
self._listeners = {}
|
self._listeners = {}
|
||||||
self.domains = BUILTIN_DOMAINS.copy()
|
self.domains = BUILTIN_DOMAINS.copy()
|
||||||
self.builderclasses = BUILTIN_BUILDERS.copy()
|
self.builderclasses = BUILTIN_BUILDERS.copy()
|
||||||
@ -112,6 +115,10 @@ class Sphinx(object):
|
|||||||
# status code for command-line application
|
# status code for command-line application
|
||||||
self.statuscode = 0
|
self.statuscode = 0
|
||||||
|
|
||||||
|
if not path.isdir(outdir):
|
||||||
|
self.info('making output directory...')
|
||||||
|
os.makedirs(outdir)
|
||||||
|
|
||||||
# read config
|
# read config
|
||||||
self.tags = Tags(tags)
|
self.tags = Tags(tags)
|
||||||
self.config = Config(confdir, CONFIG_FILENAME,
|
self.config = Config(confdir, CONFIG_FILENAME,
|
||||||
@ -128,7 +135,7 @@ class Sphinx(object):
|
|||||||
self.setup_extension(extension)
|
self.setup_extension(extension)
|
||||||
# the config file itself can be an extension
|
# the config file itself can be an extension
|
||||||
if self.config.setup:
|
if self.config.setup:
|
||||||
# py31 doesn't have 'callable' function for bellow check
|
# py31 doesn't have 'callable' function for below check
|
||||||
if hasattr(self.config.setup, '__call__'):
|
if hasattr(self.config.setup, '__call__'):
|
||||||
self.config.setup(self)
|
self.config.setup(self)
|
||||||
else:
|
else:
|
||||||
@ -156,7 +163,7 @@ class Sphinx(object):
|
|||||||
'version requirement for extension %s, but it is '
|
'version requirement for extension %s, but it is '
|
||||||
'not loaded' % extname)
|
'not loaded' % extname)
|
||||||
continue
|
continue
|
||||||
has_ver = self._extension_versions[extname]
|
has_ver = self._extension_metadata[extname]['version']
|
||||||
if has_ver == 'unknown version' or needs_ver > has_ver:
|
if has_ver == 'unknown version' or needs_ver > has_ver:
|
||||||
raise VersionRequirementError(
|
raise VersionRequirementError(
|
||||||
'This project needs the extension %s at least in '
|
'This project needs the extension %s at least in '
|
||||||
@ -200,8 +207,8 @@ class Sphinx(object):
|
|||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
self.info(bold('loading pickled environment... '), nonl=True)
|
self.info(bold('loading pickled environment... '), nonl=True)
|
||||||
self.env = BuildEnvironment.frompickle(self.config,
|
self.env = BuildEnvironment.frompickle(
|
||||||
path.join(self.doctreedir, ENV_PICKLE_FILENAME))
|
self.config, path.join(self.doctreedir, ENV_PICKLE_FILENAME))
|
||||||
self.env.domains = {}
|
self.env.domains = {}
|
||||||
for domain in self.domains.keys():
|
for domain in self.domains.keys():
|
||||||
# this can raise if the data version doesn't fit
|
# this can raise if the data version doesn't fit
|
||||||
@ -245,6 +252,15 @@ class Sphinx(object):
|
|||||||
else:
|
else:
|
||||||
self.builder.compile_update_catalogs()
|
self.builder.compile_update_catalogs()
|
||||||
self.builder.build_update()
|
self.builder.build_update()
|
||||||
|
|
||||||
|
status = (self.statuscode == 0
|
||||||
|
and 'succeeded' or 'finished with problems')
|
||||||
|
if self._warncount:
|
||||||
|
self.info(bold('build %s, %s warning%s.' %
|
||||||
|
(status, self._warncount,
|
||||||
|
self._warncount != 1 and 's' or '')))
|
||||||
|
else:
|
||||||
|
self.info(bold('build %s.' % status))
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
# delete the saved env to force a fresh build next time
|
# delete the saved env to force a fresh build next time
|
||||||
envfile = path.join(self.doctreedir, ENV_PICKLE_FILENAME)
|
envfile = path.join(self.doctreedir, ENV_PICKLE_FILENAME)
|
||||||
@ -291,7 +307,7 @@ class Sphinx(object):
|
|||||||
else:
|
else:
|
||||||
location = None
|
location = None
|
||||||
warntext = location and '%s: %s%s\n' % (location, prefix, message) or \
|
warntext = location and '%s: %s%s\n' % (location, prefix, message) or \
|
||||||
'%s%s\n' % (prefix, message)
|
'%s%s\n' % (prefix, message)
|
||||||
if self.warningiserror:
|
if self.warningiserror:
|
||||||
raise SphinxWarning(warntext)
|
raise SphinxWarning(warntext)
|
||||||
self._warncount += 1
|
self._warncount += 1
|
||||||
@ -350,6 +366,48 @@ class Sphinx(object):
|
|||||||
message = message % (args or kwargs)
|
message = message % (args or kwargs)
|
||||||
self._log(lightgray(message), self._status)
|
self._log(lightgray(message), self._status)
|
||||||
|
|
||||||
|
def _display_chunk(chunk):
|
||||||
|
if isinstance(chunk, (list, tuple)):
|
||||||
|
if len(chunk) == 1:
|
||||||
|
return text_type(chunk[0])
|
||||||
|
return '%s .. %s' % (chunk[0], chunk[-1])
|
||||||
|
return text_type(chunk)
|
||||||
|
|
||||||
|
def old_status_iterator(self, iterable, summary, colorfunc=darkgreen,
|
||||||
|
stringify_func=_display_chunk):
|
||||||
|
l = 0
|
||||||
|
for item in iterable:
|
||||||
|
if l == 0:
|
||||||
|
self.info(bold(summary), nonl=1)
|
||||||
|
l = 1
|
||||||
|
self.info(colorfunc(stringify_func(item)) + ' ', nonl=1)
|
||||||
|
yield item
|
||||||
|
if l == 1:
|
||||||
|
self.info()
|
||||||
|
|
||||||
|
# new version with progress info
|
||||||
|
def status_iterator(self, iterable, summary, colorfunc=darkgreen, length=0,
|
||||||
|
stringify_func=_display_chunk):
|
||||||
|
if length == 0:
|
||||||
|
for item in self.old_status_iterator(iterable, summary, colorfunc,
|
||||||
|
stringify_func):
|
||||||
|
yield item
|
||||||
|
return
|
||||||
|
l = 0
|
||||||
|
summary = bold(summary)
|
||||||
|
for item in iterable:
|
||||||
|
l += 1
|
||||||
|
s = '%s[%3d%%] %s' % (summary, 100*l/length,
|
||||||
|
colorfunc(stringify_func(item)))
|
||||||
|
if self.verbosity:
|
||||||
|
s += '\n'
|
||||||
|
else:
|
||||||
|
s = term_width_line(s)
|
||||||
|
self.info(s, nonl=1)
|
||||||
|
yield item
|
||||||
|
if l > 0:
|
||||||
|
self.info()
|
||||||
|
|
||||||
# ---- general extensibility interface -------------------------------------
|
# ---- general extensibility interface -------------------------------------
|
||||||
|
|
||||||
def setup_extension(self, extension):
|
def setup_extension(self, extension):
|
||||||
@ -366,20 +424,22 @@ class Sphinx(object):
|
|||||||
if not hasattr(mod, 'setup'):
|
if not hasattr(mod, 'setup'):
|
||||||
self.warn('extension %r has no setup() function; is it really '
|
self.warn('extension %r has no setup() function; is it really '
|
||||||
'a Sphinx extension module?' % extension)
|
'a Sphinx extension module?' % extension)
|
||||||
version = None
|
ext_meta = None
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
version = mod.setup(self)
|
ext_meta = mod.setup(self)
|
||||||
except VersionRequirementError as err:
|
except VersionRequirementError as err:
|
||||||
# add the extension name to the version required
|
# add the extension name to the version required
|
||||||
raise VersionRequirementError(
|
raise VersionRequirementError(
|
||||||
'The %s extension used by this project needs at least '
|
'The %s extension used by this project needs at least '
|
||||||
'Sphinx v%s; it therefore cannot be built with this '
|
'Sphinx v%s; it therefore cannot be built with this '
|
||||||
'version.' % (extension, err))
|
'version.' % (extension, err))
|
||||||
if version is None:
|
if ext_meta is None:
|
||||||
version = 'unknown version'
|
ext_meta = {}
|
||||||
|
if not ext_meta.get('version'):
|
||||||
|
ext_meta['version'] = 'unknown version'
|
||||||
self._extensions[extension] = mod
|
self._extensions[extension] = mod
|
||||||
self._extension_versions[extension] = version
|
self._extension_metadata[extension] = ext_meta
|
||||||
|
|
||||||
def require_sphinx(self, version):
|
def require_sphinx(self, version):
|
||||||
# check the Sphinx version if requested
|
# check the Sphinx version if requested
|
||||||
@ -461,7 +521,7 @@ class Sphinx(object):
|
|||||||
else:
|
else:
|
||||||
raise ExtensionError(
|
raise ExtensionError(
|
||||||
'Builder %r already exists (in module %s)' % (
|
'Builder %r already exists (in module %s)' % (
|
||||||
builder.name, self.builderclasses[builder.name].__module__))
|
builder.name, self.builderclasses[builder.name].__module__))
|
||||||
self.builderclasses[builder.name] = builder
|
self.builderclasses[builder.name] = builder
|
||||||
|
|
||||||
def add_config_value(self, name, default, rebuild):
|
def add_config_value(self, name, default, rebuild):
|
||||||
|
@ -22,7 +22,9 @@ from docutils import nodes
|
|||||||
|
|
||||||
from sphinx.util import i18n, path_stabilize
|
from sphinx.util import i18n, path_stabilize
|
||||||
from sphinx.util.osutil import SEP, relative_uri, find_catalog
|
from sphinx.util.osutil import SEP, relative_uri, find_catalog
|
||||||
from sphinx.util.console import bold, purple, darkgreen, term_width_line
|
from sphinx.util.console import bold, darkgreen
|
||||||
|
from sphinx.util.parallel import ParallelTasks, SerialTasks, make_chunks, \
|
||||||
|
parallel_available
|
||||||
|
|
||||||
# side effect: registers roles and directives
|
# side effect: registers roles and directives
|
||||||
from sphinx import roles
|
from sphinx import roles
|
||||||
@ -62,10 +64,17 @@ class Builder(object):
|
|||||||
self.tags.add(self.name)
|
self.tags.add(self.name)
|
||||||
self.tags.add("format_%s" % self.format)
|
self.tags.add("format_%s" % self.format)
|
||||||
self.tags.add("builder_%s" % self.name)
|
self.tags.add("builder_%s" % self.name)
|
||||||
|
# compatibility aliases
|
||||||
|
self.status_iterator = app.status_iterator
|
||||||
|
self.old_status_iterator = app.old_status_iterator
|
||||||
|
|
||||||
# images that need to be copied over (source -> dest)
|
# images that need to be copied over (source -> dest)
|
||||||
self.images = {}
|
self.images = {}
|
||||||
|
|
||||||
|
# these get set later
|
||||||
|
self.parallel_ok = False
|
||||||
|
self.finish_tasks = None
|
||||||
|
|
||||||
# load default translator class
|
# load default translator class
|
||||||
self.translator_class = app._translators.get(self.name)
|
self.translator_class = app._translators.get(self.name)
|
||||||
|
|
||||||
@ -113,41 +122,6 @@ class Builder(object):
|
|||||||
"""
|
"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
def old_status_iterator(self, iterable, summary, colorfunc=darkgreen,
|
|
||||||
stringify_func=lambda x: x):
|
|
||||||
l = 0
|
|
||||||
for item in iterable:
|
|
||||||
if l == 0:
|
|
||||||
self.info(bold(summary), nonl=1)
|
|
||||||
l = 1
|
|
||||||
self.info(colorfunc(stringify_func(item)) + ' ', nonl=1)
|
|
||||||
yield item
|
|
||||||
if l == 1:
|
|
||||||
self.info()
|
|
||||||
|
|
||||||
# new version with progress info
|
|
||||||
def status_iterator(self, iterable, summary, colorfunc=darkgreen, length=0,
|
|
||||||
stringify_func=lambda x: x):
|
|
||||||
if length == 0:
|
|
||||||
for item in self.old_status_iterator(iterable, summary, colorfunc,
|
|
||||||
stringify_func):
|
|
||||||
yield item
|
|
||||||
return
|
|
||||||
l = 0
|
|
||||||
summary = bold(summary)
|
|
||||||
for item in iterable:
|
|
||||||
l += 1
|
|
||||||
s = '%s[%3d%%] %s' % (summary, 100*l/length,
|
|
||||||
colorfunc(stringify_func(item)))
|
|
||||||
if self.app.verbosity:
|
|
||||||
s += '\n'
|
|
||||||
else:
|
|
||||||
s = term_width_line(s)
|
|
||||||
self.info(s, nonl=1)
|
|
||||||
yield item
|
|
||||||
if l > 0:
|
|
||||||
self.info()
|
|
||||||
|
|
||||||
supported_image_types = []
|
supported_image_types = []
|
||||||
|
|
||||||
def post_process_images(self, doctree):
|
def post_process_images(self, doctree):
|
||||||
@ -179,9 +153,8 @@ class Builder(object):
|
|||||||
def compile_catalogs(self, catalogs, message):
|
def compile_catalogs(self, catalogs, message):
|
||||||
if not self.config.gettext_auto_build:
|
if not self.config.gettext_auto_build:
|
||||||
return
|
return
|
||||||
self.info(bold('building [mo]: '), nonl=1)
|
self.info(bold('building [mo]: ') + message)
|
||||||
self.info(message)
|
for catalog in self.app.status_iterator(
|
||||||
for catalog in self.status_iterator(
|
|
||||||
catalogs, 'writing output... ', darkgreen, len(catalogs),
|
catalogs, 'writing output... ', darkgreen, len(catalogs),
|
||||||
lambda c: c.mo_path):
|
lambda c: c.mo_path):
|
||||||
catalog.write_mo(self.config.language)
|
catalog.write_mo(self.config.language)
|
||||||
@ -263,25 +236,17 @@ class Builder(object):
|
|||||||
First updates the environment, and then calls :meth:`write`.
|
First updates the environment, and then calls :meth:`write`.
|
||||||
"""
|
"""
|
||||||
if summary:
|
if summary:
|
||||||
self.info(bold('building [%s]: ' % self.name), nonl=1)
|
self.info(bold('building [%s]' % self.name) + ': ' + summary)
|
||||||
self.info(summary)
|
|
||||||
|
|
||||||
updated_docnames = set()
|
updated_docnames = set()
|
||||||
# while reading, collect all warnings from docutils
|
# while reading, collect all warnings from docutils
|
||||||
warnings = []
|
warnings = []
|
||||||
self.env.set_warnfunc(lambda *args: warnings.append(args))
|
self.env.set_warnfunc(lambda *args: warnings.append(args))
|
||||||
self.info(bold('updating environment: '), nonl=1)
|
updated_docnames = self.env.update(self.config, self.srcdir,
|
||||||
msg, length, iterator = self.env.update(self.config, self.srcdir,
|
self.doctreedir, self.app)
|
||||||
self.doctreedir, self.app)
|
self.env.set_warnfunc(self.warn)
|
||||||
self.info(msg)
|
|
||||||
for docname in self.status_iterator(iterator, 'reading sources... ',
|
|
||||||
purple, length):
|
|
||||||
updated_docnames.add(docname)
|
|
||||||
# nothing further to do, the environment has already
|
|
||||||
# done the reading
|
|
||||||
for warning in warnings:
|
for warning in warnings:
|
||||||
self.warn(*warning)
|
self.warn(*warning)
|
||||||
self.env.set_warnfunc(self.warn)
|
|
||||||
|
|
||||||
doccount = len(updated_docnames)
|
doccount = len(updated_docnames)
|
||||||
self.info(bold('looking for now-outdated files... '), nonl=1)
|
self.info(bold('looking for now-outdated files... '), nonl=1)
|
||||||
@ -315,20 +280,33 @@ class Builder(object):
|
|||||||
if docnames and docnames != ['__all__']:
|
if docnames and docnames != ['__all__']:
|
||||||
docnames = set(docnames) & self.env.found_docs
|
docnames = set(docnames) & self.env.found_docs
|
||||||
|
|
||||||
# another indirection to support builders that don't build
|
# determine if we can write in parallel
|
||||||
# files individually
|
self.parallel_ok = False
|
||||||
|
if parallel_available and self.app.parallel > 1 and self.allow_parallel:
|
||||||
|
self.parallel_ok = True
|
||||||
|
for extname, md in self.app._extension_metadata.items():
|
||||||
|
par_ok = md.get('parallel_write_safe', True)
|
||||||
|
if not par_ok:
|
||||||
|
self.app.warn('the %s extension is not safe for parallel '
|
||||||
|
'writing, doing serial read' % extname)
|
||||||
|
self.parallel_ok = False
|
||||||
|
break
|
||||||
|
|
||||||
|
# create a task executor to use for misc. "finish-up" tasks
|
||||||
|
# if self.parallel_ok:
|
||||||
|
# self.finish_tasks = ParallelTasks(self.app.parallel)
|
||||||
|
# else:
|
||||||
|
# for now, just execute them serially
|
||||||
|
self.finish_tasks = SerialTasks()
|
||||||
|
|
||||||
|
# write all "normal" documents (or everything for some builders)
|
||||||
self.write(docnames, list(updated_docnames), method)
|
self.write(docnames, list(updated_docnames), method)
|
||||||
|
|
||||||
# finish (write static files etc.)
|
# finish (write static files etc.)
|
||||||
self.finish()
|
self.finish()
|
||||||
status = (self.app.statuscode == 0
|
|
||||||
and 'succeeded' or 'finished with problems')
|
# wait for all tasks
|
||||||
if self.app._warncount:
|
self.finish_tasks.join()
|
||||||
self.info(bold('build %s, %s warning%s.' %
|
|
||||||
(status, self.app._warncount,
|
|
||||||
self.app._warncount != 1 and 's' or '')))
|
|
||||||
else:
|
|
||||||
self.info(bold('build %s.' % status))
|
|
||||||
|
|
||||||
def write(self, build_docnames, updated_docnames, method='update'):
|
def write(self, build_docnames, updated_docnames, method='update'):
|
||||||
if build_docnames is None or build_docnames == ['__all__']:
|
if build_docnames is None or build_docnames == ['__all__']:
|
||||||
@ -354,23 +332,17 @@ class Builder(object):
|
|||||||
|
|
||||||
warnings = []
|
warnings = []
|
||||||
self.env.set_warnfunc(lambda *args: warnings.append(args))
|
self.env.set_warnfunc(lambda *args: warnings.append(args))
|
||||||
# check for prerequisites to parallel build
|
if self.parallel_ok:
|
||||||
# (parallel only works on POSIX, because the forking impl of
|
|
||||||
# multiprocessing is required)
|
|
||||||
if not (multiprocessing and
|
|
||||||
self.app.parallel > 1 and
|
|
||||||
self.allow_parallel and
|
|
||||||
os.name == 'posix'):
|
|
||||||
self._write_serial(sorted(docnames), warnings)
|
|
||||||
else:
|
|
||||||
# number of subprocesses is parallel-1 because the main process
|
# number of subprocesses is parallel-1 because the main process
|
||||||
# is busy loading doctrees and doing write_doc_serialized()
|
# is busy loading doctrees and doing write_doc_serialized()
|
||||||
self._write_parallel(sorted(docnames), warnings,
|
self._write_parallel(sorted(docnames), warnings,
|
||||||
nproc=self.app.parallel - 1)
|
nproc=self.app.parallel - 1)
|
||||||
|
else:
|
||||||
|
self._write_serial(sorted(docnames), warnings)
|
||||||
self.env.set_warnfunc(self.warn)
|
self.env.set_warnfunc(self.warn)
|
||||||
|
|
||||||
def _write_serial(self, docnames, warnings):
|
def _write_serial(self, docnames, warnings):
|
||||||
for docname in self.status_iterator(
|
for docname in self.app.status_iterator(
|
||||||
docnames, 'writing output... ', darkgreen, len(docnames)):
|
docnames, 'writing output... ', darkgreen, len(docnames)):
|
||||||
doctree = self.env.get_and_resolve_doctree(docname, self)
|
doctree = self.env.get_and_resolve_doctree(docname, self)
|
||||||
self.write_doc_serialized(docname, doctree)
|
self.write_doc_serialized(docname, doctree)
|
||||||
@ -380,60 +352,34 @@ class Builder(object):
|
|||||||
|
|
||||||
def _write_parallel(self, docnames, warnings, nproc):
|
def _write_parallel(self, docnames, warnings, nproc):
|
||||||
def write_process(docs):
|
def write_process(docs):
|
||||||
try:
|
for docname, doctree in docs:
|
||||||
for docname, doctree in docs:
|
self.write_doc(docname, doctree)
|
||||||
self.write_doc(docname, doctree)
|
return warnings
|
||||||
except KeyboardInterrupt:
|
|
||||||
pass # do not print a traceback on Ctrl-C
|
|
||||||
finally:
|
|
||||||
for warning in warnings:
|
|
||||||
self.warn(*warning)
|
|
||||||
|
|
||||||
def process_thread(docs):
|
def add_warnings(docs, wlist):
|
||||||
p = multiprocessing.Process(target=write_process, args=(docs,))
|
warnings.extend(wlist)
|
||||||
p.start()
|
|
||||||
p.join()
|
|
||||||
semaphore.release()
|
|
||||||
|
|
||||||
# allow only "nproc" worker processes at once
|
|
||||||
semaphore = threading.Semaphore(nproc)
|
|
||||||
# list of threads to join when waiting for completion
|
|
||||||
threads = []
|
|
||||||
|
|
||||||
# warm up caches/compile templates using the first document
|
# warm up caches/compile templates using the first document
|
||||||
firstname, docnames = docnames[0], docnames[1:]
|
firstname, docnames = docnames[0], docnames[1:]
|
||||||
doctree = self.env.get_and_resolve_doctree(firstname, self)
|
doctree = self.env.get_and_resolve_doctree(firstname, self)
|
||||||
self.write_doc_serialized(firstname, doctree)
|
self.write_doc_serialized(firstname, doctree)
|
||||||
self.write_doc(firstname, doctree)
|
self.write_doc(firstname, doctree)
|
||||||
# for the rest, determine how many documents to write in one go
|
|
||||||
ndocs = len(docnames)
|
tasks = ParallelTasks(nproc)
|
||||||
chunksize = min(ndocs // nproc, 10)
|
chunks = make_chunks(docnames, nproc)
|
||||||
if chunksize == 0:
|
|
||||||
chunksize = 1
|
for chunk in self.app.status_iterator(
|
||||||
nchunks, rest = divmod(ndocs, chunksize)
|
chunks, 'writing output... ', darkgreen, len(chunks)):
|
||||||
if rest:
|
arg = []
|
||||||
nchunks += 1
|
for i, docname in enumerate(chunk):
|
||||||
# partition documents in "chunks" that will be written by one Process
|
|
||||||
chunks = [docnames[i*chunksize:(i+1)*chunksize] for i in range(nchunks)]
|
|
||||||
for docnames in self.status_iterator(
|
|
||||||
chunks, 'writing output... ', darkgreen, len(chunks),
|
|
||||||
lambda chk: '%s .. %s' % (chk[0], chk[-1])):
|
|
||||||
docs = []
|
|
||||||
for docname in docnames:
|
|
||||||
doctree = self.env.get_and_resolve_doctree(docname, self)
|
doctree = self.env.get_and_resolve_doctree(docname, self)
|
||||||
self.write_doc_serialized(docname, doctree)
|
self.write_doc_serialized(docname, doctree)
|
||||||
docs.append((docname, doctree))
|
arg.append((docname, doctree))
|
||||||
# start a new thread to oversee the completion of this chunk
|
tasks.add_task(write_process, arg, add_warnings)
|
||||||
semaphore.acquire()
|
|
||||||
t = threading.Thread(target=process_thread, args=(docs,))
|
|
||||||
t.setDaemon(True)
|
|
||||||
t.start()
|
|
||||||
threads.append(t)
|
|
||||||
|
|
||||||
# make sure all threads have finished
|
# make sure all threads have finished
|
||||||
self.info(bold('waiting for workers... '))
|
self.info(bold('waiting for workers...'))
|
||||||
for t in threads:
|
tasks.join()
|
||||||
t.join()
|
|
||||||
|
|
||||||
def prepare_writing(self, docnames):
|
def prepare_writing(self, docnames):
|
||||||
"""A place where you can add logic before :meth:`write_doc` is run"""
|
"""A place where you can add logic before :meth:`write_doc` is run"""
|
||||||
|
@ -130,6 +130,9 @@ class ChangesBuilder(Builder):
|
|||||||
self.env.config.source_encoding)
|
self.env.config.source_encoding)
|
||||||
try:
|
try:
|
||||||
lines = f.readlines()
|
lines = f.readlines()
|
||||||
|
except UnicodeDecodeError:
|
||||||
|
self.warn('could not read %r for changelog creation' % docname)
|
||||||
|
continue
|
||||||
finally:
|
finally:
|
||||||
f.close()
|
f.close()
|
||||||
targetfn = path.join(self.outdir, 'rst', os_path(docname)) + '.html'
|
targetfn = path.join(self.outdir, 'rst', os_path(docname)) + '.html'
|
||||||
|
@ -405,8 +405,8 @@ class EpubBuilder(StandaloneHTMLBuilder):
|
|||||||
converting the format and resizing the image if necessary/possible.
|
converting the format and resizing the image if necessary/possible.
|
||||||
"""
|
"""
|
||||||
ensuredir(path.join(self.outdir, '_images'))
|
ensuredir(path.join(self.outdir, '_images'))
|
||||||
for src in self.status_iterator(self.images, 'copying images... ',
|
for src in self.app.status_iterator(self.images, 'copying images... ',
|
||||||
brown, len(self.images)):
|
brown, len(self.images)):
|
||||||
dest = self.images[src]
|
dest = self.images[src]
|
||||||
try:
|
try:
|
||||||
img = Image.open(path.join(self.srcdir, src))
|
img = Image.open(path.join(self.srcdir, src))
|
||||||
|
@ -170,8 +170,8 @@ class MessageCatalogBuilder(I18nBuilder):
|
|||||||
|
|
||||||
extract_translations = self.templates.environment.extract_translations
|
extract_translations = self.templates.environment.extract_translations
|
||||||
|
|
||||||
for template in self.status_iterator(files,
|
for template in self.app.status_iterator(
|
||||||
'reading templates... ', purple, len(files)):
|
files, 'reading templates... ', purple, len(files)):
|
||||||
with open(template, 'r', encoding='utf-8') as f:
|
with open(template, 'r', encoding='utf-8') as f:
|
||||||
context = f.read()
|
context = f.read()
|
||||||
for line, meth, msg in extract_translations(context):
|
for line, meth, msg in extract_translations(context):
|
||||||
@ -191,7 +191,7 @@ class MessageCatalogBuilder(I18nBuilder):
|
|||||||
ctime = datetime.fromtimestamp(
|
ctime = datetime.fromtimestamp(
|
||||||
timestamp, ltz).strftime('%Y-%m-%d %H:%M%z'),
|
timestamp, ltz).strftime('%Y-%m-%d %H:%M%z'),
|
||||||
)
|
)
|
||||||
for textdomain, catalog in self.status_iterator(
|
for textdomain, catalog in self.app.status_iterator(
|
||||||
iteritems(self.catalogs), "writing message catalogs... ",
|
iteritems(self.catalogs), "writing message catalogs... ",
|
||||||
darkgreen, len(self.catalogs),
|
darkgreen, len(self.catalogs),
|
||||||
lambda textdomain__: textdomain__[0]):
|
lambda textdomain__: textdomain__[0]):
|
||||||
|
@ -29,7 +29,7 @@ from docutils.readers.doctree import Reader as DoctreeReader
|
|||||||
from sphinx import package_dir, __version__
|
from sphinx import package_dir, __version__
|
||||||
from sphinx.util import jsonimpl, copy_static_entry
|
from sphinx.util import jsonimpl, copy_static_entry
|
||||||
from sphinx.util.osutil import SEP, os_path, relative_uri, ensuredir, \
|
from sphinx.util.osutil import SEP, os_path, relative_uri, ensuredir, \
|
||||||
movefile, ustrftime, copyfile
|
movefile, ustrftime, copyfile
|
||||||
from sphinx.util.nodes import inline_all_toctrees
|
from sphinx.util.nodes import inline_all_toctrees
|
||||||
from sphinx.util.matching import patmatch, compile_matchers
|
from sphinx.util.matching import patmatch, compile_matchers
|
||||||
from sphinx.locale import _
|
from sphinx.locale import _
|
||||||
@ -40,7 +40,7 @@ from sphinx.application import ENV_PICKLE_FILENAME
|
|||||||
from sphinx.highlighting import PygmentsBridge
|
from sphinx.highlighting import PygmentsBridge
|
||||||
from sphinx.util.console import bold, darkgreen, brown
|
from sphinx.util.console import bold, darkgreen, brown
|
||||||
from sphinx.writers.html import HTMLWriter, HTMLTranslator, \
|
from sphinx.writers.html import HTMLWriter, HTMLTranslator, \
|
||||||
SmartyPantsHTMLTranslator
|
SmartyPantsHTMLTranslator
|
||||||
|
|
||||||
#: the filename for the inventory of objects
|
#: the filename for the inventory of objects
|
||||||
INVENTORY_FILENAME = 'objects.inv'
|
INVENTORY_FILENAME = 'objects.inv'
|
||||||
@ -443,12 +443,19 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
self.index_page(docname, doctree, title)
|
self.index_page(docname, doctree, title)
|
||||||
|
|
||||||
def finish(self):
|
def finish(self):
|
||||||
self.info(bold('writing additional files...'), nonl=1)
|
self.finish_tasks.add_task(self.gen_indices)
|
||||||
|
self.finish_tasks.add_task(self.gen_additional_pages)
|
||||||
|
self.finish_tasks.add_task(self.copy_image_files)
|
||||||
|
self.finish_tasks.add_task(self.copy_download_files)
|
||||||
|
self.finish_tasks.add_task(self.copy_static_files)
|
||||||
|
self.finish_tasks.add_task(self.copy_extra_files)
|
||||||
|
self.finish_tasks.add_task(self.write_buildinfo)
|
||||||
|
|
||||||
# pages from extensions
|
# dump the search index
|
||||||
for pagelist in self.app.emit('html-collect-pages'):
|
self.handle_finish()
|
||||||
for pagename, context, template in pagelist:
|
|
||||||
self.handle_page(pagename, context, template)
|
def gen_indices(self):
|
||||||
|
self.info(bold('generating indices...'), nonl=1)
|
||||||
|
|
||||||
# the global general index
|
# the global general index
|
||||||
if self.get_builder_config('use_index', 'html'):
|
if self.get_builder_config('use_index', 'html'):
|
||||||
@ -457,16 +464,27 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
# the global domain-specific indices
|
# the global domain-specific indices
|
||||||
self.write_domain_indices()
|
self.write_domain_indices()
|
||||||
|
|
||||||
# the search page
|
self.info()
|
||||||
if self.name != 'htmlhelp':
|
|
||||||
self.info(' search', nonl=1)
|
def gen_additional_pages(self):
|
||||||
self.handle_page('search', {}, 'search.html')
|
# pages from extensions
|
||||||
|
for pagelist in self.app.emit('html-collect-pages'):
|
||||||
|
for pagename, context, template in pagelist:
|
||||||
|
self.handle_page(pagename, context, template)
|
||||||
|
|
||||||
|
self.info(bold('writing additional pages...'), nonl=1)
|
||||||
|
|
||||||
# additional pages from conf.py
|
# additional pages from conf.py
|
||||||
for pagename, template in self.config.html_additional_pages.items():
|
for pagename, template in self.config.html_additional_pages.items():
|
||||||
self.info(' '+pagename, nonl=1)
|
self.info(' '+pagename, nonl=1)
|
||||||
self.handle_page(pagename, {}, template)
|
self.handle_page(pagename, {}, template)
|
||||||
|
|
||||||
|
# the search page
|
||||||
|
if self.name != 'htmlhelp':
|
||||||
|
self.info(' search', nonl=1)
|
||||||
|
self.handle_page('search', {}, 'search.html')
|
||||||
|
|
||||||
|
# the opensearch xml file
|
||||||
if self.config.html_use_opensearch and self.name != 'htmlhelp':
|
if self.config.html_use_opensearch and self.name != 'htmlhelp':
|
||||||
self.info(' opensearch', nonl=1)
|
self.info(' opensearch', nonl=1)
|
||||||
fn = path.join(self.outdir, '_static', 'opensearch.xml')
|
fn = path.join(self.outdir, '_static', 'opensearch.xml')
|
||||||
@ -474,15 +492,6 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
|
|
||||||
self.info()
|
self.info()
|
||||||
|
|
||||||
self.copy_image_files()
|
|
||||||
self.copy_download_files()
|
|
||||||
self.copy_static_files()
|
|
||||||
self.copy_extra_files()
|
|
||||||
self.write_buildinfo()
|
|
||||||
|
|
||||||
# dump the search index
|
|
||||||
self.handle_finish()
|
|
||||||
|
|
||||||
def write_genindex(self):
|
def write_genindex(self):
|
||||||
# the total count of lines for each index letter, used to distribute
|
# the total count of lines for each index letter, used to distribute
|
||||||
# the entries into two columns
|
# the entries into two columns
|
||||||
@ -526,8 +535,8 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
# copy image files
|
# copy image files
|
||||||
if self.images:
|
if self.images:
|
||||||
ensuredir(path.join(self.outdir, '_images'))
|
ensuredir(path.join(self.outdir, '_images'))
|
||||||
for src in self.status_iterator(self.images, 'copying images... ',
|
for src in self.app.status_iterator(self.images, 'copying images... ',
|
||||||
brown, len(self.images)):
|
brown, len(self.images)):
|
||||||
dest = self.images[src]
|
dest = self.images[src]
|
||||||
try:
|
try:
|
||||||
copyfile(path.join(self.srcdir, src),
|
copyfile(path.join(self.srcdir, src),
|
||||||
@ -540,9 +549,9 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
# copy downloadable files
|
# copy downloadable files
|
||||||
if self.env.dlfiles:
|
if self.env.dlfiles:
|
||||||
ensuredir(path.join(self.outdir, '_downloads'))
|
ensuredir(path.join(self.outdir, '_downloads'))
|
||||||
for src in self.status_iterator(self.env.dlfiles,
|
for src in self.app.status_iterator(self.env.dlfiles,
|
||||||
'copying downloadable files... ',
|
'copying downloadable files... ',
|
||||||
brown, len(self.env.dlfiles)):
|
brown, len(self.env.dlfiles)):
|
||||||
dest = self.env.dlfiles[src][1]
|
dest = self.env.dlfiles[src][1]
|
||||||
try:
|
try:
|
||||||
copyfile(path.join(self.srcdir, src),
|
copyfile(path.join(self.srcdir, src),
|
||||||
@ -786,8 +795,8 @@ class StandaloneHTMLBuilder(Builder):
|
|||||||
copyfile(self.env.doc2path(pagename), source_name)
|
copyfile(self.env.doc2path(pagename), source_name)
|
||||||
|
|
||||||
def handle_finish(self):
|
def handle_finish(self):
|
||||||
self.dump_search_index()
|
self.finish_tasks.add_task(self.dump_search_index)
|
||||||
self.dump_inventory()
|
self.finish_tasks.add_task(self.dump_inventory)
|
||||||
|
|
||||||
def dump_inventory(self):
|
def dump_inventory(self):
|
||||||
self.info(bold('dumping object inventory... '), nonl=True)
|
self.info(bold('dumping object inventory... '), nonl=True)
|
||||||
|
@ -12,7 +12,7 @@ from __future__ import print_function
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import getopt
|
import optparse
|
||||||
import traceback
|
import traceback
|
||||||
from os import path
|
from os import path
|
||||||
|
|
||||||
@ -32,89 +32,121 @@ def usage(argv, msg=None):
|
|||||||
if msg:
|
if msg:
|
||||||
print(msg, file=sys.stderr)
|
print(msg, file=sys.stderr)
|
||||||
print(file=sys.stderr)
|
print(file=sys.stderr)
|
||||||
print("""\
|
|
||||||
|
USAGE = """\
|
||||||
Sphinx v%s
|
Sphinx v%s
|
||||||
Usage: %s [options] sourcedir outdir [filenames...]
|
Usage: %%prog [options] sourcedir outdir [filenames...]
|
||||||
|
|
||||||
General options
|
Filename arguments:
|
||||||
^^^^^^^^^^^^^^^
|
without -a and without filenames, write new and changed files.
|
||||||
-b <builder> builder to use; default is html
|
with -a, write all files.
|
||||||
-a write all files; default is to only write new and changed files
|
with filenames, write these.
|
||||||
-E don't use a saved environment, always read all files
|
""" % __version__
|
||||||
-d <path> path for the cached environment and doctree files
|
|
||||||
(default: outdir/.doctrees)
|
|
||||||
-j <N> build in parallel with N processes where possible
|
|
||||||
-M <builder> "make" mode -- used by Makefile, like "sphinx-build -M html"
|
|
||||||
|
|
||||||
Build configuration options
|
EPILOG = """\
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
For more information, visit <http://sphinx-doc.org/>.
|
||||||
-c <path> path where configuration file (conf.py) is located
|
"""
|
||||||
(default: same as sourcedir)
|
|
||||||
-C use no config file at all, only -D options
|
|
||||||
-D <setting=value> override a setting in configuration file
|
|
||||||
-t <tag> define tag: include "only" blocks with <tag>
|
|
||||||
-A <name=value> pass a value into the templates, for HTML builder
|
|
||||||
-n nit-picky mode, warn about all missing references
|
|
||||||
|
|
||||||
Console output options
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
-v increase verbosity (can be repeated)
|
|
||||||
-q no output on stdout, just warnings on stderr
|
|
||||||
-Q no output at all, not even warnings
|
|
||||||
-w <file> write warnings (and errors) to given file
|
|
||||||
-W turn warnings into errors
|
|
||||||
-T show full traceback on exception
|
|
||||||
-N do not emit colored output
|
|
||||||
-P run Pdb on exception
|
|
||||||
|
|
||||||
Filename arguments
|
class MyFormatter(optparse.IndentedHelpFormatter):
|
||||||
^^^^^^^^^^^^^^^^^^
|
def format_usage(self, usage):
|
||||||
* without -a and without filenames, write new and changed files.
|
return usage
|
||||||
* with -a, write all files.
|
|
||||||
* with filenames, write these.
|
|
||||||
|
|
||||||
Standard options
|
def format_help(self, formatter):
|
||||||
^^^^^^^^^^^^^^^^
|
result = []
|
||||||
-h, --help show this help and exit
|
if self.description:
|
||||||
--version show version information and exit
|
result.append(self.format_description(formatter))
|
||||||
""" % (__version__, argv[0]), file=sys.stderr)
|
if self.option_list:
|
||||||
|
result.append(self.format_option_help(formatter))
|
||||||
|
return "\n".join(result)
|
||||||
|
|
||||||
|
|
||||||
def main(argv):
|
def main(argv):
|
||||||
if not color_terminal():
|
if not color_terminal():
|
||||||
nocolor()
|
nocolor()
|
||||||
|
|
||||||
|
parser = optparse.OptionParser(USAGE, epilog=EPILOG, formatter=MyFormatter())
|
||||||
|
parser.add_option('--version', action='store_true', dest='version',
|
||||||
|
help='show version information and exit')
|
||||||
|
|
||||||
|
group = parser.add_option_group('General options')
|
||||||
|
group.add_option('-b', metavar='BUILDER', dest='builder', default='html',
|
||||||
|
help='builder to use; default is html')
|
||||||
|
group.add_option('-a', action='store_true', dest='force_all',
|
||||||
|
help='write all files; default is to only write new and '
|
||||||
|
'changed files')
|
||||||
|
group.add_option('-E', action='store_true', dest='freshenv',
|
||||||
|
help='don\'t use a saved environment, always read '
|
||||||
|
'all files')
|
||||||
|
group.add_option('-d', metavar='PATH', default=None, dest='doctreedir',
|
||||||
|
help='path for the cached environment and doctree files '
|
||||||
|
'(default: outdir/.doctrees)')
|
||||||
|
group.add_option('-j', metavar='N', default=1, type='int', dest='jobs',
|
||||||
|
help='build in parallel with N processes where possible')
|
||||||
|
# this option never gets through to this point (it is intercepted earlier)
|
||||||
|
# group.add_option('-M', metavar='BUILDER', dest='make_mode',
|
||||||
|
# help='"make" mode -- as used by Makefile, like '
|
||||||
|
# '"sphinx-build -M html"')
|
||||||
|
|
||||||
|
group = parser.add_option_group('Build configuration options')
|
||||||
|
group.add_option('-c', metavar='PATH', dest='confdir',
|
||||||
|
help='path where configuration file (conf.py) is located '
|
||||||
|
'(default: same as sourcedir)')
|
||||||
|
group.add_option('-C', action='store_true', dest='noconfig',
|
||||||
|
help='use no config file at all, only -D options')
|
||||||
|
group.add_option('-D', metavar='setting=value', action='append',
|
||||||
|
dest='define', default=[],
|
||||||
|
help='override a setting in configuration file')
|
||||||
|
group.add_option('-A', metavar='name=value', action='append',
|
||||||
|
dest='htmldefine', default=[],
|
||||||
|
help='pass a value into HTML templates')
|
||||||
|
group.add_option('-t', metavar='TAG', action='append',
|
||||||
|
dest='tags', default=[],
|
||||||
|
help='define tag: include "only" blocks with TAG')
|
||||||
|
group.add_option('-n', action='store_true', dest='nitpicky',
|
||||||
|
help='nit-picky mode, warn about all missing references')
|
||||||
|
|
||||||
|
group = parser.add_option_group('Console output options')
|
||||||
|
group.add_option('-v', action='count', dest='verbosity', default=0,
|
||||||
|
help='increase verbosity (can be repeated)')
|
||||||
|
group.add_option('-q', action='store_true', dest='quiet',
|
||||||
|
help='no output on stdout, just warnings on stderr')
|
||||||
|
group.add_option('-Q', action='store_true', dest='really_quiet',
|
||||||
|
help='no output at all, not even warnings')
|
||||||
|
group.add_option('-N', action='store_true', dest='nocolor',
|
||||||
|
help='do not emit colored output')
|
||||||
|
group.add_option('-w', metavar='FILE', dest='warnfile',
|
||||||
|
help='write warnings (and errors) to given file')
|
||||||
|
group.add_option('-W', action='store_true', dest='warningiserror',
|
||||||
|
help='turn warnings into errors')
|
||||||
|
group.add_option('-T', action='store_true', dest='traceback',
|
||||||
|
help='show full traceback on exception')
|
||||||
|
group.add_option('-P', action='store_true', dest='pdb',
|
||||||
|
help='run Pdb on exception')
|
||||||
|
|
||||||
# parse options
|
# parse options
|
||||||
try:
|
try:
|
||||||
opts, args = getopt.getopt(argv[1:], 'ab:t:d:c:CD:A:nNEqQWw:PThvj:',
|
opts, args = parser.parse_args()
|
||||||
['help', 'version'])
|
except SystemExit as err:
|
||||||
except getopt.error as err:
|
return err.code
|
||||||
usage(argv, 'Error: %s' % err)
|
|
||||||
return 1
|
|
||||||
|
|
||||||
# handle basic options
|
# handle basic options
|
||||||
allopts = set(opt[0] for opt in opts)
|
if opts.version:
|
||||||
# help and version options
|
print('Sphinx (sphinx-build) %s' % __version__)
|
||||||
if '-h' in allopts or '--help' in allopts:
|
|
||||||
usage(argv)
|
|
||||||
print(file=sys.stderr)
|
|
||||||
print('For more information, see <http://sphinx-doc.org/>.',
|
|
||||||
file=sys.stderr)
|
|
||||||
return 0
|
|
||||||
if '--version' in allopts:
|
|
||||||
print('Sphinx (sphinx-build) %s' % __version__)
|
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
# get paths (first and second positional argument)
|
# get paths (first and second positional argument)
|
||||||
try:
|
try:
|
||||||
srcdir = confdir = abspath(args[0])
|
srcdir = abspath(args[0])
|
||||||
|
confdir = abspath(opts.confdir or srcdir)
|
||||||
|
if opts.noconfig:
|
||||||
|
confdir = None
|
||||||
if not path.isdir(srcdir):
|
if not path.isdir(srcdir):
|
||||||
print('Error: Cannot find source directory `%s\'.' % srcdir,
|
print('Error: Cannot find source directory `%s\'.' % srcdir,
|
||||||
file=sys.stderr)
|
file=sys.stderr)
|
||||||
return 1
|
return 1
|
||||||
if not path.isfile(path.join(srcdir, 'conf.py')) and \
|
if not opts.noconfig and not path.isfile(path.join(confdir, 'conf.py')):
|
||||||
'-c' not in allopts and '-C' not in allopts:
|
print('Error: Config directory doesn\'t contain a conf.py file.',
|
||||||
print('Error: Source directory doesn\'t contain a conf.py file.',
|
|
||||||
file=sys.stderr)
|
file=sys.stderr)
|
||||||
return 1
|
return 1
|
||||||
outdir = abspath(args[1])
|
outdir = abspath(args[1])
|
||||||
@ -144,116 +176,77 @@ def main(argv):
|
|||||||
except Exception:
|
except Exception:
|
||||||
likely_encoding = None
|
likely_encoding = None
|
||||||
|
|
||||||
buildername = None
|
if opts.force_all and filenames:
|
||||||
force_all = freshenv = warningiserror = use_pdb = False
|
print('Error: Cannot combine -a option and filenames.', file=sys.stderr)
|
||||||
show_traceback = False
|
return 1
|
||||||
verbosity = 0
|
|
||||||
parallel = 0
|
if opts.nocolor:
|
||||||
|
nocolor()
|
||||||
|
|
||||||
|
doctreedir = abspath(opts.doctreedir or path.join(outdir, '.doctrees'))
|
||||||
|
|
||||||
status = sys.stdout
|
status = sys.stdout
|
||||||
warning = sys.stderr
|
warning = sys.stderr
|
||||||
error = sys.stderr
|
error = sys.stderr
|
||||||
warnfile = None
|
|
||||||
|
if opts.quiet:
|
||||||
|
status = None
|
||||||
|
if opts.really_quiet:
|
||||||
|
status = warning = None
|
||||||
|
if warning and opts.warnfile:
|
||||||
|
try:
|
||||||
|
warnfp = open(opts.warnfile, 'w')
|
||||||
|
except Exception as exc:
|
||||||
|
print('Error: Cannot open warning file %r: %s' %
|
||||||
|
(opts.warnfile, exc), file=sys.stderr)
|
||||||
|
sys.exit(1)
|
||||||
|
warning = Tee(warning, warnfp)
|
||||||
|
error = warning
|
||||||
|
|
||||||
confoverrides = {}
|
confoverrides = {}
|
||||||
tags = []
|
for val in opts.define:
|
||||||
doctreedir = path.join(outdir, '.doctrees')
|
try:
|
||||||
for opt, val in opts:
|
key, val = val.split('=')
|
||||||
if opt == '-b':
|
except ValueError:
|
||||||
buildername = val
|
print('Error: -D option argument must be in the form name=value.',
|
||||||
elif opt == '-a':
|
file=sys.stderr)
|
||||||
if filenames:
|
return 1
|
||||||
usage(argv, 'Error: Cannot combine -a option and filenames.')
|
if likely_encoding and isinstance(val, binary_type):
|
||||||
return 1
|
|
||||||
force_all = True
|
|
||||||
elif opt == '-t':
|
|
||||||
tags.append(val)
|
|
||||||
elif opt == '-d':
|
|
||||||
doctreedir = abspath(val)
|
|
||||||
elif opt == '-c':
|
|
||||||
confdir = abspath(val)
|
|
||||||
if not path.isfile(path.join(confdir, 'conf.py')):
|
|
||||||
print('Error: Configuration directory doesn\'t contain conf.py file.',
|
|
||||||
file=sys.stderr)
|
|
||||||
return 1
|
|
||||||
elif opt == '-C':
|
|
||||||
confdir = None
|
|
||||||
elif opt == '-D':
|
|
||||||
try:
|
try:
|
||||||
key, val = val.split('=')
|
val = val.decode(likely_encoding)
|
||||||
except ValueError:
|
except UnicodeError:
|
||||||
print('Error: -D option argument must be in the form name=value.',
|
pass
|
||||||
file=sys.stderr)
|
confoverrides[key] = val
|
||||||
return 1
|
|
||||||
|
for val in opts.htmldefine:
|
||||||
|
try:
|
||||||
|
key, val = val.split('=')
|
||||||
|
except ValueError:
|
||||||
|
print('Error: -A option argument must be in the form name=value.',
|
||||||
|
file=sys.stderr)
|
||||||
|
return 1
|
||||||
|
try:
|
||||||
|
val = int(val)
|
||||||
|
except ValueError:
|
||||||
if likely_encoding and isinstance(val, binary_type):
|
if likely_encoding and isinstance(val, binary_type):
|
||||||
try:
|
try:
|
||||||
val = val.decode(likely_encoding)
|
val = val.decode(likely_encoding)
|
||||||
except UnicodeError:
|
except UnicodeError:
|
||||||
pass
|
pass
|
||||||
confoverrides[key] = val
|
confoverrides['html_context.%s' % key] = val
|
||||||
elif opt == '-A':
|
|
||||||
try:
|
|
||||||
key, val = val.split('=')
|
|
||||||
except ValueError:
|
|
||||||
print('Error: -A option argument must be in the form name=value.',
|
|
||||||
file=sys.stderr)
|
|
||||||
return 1
|
|
||||||
try:
|
|
||||||
val = int(val)
|
|
||||||
except ValueError:
|
|
||||||
if likely_encoding and isinstance(val, binary_type):
|
|
||||||
try:
|
|
||||||
val = val.decode(likely_encoding)
|
|
||||||
except UnicodeError:
|
|
||||||
pass
|
|
||||||
confoverrides['html_context.%s' % key] = val
|
|
||||||
elif opt == '-n':
|
|
||||||
confoverrides['nitpicky'] = True
|
|
||||||
elif opt == '-N':
|
|
||||||
nocolor()
|
|
||||||
elif opt == '-E':
|
|
||||||
freshenv = True
|
|
||||||
elif opt == '-q':
|
|
||||||
status = None
|
|
||||||
elif opt == '-Q':
|
|
||||||
status = None
|
|
||||||
warning = None
|
|
||||||
elif opt == '-W':
|
|
||||||
warningiserror = True
|
|
||||||
elif opt == '-w':
|
|
||||||
warnfile = val
|
|
||||||
elif opt == '-P':
|
|
||||||
use_pdb = True
|
|
||||||
elif opt == '-T':
|
|
||||||
show_traceback = True
|
|
||||||
elif opt == '-v':
|
|
||||||
verbosity += 1
|
|
||||||
show_traceback = True
|
|
||||||
elif opt == '-j':
|
|
||||||
try:
|
|
||||||
parallel = int(val)
|
|
||||||
except ValueError:
|
|
||||||
print('Error: -j option argument must be an integer.',
|
|
||||||
file=sys.stderr)
|
|
||||||
return 1
|
|
||||||
|
|
||||||
if warning and warnfile:
|
if opts.nitpicky:
|
||||||
warnfp = open(warnfile, 'w')
|
confoverrides['nitpicky'] = True
|
||||||
warning = Tee(warning, warnfp)
|
|
||||||
error = warning
|
|
||||||
|
|
||||||
if not path.isdir(outdir):
|
|
||||||
if status:
|
|
||||||
print('Making output directory...', file=status)
|
|
||||||
os.makedirs(outdir)
|
|
||||||
|
|
||||||
app = None
|
app = None
|
||||||
try:
|
try:
|
||||||
app = Sphinx(srcdir, confdir, outdir, doctreedir, buildername,
|
app = Sphinx(srcdir, confdir, outdir, doctreedir, opts.builder,
|
||||||
confoverrides, status, warning, freshenv,
|
confoverrides, status, warning, opts.freshenv,
|
||||||
warningiserror, tags, verbosity, parallel)
|
opts.warningiserror, opts.tags, opts.verbosity, opts.jobs)
|
||||||
app.build(force_all, filenames)
|
app.build(opts.force_all, filenames)
|
||||||
return app.statuscode
|
return app.statuscode
|
||||||
except (Exception, KeyboardInterrupt) as err:
|
except (Exception, KeyboardInterrupt) as err:
|
||||||
if use_pdb:
|
if opts.pdb:
|
||||||
import pdb
|
import pdb
|
||||||
print(red('Exception occurred while building, starting debugger:'),
|
print(red('Exception occurred while building, starting debugger:'),
|
||||||
file=error)
|
file=error)
|
||||||
@ -261,7 +254,7 @@ def main(argv):
|
|||||||
pdb.post_mortem(sys.exc_info()[2])
|
pdb.post_mortem(sys.exc_info()[2])
|
||||||
else:
|
else:
|
||||||
print(file=error)
|
print(file=error)
|
||||||
if show_traceback:
|
if opts.verbosity or opts.traceback:
|
||||||
traceback.print_exc(None, error)
|
traceback.print_exc(None, error)
|
||||||
print(file=error)
|
print(file=error)
|
||||||
if isinstance(err, KeyboardInterrupt):
|
if isinstance(err, KeyboardInterrupt):
|
||||||
|
@ -11,7 +11,8 @@
|
|||||||
|
|
||||||
import re
|
import re
|
||||||
|
|
||||||
from docutils.parsers.rst import Directive, directives
|
from docutils import nodes
|
||||||
|
from docutils.parsers.rst import Directive, directives, roles
|
||||||
|
|
||||||
from sphinx import addnodes
|
from sphinx import addnodes
|
||||||
from sphinx.util.docfields import DocFieldTransformer
|
from sphinx.util.docfields import DocFieldTransformer
|
||||||
@ -162,6 +163,34 @@ class ObjectDescription(Directive):
|
|||||||
DescDirective = ObjectDescription
|
DescDirective = ObjectDescription
|
||||||
|
|
||||||
|
|
||||||
|
class DefaultRole(Directive):
|
||||||
|
"""
|
||||||
|
Set the default interpreted text role. Overridden from docutils.
|
||||||
|
"""
|
||||||
|
|
||||||
|
optional_arguments = 1
|
||||||
|
final_argument_whitespace = False
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
if not self.arguments:
|
||||||
|
if '' in roles._roles:
|
||||||
|
# restore the "default" default role
|
||||||
|
del roles._roles['']
|
||||||
|
return []
|
||||||
|
role_name = self.arguments[0]
|
||||||
|
role, messages = roles.role(role_name, self.state_machine.language,
|
||||||
|
self.lineno, self.state.reporter)
|
||||||
|
if role is None:
|
||||||
|
error = self.state.reporter.error(
|
||||||
|
'Unknown interpreted text role "%s".' % role_name,
|
||||||
|
nodes.literal_block(self.block_text, self.block_text),
|
||||||
|
line=self.lineno)
|
||||||
|
return messages + [error]
|
||||||
|
roles._roles[''] = role
|
||||||
|
self.state.document.settings.env.temp_data['default_role'] = role_name
|
||||||
|
return messages
|
||||||
|
|
||||||
|
|
||||||
class DefaultDomain(Directive):
|
class DefaultDomain(Directive):
|
||||||
"""
|
"""
|
||||||
Directive to (re-)set the default domain for this source file.
|
Directive to (re-)set the default domain for this source file.
|
||||||
@ -186,6 +215,7 @@ class DefaultDomain(Directive):
|
|||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
directives.register_directive('default-role', DefaultRole)
|
||||||
directives.register_directive('default-domain', DefaultDomain)
|
directives.register_directive('default-domain', DefaultDomain)
|
||||||
directives.register_directive('describe', ObjectDescription)
|
directives.register_directive('describe', ObjectDescription)
|
||||||
# new, more consistent, name
|
# new, more consistent, name
|
||||||
|
@ -108,7 +108,7 @@ class CodeBlock(Directive):
|
|||||||
return [document.reporter.warning(str(err), line=self.lineno)]
|
return [document.reporter.warning(str(err), line=self.lineno)]
|
||||||
else:
|
else:
|
||||||
hl_lines = None
|
hl_lines = None
|
||||||
|
|
||||||
if 'dedent' in self.options:
|
if 'dedent' in self.options:
|
||||||
lines = code.split('\n')
|
lines = code.split('\n')
|
||||||
lines = dedent_lines(lines, self.options['dedent'])
|
lines = dedent_lines(lines, self.options['dedent'])
|
||||||
|
@ -155,10 +155,13 @@ class Domain(object):
|
|||||||
self._role_cache = {}
|
self._role_cache = {}
|
||||||
self._directive_cache = {}
|
self._directive_cache = {}
|
||||||
self._role2type = {}
|
self._role2type = {}
|
||||||
|
self._type2role = {}
|
||||||
for name, obj in iteritems(self.object_types):
|
for name, obj in iteritems(self.object_types):
|
||||||
for rolename in obj.roles:
|
for rolename in obj.roles:
|
||||||
self._role2type.setdefault(rolename, []).append(name)
|
self._role2type.setdefault(rolename, []).append(name)
|
||||||
|
self._type2role[name] = obj.roles[0] if obj.roles else ''
|
||||||
self.objtypes_for_role = self._role2type.get
|
self.objtypes_for_role = self._role2type.get
|
||||||
|
self.role_for_objtype = self._type2role.get
|
||||||
|
|
||||||
def role(self, name):
|
def role(self, name):
|
||||||
"""Return a role adapter function that always gives the registered
|
"""Return a role adapter function that always gives the registered
|
||||||
@ -199,6 +202,14 @@ class Domain(object):
|
|||||||
"""Remove traces of a document in the domain-specific inventories."""
|
"""Remove traces of a document in the domain-specific inventories."""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
"""Merge in data regarding *docnames* from a different domaindata
|
||||||
|
inventory (coming from a subprocess in parallel builds).
|
||||||
|
"""
|
||||||
|
raise NotImplementedError('merge_domaindata must be implemented in %s '
|
||||||
|
'to be able to do parallel builds!' %
|
||||||
|
self.__class__)
|
||||||
|
|
||||||
def process_doc(self, env, docname, document):
|
def process_doc(self, env, docname, document):
|
||||||
"""Process a document after it is read by the environment."""
|
"""Process a document after it is read by the environment."""
|
||||||
pass
|
pass
|
||||||
@ -220,6 +231,22 @@ class Domain(object):
|
|||||||
"""
|
"""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target, node, contnode):
|
||||||
|
"""Resolve the pending_xref *node* with the given *target*.
|
||||||
|
|
||||||
|
The reference comes from an "any" or similar role, which means that we
|
||||||
|
don't know the type. Otherwise, the arguments are the same as for
|
||||||
|
:meth:`resolve_xref`.
|
||||||
|
|
||||||
|
The method must return a list (potentially empty) of tuples
|
||||||
|
``('domain:role', newnode)``, where ``'domain:role'`` is the name of a
|
||||||
|
role that could have created the same reference, e.g. ``'py:func'``.
|
||||||
|
``newnode`` is what :meth:`resolve_xref` would return.
|
||||||
|
|
||||||
|
.. versionadded:: 1.3
|
||||||
|
"""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
"""Return an iterable of "object descriptions", which are tuples with
|
"""Return an iterable of "object descriptions", which are tuples with
|
||||||
five items:
|
five items:
|
||||||
|
@ -130,7 +130,7 @@ class CObject(ObjectDescription):
|
|||||||
if m:
|
if m:
|
||||||
name = m.group(1)
|
name = m.group(1)
|
||||||
|
|
||||||
typename = self.env.temp_data.get('c:type')
|
typename = self.env.ref_context.get('c:type')
|
||||||
if self.name == 'c:member' and typename:
|
if self.name == 'c:member' and typename:
|
||||||
fullname = typename + '.' + name
|
fullname = typename + '.' + name
|
||||||
else:
|
else:
|
||||||
@ -212,12 +212,12 @@ class CObject(ObjectDescription):
|
|||||||
self.typename_set = False
|
self.typename_set = False
|
||||||
if self.name == 'c:type':
|
if self.name == 'c:type':
|
||||||
if self.names:
|
if self.names:
|
||||||
self.env.temp_data['c:type'] = self.names[0]
|
self.env.ref_context['c:type'] = self.names[0]
|
||||||
self.typename_set = True
|
self.typename_set = True
|
||||||
|
|
||||||
def after_content(self):
|
def after_content(self):
|
||||||
if self.typename_set:
|
if self.typename_set:
|
||||||
self.env.temp_data['c:type'] = None
|
self.env.ref_context.pop('c:type', None)
|
||||||
|
|
||||||
|
|
||||||
class CXRefRole(XRefRole):
|
class CXRefRole(XRefRole):
|
||||||
@ -269,6 +269,12 @@ class CDomain(Domain):
|
|||||||
if fn == docname:
|
if fn == docname:
|
||||||
del self.data['objects'][fullname]
|
del self.data['objects'][fullname]
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
# XXX check duplicates
|
||||||
|
for fullname, (fn, objtype) in otherdata['objects'].items():
|
||||||
|
if fn in docnames:
|
||||||
|
self.data['objects'][fullname] = (fn, objtype)
|
||||||
|
|
||||||
def resolve_xref(self, env, fromdocname, builder,
|
def resolve_xref(self, env, fromdocname, builder,
|
||||||
typ, target, node, contnode):
|
typ, target, node, contnode):
|
||||||
# strip pointer asterisk
|
# strip pointer asterisk
|
||||||
@ -279,6 +285,17 @@ class CDomain(Domain):
|
|||||||
return make_refnode(builder, fromdocname, obj[0], 'c.' + target,
|
return make_refnode(builder, fromdocname, obj[0], 'c.' + target,
|
||||||
contnode, target)
|
contnode, target)
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target,
|
||||||
|
node, contnode):
|
||||||
|
# strip pointer asterisk
|
||||||
|
target = target.rstrip(' *')
|
||||||
|
if target not in self.data['objects']:
|
||||||
|
return []
|
||||||
|
obj = self.data['objects'][target]
|
||||||
|
return [('c:' + self.role_for_objtype(obj[1]),
|
||||||
|
make_refnode(builder, fromdocname, obj[0], 'c.' + target,
|
||||||
|
contnode, target))]
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for refname, (docname, type) in list(self.data['objects'].items()):
|
for refname, (docname, type) in list(self.data['objects'].items()):
|
||||||
yield (refname, refname, type, docname, 'c.' + refname, 1)
|
yield (refname, refname, type, docname, 'c.' + refname, 1)
|
||||||
|
@ -7,11 +7,11 @@
|
|||||||
|
|
||||||
:copyright: Copyright 2007-2014 by the Sphinx team, see AUTHORS.
|
:copyright: Copyright 2007-2014 by the Sphinx team, see AUTHORS.
|
||||||
:license: BSD, see LICENSE for details.
|
:license: BSD, see LICENSE for details.
|
||||||
|
|
||||||
See http://www.nongnu.org/hcb/ for the grammar.
|
See http://www.nongnu.org/hcb/ for the grammar.
|
||||||
See http://mentorembedded.github.io/cxx-abi/abi.html#mangling for the
|
See http://mentorembedded.github.io/cxx-abi/abi.html#mangling for the
|
||||||
inspiration for the id generation.
|
inspiration for the id generation.
|
||||||
|
|
||||||
common grammar things:
|
common grammar things:
|
||||||
simple-declaration
|
simple-declaration
|
||||||
-> attribute-specifier-seq[opt] decl-specifier-seq[opt]
|
-> attribute-specifier-seq[opt] decl-specifier-seq[opt]
|
||||||
@ -20,7 +20,7 @@
|
|||||||
# Use at most 1 init-declerator.
|
# Use at most 1 init-declerator.
|
||||||
-> decl-specifier-seq init-declerator
|
-> decl-specifier-seq init-declerator
|
||||||
-> decl-specifier-seq declerator initializer
|
-> decl-specifier-seq declerator initializer
|
||||||
|
|
||||||
decl-specifier ->
|
decl-specifier ->
|
||||||
storage-class-specifier -> "static" (only for member_object and
|
storage-class-specifier -> "static" (only for member_object and
|
||||||
function_object)
|
function_object)
|
||||||
@ -76,8 +76,8 @@
|
|||||||
constant-expression
|
constant-expression
|
||||||
| type-specifier-seq abstract-declerator
|
| type-specifier-seq abstract-declerator
|
||||||
| id-expression
|
| id-expression
|
||||||
|
|
||||||
|
|
||||||
declerator ->
|
declerator ->
|
||||||
ptr-declerator
|
ptr-declerator
|
||||||
| noptr-declarator parameters-and-qualifiers trailing-return-type
|
| noptr-declarator parameters-and-qualifiers trailing-return-type
|
||||||
@ -108,11 +108,11 @@
|
|||||||
memberFunctionInit -> "=" "0"
|
memberFunctionInit -> "=" "0"
|
||||||
# (note: only "0" is allowed as the value, according to the standard,
|
# (note: only "0" is allowed as the value, according to the standard,
|
||||||
# right?)
|
# right?)
|
||||||
|
|
||||||
|
|
||||||
We additionally add the possibility for specifying the visibility as the
|
We additionally add the possibility for specifying the visibility as the
|
||||||
first thing.
|
first thing.
|
||||||
|
|
||||||
type_object:
|
type_object:
|
||||||
goal:
|
goal:
|
||||||
either a single type (e.g., "MyClass:Something_T" or a typedef-like
|
either a single type (e.g., "MyClass:Something_T" or a typedef-like
|
||||||
@ -126,14 +126,14 @@
|
|||||||
-> decl-specifier-seq abstract-declarator[opt]
|
-> decl-specifier-seq abstract-declarator[opt]
|
||||||
grammar, typedef-like: no initilizer
|
grammar, typedef-like: no initilizer
|
||||||
decl-specifier-seq declerator
|
decl-specifier-seq declerator
|
||||||
|
|
||||||
|
|
||||||
member_object:
|
member_object:
|
||||||
goal: as a type_object which must have a declerator, and optionally
|
goal: as a type_object which must have a declerator, and optionally
|
||||||
with a initializer
|
with a initializer
|
||||||
grammar:
|
grammar:
|
||||||
decl-specifier-seq declerator initializer
|
decl-specifier-seq declerator initializer
|
||||||
|
|
||||||
function_object:
|
function_object:
|
||||||
goal: a function declaration, TODO: what about templates? for now: skip
|
goal: a function declaration, TODO: what about templates? for now: skip
|
||||||
grammar: no initializer
|
grammar: no initializer
|
||||||
@ -141,7 +141,6 @@
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import re
|
import re
|
||||||
import traceback
|
|
||||||
from copy import deepcopy
|
from copy import deepcopy
|
||||||
|
|
||||||
from six import iteritems, text_type
|
from six import iteritems, text_type
|
||||||
@ -222,9 +221,9 @@ _id_operator = {
|
|||||||
'delete[]': 'da',
|
'delete[]': 'da',
|
||||||
# the arguments will make the difference between unary and binary
|
# the arguments will make the difference between unary and binary
|
||||||
# '+(unary)' : 'ps',
|
# '+(unary)' : 'ps',
|
||||||
#'-(unary)' : 'ng',
|
# '-(unary)' : 'ng',
|
||||||
#'&(unary)' : 'ad',
|
# '&(unary)' : 'ad',
|
||||||
#'*(unary)' : 'de',
|
# '*(unary)' : 'de',
|
||||||
'~': 'co',
|
'~': 'co',
|
||||||
'+': 'pl',
|
'+': 'pl',
|
||||||
'-': 'mi',
|
'-': 'mi',
|
||||||
@ -319,7 +318,7 @@ class ASTBase(UnicodeMixin):
|
|||||||
|
|
||||||
|
|
||||||
def _verify_description_mode(mode):
|
def _verify_description_mode(mode):
|
||||||
if not mode in ('lastIsName', 'noneIsName', 'markType', 'param'):
|
if mode not in ('lastIsName', 'noneIsName', 'markType', 'param'):
|
||||||
raise Exception("Description mode '%s' is invalid." % mode)
|
raise Exception("Description mode '%s' is invalid." % mode)
|
||||||
|
|
||||||
|
|
||||||
@ -328,7 +327,7 @@ class ASTOperatorBuildIn(ASTBase):
|
|||||||
self.op = op
|
self.op = op
|
||||||
|
|
||||||
def get_id(self):
|
def get_id(self):
|
||||||
if not self.op in _id_operator:
|
if self.op not in _id_operator:
|
||||||
raise Exception('Internal error: Build-in operator "%s" can not '
|
raise Exception('Internal error: Build-in operator "%s" can not '
|
||||||
'be mapped to an id.' % self.op)
|
'be mapped to an id.' % self.op)
|
||||||
return _id_operator[self.op]
|
return _id_operator[self.op]
|
||||||
@ -434,7 +433,7 @@ class ASTNestedNameElement(ASTBase):
|
|||||||
'', refdomain='cpp', reftype='type',
|
'', refdomain='cpp', reftype='type',
|
||||||
reftarget=targetText, modname=None, classname=None)
|
reftarget=targetText, modname=None, classname=None)
|
||||||
if env: # during testing we don't have an env, do we?
|
if env: # during testing we don't have an env, do we?
|
||||||
pnode['cpp:parent'] = env.temp_data.get('cpp:parent')
|
pnode['cpp:parent'] = env.ref_context.get('cpp:parent')
|
||||||
pnode += nodes.Text(text_type(self.identifier))
|
pnode += nodes.Text(text_type(self.identifier))
|
||||||
signode += pnode
|
signode += pnode
|
||||||
elif mode == 'lastIsName':
|
elif mode == 'lastIsName':
|
||||||
@ -532,7 +531,7 @@ class ASTTrailingTypeSpecFundamental(ASTBase):
|
|||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
def get_id(self):
|
def get_id(self):
|
||||||
if not self.name in _id_fundamental:
|
if self.name not in _id_fundamental:
|
||||||
raise Exception(
|
raise Exception(
|
||||||
'Semi-internal error: Fundamental type "%s" can not be mapped '
|
'Semi-internal error: Fundamental type "%s" can not be mapped '
|
||||||
'to an id. Is it a true fundamental type? If not so, the '
|
'to an id. Is it a true fundamental type? If not so, the '
|
||||||
@ -866,7 +865,7 @@ class ASTDeclerator(ASTBase):
|
|||||||
isinstance(self.ptrOps[-1], ASTPtrOpParamPack)):
|
isinstance(self.ptrOps[-1], ASTPtrOpParamPack)):
|
||||||
return False
|
return False
|
||||||
else:
|
else:
|
||||||
return self.declId != None
|
return self.declId is not None
|
||||||
|
|
||||||
def __unicode__(self):
|
def __unicode__(self):
|
||||||
res = []
|
res = []
|
||||||
@ -949,7 +948,7 @@ class ASTType(ASTBase):
|
|||||||
_verify_description_mode(mode)
|
_verify_description_mode(mode)
|
||||||
self.declSpecs.describe_signature(signode, 'markType', env)
|
self.declSpecs.describe_signature(signode, 'markType', env)
|
||||||
if (self.decl.require_start_space() and
|
if (self.decl.require_start_space() and
|
||||||
len(text_type(self.declSpecs)) > 0):
|
len(text_type(self.declSpecs)) > 0):
|
||||||
signode += nodes.Text(' ')
|
signode += nodes.Text(' ')
|
||||||
self.decl.describe_signature(signode, mode, env)
|
self.decl.describe_signature(signode, mode, env)
|
||||||
|
|
||||||
@ -1178,7 +1177,7 @@ class DefinitionParser(object):
|
|||||||
else:
|
else:
|
||||||
while not self.eof:
|
while not self.eof:
|
||||||
if (len(symbols) == 0 and
|
if (len(symbols) == 0 and
|
||||||
self.current_char in (
|
self.current_char in (
|
||||||
',', '>')):
|
',', '>')):
|
||||||
break
|
break
|
||||||
# TODO: actually implement nice handling
|
# TODO: actually implement nice handling
|
||||||
@ -1190,8 +1189,7 @@ class DefinitionParser(object):
|
|||||||
self.fail(
|
self.fail(
|
||||||
'Could not find end of constant '
|
'Could not find end of constant '
|
||||||
'template argument.')
|
'template argument.')
|
||||||
value = self.definition[
|
value = self.definition[startPos:self.pos].strip()
|
||||||
startPos:self.pos].strip()
|
|
||||||
templateArgs.append(ASTTemplateArgConstant(value))
|
templateArgs.append(ASTTemplateArgConstant(value))
|
||||||
self.skip_ws()
|
self.skip_ws()
|
||||||
if self.skip_string('>'):
|
if self.skip_string('>'):
|
||||||
@ -1422,7 +1420,7 @@ class DefinitionParser(object):
|
|||||||
|
|
||||||
def _parse_declerator(self, named, paramMode=None, typed=True):
|
def _parse_declerator(self, named, paramMode=None, typed=True):
|
||||||
if paramMode:
|
if paramMode:
|
||||||
if not paramMode in ('type', 'function'):
|
if paramMode not in ('type', 'function'):
|
||||||
raise Exception(
|
raise Exception(
|
||||||
"Internal error, unknown paramMode '%s'." % paramMode)
|
"Internal error, unknown paramMode '%s'." % paramMode)
|
||||||
ptrOps = []
|
ptrOps = []
|
||||||
@ -1493,7 +1491,7 @@ class DefinitionParser(object):
|
|||||||
if outer == 'member':
|
if outer == 'member':
|
||||||
value = self.read_rest().strip()
|
value = self.read_rest().strip()
|
||||||
return ASTInitializer(value)
|
return ASTInitializer(value)
|
||||||
elif outer == None: # function parameter
|
elif outer is None: # function parameter
|
||||||
symbols = []
|
symbols = []
|
||||||
startPos = self.pos
|
startPos = self.pos
|
||||||
self.skip_ws()
|
self.skip_ws()
|
||||||
@ -1528,7 +1526,7 @@ class DefinitionParser(object):
|
|||||||
doesn't need to name the arguments
|
doesn't need to name the arguments
|
||||||
"""
|
"""
|
||||||
if outer: # always named
|
if outer: # always named
|
||||||
if not outer in ('type', 'member', 'function'):
|
if outer not in ('type', 'member', 'function'):
|
||||||
raise Exception('Internal error, unknown outer "%s".' % outer)
|
raise Exception('Internal error, unknown outer "%s".' % outer)
|
||||||
assert not named
|
assert not named
|
||||||
|
|
||||||
@ -1652,12 +1650,12 @@ class CPPObject(ObjectDescription):
|
|||||||
if theid not in self.state.document.ids:
|
if theid not in self.state.document.ids:
|
||||||
# the name is not unique, the first one will win
|
# the name is not unique, the first one will win
|
||||||
objects = self.env.domaindata['cpp']['objects']
|
objects = self.env.domaindata['cpp']['objects']
|
||||||
if not name in objects:
|
if name not in objects:
|
||||||
signode['names'].append(name)
|
signode['names'].append(name)
|
||||||
signode['ids'].append(theid)
|
signode['ids'].append(theid)
|
||||||
signode['first'] = (not self.names)
|
signode['first'] = (not self.names)
|
||||||
self.state.document.note_explicit_target(signode)
|
self.state.document.note_explicit_target(signode)
|
||||||
if not name in objects:
|
if name not in objects:
|
||||||
objects.setdefault(name,
|
objects.setdefault(name,
|
||||||
(self.env.docname, ast.objectType, theid))
|
(self.env.docname, ast.objectType, theid))
|
||||||
# add the uninstantiated template if it doesn't exist
|
# add the uninstantiated template if it doesn't exist
|
||||||
@ -1665,8 +1663,8 @@ class CPPObject(ObjectDescription):
|
|||||||
if uninstantiated != name and uninstantiated not in objects:
|
if uninstantiated != name and uninstantiated not in objects:
|
||||||
signode['names'].append(uninstantiated)
|
signode['names'].append(uninstantiated)
|
||||||
objects.setdefault(uninstantiated, (
|
objects.setdefault(uninstantiated, (
|
||||||
self.env.docname, ast.objectType, theid))
|
self.env.docname, ast.objectType, theid))
|
||||||
self.env.temp_data['cpp:lastname'] = ast.prefixedName
|
self.env.ref_context['cpp:lastname'] = ast.prefixedName
|
||||||
|
|
||||||
indextext = self.get_index_text(name)
|
indextext = self.get_index_text(name)
|
||||||
if not re.compile(r'^[a-zA-Z0-9_]*$').match(theid):
|
if not re.compile(r'^[a-zA-Z0-9_]*$').match(theid):
|
||||||
@ -1693,7 +1691,7 @@ class CPPObject(ObjectDescription):
|
|||||||
raise ValueError
|
raise ValueError
|
||||||
self.describe_signature(signode, ast)
|
self.describe_signature(signode, ast)
|
||||||
|
|
||||||
parent = self.env.temp_data.get('cpp:parent')
|
parent = self.env.ref_context.get('cpp:parent')
|
||||||
if parent and len(parent) > 0:
|
if parent and len(parent) > 0:
|
||||||
ast = ast.clone()
|
ast = ast.clone()
|
||||||
ast.prefixedName = ast.name.prefix_nested_name(parent[-1])
|
ast.prefixedName = ast.name.prefix_nested_name(parent[-1])
|
||||||
@ -1741,15 +1739,15 @@ class CPPClassObject(CPPObject):
|
|||||||
return _('%s (C++ class)') % name
|
return _('%s (C++ class)') % name
|
||||||
|
|
||||||
def before_content(self):
|
def before_content(self):
|
||||||
lastname = self.env.temp_data['cpp:lastname']
|
lastname = self.env.ref_context['cpp:lastname']
|
||||||
assert lastname
|
assert lastname
|
||||||
if 'cpp:parent' in self.env.temp_data:
|
if 'cpp:parent' in self.env.ref_context:
|
||||||
self.env.temp_data['cpp:parent'].append(lastname)
|
self.env.ref_context['cpp:parent'].append(lastname)
|
||||||
else:
|
else:
|
||||||
self.env.temp_data['cpp:parent'] = [lastname]
|
self.env.ref_context['cpp:parent'] = [lastname]
|
||||||
|
|
||||||
def after_content(self):
|
def after_content(self):
|
||||||
self.env.temp_data['cpp:parent'].pop()
|
self.env.ref_context['cpp:parent'].pop()
|
||||||
|
|
||||||
def parse_definition(self, parser):
|
def parse_definition(self, parser):
|
||||||
return parser.parse_class_object()
|
return parser.parse_class_object()
|
||||||
@ -1774,7 +1772,7 @@ class CPPNamespaceObject(Directive):
|
|||||||
def run(self):
|
def run(self):
|
||||||
env = self.state.document.settings.env
|
env = self.state.document.settings.env
|
||||||
if self.arguments[0].strip() in ('NULL', '0', 'nullptr'):
|
if self.arguments[0].strip() in ('NULL', '0', 'nullptr'):
|
||||||
env.temp_data['cpp:parent'] = []
|
env.ref_context['cpp:parent'] = []
|
||||||
else:
|
else:
|
||||||
parser = DefinitionParser(self.arguments[0])
|
parser = DefinitionParser(self.arguments[0])
|
||||||
try:
|
try:
|
||||||
@ -1784,13 +1782,13 @@ class CPPNamespaceObject(Directive):
|
|||||||
self.state_machine.reporter.warning(e.description,
|
self.state_machine.reporter.warning(e.description,
|
||||||
line=self.lineno)
|
line=self.lineno)
|
||||||
else:
|
else:
|
||||||
env.temp_data['cpp:parent'] = [prefix]
|
env.ref_context['cpp:parent'] = [prefix]
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
||||||
class CPPXRefRole(XRefRole):
|
class CPPXRefRole(XRefRole):
|
||||||
def process_link(self, env, refnode, has_explicit_title, title, target):
|
def process_link(self, env, refnode, has_explicit_title, title, target):
|
||||||
parent = env.temp_data.get('cpp:parent')
|
parent = env.ref_context.get('cpp:parent')
|
||||||
if parent:
|
if parent:
|
||||||
refnode['cpp:parent'] = parent[:]
|
refnode['cpp:parent'] = parent[:]
|
||||||
if not has_explicit_title:
|
if not has_explicit_title:
|
||||||
@ -1838,18 +1836,24 @@ class CPPDomain(Domain):
|
|||||||
if data[0] == docname:
|
if data[0] == docname:
|
||||||
del self.data['objects'][fullname]
|
del self.data['objects'][fullname]
|
||||||
|
|
||||||
def resolve_xref(self, env, fromdocname, builder,
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
typ, target, node, contnode):
|
# XXX check duplicates
|
||||||
|
for fullname, data in otherdata['objects'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['objects'][fullname] = data
|
||||||
|
|
||||||
|
def _resolve_xref_inner(self, env, fromdocname, builder,
|
||||||
|
target, node, contnode, warn=True):
|
||||||
def _create_refnode(nameAst):
|
def _create_refnode(nameAst):
|
||||||
name = text_type(nameAst)
|
name = text_type(nameAst)
|
||||||
if name not in self.data['objects']:
|
if name not in self.data['objects']:
|
||||||
# try dropping the last template
|
# try dropping the last template
|
||||||
name = nameAst.get_name_no_last_template()
|
name = nameAst.get_name_no_last_template()
|
||||||
if name not in self.data['objects']:
|
if name not in self.data['objects']:
|
||||||
return None
|
return None, None
|
||||||
docname, objectType, id = self.data['objects'][name]
|
docname, objectType, id = self.data['objects'][name]
|
||||||
return make_refnode(builder, fromdocname, docname, id, contnode,
|
return make_refnode(builder, fromdocname, docname, id, contnode,
|
||||||
name)
|
name), objectType
|
||||||
|
|
||||||
parser = DefinitionParser(target)
|
parser = DefinitionParser(target)
|
||||||
try:
|
try:
|
||||||
@ -1858,21 +1862,35 @@ class CPPDomain(Domain):
|
|||||||
if not parser.eof:
|
if not parser.eof:
|
||||||
raise DefinitionError('')
|
raise DefinitionError('')
|
||||||
except DefinitionError:
|
except DefinitionError:
|
||||||
env.warn_node('unparseable C++ definition: %r' % target, node)
|
if warn:
|
||||||
return None
|
env.warn_node('unparseable C++ definition: %r' % target, node)
|
||||||
|
return None, None
|
||||||
|
|
||||||
# try as is the name is fully qualified
|
# try as is the name is fully qualified
|
||||||
refNode = _create_refnode(nameAst)
|
res = _create_refnode(nameAst)
|
||||||
if refNode:
|
if res[0]:
|
||||||
return refNode
|
return res
|
||||||
|
|
||||||
# try qualifying it with the parent
|
# try qualifying it with the parent
|
||||||
parent = node.get('cpp:parent', None)
|
parent = node.get('cpp:parent', None)
|
||||||
if parent and len(parent) > 0:
|
if parent and len(parent) > 0:
|
||||||
return _create_refnode(nameAst.prefix_nested_name(parent[-1]))
|
return _create_refnode(nameAst.prefix_nested_name(parent[-1]))
|
||||||
else:
|
else:
|
||||||
return None
|
return None, None
|
||||||
|
|
||||||
|
def resolve_xref(self, env, fromdocname, builder,
|
||||||
|
typ, target, node, contnode):
|
||||||
|
return self._resolve_xref_inner(env, fromdocname, builder, target, node,
|
||||||
|
contnode)[0]
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target,
|
||||||
|
node, contnode):
|
||||||
|
node, objtype = self._resolve_xref_inner(env, fromdocname, builder,
|
||||||
|
target, node, contnode, warn=False)
|
||||||
|
if node:
|
||||||
|
return [('cpp:' + self.role_for_objtype(objtype), node)]
|
||||||
|
return []
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for refname, (docname, type, theid) in iteritems(self.data['objects']):
|
for refname, (docname, type, theid) in iteritems(self.data['objects']):
|
||||||
yield (refname, refname, type, docname, refname, 1)
|
yield (refname, refname, type, docname, refname, 1)
|
||||||
|
@ -45,7 +45,7 @@ class JSObject(ObjectDescription):
|
|||||||
nameprefix = None
|
nameprefix = None
|
||||||
name = prefix
|
name = prefix
|
||||||
|
|
||||||
objectname = self.env.temp_data.get('js:object')
|
objectname = self.env.ref_context.get('js:object')
|
||||||
if nameprefix:
|
if nameprefix:
|
||||||
if objectname:
|
if objectname:
|
||||||
# someone documenting the method of an attribute of the current
|
# someone documenting the method of an attribute of the current
|
||||||
@ -77,7 +77,7 @@ class JSObject(ObjectDescription):
|
|||||||
|
|
||||||
def add_target_and_index(self, name_obj, sig, signode):
|
def add_target_and_index(self, name_obj, sig, signode):
|
||||||
objectname = self.options.get(
|
objectname = self.options.get(
|
||||||
'object', self.env.temp_data.get('js:object'))
|
'object', self.env.ref_context.get('js:object'))
|
||||||
fullname = name_obj[0]
|
fullname = name_obj[0]
|
||||||
if fullname not in self.state.document.ids:
|
if fullname not in self.state.document.ids:
|
||||||
signode['names'].append(fullname)
|
signode['names'].append(fullname)
|
||||||
@ -140,7 +140,7 @@ class JSConstructor(JSCallable):
|
|||||||
class JSXRefRole(XRefRole):
|
class JSXRefRole(XRefRole):
|
||||||
def process_link(self, env, refnode, has_explicit_title, title, target):
|
def process_link(self, env, refnode, has_explicit_title, title, target):
|
||||||
# basically what sphinx.domains.python.PyXRefRole does
|
# basically what sphinx.domains.python.PyXRefRole does
|
||||||
refnode['js:object'] = env.temp_data.get('js:object')
|
refnode['js:object'] = env.ref_context.get('js:object')
|
||||||
if not has_explicit_title:
|
if not has_explicit_title:
|
||||||
title = title.lstrip('.')
|
title = title.lstrip('.')
|
||||||
target = target.lstrip('~')
|
target = target.lstrip('~')
|
||||||
@ -179,7 +179,7 @@ class JavaScriptDomain(Domain):
|
|||||||
'attr': JSXRefRole(),
|
'attr': JSXRefRole(),
|
||||||
}
|
}
|
||||||
initial_data = {
|
initial_data = {
|
||||||
'objects': {}, # fullname -> docname, objtype
|
'objects': {}, # fullname -> docname, objtype
|
||||||
}
|
}
|
||||||
|
|
||||||
def clear_doc(self, docname):
|
def clear_doc(self, docname):
|
||||||
@ -187,6 +187,12 @@ class JavaScriptDomain(Domain):
|
|||||||
if fn == docname:
|
if fn == docname:
|
||||||
del self.data['objects'][fullname]
|
del self.data['objects'][fullname]
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
# XXX check duplicates
|
||||||
|
for fullname, (fn, objtype) in otherdata['objects'].items():
|
||||||
|
if fn in docnames:
|
||||||
|
self.data['objects'][fullname] = (fn, objtype)
|
||||||
|
|
||||||
def find_obj(self, env, obj, name, typ, searchorder=0):
|
def find_obj(self, env, obj, name, typ, searchorder=0):
|
||||||
if name[-2:] == '()':
|
if name[-2:] == '()':
|
||||||
name = name[:-2]
|
name = name[:-2]
|
||||||
@ -214,6 +220,16 @@ class JavaScriptDomain(Domain):
|
|||||||
return make_refnode(builder, fromdocname, obj[0],
|
return make_refnode(builder, fromdocname, obj[0],
|
||||||
name.replace('$', '_S_'), contnode, name)
|
name.replace('$', '_S_'), contnode, name)
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target, node,
|
||||||
|
contnode):
|
||||||
|
objectname = node.get('js:object')
|
||||||
|
name, obj = self.find_obj(env, objectname, target, None, 1)
|
||||||
|
if not obj:
|
||||||
|
return []
|
||||||
|
return [('js:' + self.role_for_objtype(obj[1]),
|
||||||
|
make_refnode(builder, fromdocname, obj[0],
|
||||||
|
name.replace('$', '_S_'), contnode, name))]
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for refname, (docname, type) in list(self.data['objects'].items()):
|
for refname, (docname, type) in list(self.data['objects'].items()):
|
||||||
yield refname, refname, type, docname, \
|
yield refname, refname, type, docname, \
|
||||||
|
@ -156,8 +156,8 @@ class PyObject(ObjectDescription):
|
|||||||
|
|
||||||
# determine module and class name (if applicable), as well as full name
|
# determine module and class name (if applicable), as well as full name
|
||||||
modname = self.options.get(
|
modname = self.options.get(
|
||||||
'module', self.env.temp_data.get('py:module'))
|
'module', self.env.ref_context.get('py:module'))
|
||||||
classname = self.env.temp_data.get('py:class')
|
classname = self.env.ref_context.get('py:class')
|
||||||
if classname:
|
if classname:
|
||||||
add_module = False
|
add_module = False
|
||||||
if name_prefix and name_prefix.startswith(classname):
|
if name_prefix and name_prefix.startswith(classname):
|
||||||
@ -194,7 +194,7 @@ class PyObject(ObjectDescription):
|
|||||||
# 'exceptions' module.
|
# 'exceptions' module.
|
||||||
elif add_module and self.env.config.add_module_names:
|
elif add_module and self.env.config.add_module_names:
|
||||||
modname = self.options.get(
|
modname = self.options.get(
|
||||||
'module', self.env.temp_data.get('py:module'))
|
'module', self.env.ref_context.get('py:module'))
|
||||||
if modname and modname != 'exceptions':
|
if modname and modname != 'exceptions':
|
||||||
nodetext = modname + '.'
|
nodetext = modname + '.'
|
||||||
signode += addnodes.desc_addname(nodetext, nodetext)
|
signode += addnodes.desc_addname(nodetext, nodetext)
|
||||||
@ -225,7 +225,7 @@ class PyObject(ObjectDescription):
|
|||||||
|
|
||||||
def add_target_and_index(self, name_cls, sig, signode):
|
def add_target_and_index(self, name_cls, sig, signode):
|
||||||
modname = self.options.get(
|
modname = self.options.get(
|
||||||
'module', self.env.temp_data.get('py:module'))
|
'module', self.env.ref_context.get('py:module'))
|
||||||
fullname = (modname and modname + '.' or '') + name_cls[0]
|
fullname = (modname and modname + '.' or '') + name_cls[0]
|
||||||
# note target
|
# note target
|
||||||
if fullname not in self.state.document.ids:
|
if fullname not in self.state.document.ids:
|
||||||
@ -254,7 +254,7 @@ class PyObject(ObjectDescription):
|
|||||||
|
|
||||||
def after_content(self):
|
def after_content(self):
|
||||||
if self.clsname_set:
|
if self.clsname_set:
|
||||||
self.env.temp_data['py:class'] = None
|
self.env.ref_context.pop('py:class', None)
|
||||||
|
|
||||||
|
|
||||||
class PyModulelevel(PyObject):
|
class PyModulelevel(PyObject):
|
||||||
@ -299,7 +299,7 @@ class PyClasslike(PyObject):
|
|||||||
def before_content(self):
|
def before_content(self):
|
||||||
PyObject.before_content(self)
|
PyObject.before_content(self)
|
||||||
if self.names:
|
if self.names:
|
||||||
self.env.temp_data['py:class'] = self.names[0][0]
|
self.env.ref_context['py:class'] = self.names[0][0]
|
||||||
self.clsname_set = True
|
self.clsname_set = True
|
||||||
|
|
||||||
|
|
||||||
@ -377,8 +377,8 @@ class PyClassmember(PyObject):
|
|||||||
def before_content(self):
|
def before_content(self):
|
||||||
PyObject.before_content(self)
|
PyObject.before_content(self)
|
||||||
lastname = self.names and self.names[-1][1]
|
lastname = self.names and self.names[-1][1]
|
||||||
if lastname and not self.env.temp_data.get('py:class'):
|
if lastname and not self.env.ref_context.get('py:class'):
|
||||||
self.env.temp_data['py:class'] = lastname.strip('.')
|
self.env.ref_context['py:class'] = lastname.strip('.')
|
||||||
self.clsname_set = True
|
self.clsname_set = True
|
||||||
|
|
||||||
|
|
||||||
@ -434,7 +434,7 @@ class PyModule(Directive):
|
|||||||
env = self.state.document.settings.env
|
env = self.state.document.settings.env
|
||||||
modname = self.arguments[0].strip()
|
modname = self.arguments[0].strip()
|
||||||
noindex = 'noindex' in self.options
|
noindex = 'noindex' in self.options
|
||||||
env.temp_data['py:module'] = modname
|
env.ref_context['py:module'] = modname
|
||||||
ret = []
|
ret = []
|
||||||
if not noindex:
|
if not noindex:
|
||||||
env.domaindata['py']['modules'][modname] = \
|
env.domaindata['py']['modules'][modname] = \
|
||||||
@ -472,16 +472,16 @@ class PyCurrentModule(Directive):
|
|||||||
env = self.state.document.settings.env
|
env = self.state.document.settings.env
|
||||||
modname = self.arguments[0].strip()
|
modname = self.arguments[0].strip()
|
||||||
if modname == 'None':
|
if modname == 'None':
|
||||||
env.temp_data['py:module'] = None
|
env.ref_context.pop('py:module', None)
|
||||||
else:
|
else:
|
||||||
env.temp_data['py:module'] = modname
|
env.ref_context['py:module'] = modname
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
||||||
class PyXRefRole(XRefRole):
|
class PyXRefRole(XRefRole):
|
||||||
def process_link(self, env, refnode, has_explicit_title, title, target):
|
def process_link(self, env, refnode, has_explicit_title, title, target):
|
||||||
refnode['py:module'] = env.temp_data.get('py:module')
|
refnode['py:module'] = env.ref_context.get('py:module')
|
||||||
refnode['py:class'] = env.temp_data.get('py:class')
|
refnode['py:class'] = env.ref_context.get('py:class')
|
||||||
if not has_explicit_title:
|
if not has_explicit_title:
|
||||||
title = title.lstrip('.') # only has a meaning for the target
|
title = title.lstrip('.') # only has a meaning for the target
|
||||||
target = target.lstrip('~') # only has a meaning for the title
|
target = target.lstrip('~') # only has a meaning for the title
|
||||||
@ -627,6 +627,15 @@ class PythonDomain(Domain):
|
|||||||
if fn == docname:
|
if fn == docname:
|
||||||
del self.data['modules'][modname]
|
del self.data['modules'][modname]
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
# XXX check duplicates?
|
||||||
|
for fullname, (fn, objtype) in otherdata['objects'].items():
|
||||||
|
if fn in docnames:
|
||||||
|
self.data['objects'][fullname] = (fn, objtype)
|
||||||
|
for modname, data in otherdata['modules'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['modules'][modname] = data
|
||||||
|
|
||||||
def find_obj(self, env, modname, classname, name, type, searchmode=0):
|
def find_obj(self, env, modname, classname, name, type, searchmode=0):
|
||||||
"""Find a Python object for "name", perhaps using the given module
|
"""Find a Python object for "name", perhaps using the given module
|
||||||
and/or classname. Returns a list of (name, object entry) tuples.
|
and/or classname. Returns a list of (name, object entry) tuples.
|
||||||
@ -643,7 +652,10 @@ class PythonDomain(Domain):
|
|||||||
|
|
||||||
newname = None
|
newname = None
|
||||||
if searchmode == 1:
|
if searchmode == 1:
|
||||||
objtypes = self.objtypes_for_role(type)
|
if type is None:
|
||||||
|
objtypes = list(self.object_types)
|
||||||
|
else:
|
||||||
|
objtypes = self.objtypes_for_role(type)
|
||||||
if objtypes is not None:
|
if objtypes is not None:
|
||||||
if modname and classname:
|
if modname and classname:
|
||||||
fullname = modname + '.' + classname + '.' + name
|
fullname = modname + '.' + classname + '.' + name
|
||||||
@ -704,22 +716,44 @@ class PythonDomain(Domain):
|
|||||||
name, obj = matches[0]
|
name, obj = matches[0]
|
||||||
|
|
||||||
if obj[1] == 'module':
|
if obj[1] == 'module':
|
||||||
# get additional info for modules
|
return self._make_module_refnode(builder, fromdocname, name,
|
||||||
docname, synopsis, platform, deprecated = self.data['modules'][name]
|
contnode)
|
||||||
assert docname == obj[0]
|
|
||||||
title = name
|
|
||||||
if synopsis:
|
|
||||||
title += ': ' + synopsis
|
|
||||||
if deprecated:
|
|
||||||
title += _(' (deprecated)')
|
|
||||||
if platform:
|
|
||||||
title += ' (' + platform + ')'
|
|
||||||
return make_refnode(builder, fromdocname, docname,
|
|
||||||
'module-' + name, contnode, title)
|
|
||||||
else:
|
else:
|
||||||
return make_refnode(builder, fromdocname, obj[0], name,
|
return make_refnode(builder, fromdocname, obj[0], name,
|
||||||
contnode, name)
|
contnode, name)
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target,
|
||||||
|
node, contnode):
|
||||||
|
modname = node.get('py:module')
|
||||||
|
clsname = node.get('py:class')
|
||||||
|
results = []
|
||||||
|
|
||||||
|
# always search in "refspecific" mode with the :any: role
|
||||||
|
matches = self.find_obj(env, modname, clsname, target, None, 1)
|
||||||
|
for name, obj in matches:
|
||||||
|
if obj[1] == 'module':
|
||||||
|
results.append(('py:mod',
|
||||||
|
self._make_module_refnode(builder, fromdocname,
|
||||||
|
name, contnode)))
|
||||||
|
else:
|
||||||
|
results.append(('py:' + self.role_for_objtype(obj[1]),
|
||||||
|
make_refnode(builder, fromdocname, obj[0], name,
|
||||||
|
contnode, name)))
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _make_module_refnode(self, builder, fromdocname, name, contnode):
|
||||||
|
# get additional info for modules
|
||||||
|
docname, synopsis, platform, deprecated = self.data['modules'][name]
|
||||||
|
title = name
|
||||||
|
if synopsis:
|
||||||
|
title += ': ' + synopsis
|
||||||
|
if deprecated:
|
||||||
|
title += _(' (deprecated)')
|
||||||
|
if platform:
|
||||||
|
title += ' (' + platform + ')'
|
||||||
|
return make_refnode(builder, fromdocname, docname,
|
||||||
|
'module-' + name, contnode, title)
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for modname, info in iteritems(self.data['modules']):
|
for modname, info in iteritems(self.data['modules']):
|
||||||
yield (modname, modname, 'module', info[0], 'module-' + modname, 0)
|
yield (modname, modname, 'module', info[0], 'module-' + modname, 0)
|
||||||
|
@ -123,6 +123,12 @@ class ReSTDomain(Domain):
|
|||||||
if doc == docname:
|
if doc == docname:
|
||||||
del self.data['objects'][typ, name]
|
del self.data['objects'][typ, name]
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
# XXX check duplicates
|
||||||
|
for (typ, name), doc in otherdata['objects'].items():
|
||||||
|
if doc in docnames:
|
||||||
|
self.data['objects'][typ, name] = doc
|
||||||
|
|
||||||
def resolve_xref(self, env, fromdocname, builder, typ, target, node,
|
def resolve_xref(self, env, fromdocname, builder, typ, target, node,
|
||||||
contnode):
|
contnode):
|
||||||
objects = self.data['objects']
|
objects = self.data['objects']
|
||||||
@ -134,6 +140,19 @@ class ReSTDomain(Domain):
|
|||||||
objtype + '-' + target,
|
objtype + '-' + target,
|
||||||
contnode, target + ' ' + objtype)
|
contnode, target + ' ' + objtype)
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target,
|
||||||
|
node, contnode):
|
||||||
|
objects = self.data['objects']
|
||||||
|
results = []
|
||||||
|
for objtype in self.object_types:
|
||||||
|
if (objtype, target) in self.data['objects']:
|
||||||
|
results.append(('rst:' + self.role_for_objtype(objtype),
|
||||||
|
make_refnode(builder, fromdocname,
|
||||||
|
objects[objtype, target],
|
||||||
|
objtype + '-' + target,
|
||||||
|
contnode, target + ' ' + objtype)))
|
||||||
|
return results
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for (typ, name), docname in iteritems(self.data['objects']):
|
for (typ, name), docname in iteritems(self.data['objects']):
|
||||||
yield name, name, typ, docname, typ + '-' + name, 1
|
yield name, name, typ, docname, typ + '-' + name, 1
|
||||||
|
@ -28,7 +28,9 @@ from sphinx.util.compat import Directive
|
|||||||
|
|
||||||
|
|
||||||
# RE for option descriptions
|
# RE for option descriptions
|
||||||
option_desc_re = re.compile(r'((?:/|-|--)?[-_a-zA-Z0-9]+)(\s*.*)')
|
option_desc_re = re.compile(r'((?:/|--|-|\+)?[-?@#_a-zA-Z0-9]+)(=?\s*.*)')
|
||||||
|
# RE for grammar tokens
|
||||||
|
token_re = re.compile('`(\w+)`', re.U)
|
||||||
|
|
||||||
|
|
||||||
class GenericObject(ObjectDescription):
|
class GenericObject(ObjectDescription):
|
||||||
@ -144,8 +146,9 @@ class Cmdoption(ObjectDescription):
|
|||||||
self.env.warn(
|
self.env.warn(
|
||||||
self.env.docname,
|
self.env.docname,
|
||||||
'Malformed option description %r, should '
|
'Malformed option description %r, should '
|
||||||
'look like "opt", "-opt args", "--opt args" or '
|
'look like "opt", "-opt args", "--opt args", '
|
||||||
'"/opt args"' % potential_option, self.lineno)
|
'"/opt args" or "+opt args"' % potential_option,
|
||||||
|
self.lineno)
|
||||||
continue
|
continue
|
||||||
optname, args = m.groups()
|
optname, args = m.groups()
|
||||||
if count:
|
if count:
|
||||||
@ -163,7 +166,7 @@ class Cmdoption(ObjectDescription):
|
|||||||
return firstname
|
return firstname
|
||||||
|
|
||||||
def add_target_and_index(self, firstname, sig, signode):
|
def add_target_and_index(self, firstname, sig, signode):
|
||||||
currprogram = self.env.temp_data.get('std:program')
|
currprogram = self.env.ref_context.get('std:program')
|
||||||
for optname in signode.get('allnames', []):
|
for optname in signode.get('allnames', []):
|
||||||
targetname = optname.replace('/', '-')
|
targetname = optname.replace('/', '-')
|
||||||
if not targetname.startswith('-'):
|
if not targetname.startswith('-'):
|
||||||
@ -198,36 +201,19 @@ class Program(Directive):
|
|||||||
env = self.state.document.settings.env
|
env = self.state.document.settings.env
|
||||||
program = ws_re.sub('-', self.arguments[0].strip())
|
program = ws_re.sub('-', self.arguments[0].strip())
|
||||||
if program == 'None':
|
if program == 'None':
|
||||||
env.temp_data['std:program'] = None
|
env.ref_context.pop('std:program', None)
|
||||||
else:
|
else:
|
||||||
env.temp_data['std:program'] = program
|
env.ref_context['std:program'] = program
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
||||||
class OptionXRefRole(XRefRole):
|
class OptionXRefRole(XRefRole):
|
||||||
innernodeclass = addnodes.literal_emphasis
|
|
||||||
|
|
||||||
def _split(self, text, refnode, env):
|
|
||||||
try:
|
|
||||||
program, target = re.split(' (?=-|--|/)', text, 1)
|
|
||||||
except ValueError:
|
|
||||||
env.warn_node('Malformed :option: %r, does not contain option '
|
|
||||||
'marker - or -- or /' % text, refnode)
|
|
||||||
return None, text
|
|
||||||
else:
|
|
||||||
program = ws_re.sub('-', program)
|
|
||||||
return program, target
|
|
||||||
|
|
||||||
def process_link(self, env, refnode, has_explicit_title, title, target):
|
def process_link(self, env, refnode, has_explicit_title, title, target):
|
||||||
program = env.temp_data.get('std:program')
|
# validate content
|
||||||
if not has_explicit_title:
|
if not re.match('(.+ )?[-/+]', target):
|
||||||
if ' ' in title and not (title.startswith('/') or
|
env.warn_node('Malformed :option: %r, does not contain option '
|
||||||
title.startswith('-')):
|
'marker - or -- or / or +' % target, refnode)
|
||||||
program, target = self._split(title, refnode, env)
|
refnode['std:program'] = env.ref_context.get('std:program')
|
||||||
target = target.strip()
|
|
||||||
elif ' ' in target:
|
|
||||||
program, target = self._split(target, refnode, env)
|
|
||||||
refnode['refprogram'] = program
|
|
||||||
return title, target
|
return title, target
|
||||||
|
|
||||||
|
|
||||||
@ -327,7 +313,7 @@ class Glossary(Directive):
|
|||||||
else:
|
else:
|
||||||
messages.append(self.state.reporter.system_message(
|
messages.append(self.state.reporter.system_message(
|
||||||
2, 'glossary seems to be misformatted, check '
|
2, 'glossary seems to be misformatted, check '
|
||||||
'indentation', source=source, line=lineno))
|
'indentation', source=source, line=lineno))
|
||||||
else:
|
else:
|
||||||
if not in_definition:
|
if not in_definition:
|
||||||
# first line of definition, determines indentation
|
# first line of definition, determines indentation
|
||||||
@ -338,7 +324,7 @@ class Glossary(Directive):
|
|||||||
else:
|
else:
|
||||||
messages.append(self.state.reporter.system_message(
|
messages.append(self.state.reporter.system_message(
|
||||||
2, 'glossary seems to be misformatted, check '
|
2, 'glossary seems to be misformatted, check '
|
||||||
'indentation', source=source, line=lineno))
|
'indentation', source=source, line=lineno))
|
||||||
was_empty = False
|
was_empty = False
|
||||||
|
|
||||||
# now, parse all the entries into a big definition list
|
# now, parse all the entries into a big definition list
|
||||||
@ -359,7 +345,7 @@ class Glossary(Directive):
|
|||||||
tmp.source = source
|
tmp.source = source
|
||||||
tmp.line = lineno
|
tmp.line = lineno
|
||||||
new_id, termtext, new_termnodes = \
|
new_id, termtext, new_termnodes = \
|
||||||
make_termnodes_from_paragraph_node(env, tmp)
|
make_termnodes_from_paragraph_node(env, tmp)
|
||||||
ids.append(new_id)
|
ids.append(new_id)
|
||||||
termtexts.append(termtext)
|
termtexts.append(termtext)
|
||||||
termnodes.extend(new_termnodes)
|
termnodes.extend(new_termnodes)
|
||||||
@ -386,8 +372,6 @@ class Glossary(Directive):
|
|||||||
return messages + [node]
|
return messages + [node]
|
||||||
|
|
||||||
|
|
||||||
token_re = re.compile('`(\w+)`', re.U)
|
|
||||||
|
|
||||||
def token_xrefs(text):
|
def token_xrefs(text):
|
||||||
retnodes = []
|
retnodes = []
|
||||||
pos = 0
|
pos = 0
|
||||||
@ -472,7 +456,7 @@ class StandardDomain(Domain):
|
|||||||
'productionlist': ProductionList,
|
'productionlist': ProductionList,
|
||||||
}
|
}
|
||||||
roles = {
|
roles = {
|
||||||
'option': OptionXRefRole(innernodeclass=addnodes.literal_emphasis),
|
'option': OptionXRefRole(),
|
||||||
'envvar': EnvVarXRefRole(),
|
'envvar': EnvVarXRefRole(),
|
||||||
# links to tokens in grammar productions
|
# links to tokens in grammar productions
|
||||||
'token': XRefRole(),
|
'token': XRefRole(),
|
||||||
@ -522,6 +506,21 @@ class StandardDomain(Domain):
|
|||||||
if fn == docname:
|
if fn == docname:
|
||||||
del self.data['anonlabels'][key]
|
del self.data['anonlabels'][key]
|
||||||
|
|
||||||
|
def merge_domaindata(self, docnames, otherdata):
|
||||||
|
# XXX duplicates?
|
||||||
|
for key, data in otherdata['progoptions'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['progoptions'][key] = data
|
||||||
|
for key, data in otherdata['objects'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['objects'][key] = data
|
||||||
|
for key, data in otherdata['labels'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['labels'][key] = data
|
||||||
|
for key, data in otherdata['anonlabels'].items():
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.data['anonlabels'][key] = data
|
||||||
|
|
||||||
def process_doc(self, env, docname, document):
|
def process_doc(self, env, docname, document):
|
||||||
labels, anonlabels = self.data['labels'], self.data['anonlabels']
|
labels, anonlabels = self.data['labels'], self.data['anonlabels']
|
||||||
for name, explicit in iteritems(document.nametypes):
|
for name, explicit in iteritems(document.nametypes):
|
||||||
@ -532,7 +531,7 @@ class StandardDomain(Domain):
|
|||||||
continue
|
continue
|
||||||
node = document.ids[labelid]
|
node = document.ids[labelid]
|
||||||
if name.isdigit() or 'refuri' in node or \
|
if name.isdigit() or 'refuri' in node or \
|
||||||
node.tagname.startswith('desc_'):
|
node.tagname.startswith('desc_'):
|
||||||
# ignore footnote labels, labels automatically generated from a
|
# ignore footnote labels, labels automatically generated from a
|
||||||
# link and object descriptions
|
# link and object descriptions
|
||||||
continue
|
continue
|
||||||
@ -541,7 +540,7 @@ class StandardDomain(Domain):
|
|||||||
'in ' + env.doc2path(labels[name][0]), node)
|
'in ' + env.doc2path(labels[name][0]), node)
|
||||||
anonlabels[name] = docname, labelid
|
anonlabels[name] = docname, labelid
|
||||||
if node.tagname == 'section':
|
if node.tagname == 'section':
|
||||||
sectname = clean_astext(node[0]) # node[0] == title node
|
sectname = clean_astext(node[0]) # node[0] == title node
|
||||||
elif node.tagname == 'figure':
|
elif node.tagname == 'figure':
|
||||||
for n in node:
|
for n in node:
|
||||||
if n.tagname == 'caption':
|
if n.tagname == 'caption':
|
||||||
@ -579,13 +578,13 @@ class StandardDomain(Domain):
|
|||||||
if node['refexplicit']:
|
if node['refexplicit']:
|
||||||
# reference to anonymous label; the reference uses
|
# reference to anonymous label; the reference uses
|
||||||
# the supplied link caption
|
# the supplied link caption
|
||||||
docname, labelid = self.data['anonlabels'].get(target, ('',''))
|
docname, labelid = self.data['anonlabels'].get(target, ('', ''))
|
||||||
sectname = node.astext()
|
sectname = node.astext()
|
||||||
else:
|
else:
|
||||||
# reference to named label; the final node will
|
# reference to named label; the final node will
|
||||||
# contain the section name after the label
|
# contain the section name after the label
|
||||||
docname, labelid, sectname = self.data['labels'].get(target,
|
docname, labelid, sectname = self.data['labels'].get(target,
|
||||||
('','',''))
|
('', '', ''))
|
||||||
if not docname:
|
if not docname:
|
||||||
return None
|
return None
|
||||||
newnode = nodes.reference('', '', internal=True)
|
newnode = nodes.reference('', '', internal=True)
|
||||||
@ -607,13 +606,22 @@ class StandardDomain(Domain):
|
|||||||
return newnode
|
return newnode
|
||||||
elif typ == 'keyword':
|
elif typ == 'keyword':
|
||||||
# keywords are oddballs: they are referenced by named labels
|
# keywords are oddballs: they are referenced by named labels
|
||||||
docname, labelid, _ = self.data['labels'].get(target, ('','',''))
|
docname, labelid, _ = self.data['labels'].get(target, ('', '', ''))
|
||||||
if not docname:
|
if not docname:
|
||||||
return None
|
return None
|
||||||
return make_refnode(builder, fromdocname, docname,
|
return make_refnode(builder, fromdocname, docname,
|
||||||
labelid, contnode)
|
labelid, contnode)
|
||||||
elif typ == 'option':
|
elif typ == 'option':
|
||||||
progname = node['refprogram']
|
target = target.strip()
|
||||||
|
# most obvious thing: we are a flag option without program
|
||||||
|
if target.startswith(('-', '/', '+')):
|
||||||
|
progname = node.get('std:program')
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
progname, target = re.split(r' (?=-|--|/|\+)', target, 1)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
progname = ws_re.sub('-', progname.strip())
|
||||||
docname, labelid = self.data['progoptions'].get((progname, target),
|
docname, labelid = self.data['progoptions'].get((progname, target),
|
||||||
('', ''))
|
('', ''))
|
||||||
if not docname:
|
if not docname:
|
||||||
@ -633,6 +641,28 @@ class StandardDomain(Domain):
|
|||||||
return make_refnode(builder, fromdocname, docname,
|
return make_refnode(builder, fromdocname, docname,
|
||||||
labelid, contnode)
|
labelid, contnode)
|
||||||
|
|
||||||
|
def resolve_any_xref(self, env, fromdocname, builder, target,
|
||||||
|
node, contnode):
|
||||||
|
results = []
|
||||||
|
ltarget = target.lower() # :ref: lowercases its target automatically
|
||||||
|
for role in ('ref', 'option'): # do not try "keyword"
|
||||||
|
res = self.resolve_xref(env, fromdocname, builder, role,
|
||||||
|
ltarget if role == 'ref' else target,
|
||||||
|
node, contnode)
|
||||||
|
if res:
|
||||||
|
results.append(('std:' + role, res))
|
||||||
|
# all others
|
||||||
|
for objtype in self.object_types:
|
||||||
|
key = (objtype, target)
|
||||||
|
if objtype == 'term':
|
||||||
|
key = (objtype, ltarget)
|
||||||
|
if key in self.data['objects']:
|
||||||
|
docname, labelid = self.data['objects'][key]
|
||||||
|
results.append(('std:' + self.role_for_objtype(objtype),
|
||||||
|
make_refnode(builder, fromdocname, docname,
|
||||||
|
labelid, contnode)))
|
||||||
|
return results
|
||||||
|
|
||||||
def get_objects(self):
|
def get_objects(self):
|
||||||
for (prog, option), info in iteritems(self.data['progoptions']):
|
for (prog, option), info in iteritems(self.data['progoptions']):
|
||||||
yield (option, option, 'option', info[0], info[1], 1)
|
yield (option, option, 'option', info[0], info[1], 1)
|
||||||
|
@ -33,26 +33,31 @@ from docutils.parsers.rst import roles, directives
|
|||||||
from docutils.parsers.rst.languages import en as english
|
from docutils.parsers.rst.languages import en as english
|
||||||
from docutils.parsers.rst.directives.html import MetaBody
|
from docutils.parsers.rst.directives.html import MetaBody
|
||||||
from docutils.writers import UnfilteredWriter
|
from docutils.writers import UnfilteredWriter
|
||||||
|
from docutils.frontend import OptionParser
|
||||||
|
|
||||||
from sphinx import addnodes
|
from sphinx import addnodes
|
||||||
from sphinx.util import url_re, get_matching_docs, docname_join, split_into, \
|
from sphinx.util import url_re, get_matching_docs, docname_join, split_into, \
|
||||||
FilenameUniqDict
|
FilenameUniqDict
|
||||||
from sphinx.util.nodes import clean_astext, make_refnode, WarningStream
|
from sphinx.util.nodes import clean_astext, make_refnode, WarningStream
|
||||||
from sphinx.util.osutil import SEP, fs_encoding, find_catalog_files
|
from sphinx.util.osutil import SEP, find_catalog_files, getcwd, fs_encoding
|
||||||
|
from sphinx.util.console import bold, purple
|
||||||
from sphinx.util.matching import compile_matchers
|
from sphinx.util.matching import compile_matchers
|
||||||
|
from sphinx.util.parallel import ParallelTasks, parallel_available, make_chunks
|
||||||
from sphinx.util.websupport import is_commentable
|
from sphinx.util.websupport import is_commentable
|
||||||
from sphinx.errors import SphinxError, ExtensionError
|
from sphinx.errors import SphinxError, ExtensionError
|
||||||
from sphinx.locale import _
|
from sphinx.locale import _
|
||||||
from sphinx.versioning import add_uids, merge_doctrees
|
from sphinx.versioning import add_uids, merge_doctrees
|
||||||
from sphinx.transforms import DefaultSubstitutions, MoveModuleTargets, \
|
from sphinx.transforms import DefaultSubstitutions, MoveModuleTargets, \
|
||||||
HandleCodeBlocks, SortIds, CitationReferences, Locale, \
|
HandleCodeBlocks, SortIds, CitationReferences, Locale, \
|
||||||
RemoveTranslatableInline, SphinxContentsFilter
|
RemoveTranslatableInline, SphinxContentsFilter
|
||||||
|
|
||||||
|
|
||||||
orig_role_function = roles.role
|
orig_role_function = roles.role
|
||||||
orig_directive_function = directives.directive
|
orig_directive_function = directives.directive
|
||||||
|
|
||||||
class ElementLookupError(Exception): pass
|
|
||||||
|
class ElementLookupError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
default_settings = {
|
default_settings = {
|
||||||
@ -69,7 +74,9 @@ default_settings = {
|
|||||||
|
|
||||||
# This is increased every time an environment attribute is added
|
# This is increased every time an environment attribute is added
|
||||||
# or changed to properly invalidate pickle files.
|
# or changed to properly invalidate pickle files.
|
||||||
ENV_VERSION = 42 + (sys.version_info[0] - 2)
|
#
|
||||||
|
# NOTE: increase base version by 2 to have distinct numbers for Py2 and 3
|
||||||
|
ENV_VERSION = 44 + (sys.version_info[0] - 2)
|
||||||
|
|
||||||
|
|
||||||
dummy_reporter = Reporter('', 4, 4)
|
dummy_reporter = Reporter('', 4, 4)
|
||||||
@ -105,6 +112,33 @@ class SphinxDummyWriter(UnfilteredWriter):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class SphinxFileInput(FileInput):
|
||||||
|
def __init__(self, app, env, *args, **kwds):
|
||||||
|
self.app = app
|
||||||
|
self.env = env
|
||||||
|
# don't call sys.exit() on IOErrors
|
||||||
|
kwds['handle_io_errors'] = False
|
||||||
|
kwds['error_handler'] = 'sphinx' # py3: handle error on open.
|
||||||
|
FileInput.__init__(self, *args, **kwds)
|
||||||
|
|
||||||
|
def decode(self, data):
|
||||||
|
if isinstance(data, text_type): # py3: `data` already decoded.
|
||||||
|
return data
|
||||||
|
return data.decode(self.encoding, 'sphinx') # py2: decoding
|
||||||
|
|
||||||
|
def read(self):
|
||||||
|
data = FileInput.read(self)
|
||||||
|
if self.app:
|
||||||
|
arg = [data]
|
||||||
|
self.app.emit('source-read', self.env.docname, arg)
|
||||||
|
data = arg[0]
|
||||||
|
if self.env.config.rst_epilog:
|
||||||
|
data = data + '\n' + self.env.config.rst_epilog + '\n'
|
||||||
|
if self.env.config.rst_prolog:
|
||||||
|
data = self.env.config.rst_prolog + '\n' + data
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
class BuildEnvironment:
|
class BuildEnvironment:
|
||||||
"""
|
"""
|
||||||
The environment in which the ReST files are translated.
|
The environment in which the ReST files are translated.
|
||||||
@ -122,7 +156,7 @@ class BuildEnvironment:
|
|||||||
finally:
|
finally:
|
||||||
picklefile.close()
|
picklefile.close()
|
||||||
if env.version != ENV_VERSION:
|
if env.version != ENV_VERSION:
|
||||||
raise IOError('env version not current')
|
raise IOError('build environment version not current')
|
||||||
env.config.values = config.values
|
env.config.values = config.values
|
||||||
return env
|
return env
|
||||||
|
|
||||||
@ -138,9 +172,9 @@ class BuildEnvironment:
|
|||||||
# remove potentially pickling-problematic values from config
|
# remove potentially pickling-problematic values from config
|
||||||
for key, val in list(vars(self.config).items()):
|
for key, val in list(vars(self.config).items()):
|
||||||
if key.startswith('_') or \
|
if key.startswith('_') or \
|
||||||
isinstance(val, types.ModuleType) or \
|
isinstance(val, types.ModuleType) or \
|
||||||
isinstance(val, types.FunctionType) or \
|
isinstance(val, types.FunctionType) or \
|
||||||
isinstance(val, class_types):
|
isinstance(val, class_types):
|
||||||
del self.config[key]
|
del self.config[key]
|
||||||
try:
|
try:
|
||||||
pickle.dump(self, picklefile, pickle.HIGHEST_PROTOCOL)
|
pickle.dump(self, picklefile, pickle.HIGHEST_PROTOCOL)
|
||||||
@ -181,8 +215,8 @@ class BuildEnvironment:
|
|||||||
# the source suffix.
|
# the source suffix.
|
||||||
|
|
||||||
self.found_docs = set() # contains all existing docnames
|
self.found_docs = set() # contains all existing docnames
|
||||||
self.all_docs = {} # docname -> mtime at the time of build
|
self.all_docs = {} # docname -> mtime at the time of reading
|
||||||
# contains all built docnames
|
# contains all read docnames
|
||||||
self.dependencies = {} # docname -> set of dependent file
|
self.dependencies = {} # docname -> set of dependent file
|
||||||
# names, relative to documentation root
|
# names, relative to documentation root
|
||||||
self.reread_always = set() # docnames to re-read unconditionally on
|
self.reread_always = set() # docnames to re-read unconditionally on
|
||||||
@ -223,6 +257,10 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
# temporary data storage while reading a document
|
# temporary data storage while reading a document
|
||||||
self.temp_data = {}
|
self.temp_data = {}
|
||||||
|
# context for cross-references (e.g. current module or class)
|
||||||
|
# this is similar to temp_data, but will for example be copied to
|
||||||
|
# attributes of "any" cross references
|
||||||
|
self.ref_context = {}
|
||||||
|
|
||||||
def set_warnfunc(self, func):
|
def set_warnfunc(self, func):
|
||||||
self._warnfunc = func
|
self._warnfunc = func
|
||||||
@ -292,6 +330,50 @@ class BuildEnvironment:
|
|||||||
for domain in self.domains.values():
|
for domain in self.domains.values():
|
||||||
domain.clear_doc(docname)
|
domain.clear_doc(docname)
|
||||||
|
|
||||||
|
def merge_info_from(self, docnames, other, app):
|
||||||
|
"""Merge global information gathered about *docnames* while reading them
|
||||||
|
from the *other* environment.
|
||||||
|
|
||||||
|
This possibly comes from a parallel build process.
|
||||||
|
"""
|
||||||
|
docnames = set(docnames)
|
||||||
|
for docname in docnames:
|
||||||
|
self.all_docs[docname] = other.all_docs[docname]
|
||||||
|
if docname in other.reread_always:
|
||||||
|
self.reread_always.add(docname)
|
||||||
|
self.metadata[docname] = other.metadata[docname]
|
||||||
|
if docname in other.dependencies:
|
||||||
|
self.dependencies[docname] = other.dependencies[docname]
|
||||||
|
self.titles[docname] = other.titles[docname]
|
||||||
|
self.longtitles[docname] = other.longtitles[docname]
|
||||||
|
self.tocs[docname] = other.tocs[docname]
|
||||||
|
self.toc_num_entries[docname] = other.toc_num_entries[docname]
|
||||||
|
# toc_secnumbers is not assigned during read
|
||||||
|
if docname in other.toctree_includes:
|
||||||
|
self.toctree_includes[docname] = other.toctree_includes[docname]
|
||||||
|
self.indexentries[docname] = other.indexentries[docname]
|
||||||
|
if docname in other.glob_toctrees:
|
||||||
|
self.glob_toctrees.add(docname)
|
||||||
|
if docname in other.numbered_toctrees:
|
||||||
|
self.numbered_toctrees.add(docname)
|
||||||
|
|
||||||
|
self.images.merge_other(docnames, other.images)
|
||||||
|
self.dlfiles.merge_other(docnames, other.dlfiles)
|
||||||
|
|
||||||
|
for subfn, fnset in other.files_to_rebuild.items():
|
||||||
|
self.files_to_rebuild.setdefault(subfn, set()).update(fnset & docnames)
|
||||||
|
for key, data in other.citations.items():
|
||||||
|
# XXX duplicates?
|
||||||
|
if data[0] in docnames:
|
||||||
|
self.citations[key] = data
|
||||||
|
for version, changes in other.versionchanges.items():
|
||||||
|
self.versionchanges.setdefault(version, []).extend(
|
||||||
|
change for change in changes if change[1] in docnames)
|
||||||
|
|
||||||
|
for domainname, domain in self.domains.items():
|
||||||
|
domain.merge_domaindata(docnames, other.domaindata[domainname])
|
||||||
|
app.emit('env-merge-info', self, docnames, other)
|
||||||
|
|
||||||
def doc2path(self, docname, base=True, suffix=None):
|
def doc2path(self, docname, base=True, suffix=None):
|
||||||
"""Return the filename for the document name.
|
"""Return the filename for the document name.
|
||||||
|
|
||||||
@ -407,13 +489,11 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
return added, changed, removed
|
return added, changed, removed
|
||||||
|
|
||||||
def update(self, config, srcdir, doctreedir, app=None):
|
def update(self, config, srcdir, doctreedir, app):
|
||||||
"""(Re-)read all files new or changed since last update.
|
"""(Re-)read all files new or changed since last update.
|
||||||
|
|
||||||
Returns a summary, the total count of documents to reread and an
|
Store all environment docnames in the canonical format (ie using SEP as
|
||||||
iterator that yields docnames as it processes them. Store all
|
a separator in place of os.path.sep).
|
||||||
environment docnames in the canonical format (ie using SEP as a
|
|
||||||
separator in place of os.path.sep).
|
|
||||||
"""
|
"""
|
||||||
config_changed = False
|
config_changed = False
|
||||||
if self.config is None:
|
if self.config is None:
|
||||||
@ -445,6 +525,8 @@ class BuildEnvironment:
|
|||||||
# this cache also needs to be updated every time
|
# this cache also needs to be updated every time
|
||||||
self._nitpick_ignore = set(self.config.nitpick_ignore)
|
self._nitpick_ignore = set(self.config.nitpick_ignore)
|
||||||
|
|
||||||
|
app.info(bold('updating environment: '), nonl=1)
|
||||||
|
|
||||||
added, changed, removed = self.get_outdated_files(config_changed)
|
added, changed, removed = self.get_outdated_files(config_changed)
|
||||||
|
|
||||||
# allow user intervention as well
|
# allow user intervention as well
|
||||||
@ -459,30 +541,98 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
msg += '%s added, %s changed, %s removed' % (len(added), len(changed),
|
msg += '%s added, %s changed, %s removed' % (len(added), len(changed),
|
||||||
len(removed))
|
len(removed))
|
||||||
|
app.info(msg)
|
||||||
|
|
||||||
def update_generator():
|
self.app = app
|
||||||
|
|
||||||
|
# clear all files no longer present
|
||||||
|
for docname in removed:
|
||||||
|
app.emit('env-purge-doc', self, docname)
|
||||||
|
self.clear_doc(docname)
|
||||||
|
|
||||||
|
# read all new and changed files
|
||||||
|
docnames = sorted(added | changed)
|
||||||
|
# allow changing and reordering the list of docs to read
|
||||||
|
app.emit('env-before-read-docs', self, docnames)
|
||||||
|
|
||||||
|
# check if we should do parallel or serial read
|
||||||
|
par_ok = False
|
||||||
|
if parallel_available and len(docnames) > 5 and app.parallel > 1:
|
||||||
|
par_ok = True
|
||||||
|
for extname, md in app._extension_metadata.items():
|
||||||
|
ext_ok = md.get('parallel_read_safe')
|
||||||
|
if ext_ok:
|
||||||
|
continue
|
||||||
|
if ext_ok is None:
|
||||||
|
app.warn('the %s extension does not declare if it '
|
||||||
|
'is safe for parallel reading, assuming it '
|
||||||
|
'isn\'t - please ask the extension author to '
|
||||||
|
'check and make it explicit' % extname)
|
||||||
|
app.warn('doing serial read')
|
||||||
|
else:
|
||||||
|
app.warn('the %s extension is not safe for parallel '
|
||||||
|
'reading, doing serial read' % extname)
|
||||||
|
par_ok = False
|
||||||
|
break
|
||||||
|
if par_ok:
|
||||||
|
self._read_parallel(docnames, app, nproc=app.parallel)
|
||||||
|
else:
|
||||||
|
self._read_serial(docnames, app)
|
||||||
|
|
||||||
|
if config.master_doc not in self.all_docs:
|
||||||
|
self.warn(None, 'master file %s not found' %
|
||||||
|
self.doc2path(config.master_doc))
|
||||||
|
|
||||||
|
self.app = None
|
||||||
|
app.emit('env-updated', self)
|
||||||
|
return docnames
|
||||||
|
|
||||||
|
def _read_serial(self, docnames, app):
|
||||||
|
for docname in app.status_iterator(docnames, 'reading sources... ',
|
||||||
|
purple, len(docnames)):
|
||||||
|
# remove all inventory entries for that file
|
||||||
|
app.emit('env-purge-doc', self, docname)
|
||||||
|
self.clear_doc(docname)
|
||||||
|
self.read_doc(docname, app)
|
||||||
|
|
||||||
|
def _read_parallel(self, docnames, app, nproc):
|
||||||
|
# clear all outdated docs at once
|
||||||
|
for docname in docnames:
|
||||||
|
app.emit('env-purge-doc', self, docname)
|
||||||
|
self.clear_doc(docname)
|
||||||
|
|
||||||
|
def read_process(docs):
|
||||||
self.app = app
|
self.app = app
|
||||||
|
self.warnings = []
|
||||||
|
self.set_warnfunc(lambda *args: self.warnings.append(args))
|
||||||
|
for docname in docs:
|
||||||
|
self.read_doc(docname, app)
|
||||||
|
# allow pickling self to send it back
|
||||||
|
self.set_warnfunc(None)
|
||||||
|
del self.app
|
||||||
|
del self.domains
|
||||||
|
del self.config.values
|
||||||
|
del self.config
|
||||||
|
return self
|
||||||
|
|
||||||
# clear all files no longer present
|
def merge(docs, otherenv):
|
||||||
for docname in removed:
|
warnings.extend(otherenv.warnings)
|
||||||
if app:
|
self.merge_info_from(docs, otherenv, app)
|
||||||
app.emit('env-purge-doc', self, docname)
|
|
||||||
self.clear_doc(docname)
|
|
||||||
|
|
||||||
# read all new and changed files
|
tasks = ParallelTasks(nproc)
|
||||||
for docname in sorted(added | changed):
|
chunks = make_chunks(docnames, nproc)
|
||||||
yield docname
|
|
||||||
self.read_doc(docname, app=app)
|
|
||||||
|
|
||||||
if config.master_doc not in self.all_docs:
|
warnings = []
|
||||||
self.warn(None, 'master file %s not found' %
|
for chunk in app.status_iterator(
|
||||||
self.doc2path(config.master_doc))
|
chunks, 'reading sources... ', purple, len(chunks)):
|
||||||
|
tasks.add_task(read_process, chunk, merge)
|
||||||
|
|
||||||
self.app = None
|
# make sure all threads have finished
|
||||||
if app:
|
app.info(bold('waiting for workers...'))
|
||||||
app.emit('env-updated', self)
|
tasks.join()
|
||||||
|
|
||||||
return msg, len(added | changed), update_generator()
|
for warning in warnings:
|
||||||
|
self._warnfunc(*warning)
|
||||||
|
|
||||||
def check_dependents(self, already):
|
def check_dependents(self, already):
|
||||||
to_rewrite = self.assign_section_numbers()
|
to_rewrite = self.assign_section_numbers()
|
||||||
@ -496,7 +646,8 @@ class BuildEnvironment:
|
|||||||
"""Custom decoding error handler that warns and replaces."""
|
"""Custom decoding error handler that warns and replaces."""
|
||||||
linestart = error.object.rfind(b'\n', 0, error.start)
|
linestart = error.object.rfind(b'\n', 0, error.start)
|
||||||
lineend = error.object.find(b'\n', error.start)
|
lineend = error.object.find(b'\n', error.start)
|
||||||
if lineend == -1: lineend = len(error.object)
|
if lineend == -1:
|
||||||
|
lineend = len(error.object)
|
||||||
lineno = error.object.count(b'\n', 0, error.start) + 1
|
lineno = error.object.count(b'\n', 0, error.start) + 1
|
||||||
self.warn(self.docname, 'undecodable source characters, '
|
self.warn(self.docname, 'undecodable source characters, '
|
||||||
'replacing with "?": %r' %
|
'replacing with "?": %r' %
|
||||||
@ -550,19 +701,8 @@ class BuildEnvironment:
|
|||||||
directives.directive = directive
|
directives.directive = directive
|
||||||
roles.role = role
|
roles.role = role
|
||||||
|
|
||||||
def read_doc(self, docname, src_path=None, save_parsed=True, app=None):
|
def read_doc(self, docname, app=None):
|
||||||
"""Parse a file and add/update inventory entries for the doctree.
|
"""Parse a file and add/update inventory entries for the doctree."""
|
||||||
|
|
||||||
If srcpath is given, read from a different source file.
|
|
||||||
"""
|
|
||||||
# remove all inventory entries for that file
|
|
||||||
if app:
|
|
||||||
app.emit('env-purge-doc', self, docname)
|
|
||||||
|
|
||||||
self.clear_doc(docname)
|
|
||||||
|
|
||||||
if src_path is None:
|
|
||||||
src_path = self.doc2path(docname)
|
|
||||||
|
|
||||||
self.temp_data['docname'] = docname
|
self.temp_data['docname'] = docname
|
||||||
# defaults to the global default, but can be re-set in a document
|
# defaults to the global default, but can be re-set in a document
|
||||||
@ -576,6 +716,12 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
self.patch_lookup_functions()
|
self.patch_lookup_functions()
|
||||||
|
|
||||||
|
docutilsconf = path.join(self.srcdir, 'docutils.conf')
|
||||||
|
# read docutils.conf from source dir, not from current dir
|
||||||
|
OptionParser.standard_config_files[1] = docutilsconf
|
||||||
|
if path.isfile(docutilsconf):
|
||||||
|
self.note_dependency(docutilsconf)
|
||||||
|
|
||||||
if self.config.default_role:
|
if self.config.default_role:
|
||||||
role_fn, messages = roles.role(self.config.default_role, english,
|
role_fn, messages = roles.role(self.config.default_role, english,
|
||||||
0, dummy_reporter)
|
0, dummy_reporter)
|
||||||
@ -587,38 +733,17 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
codecs.register_error('sphinx', self.warn_and_replace)
|
codecs.register_error('sphinx', self.warn_and_replace)
|
||||||
|
|
||||||
class SphinxSourceClass(FileInput):
|
|
||||||
def __init__(self_, *args, **kwds):
|
|
||||||
# don't call sys.exit() on IOErrors
|
|
||||||
kwds['handle_io_errors'] = False
|
|
||||||
kwds['error_handler'] = 'sphinx' # py3: handle error on open.
|
|
||||||
FileInput.__init__(self_, *args, **kwds)
|
|
||||||
|
|
||||||
def decode(self_, data):
|
|
||||||
if isinstance(data, text_type): # py3: `data` already decoded.
|
|
||||||
return data
|
|
||||||
return data.decode(self_.encoding, 'sphinx') # py2: decoding
|
|
||||||
|
|
||||||
def read(self_):
|
|
||||||
data = FileInput.read(self_)
|
|
||||||
if app:
|
|
||||||
arg = [data]
|
|
||||||
app.emit('source-read', docname, arg)
|
|
||||||
data = arg[0]
|
|
||||||
if self.config.rst_epilog:
|
|
||||||
data = data + '\n' + self.config.rst_epilog + '\n'
|
|
||||||
if self.config.rst_prolog:
|
|
||||||
data = self.config.rst_prolog + '\n' + data
|
|
||||||
return data
|
|
||||||
|
|
||||||
# publish manually
|
# publish manually
|
||||||
pub = Publisher(reader=SphinxStandaloneReader(),
|
pub = Publisher(reader=SphinxStandaloneReader(),
|
||||||
writer=SphinxDummyWriter(),
|
writer=SphinxDummyWriter(),
|
||||||
source_class=SphinxSourceClass,
|
|
||||||
destination_class=NullOutput)
|
destination_class=NullOutput)
|
||||||
pub.set_components(None, 'restructuredtext', None)
|
pub.set_components(None, 'restructuredtext', None)
|
||||||
pub.process_programmatic_settings(None, self.settings, None)
|
pub.process_programmatic_settings(None, self.settings, None)
|
||||||
pub.set_source(None, src_path)
|
src_path = self.doc2path(docname)
|
||||||
|
source = SphinxFileInput(app, self, source=None, source_path=src_path,
|
||||||
|
encoding=self.config.source_encoding)
|
||||||
|
pub.source = source
|
||||||
|
pub.settings._source = src_path
|
||||||
pub.set_destination(None, None)
|
pub.set_destination(None, None)
|
||||||
pub.publish()
|
pub.publish()
|
||||||
doctree = pub.document
|
doctree = pub.document
|
||||||
@ -641,12 +766,12 @@ class BuildEnvironment:
|
|||||||
if app:
|
if app:
|
||||||
app.emit('doctree-read', doctree)
|
app.emit('doctree-read', doctree)
|
||||||
|
|
||||||
# store time of build, for outdated files detection
|
# store time of reading, for outdated files detection
|
||||||
# (Some filesystems have coarse timestamp resolution;
|
# (Some filesystems have coarse timestamp resolution;
|
||||||
# therefore time.time() can be older than filesystem's timestamp.
|
# therefore time.time() can be older than filesystem's timestamp.
|
||||||
# For example, FAT32 has 2sec timestamp resolution.)
|
# For example, FAT32 has 2sec timestamp resolution.)
|
||||||
self.all_docs[docname] = max(
|
self.all_docs[docname] = max(
|
||||||
time.time(), path.getmtime(self.doc2path(docname)))
|
time.time(), path.getmtime(self.doc2path(docname)))
|
||||||
|
|
||||||
if self.versioning_condition:
|
if self.versioning_condition:
|
||||||
# get old doctree
|
# get old doctree
|
||||||
@ -679,21 +804,20 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
# cleanup
|
# cleanup
|
||||||
self.temp_data.clear()
|
self.temp_data.clear()
|
||||||
|
self.ref_context.clear()
|
||||||
|
roles._roles.pop('', None) # if a document has set a local default role
|
||||||
|
|
||||||
if save_parsed:
|
# save the parsed doctree
|
||||||
# save the parsed doctree
|
doctree_filename = self.doc2path(docname, self.doctreedir,
|
||||||
doctree_filename = self.doc2path(docname, self.doctreedir,
|
'.doctree')
|
||||||
'.doctree')
|
dirname = path.dirname(doctree_filename)
|
||||||
dirname = path.dirname(doctree_filename)
|
if not path.isdir(dirname):
|
||||||
if not path.isdir(dirname):
|
os.makedirs(dirname)
|
||||||
os.makedirs(dirname)
|
f = open(doctree_filename, 'wb')
|
||||||
f = open(doctree_filename, 'wb')
|
try:
|
||||||
try:
|
pickle.dump(doctree, f, pickle.HIGHEST_PROTOCOL)
|
||||||
pickle.dump(doctree, f, pickle.HIGHEST_PROTOCOL)
|
finally:
|
||||||
finally:
|
f.close()
|
||||||
f.close()
|
|
||||||
else:
|
|
||||||
return doctree
|
|
||||||
|
|
||||||
# utilities to use while reading a document
|
# utilities to use while reading a document
|
||||||
|
|
||||||
@ -704,13 +828,17 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
@property
|
@property
|
||||||
def currmodule(self):
|
def currmodule(self):
|
||||||
"""Backwards compatible alias."""
|
"""Backwards compatible alias. Will be removed."""
|
||||||
return self.temp_data.get('py:module')
|
self.warn(self.docname, 'env.currmodule is being referenced by an '
|
||||||
|
'extension; this API will be removed in the future')
|
||||||
|
return self.ref_context.get('py:module')
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def currclass(self):
|
def currclass(self):
|
||||||
"""Backwards compatible alias."""
|
"""Backwards compatible alias. Will be removed."""
|
||||||
return self.temp_data.get('py:class')
|
self.warn(self.docname, 'env.currclass is being referenced by an '
|
||||||
|
'extension; this API will be removed in the future')
|
||||||
|
return self.ref_context.get('py:class')
|
||||||
|
|
||||||
def new_serialno(self, category=''):
|
def new_serialno(self, category=''):
|
||||||
"""Return a serial number, e.g. for index entry targets.
|
"""Return a serial number, e.g. for index entry targets.
|
||||||
@ -740,7 +868,7 @@ class BuildEnvironment:
|
|||||||
def note_versionchange(self, type, version, node, lineno):
|
def note_versionchange(self, type, version, node, lineno):
|
||||||
self.versionchanges.setdefault(version, []).append(
|
self.versionchanges.setdefault(version, []).append(
|
||||||
(type, self.temp_data['docname'], lineno,
|
(type, self.temp_data['docname'], lineno,
|
||||||
self.temp_data.get('py:module'),
|
self.ref_context.get('py:module'),
|
||||||
self.temp_data.get('object'), node.astext()))
|
self.temp_data.get('object'), node.astext()))
|
||||||
|
|
||||||
# post-processing of read doctrees
|
# post-processing of read doctrees
|
||||||
@ -755,7 +883,7 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
def process_dependencies(self, docname, doctree):
|
def process_dependencies(self, docname, doctree):
|
||||||
"""Process docutils-generated dependency info."""
|
"""Process docutils-generated dependency info."""
|
||||||
cwd = os.getcwd()
|
cwd = getcwd()
|
||||||
frompath = path.join(path.normpath(self.srcdir), 'dummy')
|
frompath = path.join(path.normpath(self.srcdir), 'dummy')
|
||||||
deps = doctree.settings.record_dependencies
|
deps = doctree.settings.record_dependencies
|
||||||
if not deps:
|
if not deps:
|
||||||
@ -763,6 +891,8 @@ class BuildEnvironment:
|
|||||||
for dep in deps.list:
|
for dep in deps.list:
|
||||||
# the dependency path is relative to the working dir, so get
|
# the dependency path is relative to the working dir, so get
|
||||||
# one relative to the srcdir
|
# one relative to the srcdir
|
||||||
|
if isinstance(dep, bytes):
|
||||||
|
dep = dep.decode(fs_encoding)
|
||||||
relpath = relative_path(frompath,
|
relpath = relative_path(frompath,
|
||||||
path.normpath(path.join(cwd, dep)))
|
path.normpath(path.join(cwd, dep)))
|
||||||
self.dependencies.setdefault(docname, set()).add(relpath)
|
self.dependencies.setdefault(docname, set()).add(relpath)
|
||||||
@ -846,7 +976,7 @@ class BuildEnvironment:
|
|||||||
# nodes are multiply inherited...
|
# nodes are multiply inherited...
|
||||||
if isinstance(node, nodes.authors):
|
if isinstance(node, nodes.authors):
|
||||||
md['authors'] = [author.astext() for author in node]
|
md['authors'] = [author.astext() for author in node]
|
||||||
elif isinstance(node, nodes.TextElement): # e.g. author
|
elif isinstance(node, nodes.TextElement): # e.g. author
|
||||||
md[node.__class__.__name__] = node.astext()
|
md[node.__class__.__name__] = node.astext()
|
||||||
else:
|
else:
|
||||||
name, body = node
|
name, body = node
|
||||||
@ -976,7 +1106,7 @@ class BuildEnvironment:
|
|||||||
|
|
||||||
def build_toc_from(self, docname, document):
|
def build_toc_from(self, docname, document):
|
||||||
"""Build a TOC from the doctree and store it in the inventory."""
|
"""Build a TOC from the doctree and store it in the inventory."""
|
||||||
numentries = [0] # nonlocal again...
|
numentries = [0] # nonlocal again...
|
||||||
|
|
||||||
def traverse_in_section(node, cls):
|
def traverse_in_section(node, cls):
|
||||||
"""Like traverse(), but stay within the same section."""
|
"""Like traverse(), but stay within the same section."""
|
||||||
@ -1102,7 +1232,6 @@ class BuildEnvironment:
|
|||||||
stream=WarningStream(self._warnfunc))
|
stream=WarningStream(self._warnfunc))
|
||||||
return doctree
|
return doctree
|
||||||
|
|
||||||
|
|
||||||
def get_and_resolve_doctree(self, docname, builder, doctree=None,
|
def get_and_resolve_doctree(self, docname, builder, doctree=None,
|
||||||
prune_toctrees=True, includehidden=False):
|
prune_toctrees=True, includehidden=False):
|
||||||
"""Read the doctree from the pickle, resolve cross-references and
|
"""Read the doctree from the pickle, resolve cross-references and
|
||||||
@ -1117,7 +1246,8 @@ class BuildEnvironment:
|
|||||||
# now, resolve all toctree nodes
|
# now, resolve all toctree nodes
|
||||||
for toctreenode in doctree.traverse(addnodes.toctree):
|
for toctreenode in doctree.traverse(addnodes.toctree):
|
||||||
result = self.resolve_toctree(docname, builder, toctreenode,
|
result = self.resolve_toctree(docname, builder, toctreenode,
|
||||||
prune=prune_toctrees, includehidden=includehidden)
|
prune=prune_toctrees,
|
||||||
|
includehidden=includehidden)
|
||||||
if result is None:
|
if result is None:
|
||||||
toctreenode.replace_self([])
|
toctreenode.replace_self([])
|
||||||
else:
|
else:
|
||||||
@ -1174,7 +1304,7 @@ class BuildEnvironment:
|
|||||||
else:
|
else:
|
||||||
# cull sub-entries whose parents aren't 'current'
|
# cull sub-entries whose parents aren't 'current'
|
||||||
if (collapse and depth > 1 and
|
if (collapse and depth > 1 and
|
||||||
'iscurrent' not in subnode.parent):
|
'iscurrent' not in subnode.parent):
|
||||||
subnode.parent.remove(subnode)
|
subnode.parent.remove(subnode)
|
||||||
else:
|
else:
|
||||||
# recurse on visible children
|
# recurse on visible children
|
||||||
@ -1256,7 +1386,7 @@ class BuildEnvironment:
|
|||||||
child = toc.children[0]
|
child = toc.children[0]
|
||||||
for refnode in child.traverse(nodes.reference):
|
for refnode in child.traverse(nodes.reference):
|
||||||
if refnode['refuri'] == ref and \
|
if refnode['refuri'] == ref and \
|
||||||
not refnode['anchorname']:
|
not refnode['anchorname']:
|
||||||
refnode.children = [nodes.Text(title)]
|
refnode.children = [nodes.Text(title)]
|
||||||
if not toc.children:
|
if not toc.children:
|
||||||
# empty toc means: no titles will show up in the toctree
|
# empty toc means: no titles will show up in the toctree
|
||||||
@ -1346,49 +1476,23 @@ class BuildEnvironment:
|
|||||||
domain = self.domains[node['refdomain']]
|
domain = self.domains[node['refdomain']]
|
||||||
except KeyError:
|
except KeyError:
|
||||||
raise NoUri
|
raise NoUri
|
||||||
newnode = domain.resolve_xref(self, fromdocname, builder,
|
newnode = domain.resolve_xref(self, refdoc, builder,
|
||||||
typ, target, node, contnode)
|
typ, target, node, contnode)
|
||||||
# really hardwired reference types
|
# really hardwired reference types
|
||||||
|
elif typ == 'any':
|
||||||
|
newnode = self._resolve_any_reference(builder, node, contnode)
|
||||||
elif typ == 'doc':
|
elif typ == 'doc':
|
||||||
# directly reference to document by source name;
|
newnode = self._resolve_doc_reference(builder, node, contnode)
|
||||||
# can be absolute or relative
|
|
||||||
docname = docname_join(refdoc, target)
|
|
||||||
if docname in self.all_docs:
|
|
||||||
if node['refexplicit']:
|
|
||||||
# reference with explicit title
|
|
||||||
caption = node.astext()
|
|
||||||
else:
|
|
||||||
caption = clean_astext(self.titles[docname])
|
|
||||||
innernode = nodes.emphasis(caption, caption)
|
|
||||||
newnode = nodes.reference('', '', internal=True)
|
|
||||||
newnode['refuri'] = builder.get_relative_uri(
|
|
||||||
fromdocname, docname)
|
|
||||||
newnode.append(innernode)
|
|
||||||
elif typ == 'citation':
|
elif typ == 'citation':
|
||||||
docname, labelid = self.citations.get(target, ('', ''))
|
newnode = self._resolve_citation(builder, refdoc, node, contnode)
|
||||||
if docname:
|
|
||||||
try:
|
|
||||||
newnode = make_refnode(builder, fromdocname,
|
|
||||||
docname, labelid, contnode)
|
|
||||||
except NoUri:
|
|
||||||
# remove the ids we added in the CitationReferences
|
|
||||||
# transform since they can't be transfered to
|
|
||||||
# the contnode (if it's a Text node)
|
|
||||||
if not isinstance(contnode, nodes.Element):
|
|
||||||
del node['ids'][:]
|
|
||||||
raise
|
|
||||||
elif 'ids' in node:
|
|
||||||
# remove ids attribute that annotated at
|
|
||||||
# transforms.CitationReference.apply.
|
|
||||||
del node['ids'][:]
|
|
||||||
# no new node found? try the missing-reference event
|
# no new node found? try the missing-reference event
|
||||||
if newnode is None:
|
if newnode is None:
|
||||||
newnode = builder.app.emit_firstresult(
|
newnode = builder.app.emit_firstresult(
|
||||||
'missing-reference', self, node, contnode)
|
'missing-reference', self, node, contnode)
|
||||||
# still not found? warn if in nit-picky mode
|
# still not found? warn if node wishes to be warned about or
|
||||||
|
# we are in nit-picky mode
|
||||||
if newnode is None:
|
if newnode is None:
|
||||||
self._warn_missing_reference(
|
self._warn_missing_reference(refdoc, typ, target, node, domain)
|
||||||
fromdocname, typ, target, node, domain)
|
|
||||||
except NoUri:
|
except NoUri:
|
||||||
newnode = contnode
|
newnode = contnode
|
||||||
node.replace_self(newnode or contnode)
|
node.replace_self(newnode or contnode)
|
||||||
@ -1399,7 +1503,7 @@ class BuildEnvironment:
|
|||||||
# allow custom references to be resolved
|
# allow custom references to be resolved
|
||||||
builder.app.emit('doctree-resolved', doctree, fromdocname)
|
builder.app.emit('doctree-resolved', doctree, fromdocname)
|
||||||
|
|
||||||
def _warn_missing_reference(self, fromdoc, typ, target, node, domain):
|
def _warn_missing_reference(self, refdoc, typ, target, node, domain):
|
||||||
warn = node.get('refwarn')
|
warn = node.get('refwarn')
|
||||||
if self.config.nitpicky:
|
if self.config.nitpicky:
|
||||||
warn = True
|
warn = True
|
||||||
@ -1418,13 +1522,91 @@ class BuildEnvironment:
|
|||||||
msg = 'unknown document: %(target)s'
|
msg = 'unknown document: %(target)s'
|
||||||
elif typ == 'citation':
|
elif typ == 'citation':
|
||||||
msg = 'citation not found: %(target)s'
|
msg = 'citation not found: %(target)s'
|
||||||
elif node.get('refdomain', 'std') != 'std':
|
elif node.get('refdomain', 'std') not in ('', 'std'):
|
||||||
msg = '%s:%s reference target not found: %%(target)s' % \
|
msg = '%s:%s reference target not found: %%(target)s' % \
|
||||||
(node['refdomain'], typ)
|
(node['refdomain'], typ)
|
||||||
else:
|
else:
|
||||||
msg = '%s reference target not found: %%(target)s' % typ
|
msg = '%r reference target not found: %%(target)s' % typ
|
||||||
self.warn_node(msg % {'target': target}, node)
|
self.warn_node(msg % {'target': target}, node)
|
||||||
|
|
||||||
|
def _resolve_doc_reference(self, builder, node, contnode):
|
||||||
|
# directly reference to document by source name;
|
||||||
|
# can be absolute or relative
|
||||||
|
docname = docname_join(node['refdoc'], node['reftarget'])
|
||||||
|
if docname in self.all_docs:
|
||||||
|
if node['refexplicit']:
|
||||||
|
# reference with explicit title
|
||||||
|
caption = node.astext()
|
||||||
|
else:
|
||||||
|
caption = clean_astext(self.titles[docname])
|
||||||
|
innernode = nodes.emphasis(caption, caption)
|
||||||
|
newnode = nodes.reference('', '', internal=True)
|
||||||
|
newnode['refuri'] = builder.get_relative_uri(node['refdoc'], docname)
|
||||||
|
newnode.append(innernode)
|
||||||
|
return newnode
|
||||||
|
|
||||||
|
def _resolve_citation(self, builder, fromdocname, node, contnode):
|
||||||
|
docname, labelid = self.citations.get(node['reftarget'], ('', ''))
|
||||||
|
if docname:
|
||||||
|
try:
|
||||||
|
newnode = make_refnode(builder, fromdocname,
|
||||||
|
docname, labelid, contnode)
|
||||||
|
return newnode
|
||||||
|
except NoUri:
|
||||||
|
# remove the ids we added in the CitationReferences
|
||||||
|
# transform since they can't be transfered to
|
||||||
|
# the contnode (if it's a Text node)
|
||||||
|
if not isinstance(contnode, nodes.Element):
|
||||||
|
del node['ids'][:]
|
||||||
|
raise
|
||||||
|
elif 'ids' in node:
|
||||||
|
# remove ids attribute that annotated at
|
||||||
|
# transforms.CitationReference.apply.
|
||||||
|
del node['ids'][:]
|
||||||
|
|
||||||
|
def _resolve_any_reference(self, builder, node, contnode):
|
||||||
|
"""Resolve reference generated by the "any" role."""
|
||||||
|
refdoc = node['refdoc']
|
||||||
|
target = node['reftarget']
|
||||||
|
results = []
|
||||||
|
# first, try resolving as :doc:
|
||||||
|
doc_ref = self._resolve_doc_reference(builder, node, contnode)
|
||||||
|
if doc_ref:
|
||||||
|
results.append(('doc', doc_ref))
|
||||||
|
# next, do the standard domain (makes this a priority)
|
||||||
|
results.extend(self.domains['std'].resolve_any_xref(
|
||||||
|
self, refdoc, builder, target, node, contnode))
|
||||||
|
for domain in self.domains.values():
|
||||||
|
if domain.name == 'std':
|
||||||
|
continue # we did this one already
|
||||||
|
try:
|
||||||
|
results.extend(domain.resolve_any_xref(self, refdoc, builder,
|
||||||
|
target, node, contnode))
|
||||||
|
except NotImplementedError:
|
||||||
|
# the domain doesn't yet support the new interface
|
||||||
|
# we have to manually collect possible references (SLOW)
|
||||||
|
for role in domain.roles:
|
||||||
|
res = domain.resolve_xref(self, refdoc, builder, role, target,
|
||||||
|
node, contnode)
|
||||||
|
if res:
|
||||||
|
results.append(('%s:%s' % (domain.name, role), res))
|
||||||
|
# now, see how many matches we got...
|
||||||
|
if not results:
|
||||||
|
return None
|
||||||
|
if len(results) > 1:
|
||||||
|
nice_results = ' or '.join(':%s:' % r[0] for r in results)
|
||||||
|
self.warn_node('more than one target found for \'any\' cross-'
|
||||||
|
'reference %r: could be %s' % (target, nice_results),
|
||||||
|
node)
|
||||||
|
res_role, newnode = results[0]
|
||||||
|
# Override "any" class with the actual role type to get the styling
|
||||||
|
# approximately correct.
|
||||||
|
res_domain = res_role.split(':')[0]
|
||||||
|
if newnode and newnode[0].get('classes'):
|
||||||
|
newnode[0]['classes'].append(res_domain)
|
||||||
|
newnode[0]['classes'].append(res_role.replace(':', '-'))
|
||||||
|
return newnode
|
||||||
|
|
||||||
def process_only_nodes(self, doctree, builder, fromdocname=None):
|
def process_only_nodes(self, doctree, builder, fromdocname=None):
|
||||||
# A comment on the comment() nodes being inserted: replacing by [] would
|
# A comment on the comment() nodes being inserted: replacing by [] would
|
||||||
# result in a "Losing ids" exception if there is a target node before
|
# result in a "Losing ids" exception if there is a target node before
|
||||||
@ -1595,7 +1777,7 @@ class BuildEnvironment:
|
|||||||
# prefixes match: add entry as subitem of the
|
# prefixes match: add entry as subitem of the
|
||||||
# previous entry
|
# previous entry
|
||||||
oldsubitems.setdefault(m.group(2), [[], {}])[0].\
|
oldsubitems.setdefault(m.group(2), [[], {}])[0].\
|
||||||
extend(targets)
|
extend(targets)
|
||||||
del newlist[i]
|
del newlist[i]
|
||||||
continue
|
continue
|
||||||
oldkey = m.group(1)
|
oldkey = m.group(1)
|
||||||
@ -1622,6 +1804,7 @@ class BuildEnvironment:
|
|||||||
def collect_relations(self):
|
def collect_relations(self):
|
||||||
relations = {}
|
relations = {}
|
||||||
getinc = self.toctree_includes.get
|
getinc = self.toctree_includes.get
|
||||||
|
|
||||||
def collect(parents, parents_set, docname, previous, next):
|
def collect(parents, parents_set, docname, previous, next):
|
||||||
# circular relationship?
|
# circular relationship?
|
||||||
if docname in parents_set:
|
if docname in parents_set:
|
||||||
@ -1661,8 +1844,8 @@ class BuildEnvironment:
|
|||||||
# same for children
|
# same for children
|
||||||
if includes:
|
if includes:
|
||||||
for subindex, args in enumerate(zip(includes,
|
for subindex, args in enumerate(zip(includes,
|
||||||
[None] + includes,
|
[None] + includes,
|
||||||
includes[1:] + [None])):
|
includes[1:] + [None])):
|
||||||
collect([(docname, subindex)] + parents,
|
collect([(docname, subindex)] + parents,
|
||||||
parents_set.union([docname]), *args)
|
parents_set.union([docname]), *args)
|
||||||
relations[docname] = [parents[0][0], previous, next]
|
relations[docname] = [parents[0][0], previous, next]
|
||||||
|
@ -10,6 +10,9 @@
|
|||||||
:license: BSD, see LICENSE for details.
|
:license: BSD, see LICENSE for details.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
import traceback
|
||||||
|
|
||||||
|
|
||||||
class SphinxError(Exception):
|
class SphinxError(Exception):
|
||||||
"""
|
"""
|
||||||
Base class for Sphinx errors that are shown to the user in a nicer
|
Base class for Sphinx errors that are shown to the user in a nicer
|
||||||
@ -62,3 +65,13 @@ class PycodeError(Exception):
|
|||||||
if len(self.args) > 1:
|
if len(self.args) > 1:
|
||||||
res += ' (exception was: %r)' % self.args[1]
|
res += ' (exception was: %r)' % self.args[1]
|
||||||
return res
|
return res
|
||||||
|
|
||||||
|
|
||||||
|
class SphinxParallelError(Exception):
|
||||||
|
def __init__(self, orig_exc, traceback):
|
||||||
|
self.orig_exc = orig_exc
|
||||||
|
self.traceback = traceback
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return traceback.format_exception_only(
|
||||||
|
self.orig_exc.__class__, self.orig_exc)[0].strip()
|
||||||
|
@ -30,7 +30,7 @@ from sphinx.application import ExtensionError
|
|||||||
from sphinx.util.nodes import nested_parse_with_titles
|
from sphinx.util.nodes import nested_parse_with_titles
|
||||||
from sphinx.util.compat import Directive
|
from sphinx.util.compat import Directive
|
||||||
from sphinx.util.inspect import getargspec, isdescriptor, safe_getmembers, \
|
from sphinx.util.inspect import getargspec, isdescriptor, safe_getmembers, \
|
||||||
safe_getattr, safe_repr, is_builtin_class_method
|
safe_getattr, safe_repr, is_builtin_class_method
|
||||||
from sphinx.util.docstrings import prepare_docstring
|
from sphinx.util.docstrings import prepare_docstring
|
||||||
|
|
||||||
|
|
||||||
@ -50,11 +50,13 @@ class DefDict(dict):
|
|||||||
def __init__(self, default):
|
def __init__(self, default):
|
||||||
dict.__init__(self)
|
dict.__init__(self)
|
||||||
self.default = default
|
self.default = default
|
||||||
|
|
||||||
def __getitem__(self, key):
|
def __getitem__(self, key):
|
||||||
try:
|
try:
|
||||||
return dict.__getitem__(self, key)
|
return dict.__getitem__(self, key)
|
||||||
except KeyError:
|
except KeyError:
|
||||||
return self.default
|
return self.default
|
||||||
|
|
||||||
def __bool__(self):
|
def __bool__(self):
|
||||||
# docutils check "if option_spec"
|
# docutils check "if option_spec"
|
||||||
return True
|
return True
|
||||||
@ -92,6 +94,7 @@ class _MockModule(object):
|
|||||||
else:
|
else:
|
||||||
return _MockModule()
|
return _MockModule()
|
||||||
|
|
||||||
|
|
||||||
def mock_import(modname):
|
def mock_import(modname):
|
||||||
if '.' in modname:
|
if '.' in modname:
|
||||||
pkg, _n, mods = modname.rpartition('.')
|
pkg, _n, mods = modname.rpartition('.')
|
||||||
@ -104,12 +107,14 @@ def mock_import(modname):
|
|||||||
ALL = object()
|
ALL = object()
|
||||||
INSTANCEATTR = object()
|
INSTANCEATTR = object()
|
||||||
|
|
||||||
|
|
||||||
def members_option(arg):
|
def members_option(arg):
|
||||||
"""Used to convert the :members: option to auto directives."""
|
"""Used to convert the :members: option to auto directives."""
|
||||||
if arg is None:
|
if arg is None:
|
||||||
return ALL
|
return ALL
|
||||||
return [x.strip() for x in arg.split(',')]
|
return [x.strip() for x in arg.split(',')]
|
||||||
|
|
||||||
|
|
||||||
def members_set_option(arg):
|
def members_set_option(arg):
|
||||||
"""Used to convert the :members: option to auto directives."""
|
"""Used to convert the :members: option to auto directives."""
|
||||||
if arg is None:
|
if arg is None:
|
||||||
@ -118,6 +123,7 @@ def members_set_option(arg):
|
|||||||
|
|
||||||
SUPPRESS = object()
|
SUPPRESS = object()
|
||||||
|
|
||||||
|
|
||||||
def annotation_option(arg):
|
def annotation_option(arg):
|
||||||
if arg is None:
|
if arg is None:
|
||||||
# suppress showing the representation of the object
|
# suppress showing the representation of the object
|
||||||
@ -125,6 +131,7 @@ def annotation_option(arg):
|
|||||||
else:
|
else:
|
||||||
return arg
|
return arg
|
||||||
|
|
||||||
|
|
||||||
def bool_option(arg):
|
def bool_option(arg):
|
||||||
"""Used to convert flag options to auto directives. (Instead of
|
"""Used to convert flag options to auto directives. (Instead of
|
||||||
directives.flag(), which returns None).
|
directives.flag(), which returns None).
|
||||||
@ -201,6 +208,7 @@ def cut_lines(pre, post=0, what=None):
|
|||||||
lines.append('')
|
lines.append('')
|
||||||
return process
|
return process
|
||||||
|
|
||||||
|
|
||||||
def between(marker, what=None, keepempty=False, exclude=False):
|
def between(marker, what=None, keepempty=False, exclude=False):
|
||||||
"""Return a listener that either keeps, or if *exclude* is True excludes,
|
"""Return a listener that either keeps, or if *exclude* is True excludes,
|
||||||
lines between lines that match the *marker* regular expression. If no line
|
lines between lines that match the *marker* regular expression. If no line
|
||||||
@ -211,6 +219,7 @@ def between(marker, what=None, keepempty=False, exclude=False):
|
|||||||
be processed.
|
be processed.
|
||||||
"""
|
"""
|
||||||
marker_re = re.compile(marker)
|
marker_re = re.compile(marker)
|
||||||
|
|
||||||
def process(app, what_, name, obj, options, lines):
|
def process(app, what_, name, obj, options, lines):
|
||||||
if what and what_ not in what:
|
if what and what_ not in what:
|
||||||
return
|
return
|
||||||
@ -325,7 +334,7 @@ class Documenter(object):
|
|||||||
# an autogenerated one
|
# an autogenerated one
|
||||||
try:
|
try:
|
||||||
explicit_modname, path, base, args, retann = \
|
explicit_modname, path, base, args, retann = \
|
||||||
py_ext_sig_re.match(self.name).groups()
|
py_ext_sig_re.match(self.name).groups()
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
self.directive.warn('invalid signature for auto%s (%r)' %
|
self.directive.warn('invalid signature for auto%s (%r)' %
|
||||||
(self.objtype, self.name))
|
(self.objtype, self.name))
|
||||||
@ -340,7 +349,7 @@ class Documenter(object):
|
|||||||
parents = []
|
parents = []
|
||||||
|
|
||||||
self.modname, self.objpath = \
|
self.modname, self.objpath = \
|
||||||
self.resolve_name(modname, parents, path, base)
|
self.resolve_name(modname, parents, path, base)
|
||||||
|
|
||||||
if not self.modname:
|
if not self.modname:
|
||||||
return False
|
return False
|
||||||
@ -637,19 +646,19 @@ class Documenter(object):
|
|||||||
|
|
||||||
keep = False
|
keep = False
|
||||||
if want_all and membername.startswith('__') and \
|
if want_all and membername.startswith('__') and \
|
||||||
membername.endswith('__') and len(membername) > 4:
|
membername.endswith('__') and len(membername) > 4:
|
||||||
# special __methods__
|
# special __methods__
|
||||||
if self.options.special_members is ALL and \
|
if self.options.special_members is ALL and \
|
||||||
membername != '__doc__':
|
membername != '__doc__':
|
||||||
keep = has_doc or self.options.undoc_members
|
keep = has_doc or self.options.undoc_members
|
||||||
elif self.options.special_members and \
|
elif self.options.special_members and \
|
||||||
self.options.special_members is not ALL and \
|
self.options.special_members is not ALL and \
|
||||||
membername in self.options.special_members:
|
membername in self.options.special_members:
|
||||||
keep = has_doc or self.options.undoc_members
|
keep = has_doc or self.options.undoc_members
|
||||||
elif want_all and membername.startswith('_'):
|
elif want_all and membername.startswith('_'):
|
||||||
# ignore members whose name starts with _ by default
|
# ignore members whose name starts with _ by default
|
||||||
keep = self.options.private_members and \
|
keep = self.options.private_members and \
|
||||||
(has_doc or self.options.undoc_members)
|
(has_doc or self.options.undoc_members)
|
||||||
elif (namespace, membername) in attr_docs:
|
elif (namespace, membername) in attr_docs:
|
||||||
# keep documented attributes
|
# keep documented attributes
|
||||||
keep = True
|
keep = True
|
||||||
@ -685,7 +694,7 @@ class Documenter(object):
|
|||||||
self.env.temp_data['autodoc:class'] = self.objpath[0]
|
self.env.temp_data['autodoc:class'] = self.objpath[0]
|
||||||
|
|
||||||
want_all = all_members or self.options.inherited_members or \
|
want_all = all_members or self.options.inherited_members or \
|
||||||
self.options.members is ALL
|
self.options.members is ALL
|
||||||
# find out which members are documentable
|
# find out which members are documentable
|
||||||
members_check_module, members = self.get_object_members(want_all)
|
members_check_module, members = self.get_object_members(want_all)
|
||||||
|
|
||||||
@ -707,11 +716,11 @@ class Documenter(object):
|
|||||||
# give explicitly separated module name, so that members
|
# give explicitly separated module name, so that members
|
||||||
# of inner classes can be documented
|
# of inner classes can be documented
|
||||||
full_mname = self.modname + '::' + \
|
full_mname = self.modname + '::' + \
|
||||||
'.'.join(self.objpath + [mname])
|
'.'.join(self.objpath + [mname])
|
||||||
documenter = classes[-1](self.directive, full_mname, self.indent)
|
documenter = classes[-1](self.directive, full_mname, self.indent)
|
||||||
memberdocumenters.append((documenter, isattr))
|
memberdocumenters.append((documenter, isattr))
|
||||||
member_order = self.options.member_order or \
|
member_order = self.options.member_order or \
|
||||||
self.env.config.autodoc_member_order
|
self.env.config.autodoc_member_order
|
||||||
if member_order == 'groupwise':
|
if member_order == 'groupwise':
|
||||||
# sort by group; relies on stable sort to keep items in the
|
# sort by group; relies on stable sort to keep items in the
|
||||||
# same group sorted alphabetically
|
# same group sorted alphabetically
|
||||||
@ -719,6 +728,7 @@ class Documenter(object):
|
|||||||
elif member_order == 'bysource' and self.analyzer:
|
elif member_order == 'bysource' and self.analyzer:
|
||||||
# sort by source order, by virtue of the module analyzer
|
# sort by source order, by virtue of the module analyzer
|
||||||
tagorder = self.analyzer.tagorder
|
tagorder = self.analyzer.tagorder
|
||||||
|
|
||||||
def keyfunc(entry):
|
def keyfunc(entry):
|
||||||
fullname = entry[0].name.split('::')[1]
|
fullname = entry[0].name.split('::')[1]
|
||||||
return tagorder.get(fullname, len(tagorder))
|
return tagorder.get(fullname, len(tagorder))
|
||||||
@ -872,7 +882,7 @@ class ModuleDocumenter(Documenter):
|
|||||||
self.directive.warn(
|
self.directive.warn(
|
||||||
'missing attribute mentioned in :members: or __all__: '
|
'missing attribute mentioned in :members: or __all__: '
|
||||||
'module %s, attribute %s' % (
|
'module %s, attribute %s' % (
|
||||||
safe_getattr(self.object, '__name__', '???'), mname))
|
safe_getattr(self.object, '__name__', '???'), mname))
|
||||||
return False, ret
|
return False, ret
|
||||||
|
|
||||||
|
|
||||||
@ -891,7 +901,7 @@ class ModuleLevelDocumenter(Documenter):
|
|||||||
modname = self.env.temp_data.get('autodoc:module')
|
modname = self.env.temp_data.get('autodoc:module')
|
||||||
# ... or in the scope of a module directive
|
# ... or in the scope of a module directive
|
||||||
if not modname:
|
if not modname:
|
||||||
modname = self.env.temp_data.get('py:module')
|
modname = self.env.ref_context.get('py:module')
|
||||||
# ... else, it stays None, which means invalid
|
# ... else, it stays None, which means invalid
|
||||||
return modname, parents + [base]
|
return modname, parents + [base]
|
||||||
|
|
||||||
@ -913,7 +923,7 @@ class ClassLevelDocumenter(Documenter):
|
|||||||
mod_cls = self.env.temp_data.get('autodoc:class')
|
mod_cls = self.env.temp_data.get('autodoc:class')
|
||||||
# ... or from a class directive
|
# ... or from a class directive
|
||||||
if mod_cls is None:
|
if mod_cls is None:
|
||||||
mod_cls = self.env.temp_data.get('py:class')
|
mod_cls = self.env.ref_context.get('py:class')
|
||||||
# ... if still None, there's no way to know
|
# ... if still None, there's no way to know
|
||||||
if mod_cls is None:
|
if mod_cls is None:
|
||||||
return None, []
|
return None, []
|
||||||
@ -923,7 +933,7 @@ class ClassLevelDocumenter(Documenter):
|
|||||||
if not modname:
|
if not modname:
|
||||||
modname = self.env.temp_data.get('autodoc:module')
|
modname = self.env.temp_data.get('autodoc:module')
|
||||||
if not modname:
|
if not modname:
|
||||||
modname = self.env.temp_data.get('py:module')
|
modname = self.env.ref_context.get('py:module')
|
||||||
# ... else, it stays None, which means invalid
|
# ... else, it stays None, which means invalid
|
||||||
return modname, parents + [base]
|
return modname, parents + [base]
|
||||||
|
|
||||||
@ -976,6 +986,7 @@ class DocstringSignatureMixin(object):
|
|||||||
self.args, self.retann = result
|
self.args, self.retann = result
|
||||||
return Documenter.format_signature(self)
|
return Documenter.format_signature(self)
|
||||||
|
|
||||||
|
|
||||||
class DocstringStripSignatureMixin(DocstringSignatureMixin):
|
class DocstringStripSignatureMixin(DocstringSignatureMixin):
|
||||||
"""
|
"""
|
||||||
Mixin for AttributeDocumenter to provide the
|
Mixin for AttributeDocumenter to provide the
|
||||||
@ -1007,7 +1018,7 @@ class FunctionDocumenter(DocstringSignatureMixin, ModuleLevelDocumenter):
|
|||||||
|
|
||||||
def format_args(self):
|
def format_args(self):
|
||||||
if inspect.isbuiltin(self.object) or \
|
if inspect.isbuiltin(self.object) or \
|
||||||
inspect.ismethoddescriptor(self.object):
|
inspect.ismethoddescriptor(self.object):
|
||||||
# cannot introspect arguments of a C function or method
|
# cannot introspect arguments of a C function or method
|
||||||
return None
|
return None
|
||||||
try:
|
try:
|
||||||
@ -1070,8 +1081,8 @@ class ClassDocumenter(ModuleLevelDocumenter):
|
|||||||
# classes without __init__ method, default __init__ or
|
# classes without __init__ method, default __init__ or
|
||||||
# __init__ written in C?
|
# __init__ written in C?
|
||||||
if initmeth is None or \
|
if initmeth is None or \
|
||||||
is_builtin_class_method(self.object, '__init__') or \
|
is_builtin_class_method(self.object, '__init__') or \
|
||||||
not(inspect.ismethod(initmeth) or inspect.isfunction(initmeth)):
|
not(inspect.ismethod(initmeth) or inspect.isfunction(initmeth)):
|
||||||
return None
|
return None
|
||||||
try:
|
try:
|
||||||
argspec = getargspec(initmeth)
|
argspec = getargspec(initmeth)
|
||||||
@ -1109,7 +1120,7 @@ class ClassDocumenter(ModuleLevelDocumenter):
|
|||||||
if not self.doc_as_attr and self.options.show_inheritance:
|
if not self.doc_as_attr and self.options.show_inheritance:
|
||||||
self.add_line(u'', '<autodoc>')
|
self.add_line(u'', '<autodoc>')
|
||||||
if hasattr(self.object, '__bases__') and len(self.object.__bases__):
|
if hasattr(self.object, '__bases__') and len(self.object.__bases__):
|
||||||
bases = [b.__module__ == '__builtin__' and
|
bases = [b.__module__ in ('__builtin__', 'builtins') and
|
||||||
u':class:`%s`' % b.__name__ or
|
u':class:`%s`' % b.__name__ or
|
||||||
u':class:`%s.%s`' % (b.__module__, b.__name__)
|
u':class:`%s.%s`' % (b.__module__, b.__name__)
|
||||||
for b in self.object.__bases__]
|
for b in self.object.__bases__]
|
||||||
@ -1142,7 +1153,7 @@ class ClassDocumenter(ModuleLevelDocumenter):
|
|||||||
# for new-style classes, no __init__ means default __init__
|
# for new-style classes, no __init__ means default __init__
|
||||||
if (initdocstring is not None and
|
if (initdocstring is not None and
|
||||||
(initdocstring == object.__init__.__doc__ or # for pypy
|
(initdocstring == object.__init__.__doc__ or # for pypy
|
||||||
initdocstring.strip() == object.__init__.__doc__)): #for !pypy
|
initdocstring.strip() == object.__init__.__doc__)): # for !pypy
|
||||||
initdocstring = None
|
initdocstring = None
|
||||||
if initdocstring:
|
if initdocstring:
|
||||||
if content == 'init':
|
if content == 'init':
|
||||||
@ -1186,7 +1197,7 @@ class ExceptionDocumenter(ClassDocumenter):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def can_document_member(cls, member, membername, isattr, parent):
|
def can_document_member(cls, member, membername, isattr, parent):
|
||||||
return isinstance(member, class_types) and \
|
return isinstance(member, class_types) and \
|
||||||
issubclass(member, BaseException)
|
issubclass(member, BaseException)
|
||||||
|
|
||||||
|
|
||||||
class DataDocumenter(ModuleLevelDocumenter):
|
class DataDocumenter(ModuleLevelDocumenter):
|
||||||
@ -1233,7 +1244,7 @@ class MethodDocumenter(DocstringSignatureMixin, ClassLevelDocumenter):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def can_document_member(cls, member, membername, isattr, parent):
|
def can_document_member(cls, member, membername, isattr, parent):
|
||||||
return inspect.isroutine(member) and \
|
return inspect.isroutine(member) and \
|
||||||
not isinstance(parent, ModuleDocumenter)
|
not isinstance(parent, ModuleDocumenter)
|
||||||
|
|
||||||
def import_object(self):
|
def import_object(self):
|
||||||
ret = ClassLevelDocumenter.import_object(self)
|
ret = ClassLevelDocumenter.import_object(self)
|
||||||
@ -1257,7 +1268,7 @@ class MethodDocumenter(DocstringSignatureMixin, ClassLevelDocumenter):
|
|||||||
|
|
||||||
def format_args(self):
|
def format_args(self):
|
||||||
if inspect.isbuiltin(self.object) or \
|
if inspect.isbuiltin(self.object) or \
|
||||||
inspect.ismethoddescriptor(self.object):
|
inspect.ismethoddescriptor(self.object):
|
||||||
# can never get arguments of a C function or method
|
# can never get arguments of a C function or method
|
||||||
return None
|
return None
|
||||||
argspec = getargspec(self.object)
|
argspec = getargspec(self.object)
|
||||||
@ -1272,7 +1283,7 @@ class MethodDocumenter(DocstringSignatureMixin, ClassLevelDocumenter):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class AttributeDocumenter(DocstringStripSignatureMixin,ClassLevelDocumenter):
|
class AttributeDocumenter(DocstringStripSignatureMixin, ClassLevelDocumenter):
|
||||||
"""
|
"""
|
||||||
Specialized Documenter subclass for attributes.
|
Specialized Documenter subclass for attributes.
|
||||||
"""
|
"""
|
||||||
@ -1290,9 +1301,9 @@ class AttributeDocumenter(DocstringStripSignatureMixin,ClassLevelDocumenter):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def can_document_member(cls, member, membername, isattr, parent):
|
def can_document_member(cls, member, membername, isattr, parent):
|
||||||
isdatadesc = isdescriptor(member) and not \
|
isdatadesc = isdescriptor(member) and not \
|
||||||
isinstance(member, cls.method_types) and not \
|
isinstance(member, cls.method_types) and not \
|
||||||
type(member).__name__ in ("type", "method_descriptor",
|
type(member).__name__ in ("type", "method_descriptor",
|
||||||
"instancemethod")
|
"instancemethod")
|
||||||
return isdatadesc or (not isinstance(parent, ModuleDocumenter)
|
return isdatadesc or (not isinstance(parent, ModuleDocumenter)
|
||||||
and not inspect.isroutine(member)
|
and not inspect.isroutine(member)
|
||||||
and not isinstance(member, class_types))
|
and not isinstance(member, class_types))
|
||||||
@ -1303,7 +1314,7 @@ class AttributeDocumenter(DocstringStripSignatureMixin,ClassLevelDocumenter):
|
|||||||
def import_object(self):
|
def import_object(self):
|
||||||
ret = ClassLevelDocumenter.import_object(self)
|
ret = ClassLevelDocumenter.import_object(self)
|
||||||
if isdescriptor(self.object) and \
|
if isdescriptor(self.object) and \
|
||||||
not isinstance(self.object, self.method_types):
|
not isinstance(self.object, self.method_types):
|
||||||
self._datadescriptor = True
|
self._datadescriptor = True
|
||||||
else:
|
else:
|
||||||
# if it's not a data descriptor
|
# if it's not a data descriptor
|
||||||
@ -1312,7 +1323,7 @@ class AttributeDocumenter(DocstringStripSignatureMixin,ClassLevelDocumenter):
|
|||||||
|
|
||||||
def get_real_modname(self):
|
def get_real_modname(self):
|
||||||
return self.get_attr(self.parent or self.object, '__module__', None) \
|
return self.get_attr(self.parent or self.object, '__module__', None) \
|
||||||
or self.modname
|
or self.modname
|
||||||
|
|
||||||
def add_directive_header(self, sig):
|
def add_directive_header(self, sig):
|
||||||
ClassLevelDocumenter.add_directive_header(self, sig)
|
ClassLevelDocumenter.add_directive_header(self, sig)
|
||||||
@ -1479,7 +1490,7 @@ def add_documenter(cls):
|
|||||||
raise ExtensionError('autodoc documenter %r must be a subclass '
|
raise ExtensionError('autodoc documenter %r must be a subclass '
|
||||||
'of Documenter' % cls)
|
'of Documenter' % cls)
|
||||||
# actually, it should be possible to override Documenters
|
# actually, it should be possible to override Documenters
|
||||||
#if cls.objtype in AutoDirective._registry:
|
# if cls.objtype in AutoDirective._registry:
|
||||||
# raise ExtensionError('autodoc documenter for %r is already '
|
# raise ExtensionError('autodoc documenter for %r is already '
|
||||||
# 'registered' % cls.objtype)
|
# 'registered' % cls.objtype)
|
||||||
AutoDirective._registry[cls.objtype] = cls
|
AutoDirective._registry[cls.objtype] = cls
|
||||||
@ -1504,7 +1515,7 @@ def setup(app):
|
|||||||
app.add_event('autodoc-process-signature')
|
app.add_event('autodoc-process-signature')
|
||||||
app.add_event('autodoc-skip-member')
|
app.add_event('autodoc-skip-member')
|
||||||
|
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
|
||||||
|
|
||||||
class testcls:
|
class testcls:
|
||||||
|
@ -432,11 +432,11 @@ def get_import_prefixes_from_env(env):
|
|||||||
"""
|
"""
|
||||||
prefixes = [None]
|
prefixes = [None]
|
||||||
|
|
||||||
currmodule = env.temp_data.get('py:module')
|
currmodule = env.ref_context.get('py:module')
|
||||||
if currmodule:
|
if currmodule:
|
||||||
prefixes.insert(0, currmodule)
|
prefixes.insert(0, currmodule)
|
||||||
|
|
||||||
currclass = env.temp_data.get('py:class')
|
currclass = env.ref_context.get('py:class')
|
||||||
if currclass:
|
if currclass:
|
||||||
if currmodule:
|
if currmodule:
|
||||||
prefixes.insert(0, currmodule + "." + currclass)
|
prefixes.insert(0, currmodule + "." + currclass)
|
||||||
@ -570,4 +570,4 @@ def setup(app):
|
|||||||
app.connect('doctree-read', process_autosummary_toc)
|
app.connect('doctree-read', process_autosummary_toc)
|
||||||
app.connect('builder-inited', process_generate_options)
|
app.connect('builder-inited', process_generate_options)
|
||||||
app.add_config_value('autosummary_generate', [], True)
|
app.add_config_value('autosummary_generate', [], True)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -265,4 +265,4 @@ def setup(app):
|
|||||||
app.add_config_value('coverage_ignore_c_items', {}, False)
|
app.add_config_value('coverage_ignore_c_items', {}, False)
|
||||||
app.add_config_value('coverage_write_headline', True, False)
|
app.add_config_value('coverage_write_headline', True, False)
|
||||||
app.add_config_value('coverage_skip_undoc_in_source', False, False)
|
app.add_config_value('coverage_skip_undoc_in_source', False, False)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -32,6 +32,7 @@ from sphinx.util.console import bold
|
|||||||
blankline_re = re.compile(r'^\s*<BLANKLINE>', re.MULTILINE)
|
blankline_re = re.compile(r'^\s*<BLANKLINE>', re.MULTILINE)
|
||||||
doctestopt_re = re.compile(r'#\s*doctest:.+$', re.MULTILINE)
|
doctestopt_re = re.compile(r'#\s*doctest:.+$', re.MULTILINE)
|
||||||
|
|
||||||
|
|
||||||
# set up the necessary directives
|
# set up the necessary directives
|
||||||
|
|
||||||
class TestDirective(Directive):
|
class TestDirective(Directive):
|
||||||
@ -79,30 +80,35 @@ class TestDirective(Directive):
|
|||||||
option_strings = self.options['options'].replace(',', ' ').split()
|
option_strings = self.options['options'].replace(',', ' ').split()
|
||||||
for option in option_strings:
|
for option in option_strings:
|
||||||
if (option[0] not in '+-' or option[1:] not in
|
if (option[0] not in '+-' or option[1:] not in
|
||||||
doctest.OPTIONFLAGS_BY_NAME):
|
doctest.OPTIONFLAGS_BY_NAME):
|
||||||
# XXX warn?
|
# XXX warn?
|
||||||
continue
|
continue
|
||||||
flag = doctest.OPTIONFLAGS_BY_NAME[option[1:]]
|
flag = doctest.OPTIONFLAGS_BY_NAME[option[1:]]
|
||||||
node['options'][flag] = (option[0] == '+')
|
node['options'][flag] = (option[0] == '+')
|
||||||
return [node]
|
return [node]
|
||||||
|
|
||||||
|
|
||||||
class TestsetupDirective(TestDirective):
|
class TestsetupDirective(TestDirective):
|
||||||
option_spec = {}
|
option_spec = {}
|
||||||
|
|
||||||
|
|
||||||
class TestcleanupDirective(TestDirective):
|
class TestcleanupDirective(TestDirective):
|
||||||
option_spec = {}
|
option_spec = {}
|
||||||
|
|
||||||
|
|
||||||
class DoctestDirective(TestDirective):
|
class DoctestDirective(TestDirective):
|
||||||
option_spec = {
|
option_spec = {
|
||||||
'hide': directives.flag,
|
'hide': directives.flag,
|
||||||
'options': directives.unchanged,
|
'options': directives.unchanged,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
class TestcodeDirective(TestDirective):
|
class TestcodeDirective(TestDirective):
|
||||||
option_spec = {
|
option_spec = {
|
||||||
'hide': directives.flag,
|
'hide': directives.flag,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
class TestoutputDirective(TestDirective):
|
class TestoutputDirective(TestDirective):
|
||||||
option_spec = {
|
option_spec = {
|
||||||
'hide': directives.flag,
|
'hide': directives.flag,
|
||||||
@ -112,6 +118,7 @@ class TestoutputDirective(TestDirective):
|
|||||||
|
|
||||||
parser = doctest.DocTestParser()
|
parser = doctest.DocTestParser()
|
||||||
|
|
||||||
|
|
||||||
# helper classes
|
# helper classes
|
||||||
|
|
||||||
class TestGroup(object):
|
class TestGroup(object):
|
||||||
@ -196,7 +203,7 @@ class DocTestBuilder(Builder):
|
|||||||
def init(self):
|
def init(self):
|
||||||
# default options
|
# default options
|
||||||
self.opt = doctest.DONT_ACCEPT_TRUE_FOR_1 | doctest.ELLIPSIS | \
|
self.opt = doctest.DONT_ACCEPT_TRUE_FOR_1 | doctest.ELLIPSIS | \
|
||||||
doctest.IGNORE_EXCEPTION_DETAIL
|
doctest.IGNORE_EXCEPTION_DETAIL
|
||||||
|
|
||||||
# HACK HACK HACK
|
# HACK HACK HACK
|
||||||
# doctest compiles its snippets with type 'single'. That is nice
|
# doctest compiles its snippets with type 'single'. That is nice
|
||||||
@ -247,6 +254,10 @@ Results of doctest builder run on %s
|
|||||||
# write executive summary
|
# write executive summary
|
||||||
def s(v):
|
def s(v):
|
||||||
return v != 1 and 's' or ''
|
return v != 1 and 's' or ''
|
||||||
|
repl = (self.total_tries, s(self.total_tries),
|
||||||
|
self.total_failures, s(self.total_failures),
|
||||||
|
self.setup_failures, s(self.setup_failures),
|
||||||
|
self.cleanup_failures, s(self.cleanup_failures))
|
||||||
self._out('''
|
self._out('''
|
||||||
Doctest summary
|
Doctest summary
|
||||||
===============
|
===============
|
||||||
@ -254,10 +265,7 @@ Doctest summary
|
|||||||
%5d failure%s in tests
|
%5d failure%s in tests
|
||||||
%5d failure%s in setup code
|
%5d failure%s in setup code
|
||||||
%5d failure%s in cleanup code
|
%5d failure%s in cleanup code
|
||||||
''' % (self.total_tries, s(self.total_tries),
|
''' % repl)
|
||||||
self.total_failures, s(self.total_failures),
|
|
||||||
self.setup_failures, s(self.setup_failures),
|
|
||||||
self.cleanup_failures, s(self.cleanup_failures)))
|
|
||||||
self.outfile.close()
|
self.outfile.close()
|
||||||
|
|
||||||
if self.total_failures or self.setup_failures or self.cleanup_failures:
|
if self.total_failures or self.setup_failures or self.cleanup_failures:
|
||||||
@ -290,11 +298,11 @@ Doctest summary
|
|||||||
def condition(node):
|
def condition(node):
|
||||||
return (isinstance(node, (nodes.literal_block, nodes.comment))
|
return (isinstance(node, (nodes.literal_block, nodes.comment))
|
||||||
and 'testnodetype' in node) or \
|
and 'testnodetype' in node) or \
|
||||||
isinstance(node, nodes.doctest_block)
|
isinstance(node, nodes.doctest_block)
|
||||||
else:
|
else:
|
||||||
def condition(node):
|
def condition(node):
|
||||||
return isinstance(node, (nodes.literal_block, nodes.comment)) \
|
return isinstance(node, (nodes.literal_block, nodes.comment)) \
|
||||||
and 'testnodetype' in node
|
and 'testnodetype' in node
|
||||||
for node in doctree.traverse(condition):
|
for node in doctree.traverse(condition):
|
||||||
source = 'test' in node and node['test'] or node.astext()
|
source = 'test' in node and node['test'] or node.astext()
|
||||||
if not source:
|
if not source:
|
||||||
@ -364,7 +372,7 @@ Doctest summary
|
|||||||
filename, 0, None)
|
filename, 0, None)
|
||||||
sim_doctest.globs = ns
|
sim_doctest.globs = ns
|
||||||
old_f = runner.failures
|
old_f = runner.failures
|
||||||
self.type = 'exec' # the snippet may contain multiple statements
|
self.type = 'exec' # the snippet may contain multiple statements
|
||||||
runner.run(sim_doctest, out=self._warn_out, clear_globs=False)
|
runner.run(sim_doctest, out=self._warn_out, clear_globs=False)
|
||||||
if runner.failures > old_f:
|
if runner.failures > old_f:
|
||||||
return False
|
return False
|
||||||
@ -394,7 +402,7 @@ Doctest summary
|
|||||||
new_opt = code[0].options.copy()
|
new_opt = code[0].options.copy()
|
||||||
new_opt.update(example.options)
|
new_opt.update(example.options)
|
||||||
example.options = new_opt
|
example.options = new_opt
|
||||||
self.type = 'single' # as for ordinary doctests
|
self.type = 'single' # as for ordinary doctests
|
||||||
else:
|
else:
|
||||||
# testcode and output separate
|
# testcode and output separate
|
||||||
output = code[1] and code[1].code or ''
|
output = code[1] and code[1].code or ''
|
||||||
@ -413,7 +421,7 @@ Doctest summary
|
|||||||
options=options)
|
options=options)
|
||||||
test = doctest.DocTest([example], {}, group.name,
|
test = doctest.DocTest([example], {}, group.name,
|
||||||
filename, code[0].lineno, None)
|
filename, code[0].lineno, None)
|
||||||
self.type = 'exec' # multiple statements again
|
self.type = 'exec' # multiple statements again
|
||||||
# DocTest.__init__ copies the globs namespace, which we don't want
|
# DocTest.__init__ copies the globs namespace, which we don't want
|
||||||
test.globs = ns
|
test.globs = ns
|
||||||
# also don't clear the globs namespace after running the doctest
|
# also don't clear the globs namespace after running the doctest
|
||||||
@ -435,4 +443,4 @@ def setup(app):
|
|||||||
app.add_config_value('doctest_test_doctest_blocks', 'default', False)
|
app.add_config_value('doctest_test_doctest_blocks', 'default', False)
|
||||||
app.add_config_value('doctest_global_setup', '', False)
|
app.add_config_value('doctest_global_setup', '', False)
|
||||||
app.add_config_value('doctest_global_cleanup', '', False)
|
app.add_config_value('doctest_global_cleanup', '', False)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -59,4 +59,4 @@ def setup_link_roles(app):
|
|||||||
def setup(app):
|
def setup(app):
|
||||||
app.add_config_value('extlinks', {}, 'env')
|
app.add_config_value('extlinks', {}, 'env')
|
||||||
app.connect('builder-inited', setup_link_roles)
|
app.connect('builder-inited', setup_link_roles)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -323,4 +323,4 @@ def setup(app):
|
|||||||
app.add_config_value('graphviz_dot', 'dot', 'html')
|
app.add_config_value('graphviz_dot', 'dot', 'html')
|
||||||
app.add_config_value('graphviz_dot_args', [], 'html')
|
app.add_config_value('graphviz_dot_args', [], 'html')
|
||||||
app.add_config_value('graphviz_output_format', 'png', 'html')
|
app.add_config_value('graphviz_output_format', 'png', 'html')
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -73,4 +73,4 @@ def setup(app):
|
|||||||
app.add_node(ifconfig)
|
app.add_node(ifconfig)
|
||||||
app.add_directive('ifconfig', IfConfig)
|
app.add_directive('ifconfig', IfConfig)
|
||||||
app.connect('doctree-resolved', process_ifconfig_nodes)
|
app.connect('doctree-resolved', process_ifconfig_nodes)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -39,13 +39,14 @@ r"""
|
|||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import inspect
|
import inspect
|
||||||
import __builtin__ as __builtin__ # as __builtin__ is for lib2to3 compatibility
|
|
||||||
try:
|
try:
|
||||||
from hashlib import md5
|
from hashlib import md5
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from md5 import md5
|
from md5 import md5
|
||||||
|
|
||||||
from six import text_type
|
from six import text_type
|
||||||
|
from six.moves import builtins
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from docutils.parsers.rst import directives
|
from docutils.parsers.rst import directives
|
||||||
|
|
||||||
@ -147,10 +148,10 @@ class InheritanceGraph(object):
|
|||||||
displayed node names.
|
displayed node names.
|
||||||
"""
|
"""
|
||||||
all_classes = {}
|
all_classes = {}
|
||||||
builtins = vars(__builtin__).values()
|
py_builtins = vars(builtins).values()
|
||||||
|
|
||||||
def recurse(cls):
|
def recurse(cls):
|
||||||
if not show_builtins and cls in builtins:
|
if not show_builtins and cls in py_builtins:
|
||||||
return
|
return
|
||||||
if not private_bases and cls.__name__.startswith('_'):
|
if not private_bases and cls.__name__.startswith('_'):
|
||||||
return
|
return
|
||||||
@ -174,7 +175,7 @@ class InheritanceGraph(object):
|
|||||||
baselist = []
|
baselist = []
|
||||||
all_classes[cls] = (nodename, fullname, baselist, tooltip)
|
all_classes[cls] = (nodename, fullname, baselist, tooltip)
|
||||||
for base in cls.__bases__:
|
for base in cls.__bases__:
|
||||||
if not show_builtins and base in builtins:
|
if not show_builtins and base in py_builtins:
|
||||||
continue
|
continue
|
||||||
if not private_bases and base.__name__.startswith('_'):
|
if not private_bases and base.__name__.startswith('_'):
|
||||||
continue
|
continue
|
||||||
@ -194,7 +195,7 @@ class InheritanceGraph(object):
|
|||||||
completely general.
|
completely general.
|
||||||
"""
|
"""
|
||||||
module = cls.__module__
|
module = cls.__module__
|
||||||
if module == '__builtin__':
|
if module in ('__builtin__', 'builtins'):
|
||||||
fullname = cls.__name__
|
fullname = cls.__name__
|
||||||
else:
|
else:
|
||||||
fullname = '%s.%s' % (module, cls.__name__)
|
fullname = '%s.%s' % (module, cls.__name__)
|
||||||
@ -310,7 +311,7 @@ class InheritanceDiagram(Directive):
|
|||||||
# Create a graph starting with the list of classes
|
# Create a graph starting with the list of classes
|
||||||
try:
|
try:
|
||||||
graph = InheritanceGraph(
|
graph = InheritanceGraph(
|
||||||
class_names, env.temp_data.get('py:module'),
|
class_names, env.ref_context.get('py:module'),
|
||||||
parts=node['parts'],
|
parts=node['parts'],
|
||||||
private_bases='private-bases' in self.options)
|
private_bases='private-bases' in self.options)
|
||||||
except InheritanceException as err:
|
except InheritanceException as err:
|
||||||
@ -407,4 +408,4 @@ def setup(app):
|
|||||||
app.add_config_value('inheritance_graph_attrs', {}, False),
|
app.add_config_value('inheritance_graph_attrs', {}, False),
|
||||||
app.add_config_value('inheritance_node_attrs', {}, False),
|
app.add_config_value('inheritance_node_attrs', {}, False),
|
||||||
app.add_config_value('inheritance_edge_attrs', {}, False),
|
app.add_config_value('inheritance_edge_attrs', {}, False),
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -222,15 +222,21 @@ def load_mappings(app):
|
|||||||
|
|
||||||
def missing_reference(app, env, node, contnode):
|
def missing_reference(app, env, node, contnode):
|
||||||
"""Attempt to resolve a missing reference via intersphinx references."""
|
"""Attempt to resolve a missing reference via intersphinx references."""
|
||||||
domain = node.get('refdomain')
|
|
||||||
if not domain:
|
|
||||||
# only objects in domains are in the inventory
|
|
||||||
return
|
|
||||||
target = node['reftarget']
|
target = node['reftarget']
|
||||||
objtypes = env.domains[domain].objtypes_for_role(node['reftype'])
|
if node['reftype'] == 'any':
|
||||||
if not objtypes:
|
# we search anything!
|
||||||
return
|
objtypes = ['%s:%s' % (domain.name, objtype)
|
||||||
objtypes = ['%s:%s' % (domain, objtype) for objtype in objtypes]
|
for domain in env.domains.values()
|
||||||
|
for objtype in domain.object_types]
|
||||||
|
else:
|
||||||
|
domain = node.get('refdomain')
|
||||||
|
if not domain:
|
||||||
|
# only objects in domains are in the inventory
|
||||||
|
return
|
||||||
|
objtypes = env.domains[domain].objtypes_for_role(node['reftype'])
|
||||||
|
if not objtypes:
|
||||||
|
return
|
||||||
|
objtypes = ['%s:%s' % (domain, objtype) for objtype in objtypes]
|
||||||
to_try = [(env.intersphinx_inventory, target)]
|
to_try = [(env.intersphinx_inventory, target)]
|
||||||
in_set = None
|
in_set = None
|
||||||
if ':' in target:
|
if ':' in target:
|
||||||
@ -248,7 +254,7 @@ def missing_reference(app, env, node, contnode):
|
|||||||
# get correct path in case of subdirectories
|
# get correct path in case of subdirectories
|
||||||
uri = path.join(relative_path(node['refdoc'], env.srcdir), uri)
|
uri = path.join(relative_path(node['refdoc'], env.srcdir), uri)
|
||||||
newnode = nodes.reference('', '', internal=False, refuri=uri,
|
newnode = nodes.reference('', '', internal=False, refuri=uri,
|
||||||
reftitle=_('(in %s v%s)') % (proj, version))
|
reftitle=_('(in %s v%s)') % (proj, version))
|
||||||
if node.get('refexplicit'):
|
if node.get('refexplicit'):
|
||||||
# use whatever title was given
|
# use whatever title was given
|
||||||
newnode.append(contnode)
|
newnode.append(contnode)
|
||||||
@ -276,4 +282,4 @@ def setup(app):
|
|||||||
app.add_config_value('intersphinx_cache_limit', 5, False)
|
app.add_config_value('intersphinx_cache_limit', 5, False)
|
||||||
app.connect('missing-reference', missing_reference)
|
app.connect('missing-reference', missing_reference)
|
||||||
app.connect('builder-inited', load_mappings)
|
app.connect('builder-inited', load_mappings)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -57,4 +57,4 @@ def setup(app):
|
|||||||
mathbase_setup(app, (html_visit_math, None), (html_visit_displaymath, None))
|
mathbase_setup(app, (html_visit_math, None), (html_visit_displaymath, None))
|
||||||
app.add_config_value('jsmath_path', '', False)
|
app.add_config_value('jsmath_path', '', False)
|
||||||
app.connect('builder-inited', builder_inited)
|
app.connect('builder-inited', builder_inited)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -16,9 +16,11 @@ from sphinx import addnodes
|
|||||||
from sphinx.locale import _
|
from sphinx.locale import _
|
||||||
from sphinx.errors import SphinxError
|
from sphinx.errors import SphinxError
|
||||||
|
|
||||||
|
|
||||||
class LinkcodeError(SphinxError):
|
class LinkcodeError(SphinxError):
|
||||||
category = "linkcode error"
|
category = "linkcode error"
|
||||||
|
|
||||||
|
|
||||||
def doctree_read(app, doctree):
|
def doctree_read(app, doctree):
|
||||||
env = app.builder.env
|
env = app.builder.env
|
||||||
|
|
||||||
@ -68,7 +70,8 @@ def doctree_read(app, doctree):
|
|||||||
classes=['viewcode-link'])
|
classes=['viewcode-link'])
|
||||||
signode += onlynode
|
signode += onlynode
|
||||||
|
|
||||||
|
|
||||||
def setup(app):
|
def setup(app):
|
||||||
app.connect('doctree-read', doctree_read)
|
app.connect('doctree-read', doctree_read)
|
||||||
app.add_config_value('linkcode_resolve', None, '')
|
app.add_config_value('linkcode_resolve', None, '')
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -69,4 +69,4 @@ def setup(app):
|
|||||||
app.add_config_value('mathjax_inline', [r'\(', r'\)'], 'html')
|
app.add_config_value('mathjax_inline', [r'\(', r'\)'], 'html')
|
||||||
app.add_config_value('mathjax_display', [r'\[', r'\]'], 'html')
|
app.add_config_value('mathjax_display', [r'\[', r'\]'], 'html')
|
||||||
app.connect('builder-inited', builder_inited)
|
app.connect('builder-inited', builder_inited)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -256,7 +256,7 @@ def setup(app):
|
|||||||
|
|
||||||
for name, (default, rebuild) in iteritems(Config._config_values):
|
for name, (default, rebuild) in iteritems(Config._config_values):
|
||||||
app.add_config_value(name, default, rebuild)
|
app.add_config_value(name, default, rebuild)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
|
||||||
|
|
||||||
def _process_docstring(app, what, name, obj, options, lines):
|
def _process_docstring(app, what, name, obj, options, lines):
|
||||||
|
@ -246,4 +246,4 @@ def setup(app):
|
|||||||
app.add_config_value('pngmath_latex_preamble', '', 'html')
|
app.add_config_value('pngmath_latex_preamble', '', 'html')
|
||||||
app.add_config_value('pngmath_add_tooltips', True, 'html')
|
app.add_config_value('pngmath_add_tooltips', True, 'html')
|
||||||
app.connect('build-finished', cleanup_tempdir)
|
app.connect('build-finished', cleanup_tempdir)
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -150,6 +150,14 @@ def purge_todos(app, env, docname):
|
|||||||
if todo['docname'] != docname]
|
if todo['docname'] != docname]
|
||||||
|
|
||||||
|
|
||||||
|
def merge_info(app, env, docnames, other):
|
||||||
|
if not hasattr(other, 'todo_all_todos'):
|
||||||
|
return
|
||||||
|
if not hasattr(env, 'todo_all_todos'):
|
||||||
|
env.todo_all_todos = []
|
||||||
|
env.todo_all_todos.extend(other.todo_all_todos)
|
||||||
|
|
||||||
|
|
||||||
def visit_todo_node(self, node):
|
def visit_todo_node(self, node):
|
||||||
self.visit_admonition(node)
|
self.visit_admonition(node)
|
||||||
|
|
||||||
@ -172,4 +180,5 @@ def setup(app):
|
|||||||
app.connect('doctree-read', process_todos)
|
app.connect('doctree-read', process_todos)
|
||||||
app.connect('doctree-resolved', process_todo_nodes)
|
app.connect('doctree-resolved', process_todo_nodes)
|
||||||
app.connect('env-purge-doc', purge_todos)
|
app.connect('env-purge-doc', purge_todos)
|
||||||
return sphinx.__version__
|
app.connect('env-merge-info', merge_info)
|
||||||
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -20,6 +20,7 @@ from sphinx.locale import _
|
|||||||
from sphinx.pycode import ModuleAnalyzer
|
from sphinx.pycode import ModuleAnalyzer
|
||||||
from sphinx.util import get_full_modname
|
from sphinx.util import get_full_modname
|
||||||
from sphinx.util.nodes import make_refnode
|
from sphinx.util.nodes import make_refnode
|
||||||
|
from sphinx.util.console import blue
|
||||||
|
|
||||||
|
|
||||||
def _get_full_modname(app, modname, attribute):
|
def _get_full_modname(app, modname, attribute):
|
||||||
@ -37,7 +38,7 @@ def _get_full_modname(app, modname, attribute):
|
|||||||
# It should be displayed only verbose mode.
|
# It should be displayed only verbose mode.
|
||||||
app.verbose(traceback.format_exc().rstrip())
|
app.verbose(traceback.format_exc().rstrip())
|
||||||
app.verbose('viewcode can\'t import %s, failed with error "%s"' %
|
app.verbose('viewcode can\'t import %s, failed with error "%s"' %
|
||||||
(modname, e))
|
(modname, e))
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
@ -100,6 +101,16 @@ def doctree_read(app, doctree):
|
|||||||
signode += onlynode
|
signode += onlynode
|
||||||
|
|
||||||
|
|
||||||
|
def env_merge_info(app, env, docnames, other):
|
||||||
|
if not hasattr(other, '_viewcode_modules'):
|
||||||
|
return
|
||||||
|
# create a _viewcode_modules dict on the main environment
|
||||||
|
if not hasattr(env, '_viewcode_modules'):
|
||||||
|
env._viewcode_modules = {}
|
||||||
|
# now merge in the information from the subprocess
|
||||||
|
env._viewcode_modules.update(other._viewcode_modules)
|
||||||
|
|
||||||
|
|
||||||
def missing_reference(app, env, node, contnode):
|
def missing_reference(app, env, node, contnode):
|
||||||
# resolve our "viewcode" reference nodes -- they need special treatment
|
# resolve our "viewcode" reference nodes -- they need special treatment
|
||||||
if node['reftype'] == 'viewcode':
|
if node['reftype'] == 'viewcode':
|
||||||
@ -116,10 +127,12 @@ def collect_pages(app):
|
|||||||
|
|
||||||
modnames = set(env._viewcode_modules)
|
modnames = set(env._viewcode_modules)
|
||||||
|
|
||||||
app.builder.info(' (%d module code pages)' %
|
# app.builder.info(' (%d module code pages)' %
|
||||||
len(env._viewcode_modules), nonl=1)
|
# len(env._viewcode_modules), nonl=1)
|
||||||
|
|
||||||
for modname, entry in iteritems(env._viewcode_modules):
|
for modname, entry in app.status_iterator(
|
||||||
|
iteritems(env._viewcode_modules), 'highlighting module code... ',
|
||||||
|
blue, len(env._viewcode_modules), lambda x: x[0]):
|
||||||
if not entry:
|
if not entry:
|
||||||
continue
|
continue
|
||||||
code, tags, used, refname = entry
|
code, tags, used, refname = entry
|
||||||
@ -162,15 +175,14 @@ def collect_pages(app):
|
|||||||
context = {
|
context = {
|
||||||
'parents': parents,
|
'parents': parents,
|
||||||
'title': modname,
|
'title': modname,
|
||||||
'body': _('<h1>Source code for %s</h1>') % modname + \
|
'body': (_('<h1>Source code for %s</h1>') % modname +
|
||||||
'\n'.join(lines)
|
'\n'.join(lines)),
|
||||||
}
|
}
|
||||||
yield (pagename, context, 'page.html')
|
yield (pagename, context, 'page.html')
|
||||||
|
|
||||||
if not modnames:
|
if not modnames:
|
||||||
return
|
return
|
||||||
|
|
||||||
app.builder.info(' _modules/index', nonl=True)
|
|
||||||
html = ['\n']
|
html = ['\n']
|
||||||
# the stack logic is needed for using nested lists for submodules
|
# the stack logic is needed for using nested lists for submodules
|
||||||
stack = ['']
|
stack = ['']
|
||||||
@ -190,8 +202,8 @@ def collect_pages(app):
|
|||||||
html.append('</ul>' * (len(stack) - 1))
|
html.append('</ul>' * (len(stack) - 1))
|
||||||
context = {
|
context = {
|
||||||
'title': _('Overview: module code'),
|
'title': _('Overview: module code'),
|
||||||
'body': _('<h1>All modules for which code is available</h1>') + \
|
'body': (_('<h1>All modules for which code is available</h1>') +
|
||||||
''.join(html),
|
''.join(html)),
|
||||||
}
|
}
|
||||||
|
|
||||||
yield ('_modules/index', context, 'page.html')
|
yield ('_modules/index', context, 'page.html')
|
||||||
@ -200,8 +212,9 @@ def collect_pages(app):
|
|||||||
def setup(app):
|
def setup(app):
|
||||||
app.add_config_value('viewcode_import', True, False)
|
app.add_config_value('viewcode_import', True, False)
|
||||||
app.connect('doctree-read', doctree_read)
|
app.connect('doctree-read', doctree_read)
|
||||||
|
app.connect('env-merge-info', env_merge_info)
|
||||||
app.connect('html-collect-pages', collect_pages)
|
app.connect('html-collect-pages', collect_pages)
|
||||||
app.connect('missing-reference', missing_reference)
|
app.connect('missing-reference', missing_reference)
|
||||||
#app.add_config_value('viewcode_include_modules', [], 'env')
|
# app.add_config_value('viewcode_include_modules', [], 'env')
|
||||||
#app.add_config_value('viewcode_exclude_modules', [], 'env')
|
# app.add_config_value('viewcode_exclude_modules', [], 'env')
|
||||||
return sphinx.__version__
|
return {'version': sphinx.__version__, 'parallel_read_safe': True}
|
||||||
|
@ -24,46 +24,32 @@ from sphinx.util.pycompat import htmlescape
|
|||||||
from sphinx.util.texescape import tex_hl_escape_map_new
|
from sphinx.util.texescape import tex_hl_escape_map_new
|
||||||
from sphinx.ext import doctest
|
from sphinx.ext import doctest
|
||||||
|
|
||||||
try:
|
from pygments import highlight
|
||||||
import pygments
|
from pygments.lexers import PythonLexer, PythonConsoleLexer, CLexer, \
|
||||||
from pygments import highlight
|
TextLexer, RstLexer
|
||||||
from pygments.lexers import PythonLexer, PythonConsoleLexer, CLexer, \
|
from pygments.lexers import get_lexer_by_name, guess_lexer
|
||||||
TextLexer, RstLexer
|
from pygments.formatters import HtmlFormatter, LatexFormatter
|
||||||
from pygments.lexers import get_lexer_by_name, guess_lexer
|
from pygments.filters import ErrorToken
|
||||||
from pygments.formatters import HtmlFormatter, LatexFormatter
|
from pygments.styles import get_style_by_name
|
||||||
from pygments.filters import ErrorToken
|
from pygments.util import ClassNotFound
|
||||||
from pygments.styles import get_style_by_name
|
from sphinx.pygments_styles import SphinxStyle, NoneStyle
|
||||||
from pygments.util import ClassNotFound
|
|
||||||
from sphinx.pygments_styles import SphinxStyle, NoneStyle
|
|
||||||
except ImportError:
|
|
||||||
pygments = None
|
|
||||||
lexers = None
|
|
||||||
HtmlFormatter = LatexFormatter = None
|
|
||||||
else:
|
|
||||||
|
|
||||||
lexers = dict(
|
lexers = dict(
|
||||||
none = TextLexer(),
|
none = TextLexer(),
|
||||||
python = PythonLexer(),
|
python = PythonLexer(),
|
||||||
pycon = PythonConsoleLexer(),
|
pycon = PythonConsoleLexer(),
|
||||||
pycon3 = PythonConsoleLexer(python3=True),
|
pycon3 = PythonConsoleLexer(python3=True),
|
||||||
rest = RstLexer(),
|
rest = RstLexer(),
|
||||||
c = CLexer(),
|
c = CLexer(),
|
||||||
)
|
)
|
||||||
for _lexer in lexers.values():
|
for _lexer in lexers.values():
|
||||||
_lexer.add_filter('raiseonerror')
|
_lexer.add_filter('raiseonerror')
|
||||||
|
|
||||||
|
|
||||||
escape_hl_chars = {ord(u'\\'): u'\\PYGZbs{}',
|
escape_hl_chars = {ord(u'\\'): u'\\PYGZbs{}',
|
||||||
ord(u'{'): u'\\PYGZob{}',
|
ord(u'{'): u'\\PYGZob{}',
|
||||||
ord(u'}'): u'\\PYGZcb{}'}
|
ord(u'}'): u'\\PYGZcb{}'}
|
||||||
|
|
||||||
# used if Pygments is not available
|
|
||||||
_LATEX_STYLES = r'''
|
|
||||||
\newcommand\PYGZbs{\char`\\}
|
|
||||||
\newcommand\PYGZob{\char`\{}
|
|
||||||
\newcommand\PYGZcb{\char`\}}
|
|
||||||
'''
|
|
||||||
|
|
||||||
# used if Pygments is available
|
# used if Pygments is available
|
||||||
# use textcomp quote to get a true single quote
|
# use textcomp quote to get a true single quote
|
||||||
_LATEX_ADD_STYLES = r'''
|
_LATEX_ADD_STYLES = r'''
|
||||||
@ -80,8 +66,6 @@ class PygmentsBridge(object):
|
|||||||
def __init__(self, dest='html', stylename='sphinx',
|
def __init__(self, dest='html', stylename='sphinx',
|
||||||
trim_doctest_flags=False):
|
trim_doctest_flags=False):
|
||||||
self.dest = dest
|
self.dest = dest
|
||||||
if not pygments:
|
|
||||||
return
|
|
||||||
if stylename is None or stylename == 'sphinx':
|
if stylename is None or stylename == 'sphinx':
|
||||||
style = SphinxStyle
|
style = SphinxStyle
|
||||||
elif stylename == 'none':
|
elif stylename == 'none':
|
||||||
@ -153,8 +137,6 @@ class PygmentsBridge(object):
|
|||||||
def highlight_block(self, source, lang, warn=None, force=False, **kwargs):
|
def highlight_block(self, source, lang, warn=None, force=False, **kwargs):
|
||||||
if not isinstance(source, text_type):
|
if not isinstance(source, text_type):
|
||||||
source = source.decode()
|
source = source.decode()
|
||||||
if not pygments:
|
|
||||||
return self.unhighlighted(source)
|
|
||||||
|
|
||||||
# find out which lexer to use
|
# find out which lexer to use
|
||||||
if lang in ('py', 'python'):
|
if lang in ('py', 'python'):
|
||||||
@ -213,11 +195,6 @@ class PygmentsBridge(object):
|
|||||||
return hlsource.translate(tex_hl_escape_map_new)
|
return hlsource.translate(tex_hl_escape_map_new)
|
||||||
|
|
||||||
def get_stylesheet(self):
|
def get_stylesheet(self):
|
||||||
if not pygments:
|
|
||||||
if self.dest == 'latex':
|
|
||||||
return _LATEX_STYLES
|
|
||||||
# no HTML styles needed
|
|
||||||
return ''
|
|
||||||
formatter = self.get_formatter()
|
formatter = self.get_formatter()
|
||||||
if self.dest == 'html':
|
if self.dest == 'html':
|
||||||
return formatter.get_style_defs('.highlight')
|
return formatter.get_style_defs('.highlight')
|
||||||
|
@ -10,13 +10,16 @@
|
|||||||
"""
|
"""
|
||||||
from __future__ import print_function
|
from __future__ import print_function
|
||||||
|
|
||||||
import sys, os, time, re
|
import re
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
from os import path
|
from os import path
|
||||||
from io import open
|
from io import open
|
||||||
|
|
||||||
TERM_ENCODING = getattr(sys.stdin, 'encoding', None)
|
TERM_ENCODING = getattr(sys.stdin, 'encoding', None)
|
||||||
|
|
||||||
#try to import readline, unix specific enhancement
|
# try to import readline, unix specific enhancement
|
||||||
try:
|
try:
|
||||||
import readline
|
import readline
|
||||||
if readline.__doc__ and 'libedit' in readline.__doc__:
|
if readline.__doc__ and 'libedit' in readline.__doc__:
|
||||||
@ -33,7 +36,7 @@ from docutils.utils import column_width
|
|||||||
from sphinx import __version__
|
from sphinx import __version__
|
||||||
from sphinx.util.osutil import make_filename
|
from sphinx.util.osutil import make_filename
|
||||||
from sphinx.util.console import purple, bold, red, turquoise, \
|
from sphinx.util.console import purple, bold, red, turquoise, \
|
||||||
nocolor, color_terminal
|
nocolor, color_terminal
|
||||||
from sphinx.util import texescape
|
from sphinx.util import texescape
|
||||||
|
|
||||||
# function to get input from terminal -- overridden by the test suite
|
# function to get input from terminal -- overridden by the test suite
|
||||||
@ -972,17 +975,20 @@ def mkdir_p(dir):
|
|||||||
class ValidationError(Exception):
|
class ValidationError(Exception):
|
||||||
"""Raised for validation errors."""
|
"""Raised for validation errors."""
|
||||||
|
|
||||||
|
|
||||||
def is_path(x):
|
def is_path(x):
|
||||||
x = path.expanduser(x)
|
x = path.expanduser(x)
|
||||||
if path.exists(x) and not path.isdir(x):
|
if path.exists(x) and not path.isdir(x):
|
||||||
raise ValidationError("Please enter a valid path name.")
|
raise ValidationError("Please enter a valid path name.")
|
||||||
return x
|
return x
|
||||||
|
|
||||||
|
|
||||||
def nonempty(x):
|
def nonempty(x):
|
||||||
if not x:
|
if not x:
|
||||||
raise ValidationError("Please enter some text.")
|
raise ValidationError("Please enter some text.")
|
||||||
return x
|
return x
|
||||||
|
|
||||||
|
|
||||||
def choice(*l):
|
def choice(*l):
|
||||||
def val(x):
|
def val(x):
|
||||||
if x not in l:
|
if x not in l:
|
||||||
@ -990,17 +996,20 @@ def choice(*l):
|
|||||||
return x
|
return x
|
||||||
return val
|
return val
|
||||||
|
|
||||||
|
|
||||||
def boolean(x):
|
def boolean(x):
|
||||||
if x.upper() not in ('Y', 'YES', 'N', 'NO'):
|
if x.upper() not in ('Y', 'YES', 'N', 'NO'):
|
||||||
raise ValidationError("Please enter either 'y' or 'n'.")
|
raise ValidationError("Please enter either 'y' or 'n'.")
|
||||||
return x.upper() in ('Y', 'YES')
|
return x.upper() in ('Y', 'YES')
|
||||||
|
|
||||||
|
|
||||||
def suffix(x):
|
def suffix(x):
|
||||||
if not (x[0:1] == '.' and len(x) > 1):
|
if not (x[0:1] == '.' and len(x) > 1):
|
||||||
raise ValidationError("Please enter a file suffix, "
|
raise ValidationError("Please enter a file suffix, "
|
||||||
"e.g. '.rst' or '.txt'.")
|
"e.g. '.rst' or '.txt'.")
|
||||||
return x
|
return x
|
||||||
|
|
||||||
|
|
||||||
def ok(x):
|
def ok(x):
|
||||||
return x
|
return x
|
||||||
|
|
||||||
@ -1097,7 +1106,7 @@ Enter the root path for documentation.''')
|
|||||||
do_prompt(d, 'path', 'Root path for the documentation', '.', is_path)
|
do_prompt(d, 'path', 'Root path for the documentation', '.', is_path)
|
||||||
|
|
||||||
while path.isfile(path.join(d['path'], 'conf.py')) or \
|
while path.isfile(path.join(d['path'], 'conf.py')) or \
|
||||||
path.isfile(path.join(d['path'], 'source', 'conf.py')):
|
path.isfile(path.join(d['path'], 'source', 'conf.py')):
|
||||||
print()
|
print()
|
||||||
print(bold('Error: an existing conf.py has been found in the '
|
print(bold('Error: an existing conf.py has been found in the '
|
||||||
'selected root path.'))
|
'selected root path.'))
|
||||||
@ -1169,7 +1178,7 @@ document is a custom template, you can also set this to another filename.''')
|
|||||||
'index')
|
'index')
|
||||||
|
|
||||||
while path.isfile(path.join(d['path'], d['master']+d['suffix'])) or \
|
while path.isfile(path.join(d['path'], d['master']+d['suffix'])) or \
|
||||||
path.isfile(path.join(d['path'], 'source', d['master']+d['suffix'])):
|
path.isfile(path.join(d['path'], 'source', d['master']+d['suffix'])):
|
||||||
print()
|
print()
|
||||||
print(bold('Error: the master file %s has already been found in the '
|
print(bold('Error: the master file %s has already been found in the '
|
||||||
'selected root path.' % (d['master']+d['suffix'])))
|
'selected root path.' % (d['master']+d['suffix'])))
|
||||||
@ -1256,10 +1265,10 @@ def generate(d, overwrite=True, silent=False):
|
|||||||
d['extensions'] = extensions
|
d['extensions'] = extensions
|
||||||
d['copyright'] = time.strftime('%Y') + ', ' + d['author']
|
d['copyright'] = time.strftime('%Y') + ', ' + d['author']
|
||||||
d['author_texescaped'] = text_type(d['author']).\
|
d['author_texescaped'] = text_type(d['author']).\
|
||||||
translate(texescape.tex_escape_map)
|
translate(texescape.tex_escape_map)
|
||||||
d['project_doc'] = d['project'] + ' Documentation'
|
d['project_doc'] = d['project'] + ' Documentation'
|
||||||
d['project_doc_texescaped'] = text_type(d['project'] + ' Documentation').\
|
d['project_doc_texescaped'] = text_type(d['project'] + ' Documentation').\
|
||||||
translate(texescape.tex_escape_map)
|
translate(texescape.tex_escape_map)
|
||||||
|
|
||||||
# escape backslashes and single quotes in strings that are put into
|
# escape backslashes and single quotes in strings that are put into
|
||||||
# a Python string literal
|
# a Python string literal
|
||||||
|
@ -17,22 +17,23 @@ from docutils.parsers.rst import roles
|
|||||||
|
|
||||||
from sphinx import addnodes
|
from sphinx import addnodes
|
||||||
from sphinx.locale import _
|
from sphinx.locale import _
|
||||||
|
from sphinx.errors import SphinxError
|
||||||
from sphinx.util import ws_re
|
from sphinx.util import ws_re
|
||||||
from sphinx.util.nodes import split_explicit_title, process_index_entry, \
|
from sphinx.util.nodes import split_explicit_title, process_index_entry, \
|
||||||
set_role_source_info
|
set_role_source_info
|
||||||
|
|
||||||
|
|
||||||
generic_docroles = {
|
generic_docroles = {
|
||||||
'command' : addnodes.literal_strong,
|
'command': addnodes.literal_strong,
|
||||||
'dfn' : nodes.emphasis,
|
'dfn': nodes.emphasis,
|
||||||
'kbd' : nodes.literal,
|
'kbd': nodes.literal,
|
||||||
'mailheader' : addnodes.literal_emphasis,
|
'mailheader': addnodes.literal_emphasis,
|
||||||
'makevar' : addnodes.literal_strong,
|
'makevar': addnodes.literal_strong,
|
||||||
'manpage' : addnodes.literal_emphasis,
|
'manpage': addnodes.literal_emphasis,
|
||||||
'mimetype' : addnodes.literal_emphasis,
|
'mimetype': addnodes.literal_emphasis,
|
||||||
'newsgroup' : addnodes.literal_emphasis,
|
'newsgroup': addnodes.literal_emphasis,
|
||||||
'program' : addnodes.literal_strong, # XXX should be an x-ref
|
'program': addnodes.literal_strong, # XXX should be an x-ref
|
||||||
'regexp' : nodes.literal,
|
'regexp': nodes.literal,
|
||||||
}
|
}
|
||||||
|
|
||||||
for rolename, nodeclass in iteritems(generic_docroles):
|
for rolename, nodeclass in iteritems(generic_docroles):
|
||||||
@ -40,6 +41,7 @@ for rolename, nodeclass in iteritems(generic_docroles):
|
|||||||
role = roles.CustomRole(rolename, generic, {'classes': [rolename]})
|
role = roles.CustomRole(rolename, generic, {'classes': [rolename]})
|
||||||
roles.register_local_role(rolename, role)
|
roles.register_local_role(rolename, role)
|
||||||
|
|
||||||
|
|
||||||
# -- generic cross-reference role ----------------------------------------------
|
# -- generic cross-reference role ----------------------------------------------
|
||||||
|
|
||||||
class XRefRole(object):
|
class XRefRole(object):
|
||||||
@ -96,7 +98,11 @@ class XRefRole(object):
|
|||||||
options={}, content=[]):
|
options={}, content=[]):
|
||||||
env = inliner.document.settings.env
|
env = inliner.document.settings.env
|
||||||
if not typ:
|
if not typ:
|
||||||
typ = env.config.default_role
|
typ = env.temp_data.get('default_role')
|
||||||
|
if not typ:
|
||||||
|
typ = env.config.default_role
|
||||||
|
if not typ:
|
||||||
|
raise SphinxError('cannot determine default role!')
|
||||||
else:
|
else:
|
||||||
typ = typ.lower()
|
typ = typ.lower()
|
||||||
if ':' not in typ:
|
if ':' not in typ:
|
||||||
@ -158,6 +164,15 @@ class XRefRole(object):
|
|||||||
return [node], []
|
return [node], []
|
||||||
|
|
||||||
|
|
||||||
|
class AnyXRefRole(XRefRole):
|
||||||
|
def process_link(self, env, refnode, has_explicit_title, title, target):
|
||||||
|
result = XRefRole.process_link(self, env, refnode, has_explicit_title,
|
||||||
|
title, target)
|
||||||
|
# add all possible context info (i.e. std:program, py:module etc.)
|
||||||
|
refnode.attributes.update(env.ref_context)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def indexmarkup_role(typ, rawtext, text, lineno, inliner,
|
def indexmarkup_role(typ, rawtext, text, lineno, inliner,
|
||||||
options={}, content=[]):
|
options={}, content=[]):
|
||||||
"""Role for PEP/RFC references that generate an index entry."""
|
"""Role for PEP/RFC references that generate an index entry."""
|
||||||
@ -221,6 +236,7 @@ def indexmarkup_role(typ, rawtext, text, lineno, inliner,
|
|||||||
|
|
||||||
_amp_re = re.compile(r'(?<!&)&(?![&\s])')
|
_amp_re = re.compile(r'(?<!&)&(?![&\s])')
|
||||||
|
|
||||||
|
|
||||||
def menusel_role(typ, rawtext, text, lineno, inliner, options={}, content=[]):
|
def menusel_role(typ, rawtext, text, lineno, inliner, options={}, content=[]):
|
||||||
text = utils.unescape(text)
|
text = utils.unescape(text)
|
||||||
if typ == 'menuselection':
|
if typ == 'menuselection':
|
||||||
@ -246,8 +262,10 @@ def menusel_role(typ, rawtext, text, lineno, inliner, options={}, content=[]):
|
|||||||
node['classes'].append(typ)
|
node['classes'].append(typ)
|
||||||
return [node], []
|
return [node], []
|
||||||
|
|
||||||
|
|
||||||
_litvar_re = re.compile('{([^}]+)}')
|
_litvar_re = re.compile('{([^}]+)}')
|
||||||
|
|
||||||
|
|
||||||
def emph_literal_role(typ, rawtext, text, lineno, inliner,
|
def emph_literal_role(typ, rawtext, text, lineno, inliner,
|
||||||
options={}, content=[]):
|
options={}, content=[]):
|
||||||
text = utils.unescape(text)
|
text = utils.unescape(text)
|
||||||
@ -266,6 +284,7 @@ def emph_literal_role(typ, rawtext, text, lineno, inliner,
|
|||||||
|
|
||||||
_abbr_re = re.compile('\((.*)\)$', re.S)
|
_abbr_re = re.compile('\((.*)\)$', re.S)
|
||||||
|
|
||||||
|
|
||||||
def abbr_role(typ, rawtext, text, lineno, inliner, options={}, content=[]):
|
def abbr_role(typ, rawtext, text, lineno, inliner, options={}, content=[]):
|
||||||
text = utils.unescape(text)
|
text = utils.unescape(text)
|
||||||
m = _abbr_re.search(text)
|
m = _abbr_re.search(text)
|
||||||
@ -311,6 +330,8 @@ specific_docroles = {
|
|||||||
'download': XRefRole(nodeclass=addnodes.download_reference),
|
'download': XRefRole(nodeclass=addnodes.download_reference),
|
||||||
# links to documents
|
# links to documents
|
||||||
'doc': XRefRole(warn_dangling=True),
|
'doc': XRefRole(warn_dangling=True),
|
||||||
|
# links to anything
|
||||||
|
'any': AnyXRefRole(warn_dangling=True),
|
||||||
|
|
||||||
'pep': indexmarkup_role,
|
'pep': indexmarkup_role,
|
||||||
'rfc': indexmarkup_role,
|
'rfc': indexmarkup_role,
|
||||||
|
@ -522,3 +522,9 @@
|
|||||||
\gdef\@chappos{}
|
\gdef\@chappos{}
|
||||||
}
|
}
|
||||||
\fi
|
\fi
|
||||||
|
|
||||||
|
% Define literal-block environment
|
||||||
|
\RequirePackage{float}
|
||||||
|
\floatstyle{plaintop}
|
||||||
|
\newfloat{literal-block}{htbp}{loc}[chapter]
|
||||||
|
\floatname{literal-block}{List}
|
||||||
|
2
sphinx/themes/basic/static/jquery.js
vendored
2
sphinx/themes/basic/static/jquery.js
vendored
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load Diff
@ -30,6 +30,7 @@ from sphinx.errors import ThemeError
|
|||||||
NODEFAULT = object()
|
NODEFAULT = object()
|
||||||
THEMECONF = 'theme.conf'
|
THEMECONF = 'theme.conf'
|
||||||
|
|
||||||
|
|
||||||
class Theme(object):
|
class Theme(object):
|
||||||
"""
|
"""
|
||||||
Represents the theme chosen in the configuration.
|
Represents the theme chosen in the configuration.
|
||||||
@ -94,7 +95,8 @@ class Theme(object):
|
|||||||
self.themedir = tempfile.mkdtemp('sxt')
|
self.themedir = tempfile.mkdtemp('sxt')
|
||||||
self.themedir_created = True
|
self.themedir_created = True
|
||||||
for name in tinfo.namelist():
|
for name in tinfo.namelist():
|
||||||
if name.endswith('/'): continue
|
if name.endswith('/'):
|
||||||
|
continue
|
||||||
dirname = path.dirname(name)
|
dirname = path.dirname(name)
|
||||||
if not path.isdir(path.join(self.themedir, dirname)):
|
if not path.isdir(path.join(self.themedir, dirname)):
|
||||||
os.makedirs(path.join(self.themedir, dirname))
|
os.makedirs(path.join(self.themedir, dirname))
|
||||||
|
@ -34,6 +34,7 @@ default_substitutions = set([
|
|||||||
'today',
|
'today',
|
||||||
])
|
])
|
||||||
|
|
||||||
|
|
||||||
class DefaultSubstitutions(Transform):
|
class DefaultSubstitutions(Transform):
|
||||||
"""
|
"""
|
||||||
Replace some substitutions if they aren't defined in the document.
|
Replace some substitutions if they aren't defined in the document.
|
||||||
@ -69,9 +70,9 @@ class MoveModuleTargets(Transform):
|
|||||||
if not node['ids']:
|
if not node['ids']:
|
||||||
continue
|
continue
|
||||||
if ('ismod' in node and
|
if ('ismod' in node and
|
||||||
node.parent.__class__ is nodes.section and
|
node.parent.__class__ is nodes.section and
|
||||||
# index 0 is the section title node
|
# index 0 is the section title node
|
||||||
node.parent.index(node) == 1):
|
node.parent.index(node) == 1):
|
||||||
node.parent['ids'][0:0] = node['ids']
|
node.parent['ids'][0:0] = node['ids']
|
||||||
node.parent.remove(node)
|
node.parent.remove(node)
|
||||||
|
|
||||||
@ -86,10 +87,10 @@ class HandleCodeBlocks(Transform):
|
|||||||
# move doctest blocks out of blockquotes
|
# move doctest blocks out of blockquotes
|
||||||
for node in self.document.traverse(nodes.block_quote):
|
for node in self.document.traverse(nodes.block_quote):
|
||||||
if all(isinstance(child, nodes.doctest_block) for child
|
if all(isinstance(child, nodes.doctest_block) for child
|
||||||
in node.children):
|
in node.children):
|
||||||
node.replace_self(node.children)
|
node.replace_self(node.children)
|
||||||
# combine successive doctest blocks
|
# combine successive doctest blocks
|
||||||
#for node in self.document.traverse(nodes.doctest_block):
|
# for node in self.document.traverse(nodes.doctest_block):
|
||||||
# if node not in node.parent.children:
|
# if node not in node.parent.children:
|
||||||
# continue
|
# continue
|
||||||
# parindex = node.parent.index(node)
|
# parindex = node.parent.index(node)
|
||||||
@ -173,7 +174,7 @@ class Locale(Transform):
|
|||||||
|
|
||||||
parser = RSTParser()
|
parser = RSTParser()
|
||||||
|
|
||||||
#phase1: replace reference ids with translated names
|
# phase1: replace reference ids with translated names
|
||||||
for node, msg in extract_messages(self.document):
|
for node, msg in extract_messages(self.document):
|
||||||
msgstr = catalog.gettext(msg)
|
msgstr = catalog.gettext(msg)
|
||||||
# XXX add marker to untranslated parts
|
# XXX add marker to untranslated parts
|
||||||
@ -198,7 +199,7 @@ class Locale(Transform):
|
|||||||
pass
|
pass
|
||||||
# XXX doctest and other block markup
|
# XXX doctest and other block markup
|
||||||
if not isinstance(patch, nodes.paragraph):
|
if not isinstance(patch, nodes.paragraph):
|
||||||
continue # skip for now
|
continue # skip for now
|
||||||
|
|
||||||
processed = False # skip flag
|
processed = False # skip flag
|
||||||
|
|
||||||
@ -281,15 +282,14 @@ class Locale(Transform):
|
|||||||
node.children = patch.children
|
node.children = patch.children
|
||||||
node['translated'] = True
|
node['translated'] = True
|
||||||
|
|
||||||
|
# phase2: translation
|
||||||
#phase2: translation
|
|
||||||
for node, msg in extract_messages(self.document):
|
for node, msg in extract_messages(self.document):
|
||||||
if node.get('translated', False):
|
if node.get('translated', False):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
msgstr = catalog.gettext(msg)
|
msgstr = catalog.gettext(msg)
|
||||||
# XXX add marker to untranslated parts
|
# XXX add marker to untranslated parts
|
||||||
if not msgstr or msgstr == msg: # as-of-yet untranslated
|
if not msgstr or msgstr == msg: # as-of-yet untranslated
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Avoid "Literal block expected; none found." warnings.
|
# Avoid "Literal block expected; none found." warnings.
|
||||||
@ -309,12 +309,13 @@ class Locale(Transform):
|
|||||||
pass
|
pass
|
||||||
# XXX doctest and other block markup
|
# XXX doctest and other block markup
|
||||||
if not isinstance(patch, nodes.paragraph):
|
if not isinstance(patch, nodes.paragraph):
|
||||||
continue # skip for now
|
continue # skip for now
|
||||||
|
|
||||||
# auto-numbered foot note reference should use original 'ids'.
|
# auto-numbered foot note reference should use original 'ids'.
|
||||||
def is_autonumber_footnote_ref(node):
|
def is_autonumber_footnote_ref(node):
|
||||||
return isinstance(node, nodes.footnote_reference) and \
|
return isinstance(node, nodes.footnote_reference) and \
|
||||||
node.get('auto') == 1
|
node.get('auto') == 1
|
||||||
|
|
||||||
def list_replace_or_append(lst, old, new):
|
def list_replace_or_append(lst, old, new):
|
||||||
if old in lst:
|
if old in lst:
|
||||||
lst[lst.index(old)] = new
|
lst[lst.index(old)] = new
|
||||||
@ -339,7 +340,7 @@ class Locale(Transform):
|
|||||||
for id in new['ids']:
|
for id in new['ids']:
|
||||||
self.document.ids[id] = new
|
self.document.ids[id] = new
|
||||||
list_replace_or_append(
|
list_replace_or_append(
|
||||||
self.document.autofootnote_refs, old, new)
|
self.document.autofootnote_refs, old, new)
|
||||||
if refname:
|
if refname:
|
||||||
list_replace_or_append(
|
list_replace_or_append(
|
||||||
self.document.footnote_refs.setdefault(refname, []),
|
self.document.footnote_refs.setdefault(refname, []),
|
||||||
@ -404,6 +405,7 @@ class Locale(Transform):
|
|||||||
if len(old_refs) != len(new_refs):
|
if len(old_refs) != len(new_refs):
|
||||||
env.warn_node('inconsistent term references in '
|
env.warn_node('inconsistent term references in '
|
||||||
'translated message', node)
|
'translated message', node)
|
||||||
|
|
||||||
def get_ref_key(node):
|
def get_ref_key(node):
|
||||||
case = node["refdomain"], node["reftype"]
|
case = node["refdomain"], node["reftype"]
|
||||||
if case == ('std', 'term'):
|
if case == ('std', 'term'):
|
||||||
|
@ -29,15 +29,16 @@ from docutils.utils import relative_path
|
|||||||
import jinja2
|
import jinja2
|
||||||
|
|
||||||
import sphinx
|
import sphinx
|
||||||
from sphinx.errors import PycodeError
|
from sphinx.errors import PycodeError, SphinxParallelError
|
||||||
from sphinx.util.console import strip_colors
|
from sphinx.util.console import strip_colors
|
||||||
|
from sphinx.util.osutil import fs_encoding
|
||||||
|
|
||||||
# import other utilities; partly for backwards compatibility, so don't
|
# import other utilities; partly for backwards compatibility, so don't
|
||||||
# prune unused ones indiscriminately
|
# prune unused ones indiscriminately
|
||||||
from sphinx.util.osutil import SEP, os_path, relative_uri, ensuredir, walk, \
|
from sphinx.util.osutil import SEP, os_path, relative_uri, ensuredir, walk, \
|
||||||
mtimes_of_files, movefile, copyfile, copytimes, make_filename, ustrftime
|
mtimes_of_files, movefile, copyfile, copytimes, make_filename, ustrftime
|
||||||
from sphinx.util.nodes import nested_parse_with_titles, split_explicit_title, \
|
from sphinx.util.nodes import nested_parse_with_titles, split_explicit_title, \
|
||||||
explicit_title_re, caption_ref_re
|
explicit_title_re, caption_ref_re
|
||||||
from sphinx.util.matching import patfilter
|
from sphinx.util.matching import patfilter
|
||||||
|
|
||||||
# Generally useful regular expressions.
|
# Generally useful regular expressions.
|
||||||
@ -129,6 +130,11 @@ class FilenameUniqDict(dict):
|
|||||||
del self[filename]
|
del self[filename]
|
||||||
self._existing.discard(unique)
|
self._existing.discard(unique)
|
||||||
|
|
||||||
|
def merge_other(self, docnames, other):
|
||||||
|
for filename, (docs, unique) in other.items():
|
||||||
|
for doc in docs & docnames:
|
||||||
|
self.add_file(doc, filename)
|
||||||
|
|
||||||
def __getstate__(self):
|
def __getstate__(self):
|
||||||
return self._existing
|
return self._existing
|
||||||
|
|
||||||
@ -185,7 +191,11 @@ _DEBUG_HEADER = '''\
|
|||||||
def save_traceback(app):
|
def save_traceback(app):
|
||||||
"""Save the current exception's traceback in a temporary file."""
|
"""Save the current exception's traceback in a temporary file."""
|
||||||
import platform
|
import platform
|
||||||
exc = traceback.format_exc()
|
exc = sys.exc_info()[1]
|
||||||
|
if isinstance(exc, SphinxParallelError):
|
||||||
|
exc_format = '(Error in parallel process)\n' + exc.traceback
|
||||||
|
else:
|
||||||
|
exc_format = traceback.format_exc()
|
||||||
fd, path = tempfile.mkstemp('.log', 'sphinx-err-')
|
fd, path = tempfile.mkstemp('.log', 'sphinx-err-')
|
||||||
last_msgs = ''
|
last_msgs = ''
|
||||||
if app is not None:
|
if app is not None:
|
||||||
@ -200,11 +210,13 @@ def save_traceback(app):
|
|||||||
last_msgs)).encode('utf-8'))
|
last_msgs)).encode('utf-8'))
|
||||||
if app is not None:
|
if app is not None:
|
||||||
for extname, extmod in iteritems(app._extensions):
|
for extname, extmod in iteritems(app._extensions):
|
||||||
|
modfile = getattr(extmod, '__file__', 'unknown')
|
||||||
|
if isinstance(modfile, bytes):
|
||||||
|
modfile = modfile.decode(fs_encoding, 'replace')
|
||||||
os.write(fd, ('# %s (%s) from %s\n' % (
|
os.write(fd, ('# %s (%s) from %s\n' % (
|
||||||
extname, app._extension_versions[extname],
|
extname, app._extension_metadata[extname]['version'],
|
||||||
getattr(extmod, '__file__', 'unknown'))
|
modfile)).encode('utf-8'))
|
||||||
).encode('utf-8'))
|
os.write(fd, exc_format.encode('utf-8'))
|
||||||
os.write(fd, exc.encode('utf-8'))
|
|
||||||
os.close(fd)
|
os.close(fd)
|
||||||
return path
|
return path
|
||||||
|
|
||||||
|
@ -1,89 +1,89 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
"""
|
"""
|
||||||
sphinx.util.i18n
|
sphinx.util.i18n
|
||||||
~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
Builder superclass for all builders.
|
Builder superclass for all builders.
|
||||||
|
|
||||||
:copyright: Copyright 2007-2014 by the Sphinx team, see AUTHORS.
|
:copyright: Copyright 2007-2014 by the Sphinx team, see AUTHORS.
|
||||||
:license: BSD, see LICENSE for details.
|
:license: BSD, see LICENSE for details.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from os import path
|
from os import path
|
||||||
from collections import namedtuple
|
from collections import namedtuple
|
||||||
|
|
||||||
from babel.messages.pofile import read_po
|
from babel.messages.pofile import read_po
|
||||||
from babel.messages.mofile import write_mo
|
from babel.messages.mofile import write_mo
|
||||||
|
|
||||||
from sphinx.util.osutil import walk
|
from sphinx.util.osutil import walk
|
||||||
|
|
||||||
|
|
||||||
LocaleFileInfoBase = namedtuple('CatalogInfo', 'base_dir,domain')
|
LocaleFileInfoBase = namedtuple('CatalogInfo', 'base_dir,domain')
|
||||||
|
|
||||||
|
|
||||||
class CatalogInfo(LocaleFileInfoBase):
|
class CatalogInfo(LocaleFileInfoBase):
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def po_file(self):
|
def po_file(self):
|
||||||
return self.domain + '.po'
|
return self.domain + '.po'
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def mo_file(self):
|
def mo_file(self):
|
||||||
return self.domain + '.mo'
|
return self.domain + '.mo'
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def po_path(self):
|
def po_path(self):
|
||||||
return path.join(self.base_dir, self.po_file)
|
return path.join(self.base_dir, self.po_file)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def mo_path(self):
|
def mo_path(self):
|
||||||
return path.join(self.base_dir, self.mo_file)
|
return path.join(self.base_dir, self.mo_file)
|
||||||
|
|
||||||
def is_outdated(self):
|
def is_outdated(self):
|
||||||
return (
|
return (
|
||||||
not path.exists(self.mo_path) or
|
not path.exists(self.mo_path) or
|
||||||
path.getmtime(self.mo_path) < path.getmtime(self.po_path))
|
path.getmtime(self.mo_path) < path.getmtime(self.po_path))
|
||||||
|
|
||||||
def write_mo(self, locale):
|
def write_mo(self, locale):
|
||||||
with open(self.po_path, 'rt') as po:
|
with open(self.po_path, 'rt') as po:
|
||||||
with open(self.mo_path, 'wb') as mo:
|
with open(self.mo_path, 'wb') as mo:
|
||||||
write_mo(mo, read_po(po, locale))
|
write_mo(mo, read_po(po, locale))
|
||||||
|
|
||||||
|
|
||||||
def get_catalogs(locale_dirs, locale, gettext_compact=False, force_all=False):
|
def get_catalogs(locale_dirs, locale, gettext_compact=False, force_all=False):
|
||||||
"""
|
"""
|
||||||
:param list locale_dirs:
|
:param list locale_dirs:
|
||||||
list of path as `['locale_dir1', 'locale_dir2', ...]` to find
|
list of path as `['locale_dir1', 'locale_dir2', ...]` to find
|
||||||
translation catalogs. Each path contains a structure such as
|
translation catalogs. Each path contains a structure such as
|
||||||
`<locale>/LC_MESSAGES/domain.po`.
|
`<locale>/LC_MESSAGES/domain.po`.
|
||||||
:param str locale: a language as `'en'`
|
:param str locale: a language as `'en'`
|
||||||
:param boolean gettext_compact:
|
:param boolean gettext_compact:
|
||||||
* False: keep domains directory structure (default).
|
* False: keep domains directory structure (default).
|
||||||
* True: domains in the sub directory will be merged into 1 file.
|
* True: domains in the sub directory will be merged into 1 file.
|
||||||
:param boolean force_all:
|
:param boolean force_all:
|
||||||
Set True if you want to get all catalogs rather than updated catalogs.
|
Set True if you want to get all catalogs rather than updated catalogs.
|
||||||
default is False.
|
default is False.
|
||||||
:return: [CatalogInfo(), ...]
|
:return: [CatalogInfo(), ...]
|
||||||
"""
|
"""
|
||||||
if not locale:
|
if not locale:
|
||||||
return [] # locale is not specified
|
return [] # locale is not specified
|
||||||
|
|
||||||
catalogs = set()
|
catalogs = set()
|
||||||
for locale_dir in locale_dirs:
|
for locale_dir in locale_dirs:
|
||||||
base_dir = path.join(locale_dir, locale, 'LC_MESSAGES')
|
base_dir = path.join(locale_dir, locale, 'LC_MESSAGES')
|
||||||
|
|
||||||
if not path.exists(base_dir):
|
if not path.exists(base_dir):
|
||||||
continue # locale path is not found
|
continue # locale path is not found
|
||||||
|
|
||||||
for dirpath, dirnames, filenames in walk(base_dir, followlinks=True):
|
for dirpath, dirnames, filenames in walk(base_dir, followlinks=True):
|
||||||
filenames = [f for f in filenames if f.endswith('.po')]
|
filenames = [f for f in filenames if f.endswith('.po')]
|
||||||
for filename in filenames:
|
for filename in filenames:
|
||||||
base = path.splitext(filename)[0]
|
base = path.splitext(filename)[0]
|
||||||
domain = path.relpath(path.join(dirpath, base), base_dir)
|
domain = path.relpath(path.join(dirpath, base), base_dir)
|
||||||
if gettext_compact and path.sep in domain:
|
if gettext_compact and path.sep in domain:
|
||||||
domain = path.split(domain)[0]
|
domain = path.split(domain)[0]
|
||||||
cat = CatalogInfo(base_dir, domain)
|
cat = CatalogInfo(base_dir, domain)
|
||||||
if force_all or cat.is_outdated():
|
if force_all or cat.is_outdated():
|
||||||
catalogs.add(cat)
|
catalogs.add(cat)
|
||||||
|
|
||||||
return catalogs
|
return catalogs
|
||||||
|
@ -194,3 +194,9 @@ def abspath(pathdir):
|
|||||||
if isinstance(pathdir, bytes):
|
if isinstance(pathdir, bytes):
|
||||||
pathdir = pathdir.decode(fs_encoding)
|
pathdir = pathdir.decode(fs_encoding)
|
||||||
return pathdir
|
return pathdir
|
||||||
|
|
||||||
|
|
||||||
|
def getcwd():
|
||||||
|
if hasattr(os, 'getcwdu'):
|
||||||
|
return os.getcwdu()
|
||||||
|
return os.getcwd()
|
||||||
|
131
sphinx/util/parallel.py
Normal file
131
sphinx/util/parallel.py
Normal file
@ -0,0 +1,131 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""
|
||||||
|
sphinx.util.parallel
|
||||||
|
~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Parallel building utilities.
|
||||||
|
|
||||||
|
:copyright: Copyright 2007-2014 by the Sphinx team, see AUTHORS.
|
||||||
|
:license: BSD, see LICENSE for details.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import traceback
|
||||||
|
|
||||||
|
try:
|
||||||
|
import multiprocessing
|
||||||
|
import threading
|
||||||
|
except ImportError:
|
||||||
|
multiprocessing = threading = None
|
||||||
|
|
||||||
|
from six.moves import queue
|
||||||
|
|
||||||
|
from sphinx.errors import SphinxParallelError
|
||||||
|
|
||||||
|
# our parallel functionality only works for the forking Process
|
||||||
|
parallel_available = multiprocessing and (os.name == 'posix')
|
||||||
|
|
||||||
|
|
||||||
|
class SerialTasks(object):
|
||||||
|
"""Has the same interface as ParallelTasks, but executes tasks directly."""
|
||||||
|
|
||||||
|
def __init__(self, nproc=1):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def add_task(self, task_func, arg=None, result_func=None):
|
||||||
|
if arg is not None:
|
||||||
|
res = task_func(arg)
|
||||||
|
else:
|
||||||
|
res = task_func()
|
||||||
|
if result_func:
|
||||||
|
result_func(res)
|
||||||
|
|
||||||
|
def join(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class ParallelTasks(object):
|
||||||
|
"""Executes *nproc* tasks in parallel after forking."""
|
||||||
|
|
||||||
|
def __init__(self, nproc):
|
||||||
|
self.nproc = nproc
|
||||||
|
# list of threads to join when waiting for completion
|
||||||
|
self._taskid = 0
|
||||||
|
self._threads = {}
|
||||||
|
self._nthreads = 0
|
||||||
|
# queue of result objects to process
|
||||||
|
self.result_queue = queue.Queue()
|
||||||
|
self._nprocessed = 0
|
||||||
|
# maps tasks to result functions
|
||||||
|
self._result_funcs = {}
|
||||||
|
# allow only "nproc" worker processes at once
|
||||||
|
self._semaphore = threading.Semaphore(self.nproc)
|
||||||
|
|
||||||
|
def _process(self, pipe, func, arg):
|
||||||
|
try:
|
||||||
|
if arg is None:
|
||||||
|
ret = func()
|
||||||
|
else:
|
||||||
|
ret = func(arg)
|
||||||
|
pipe.send((False, ret))
|
||||||
|
except BaseException as err:
|
||||||
|
pipe.send((True, (err, traceback.format_exc())))
|
||||||
|
|
||||||
|
def _process_thread(self, tid, func, arg):
|
||||||
|
precv, psend = multiprocessing.Pipe(False)
|
||||||
|
proc = multiprocessing.Process(target=self._process,
|
||||||
|
args=(psend, func, arg))
|
||||||
|
proc.start()
|
||||||
|
result = precv.recv()
|
||||||
|
self.result_queue.put((tid, arg) + result)
|
||||||
|
proc.join()
|
||||||
|
self._semaphore.release()
|
||||||
|
|
||||||
|
def add_task(self, task_func, arg=None, result_func=None):
|
||||||
|
tid = self._taskid
|
||||||
|
self._taskid += 1
|
||||||
|
self._semaphore.acquire()
|
||||||
|
thread = threading.Thread(target=self._process_thread,
|
||||||
|
args=(tid, task_func, arg))
|
||||||
|
thread.setDaemon(True)
|
||||||
|
thread.start()
|
||||||
|
self._nthreads += 1
|
||||||
|
self._threads[tid] = thread
|
||||||
|
self._result_funcs[tid] = result_func or (lambda *x: None)
|
||||||
|
# try processing results already in parallel
|
||||||
|
try:
|
||||||
|
tid, arg, exc, result = self.result_queue.get(False)
|
||||||
|
except queue.Empty:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
del self._threads[tid]
|
||||||
|
if exc:
|
||||||
|
raise SphinxParallelError(*result)
|
||||||
|
self._result_funcs.pop(tid)(arg, result)
|
||||||
|
self._nprocessed += 1
|
||||||
|
|
||||||
|
def join(self):
|
||||||
|
while self._nprocessed < self._nthreads:
|
||||||
|
tid, arg, exc, result = self.result_queue.get()
|
||||||
|
del self._threads[tid]
|
||||||
|
if exc:
|
||||||
|
raise SphinxParallelError(*result)
|
||||||
|
self._result_funcs.pop(tid)(arg, result)
|
||||||
|
self._nprocessed += 1
|
||||||
|
|
||||||
|
# there shouldn't be any threads left...
|
||||||
|
for t in self._threads.values():
|
||||||
|
t.join()
|
||||||
|
|
||||||
|
|
||||||
|
def make_chunks(arguments, nproc, maxbatch=10):
|
||||||
|
# determine how many documents to read in one go
|
||||||
|
nargs = len(arguments)
|
||||||
|
chunksize = min(nargs // nproc, maxbatch)
|
||||||
|
if chunksize == 0:
|
||||||
|
chunksize = 1
|
||||||
|
nchunks, rest = divmod(nargs, chunksize)
|
||||||
|
if rest:
|
||||||
|
nchunks += 1
|
||||||
|
# partition documents in "chunks" that will be written by one Process
|
||||||
|
return [arguments[i*chunksize:(i+1)*chunksize] for i in range(nchunks)]
|
@ -123,6 +123,9 @@ class path(text_type):
|
|||||||
"""
|
"""
|
||||||
os.unlink(self)
|
os.unlink(self)
|
||||||
|
|
||||||
|
def utime(self, arg):
|
||||||
|
os.utime(self, arg)
|
||||||
|
|
||||||
def write_text(self, text, **kwargs):
|
def write_text(self, text, **kwargs):
|
||||||
"""
|
"""
|
||||||
Writes the given `text` to the file.
|
Writes the given `text` to the file.
|
||||||
@ -195,6 +198,9 @@ class path(text_type):
|
|||||||
"""
|
"""
|
||||||
return self.__class__(os.path.join(self, *map(self.__class__, args)))
|
return self.__class__(os.path.join(self, *map(self.__class__, args)))
|
||||||
|
|
||||||
|
def listdir(self):
|
||||||
|
return os.listdir(self)
|
||||||
|
|
||||||
__div__ = __truediv__ = joinpath
|
__div__ = __truediv__ = joinpath
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
|
@ -3,12 +3,9 @@
|
|||||||
import sys, os
|
import sys, os
|
||||||
|
|
||||||
sys.path.append(os.path.abspath('.'))
|
sys.path.append(os.path.abspath('.'))
|
||||||
sys.path.append(os.path.abspath('..'))
|
|
||||||
|
|
||||||
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.jsmath', 'sphinx.ext.todo',
|
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.jsmath', 'sphinx.ext.todo',
|
||||||
'sphinx.ext.coverage', 'sphinx.ext.autosummary',
|
'sphinx.ext.coverage', 'sphinx.ext.extlinks', 'ext']
|
||||||
'sphinx.ext.doctest', 'sphinx.ext.extlinks',
|
|
||||||
'sphinx.ext.viewcode', 'ext']
|
|
||||||
|
|
||||||
jsmath_path = 'dummy.js'
|
jsmath_path = 'dummy.js'
|
||||||
|
|
||||||
@ -18,7 +15,7 @@ master_doc = 'contents'
|
|||||||
source_suffix = '.txt'
|
source_suffix = '.txt'
|
||||||
|
|
||||||
project = 'Sphinx <Tests>'
|
project = 'Sphinx <Tests>'
|
||||||
copyright = '2010, Georg Brandl & Team'
|
copyright = '2010-2014, Georg Brandl & Team'
|
||||||
# If this is changed, remember to update the versionchanges!
|
# If this is changed, remember to update the versionchanges!
|
||||||
version = '0.6'
|
version = '0.6'
|
||||||
release = '0.6alpha1'
|
release = '0.6alpha1'
|
||||||
@ -34,7 +31,8 @@ html_theme = 'testtheme'
|
|||||||
html_theme_path = ['.']
|
html_theme_path = ['.']
|
||||||
html_theme_options = {'testopt': 'testoverride'}
|
html_theme_options = {'testopt': 'testoverride'}
|
||||||
html_sidebars = {'**': 'customsb.html',
|
html_sidebars = {'**': 'customsb.html',
|
||||||
'contents': ['contentssb.html', 'localtoc.html'] }
|
'contents': ['contentssb.html', 'localtoc.html',
|
||||||
|
'globaltoc.html']}
|
||||||
html_style = 'default.css'
|
html_style = 'default.css'
|
||||||
html_static_path = ['_static', 'templated.css_t']
|
html_static_path = ['_static', 'templated.css_t']
|
||||||
html_extra_path = ['robots.txt']
|
html_extra_path = ['robots.txt']
|
||||||
@ -44,15 +42,15 @@ html_context = {'hckey': 'hcval', 'hckey_co': 'wrong_hcval_co'}
|
|||||||
htmlhelp_basename = 'SphinxTestsdoc'
|
htmlhelp_basename = 'SphinxTestsdoc'
|
||||||
|
|
||||||
latex_documents = [
|
latex_documents = [
|
||||||
('contents', 'SphinxTests.tex', 'Sphinx Tests Documentation',
|
('contents', 'SphinxTests.tex', 'Sphinx Tests Documentation',
|
||||||
'Georg Brandl \\and someone else', 'manual'),
|
'Georg Brandl \\and someone else', 'manual'),
|
||||||
]
|
]
|
||||||
|
|
||||||
latex_additional_files = ['svgimg.svg']
|
latex_additional_files = ['svgimg.svg']
|
||||||
|
|
||||||
texinfo_documents = [
|
texinfo_documents = [
|
||||||
('contents', 'SphinxTests', 'Sphinx Tests',
|
('contents', 'SphinxTests', 'Sphinx Tests',
|
||||||
'Georg Brandl \\and someone else', 'Sphinx Testing', 'Miscellaneous'),
|
'Georg Brandl \\and someone else', 'Sphinx Testing', 'Miscellaneous'),
|
||||||
]
|
]
|
||||||
|
|
||||||
man_pages = [
|
man_pages = [
|
||||||
@ -65,8 +63,6 @@ value_from_conf_py = 84
|
|||||||
coverage_c_path = ['special/*.h']
|
coverage_c_path = ['special/*.h']
|
||||||
coverage_c_regexes = {'function': r'^PyAPI_FUNC\(.*\)\s+([^_][\w_]+)'}
|
coverage_c_regexes = {'function': r'^PyAPI_FUNC\(.*\)\s+([^_][\w_]+)'}
|
||||||
|
|
||||||
autosummary_generate = ['autosummary']
|
|
||||||
|
|
||||||
extlinks = {'issue': ('http://bugs.python.org/issue%s', 'issue '),
|
extlinks = {'issue': ('http://bugs.python.org/issue%s', 'issue '),
|
||||||
'pyurl': ('http://python.org/%s', None)}
|
'pyurl': ('http://python.org/%s', None)}
|
||||||
|
|
||||||
@ -80,35 +76,13 @@ autodoc_mock_imports = [
|
|||||||
# modify tags from conf.py
|
# modify tags from conf.py
|
||||||
tags.add('confpytag')
|
tags.add('confpytag')
|
||||||
|
|
||||||
# -- linkcode
|
|
||||||
|
|
||||||
if 'test_linkcode' in tags:
|
|
||||||
import glob
|
|
||||||
|
|
||||||
extensions.remove('sphinx.ext.viewcode')
|
|
||||||
extensions.append('sphinx.ext.linkcode')
|
|
||||||
|
|
||||||
exclude_patterns.extend(glob.glob('*.txt') + glob.glob('*/*.txt'))
|
|
||||||
exclude_patterns.remove('contents.txt')
|
|
||||||
exclude_patterns.remove('objects.txt')
|
|
||||||
|
|
||||||
def linkcode_resolve(domain, info):
|
|
||||||
if domain == 'py':
|
|
||||||
fn = info['module'].replace('.', '/')
|
|
||||||
return "http://foobar/source/%s.py" % fn
|
|
||||||
elif domain == "js":
|
|
||||||
return "http://foobar/js/" + info['fullname']
|
|
||||||
elif domain in ("c", "cpp"):
|
|
||||||
return "http://foobar/%s/%s" % (domain, "".join(info['names']))
|
|
||||||
else:
|
|
||||||
raise AssertionError()
|
|
||||||
|
|
||||||
# -- extension API
|
# -- extension API
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from sphinx import addnodes
|
from sphinx import addnodes
|
||||||
from sphinx.util.compat import Directive
|
from sphinx.util.compat import Directive
|
||||||
|
|
||||||
|
|
||||||
def userdesc_parse(env, sig, signode):
|
def userdesc_parse(env, sig, signode):
|
||||||
x, y = sig.split(':')
|
x, y = sig.split(':')
|
||||||
signode += addnodes.desc_name(x, x)
|
signode += addnodes.desc_name(x, x)
|
||||||
@ -116,15 +90,19 @@ def userdesc_parse(env, sig, signode):
|
|||||||
signode[-1] += addnodes.desc_parameter(y, y)
|
signode[-1] += addnodes.desc_parameter(y, y)
|
||||||
return x
|
return x
|
||||||
|
|
||||||
|
|
||||||
def functional_directive(name, arguments, options, content, lineno,
|
def functional_directive(name, arguments, options, content, lineno,
|
||||||
content_offset, block_text, state, state_machine):
|
content_offset, block_text, state, state_machine):
|
||||||
return [nodes.strong(text='from function: %s' % options['opt'])]
|
return [nodes.strong(text='from function: %s' % options['opt'])]
|
||||||
|
|
||||||
|
|
||||||
class ClassDirective(Directive):
|
class ClassDirective(Directive):
|
||||||
option_spec = {'opt': lambda x: x}
|
option_spec = {'opt': lambda x: x}
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
return [nodes.strong(text='from class: %s' % self.options['opt'])]
|
return [nodes.strong(text='from class: %s' % self.options['opt'])]
|
||||||
|
|
||||||
|
|
||||||
def setup(app):
|
def setup(app):
|
||||||
app.add_config_value('value_from_conf_py', 42, False)
|
app.add_config_value('value_from_conf_py', 42, False)
|
||||||
app.add_directive('funcdir', functional_directive, opt=lambda x: x)
|
app.add_directive('funcdir', functional_directive, opt=lambda x: x)
|
||||||
|
@ -21,15 +21,14 @@ Contents:
|
|||||||
bom
|
bom
|
||||||
math
|
math
|
||||||
autodoc
|
autodoc
|
||||||
autosummary
|
|
||||||
metadata
|
metadata
|
||||||
extensions
|
extensions
|
||||||
doctest
|
|
||||||
extensions
|
extensions
|
||||||
versioning/index
|
|
||||||
footnote
|
footnote
|
||||||
lists
|
lists
|
||||||
|
|
||||||
|
http://sphinx-doc.org/
|
||||||
|
Latest reference <http://sphinx-doc.org/latest/>
|
||||||
Python <http://python.org/>
|
Python <http://python.org/>
|
||||||
|
|
||||||
Indices and tables
|
Indices and tables
|
||||||
@ -44,3 +43,13 @@ References
|
|||||||
|
|
||||||
.. [Ref1] Reference target.
|
.. [Ref1] Reference target.
|
||||||
.. [Ref_1] Reference target 2.
|
.. [Ref_1] Reference target 2.
|
||||||
|
|
||||||
|
Test for issue #1157
|
||||||
|
====================
|
||||||
|
|
||||||
|
This used to crash:
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:hidden:
|
||||||
|
@ -142,6 +142,7 @@ Adding \n to test unescaping.
|
|||||||
* :ref:`here <some-label>`
|
* :ref:`here <some-label>`
|
||||||
* :ref:`my-figure`
|
* :ref:`my-figure`
|
||||||
* :ref:`my-table`
|
* :ref:`my-table`
|
||||||
|
* :ref:`my-code-block`
|
||||||
* :doc:`subdir/includes`
|
* :doc:`subdir/includes`
|
||||||
* ``:download:`` is tested in includes.txt
|
* ``:download:`` is tested in includes.txt
|
||||||
* :option:`Python -c option <python -c>`
|
* :option:`Python -c option <python -c>`
|
||||||
@ -228,8 +229,11 @@ Version markup
|
|||||||
Code blocks
|
Code blocks
|
||||||
-----------
|
-----------
|
||||||
|
|
||||||
|
.. _my-code-block:
|
||||||
|
|
||||||
.. code-block:: ruby
|
.. code-block:: ruby
|
||||||
:linenos:
|
:linenos:
|
||||||
|
:caption: my ruby code
|
||||||
|
|
||||||
def ruby?
|
def ruby?
|
||||||
false
|
false
|
||||||
@ -356,6 +360,25 @@ Only directive
|
|||||||
Always present, because set through conf.py/command line.
|
Always present, because set through conf.py/command line.
|
||||||
|
|
||||||
|
|
||||||
|
Any role
|
||||||
|
--------
|
||||||
|
|
||||||
|
.. default-role:: any
|
||||||
|
|
||||||
|
Test referencing to `headings <with>` and `objects <func_without_body>`.
|
||||||
|
Also `modules <mod>` and `classes <Time>`.
|
||||||
|
|
||||||
|
More domains:
|
||||||
|
|
||||||
|
* `JS <bar.baz>`
|
||||||
|
* `C <SphinxType>`
|
||||||
|
* `myobj` (user markup)
|
||||||
|
* `n::Array`
|
||||||
|
* `perl -c`
|
||||||
|
|
||||||
|
.. default-role::
|
||||||
|
|
||||||
|
|
||||||
.. rubric:: Footnotes
|
.. rubric:: Footnotes
|
||||||
|
|
||||||
.. [#] Like footnotes.
|
.. [#] Like footnotes.
|
||||||
|
@ -170,6 +170,10 @@ Others
|
|||||||
|
|
||||||
.. cmdoption:: -c
|
.. cmdoption:: -c
|
||||||
|
|
||||||
|
.. option:: +p
|
||||||
|
|
||||||
|
Link to :option:`perl +p`.
|
||||||
|
|
||||||
|
|
||||||
User markup
|
User markup
|
||||||
===========
|
===========
|
||||||
|
3
tests/root/undecodable.txt
Normal file
3
tests/root/undecodable.txt
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
:orphan:
|
||||||
|
|
||||||
|
here: »
|
@ -1,80 +1,80 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
## set this by test
|
## set this by test
|
||||||
# import os
|
# import os
|
||||||
# import sys
|
# import sys
|
||||||
# sys.path.insert(0, os.path.abspath('.'))
|
# sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
|
||||||
from sphinx.writers.html import HTMLTranslator
|
from sphinx.writers.html import HTMLTranslator
|
||||||
from sphinx.writers.latex import LaTeXTranslator
|
from sphinx.writers.latex import LaTeXTranslator
|
||||||
from sphinx.writers.manpage import ManualPageTranslator
|
from sphinx.writers.manpage import ManualPageTranslator
|
||||||
from sphinx.writers.texinfo import TexinfoTranslator
|
from sphinx.writers.texinfo import TexinfoTranslator
|
||||||
from sphinx.writers.text import TextTranslator
|
from sphinx.writers.text import TextTranslator
|
||||||
from sphinx.writers.websupport import WebSupportTranslator
|
from sphinx.writers.websupport import WebSupportTranslator
|
||||||
from docutils.writers.docutils_xml import XMLTranslator
|
from docutils.writers.docutils_xml import XMLTranslator
|
||||||
|
|
||||||
|
|
||||||
project = 'test'
|
project = 'test'
|
||||||
master_doc = 'index'
|
master_doc = 'index'
|
||||||
|
|
||||||
|
|
||||||
class ConfHTMLTranslator(HTMLTranslator):
|
class ConfHTMLTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfDirHTMLTranslator(HTMLTranslator):
|
class ConfDirHTMLTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfSingleHTMLTranslator(HTMLTranslator):
|
class ConfSingleHTMLTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfPickleTranslator(HTMLTranslator):
|
class ConfPickleTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfJsonTranslator(HTMLTranslator):
|
class ConfJsonTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfLaTeXTranslator(LaTeXTranslator):
|
class ConfLaTeXTranslator(LaTeXTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfManualPageTranslator(ManualPageTranslator):
|
class ConfManualPageTranslator(ManualPageTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfTexinfoTranslator(TexinfoTranslator):
|
class ConfTexinfoTranslator(TexinfoTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfTextTranslator(TextTranslator):
|
class ConfTextTranslator(TextTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfWebSupportTranslator(WebSupportTranslator):
|
class ConfWebSupportTranslator(WebSupportTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfXMLTranslator(XMLTranslator):
|
class ConfXMLTranslator(XMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class ConfPseudoXMLTranslator(XMLTranslator):
|
class ConfPseudoXMLTranslator(XMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
def setup(app):
|
def setup(app):
|
||||||
app.set_translator('html', ConfHTMLTranslator)
|
app.set_translator('html', ConfHTMLTranslator)
|
||||||
app.set_translator('dirhtml', ConfDirHTMLTranslator)
|
app.set_translator('dirhtml', ConfDirHTMLTranslator)
|
||||||
app.set_translator('singlehtml', ConfSingleHTMLTranslator)
|
app.set_translator('singlehtml', ConfSingleHTMLTranslator)
|
||||||
app.set_translator('pickle', ConfPickleTranslator)
|
app.set_translator('pickle', ConfPickleTranslator)
|
||||||
app.set_translator('json', ConfJsonTranslator)
|
app.set_translator('json', ConfJsonTranslator)
|
||||||
app.set_translator('latex', ConfLaTeXTranslator)
|
app.set_translator('latex', ConfLaTeXTranslator)
|
||||||
app.set_translator('man', ConfManualPageTranslator)
|
app.set_translator('man', ConfManualPageTranslator)
|
||||||
app.set_translator('texinfo', ConfTexinfoTranslator)
|
app.set_translator('texinfo', ConfTexinfoTranslator)
|
||||||
app.set_translator('text', ConfTextTranslator)
|
app.set_translator('text', ConfTextTranslator)
|
||||||
app.set_translator('websupport', ConfWebSupportTranslator)
|
app.set_translator('websupport', ConfWebSupportTranslator)
|
||||||
app.set_translator('xml', ConfXMLTranslator)
|
app.set_translator('xml', ConfXMLTranslator)
|
||||||
app.set_translator('pseudoxml', ConfPseudoXMLTranslator)
|
app.set_translator('pseudoxml', ConfPseudoXMLTranslator)
|
||||||
|
@ -1,3 +1,3 @@
|
|||||||
=======================
|
=======================
|
||||||
Test API set_translator
|
Test API set_translator
|
||||||
=======================
|
=======================
|
@ -1,9 +1,9 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.abspath('.')))
|
sys.path.insert(0, os.path.dirname(os.path.abspath('.')))
|
||||||
|
|
||||||
project = 'test'
|
project = 'test'
|
||||||
master_doc = 'index'
|
master_doc = 'index'
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
from sphinx.writers.html import HTMLTranslator
|
from sphinx.writers.html import HTMLTranslator
|
||||||
|
|
||||||
class ExtHTMLTranslator(HTMLTranslator):
|
class ExtHTMLTranslator(HTMLTranslator):
|
||||||
pass
|
pass
|
||||||
|
@ -1,3 +1,7 @@
|
|||||||
|
import sys, os
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
|
||||||
extensions = ['sphinx.ext.autosummary']
|
extensions = ['sphinx.ext.autosummary']
|
||||||
|
|
||||||
# The suffix of source filenames.
|
# The suffix of source filenames.
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
|
|
||||||
.. autosummary::
|
.. autosummary::
|
||||||
:nosignatures:
|
:nosignatures:
|
||||||
:toctree:
|
:toctree:
|
||||||
|
|
||||||
dummy_module
|
dummy_module
|
||||||
|
sphinx
|
||||||
|
2
tests/roots/test-build-text/conf.py
Normal file
2
tests/roots/test-build-text/conf.py
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
master_doc = 'contents'
|
||||||
|
source_suffix = '.txt'
|
8
tests/roots/test-build-text/contents.txt
Normal file
8
tests/roots/test-build-text/contents.txt
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
.. toctree::
|
||||||
|
|
||||||
|
maxwidth
|
||||||
|
lineblock
|
||||||
|
nonascii_title
|
||||||
|
nonascii_table
|
||||||
|
nonascii_maxwidth
|
||||||
|
table
|
6
tests/roots/test-build-text/lineblock.txt
Normal file
6
tests/roots/test-build-text/lineblock.txt
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
* one
|
||||||
|
|
||||||
|
| line-block 1
|
||||||
|
| line-block 2
|
||||||
|
|
||||||
|
followed paragraph.
|
6
tests/roots/test-build-text/maxwidth.txt
Normal file
6
tests/roots/test-build-text/maxwidth.txt
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
.. seealso:: ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham
|
||||||
|
|
||||||
|
* ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham
|
||||||
|
* ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham ham
|
||||||
|
|
||||||
|
spam egg
|
5
tests/roots/test-build-text/nonascii_maxwidth.txt
Normal file
5
tests/roots/test-build-text/nonascii_maxwidth.txt
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc abc
|
||||||
|
|
||||||
|
日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語 日本語
|
||||||
|
|
||||||
|
abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語 abc 日本語
|
7
tests/roots/test-build-text/nonascii_table.txt
Normal file
7
tests/roots/test-build-text/nonascii_table.txt
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
.. list-table::
|
||||||
|
|
||||||
|
- - spam
|
||||||
|
- egg
|
||||||
|
|
||||||
|
- - 日本語
|
||||||
|
- 日本語
|
2
tests/roots/test-build-text/nonascii_title.txt
Normal file
2
tests/roots/test-build-text/nonascii_title.txt
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
日本語
|
||||||
|
======
|
7
tests/roots/test-build-text/table.txt
Normal file
7
tests/roots/test-build-text/table.txt
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
+-----+-----+
|
||||||
|
| XXX | XXX |
|
||||||
|
+-----+-----+
|
||||||
|
| | XXX |
|
||||||
|
+-----+-----+
|
||||||
|
| XXX | |
|
||||||
|
+-----+-----+
|
0
tests/roots/test-circular/conf.py
Normal file
0
tests/roots/test-circular/conf.py
Normal file
4
tests/roots/test-circular/contents.rst
Normal file
4
tests/roots/test-circular/contents.rst
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
.. toctree::
|
||||||
|
|
||||||
|
sub
|
||||||
|
|
3
tests/roots/test-circular/sub.rst
Normal file
3
tests/roots/test-circular/sub.rst
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
.. toctree::
|
||||||
|
|
||||||
|
contents
|
@ -1,22 +1,35 @@
|
|||||||
Dedent
|
Dedent
|
||||||
======
|
======
|
||||||
|
|
||||||
Code blocks
|
|
||||||
-----------
|
|
||||||
|
|
||||||
.. code-block:: ruby
|
|
||||||
:linenos:
|
|
||||||
:dedent: 4
|
|
||||||
|
|
||||||
def ruby?
|
|
||||||
false
|
|
||||||
end
|
|
||||||
|
|
||||||
|
|
||||||
Literal Include
|
Literal Include
|
||||||
---------------
|
---------------
|
||||||
|
|
||||||
|
.. literalinclude:: literal.inc
|
||||||
|
:language: python
|
||||||
|
:lines: 10-11
|
||||||
|
:dedent: 0
|
||||||
|
|
||||||
|
.. literalinclude:: literal.inc
|
||||||
|
:language: python
|
||||||
|
:lines: 10-11
|
||||||
|
:dedent: 1
|
||||||
|
|
||||||
|
.. literalinclude:: literal.inc
|
||||||
|
:language: python
|
||||||
|
:lines: 10-11
|
||||||
|
:dedent: 2
|
||||||
|
|
||||||
|
.. literalinclude:: literal.inc
|
||||||
|
:language: python
|
||||||
|
:lines: 10-11
|
||||||
|
:dedent: 3
|
||||||
|
|
||||||
.. literalinclude:: literal.inc
|
.. literalinclude:: literal.inc
|
||||||
:language: python
|
:language: python
|
||||||
:lines: 10-11
|
:lines: 10-11
|
||||||
:dedent: 4
|
:dedent: 4
|
||||||
|
|
||||||
|
.. literalinclude:: literal.inc
|
||||||
|
:language: python
|
||||||
|
:lines: 10-11
|
||||||
|
:dedent: 1000
|
||||||
|
53
tests/roots/test-directive-code/dedent_code.rst
Normal file
53
tests/roots/test-directive-code/dedent_code.rst
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
Dedent
|
||||||
|
======
|
||||||
|
|
||||||
|
Code blocks
|
||||||
|
-----------
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 0
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 1
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 2
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 3
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 4
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
||||||
|
|
||||||
|
.. code-block:: ruby
|
||||||
|
:linenos:
|
||||||
|
:dedent: 1000
|
||||||
|
|
||||||
|
def ruby?
|
||||||
|
false
|
||||||
|
end
|
5
tests/roots/test-doctest/conf.py
Normal file
5
tests/roots/test-doctest/conf.py
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
extensions = ['sphinx.ext.doctest']
|
||||||
|
|
||||||
|
project = 'test project for doctest'
|
||||||
|
master_doc = 'doctest.txt'
|
||||||
|
source_suffix = '.txt'
|
@ -125,5 +125,5 @@ Special directives
|
|||||||
|
|
||||||
.. testcleanup:: *
|
.. testcleanup:: *
|
||||||
|
|
||||||
import test_doctest
|
import test_ext_doctest
|
||||||
test_doctest.cleanup_call()
|
test_ext_doctest.cleanup_call()
|
@ -1,15 +1,15 @@
|
|||||||
docutils conf
|
docutils conf
|
||||||
=============
|
=============
|
||||||
|
|
||||||
field-name-limit
|
field-name-limit
|
||||||
----------------
|
----------------
|
||||||
|
|
||||||
:short: desc
|
:short: desc
|
||||||
:long long long long: long title
|
:long long long long: long title
|
||||||
|
|
||||||
option-limit
|
option-limit
|
||||||
------------
|
------------
|
||||||
|
|
||||||
--short short desc
|
--short short desc
|
||||||
--long-long-long-long long desc
|
--long-long-long-long long desc
|
||||||
|
|
||||||
|
@ -1,8 +1,24 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import os
|
import os
|
||||||
|
|
||||||
sys.path.insert(0, os.path.abspath('.'))
|
sys.path.insert(0, os.path.abspath('.'))
|
||||||
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode']
|
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode']
|
||||||
master_doc = 'index'
|
master_doc = 'index'
|
||||||
|
|
||||||
|
|
||||||
|
if 'test_linkcode' in tags:
|
||||||
|
extensions.remove('sphinx.ext.viewcode')
|
||||||
|
extensions.append('sphinx.ext.linkcode')
|
||||||
|
|
||||||
|
def linkcode_resolve(domain, info):
|
||||||
|
if domain == 'py':
|
||||||
|
fn = info['module'].replace('.', '/')
|
||||||
|
return "http://foobar/source/%s.py" % fn
|
||||||
|
elif domain == "js":
|
||||||
|
return "http://foobar/js/" + info['fullname']
|
||||||
|
elif domain in ("c", "cpp"):
|
||||||
|
return "http://foobar/%s/%s" % (domain, "".join(info['names']))
|
||||||
|
else:
|
||||||
|
raise AssertionError()
|
||||||
|
@ -1,29 +1,34 @@
|
|||||||
viewcode
|
viewcode
|
||||||
========
|
========
|
||||||
|
|
||||||
.. py:module:: spam
|
.. py:module:: spam
|
||||||
|
|
||||||
.. autofunction:: func1
|
.. autofunction:: func1
|
||||||
|
|
||||||
.. autofunction:: func2
|
.. autofunction:: func2
|
||||||
|
|
||||||
.. autofunction:: spam.mod1.func1
|
.. autofunction:: spam.mod1.func1
|
||||||
|
|
||||||
.. autofunction:: spam.mod2.func2
|
.. autofunction:: spam.mod2.func2
|
||||||
|
|
||||||
.. autofunction:: Class1
|
.. autofunction:: Class1
|
||||||
|
|
||||||
.. autofunction:: Class2
|
.. autofunction:: Class2
|
||||||
|
|
||||||
.. autofunction:: spam.mod1.Class1
|
.. autofunction:: spam.mod1.Class1
|
||||||
|
|
||||||
.. autofunction:: spam.mod2.Class2
|
.. autofunction:: spam.mod2.Class2
|
||||||
|
|
||||||
|
|
||||||
.. literalinclude:: spam/__init__.py
|
.. literalinclude:: spam/__init__.py
|
||||||
:language: python
|
:language: python
|
||||||
:pyobject: func1
|
:pyobject: func1
|
||||||
|
|
||||||
.. literalinclude:: spam/mod1.py
|
.. literalinclude:: spam/mod1.py
|
||||||
:language: python
|
:language: python
|
||||||
:pyobject: func1
|
:pyobject: func1
|
||||||
|
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
|
||||||
|
objects
|
||||||
|
169
tests/roots/test-ext-viewcode/objects.rst
Normal file
169
tests/roots/test-ext-viewcode/objects.rst
Normal file
@ -0,0 +1,169 @@
|
|||||||
|
Testing object descriptions
|
||||||
|
===========================
|
||||||
|
|
||||||
|
.. function:: func_without_module(a, b, *c[, d])
|
||||||
|
|
||||||
|
Does something.
|
||||||
|
|
||||||
|
.. function:: func_without_body()
|
||||||
|
|
||||||
|
.. function:: func_noindex
|
||||||
|
:noindex:
|
||||||
|
|
||||||
|
.. function:: func_with_module
|
||||||
|
:module: foolib
|
||||||
|
|
||||||
|
Referring to :func:`func with no index <func_noindex>`.
|
||||||
|
Referring to :func:`nothing <>`.
|
||||||
|
|
||||||
|
.. module:: mod
|
||||||
|
:synopsis: Module synopsis.
|
||||||
|
:platform: UNIX
|
||||||
|
|
||||||
|
.. function:: func_in_module
|
||||||
|
|
||||||
|
.. class:: Cls
|
||||||
|
|
||||||
|
.. method:: meth1
|
||||||
|
|
||||||
|
.. staticmethod:: meths
|
||||||
|
|
||||||
|
.. attribute:: attr
|
||||||
|
|
||||||
|
.. explicit class given
|
||||||
|
.. method:: Cls.meth2
|
||||||
|
|
||||||
|
.. explicit module given
|
||||||
|
.. exception:: Error(arg1, arg2)
|
||||||
|
:module: errmod
|
||||||
|
|
||||||
|
.. data:: var
|
||||||
|
|
||||||
|
|
||||||
|
.. currentmodule:: None
|
||||||
|
|
||||||
|
.. function:: func_without_module2() -> annotation
|
||||||
|
|
||||||
|
.. object:: long(parameter, \
|
||||||
|
list)
|
||||||
|
another one
|
||||||
|
|
||||||
|
.. class:: TimeInt
|
||||||
|
|
||||||
|
Has only one parameter (triggers special behavior...)
|
||||||
|
|
||||||
|
:param moo: |test|
|
||||||
|
:type moo: |test|
|
||||||
|
|
||||||
|
.. |test| replace:: Moo
|
||||||
|
|
||||||
|
.. class:: Time(hour, minute, isdst)
|
||||||
|
|
||||||
|
:param year: The year.
|
||||||
|
:type year: TimeInt
|
||||||
|
:param TimeInt minute: The minute.
|
||||||
|
:param isdst: whether it's DST
|
||||||
|
:type isdst: * some complex
|
||||||
|
* expression
|
||||||
|
:returns: a new :class:`Time` instance
|
||||||
|
:rtype: :class:`Time`
|
||||||
|
:raises ValueError: if the values are out of range
|
||||||
|
:ivar int hour: like *hour*
|
||||||
|
:ivar minute: like *minute*
|
||||||
|
:vartype minute: int
|
||||||
|
:param hour: Some parameter
|
||||||
|
:type hour: DuplicateType
|
||||||
|
:param hour: Duplicate param. Should not lead to crashes.
|
||||||
|
:type hour: DuplicateType
|
||||||
|
:param .Cls extcls: A class from another module.
|
||||||
|
|
||||||
|
|
||||||
|
C items
|
||||||
|
=======
|
||||||
|
|
||||||
|
.. c:function:: Sphinx_DoSomething()
|
||||||
|
|
||||||
|
.. c:member:: SphinxStruct.member
|
||||||
|
|
||||||
|
.. c:macro:: SPHINX_USE_PYTHON
|
||||||
|
|
||||||
|
.. c:type:: SphinxType
|
||||||
|
|
||||||
|
.. c:var:: sphinx_global
|
||||||
|
|
||||||
|
|
||||||
|
Javascript items
|
||||||
|
================
|
||||||
|
|
||||||
|
.. js:function:: foo()
|
||||||
|
|
||||||
|
.. js:data:: bar
|
||||||
|
|
||||||
|
.. documenting the method of any object
|
||||||
|
.. js:function:: bar.baz(href, callback[, errback])
|
||||||
|
|
||||||
|
:param string href: The location of the resource.
|
||||||
|
:param callback: Get's called with the data returned by the resource.
|
||||||
|
:throws InvalidHref: If the `href` is invalid.
|
||||||
|
:returns: `undefined`
|
||||||
|
|
||||||
|
.. js:attribute:: bar.spam
|
||||||
|
|
||||||
|
References
|
||||||
|
==========
|
||||||
|
|
||||||
|
Referencing :class:`mod.Cls` or :Class:`mod.Cls` should be the same.
|
||||||
|
|
||||||
|
With target: :c:func:`Sphinx_DoSomething()` (parentheses are handled),
|
||||||
|
:c:member:`SphinxStruct.member`, :c:macro:`SPHINX_USE_PYTHON`,
|
||||||
|
:c:type:`SphinxType *` (pointer is handled), :c:data:`sphinx_global`.
|
||||||
|
|
||||||
|
Without target: :c:func:`CFunction`. :c:func:`!malloc`.
|
||||||
|
|
||||||
|
:js:func:`foo()`
|
||||||
|
:js:func:`foo`
|
||||||
|
|
||||||
|
:js:data:`bar`
|
||||||
|
:js:func:`bar.baz()`
|
||||||
|
:js:func:`bar.baz`
|
||||||
|
:js:func:`~bar.baz()`
|
||||||
|
|
||||||
|
:js:attr:`bar.baz`
|
||||||
|
|
||||||
|
|
||||||
|
Others
|
||||||
|
======
|
||||||
|
|
||||||
|
.. envvar:: HOME
|
||||||
|
|
||||||
|
.. program:: python
|
||||||
|
|
||||||
|
.. cmdoption:: -c command
|
||||||
|
|
||||||
|
.. program:: perl
|
||||||
|
|
||||||
|
.. cmdoption:: -c
|
||||||
|
|
||||||
|
.. option:: +p
|
||||||
|
|
||||||
|
Link to :option:`perl +p`.
|
||||||
|
|
||||||
|
|
||||||
|
User markup
|
||||||
|
===========
|
||||||
|
|
||||||
|
.. userdesc:: myobj:parameter
|
||||||
|
|
||||||
|
Description of userdesc.
|
||||||
|
|
||||||
|
|
||||||
|
Referencing :userdescrole:`myobj`.
|
||||||
|
|
||||||
|
|
||||||
|
CPP domain
|
||||||
|
==========
|
||||||
|
|
||||||
|
.. cpp:class:: n::Array<T,d>
|
||||||
|
|
||||||
|
.. cpp:function:: T& operator[]( unsigned j )
|
||||||
|
const T& operator[]( unsigned j ) const
|
@ -1,7 +1,7 @@
|
|||||||
from __future__ import absolute_import
|
from __future__ import absolute_import
|
||||||
|
|
||||||
from .mod1 import func1, Class1
|
from .mod1 import func1, Class1
|
||||||
from .mod2 import (
|
from .mod2 import (
|
||||||
func2,
|
func2,
|
||||||
Class2,
|
Class2,
|
||||||
)
|
)
|
||||||
|
@ -1,15 +1,15 @@
|
|||||||
"""
|
"""
|
||||||
mod1
|
mod1
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def func1(a, b):
|
def func1(a, b):
|
||||||
"""
|
"""
|
||||||
this is func1
|
this is func1
|
||||||
"""
|
"""
|
||||||
return a, b
|
return a, b
|
||||||
|
|
||||||
|
|
||||||
class Class1(object):
|
class Class1(object):
|
||||||
"""
|
"""
|
||||||
this is Class1
|
this is Class1
|
||||||
"""
|
"""
|
||||||
|
@ -1,15 +1,15 @@
|
|||||||
"""
|
"""
|
||||||
mod2
|
mod2
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def func2(a, b):
|
def func2(a, b):
|
||||||
"""
|
"""
|
||||||
this is func2
|
this is func2
|
||||||
"""
|
"""
|
||||||
return a, b
|
return a, b
|
||||||
|
|
||||||
|
|
||||||
class Class2(object):
|
class Class2(object):
|
||||||
"""
|
"""
|
||||||
this is Class2
|
this is Class2
|
||||||
"""
|
"""
|
||||||
|
@ -1,15 +1,15 @@
|
|||||||
:tocdepth: 2
|
:tocdepth: 2
|
||||||
|
|
||||||
i18n with python domain refs
|
i18n with python domain refs
|
||||||
=============================
|
=============================
|
||||||
|
|
||||||
.. currentmodule:: sensitive
|
.. currentmodule:: sensitive
|
||||||
|
|
||||||
See this decorator: :func:`sensitive_variables`.
|
See this decorator: :func:`sensitive_variables`.
|
||||||
|
|
||||||
.. function:: sensitive_variables(*variables)
|
.. function:: sensitive_variables(*variables)
|
||||||
|
|
||||||
Some description
|
Some description
|
||||||
|
|
||||||
.. currentmodule:: reporting
|
.. currentmodule:: reporting
|
||||||
|
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user