mirror of
https://github.com/sphinx-doc/sphinx.git
synced 2025-02-25 18:55:22 -06:00
Merge branch 'master' into 3638_eqref_format
This commit is contained in:
commit
af02bfa7b3
88
CHANGES
88
CHANGES
@ -10,33 +10,103 @@ Deprecated
|
||||
Features added
|
||||
--------------
|
||||
|
||||
* C++, handle ``decltype(auto)``.
|
||||
* #2406: C++, add proper parsing of expressions, including linking of identifiers.
|
||||
* C++, add a ``cpp:expr`` role for inserting inline C++ expressions or types.
|
||||
* #3638: Allow to change a label of reference to equation using
|
||||
``math_eqref_format``
|
||||
|
||||
Features removed
|
||||
----------------
|
||||
|
||||
* Configuration variables
|
||||
|
||||
- html_use_smartypants
|
||||
- latex_keep_old_macro_names
|
||||
- latex_elements['footer']
|
||||
|
||||
* utility methods of ``sphinx.application.Sphinx`` class
|
||||
|
||||
- buildername (property)
|
||||
- _display_chunk()
|
||||
- old_status_iterator()
|
||||
- status_iterator()
|
||||
- _directive_helper()
|
||||
|
||||
* utility methods of ``sphinx.environment.BuildEnvironment`` class
|
||||
|
||||
- currmodule (property)
|
||||
- currclass (property)
|
||||
|
||||
* epub2 builder
|
||||
* prefix and colorfunc parameter for warn()
|
||||
* ``sphinx.util.compat`` module
|
||||
* ``sphinx.util.nodes.process_only_nodes()``
|
||||
* LaTeX environment ``notice``, use ``sphinxadmonition`` instead
|
||||
|
||||
Bugs fixed
|
||||
----------
|
||||
|
||||
Testing
|
||||
--------
|
||||
|
||||
Release 1.6 beta2 (in development)
|
||||
Release 1.6 beta3 (in development)
|
||||
==================================
|
||||
|
||||
Incompatible changes
|
||||
--------------------
|
||||
|
||||
* LaTeX package ``eqparbox`` is not used and not loaded by Sphinx anymore
|
||||
* LaTeX package ``multirow`` is not used and not loaded by Sphinx anymore
|
||||
|
||||
Deprecated
|
||||
----------
|
||||
|
||||
Features added
|
||||
--------------
|
||||
|
||||
* ``LATEXMKOPTS`` variable for the Makefile in ``$BUILDDIR/latex`` to pass
|
||||
options to ``latexmk`` when executing ``make latexpdf``. Default is ``-f``.
|
||||
(refs #3695)
|
||||
|
||||
Bugs fixed
|
||||
----------
|
||||
|
||||
* #3588: No compact (p tag) html output in the i18n document build even when
|
||||
:confval:`html_compact_lists` is True.
|
||||
* The ``make latexpdf`` from 1.6b1 (for GNU/Linux and Mac OS, using
|
||||
``latexmk``) aborted earlier in case of LaTeX errors than was the case with
|
||||
1.5 series, due to hard-coded usage of ``--halt-on-error`` option. (refs #3695)
|
||||
* #3683: sphinx.websupport module is not provided by default
|
||||
* #3683: Failed to build document if builder.css_file.insert() is called
|
||||
* #3714: viewcode extension not taking ``highlight_code='none'`` in account
|
||||
* #3698: Moving :doc: to std domain broke backwards compatibility
|
||||
|
||||
Testing
|
||||
--------
|
||||
|
||||
Release 1.6 beta2 (released Apr 29, 2017)
|
||||
=========================================
|
||||
|
||||
Incompatible changes
|
||||
--------------------
|
||||
|
||||
* #3345: Replace the custom smartypants code with Docutils' smart_quotes.
|
||||
Thanks to Dmitry Shachnev, and to Günter Milde at Docutils.
|
||||
|
||||
Deprecated
|
||||
----------
|
||||
|
||||
* #3662: ``builder.css_files`` is deprecated. Please use ``add_stylesheet()``
|
||||
API instead.
|
||||
|
||||
Bugs fixed
|
||||
----------
|
||||
|
||||
* #3661: sphinx-build crashes on parallel build
|
||||
* #3669: gettext builder fails with "ValueError: substring not found"
|
||||
* #3660: Sphinx always depends on sphinxcontrib-websupport and its dependencies
|
||||
|
||||
Release 1.6 beta1 (released Apr 24, 2017)
|
||||
=========================================
|
||||
|
||||
@ -229,6 +299,15 @@ Features added
|
||||
Bugs fixed
|
||||
----------
|
||||
|
||||
* #3614: Sphinx crashes with requests-2.5.0
|
||||
* #3618: autodoc crashes with tupled arguments
|
||||
* #3664: No space after the bullet in items of a latex list produced by Sphinx
|
||||
* #3657: EPUB builder crashes if document startswith genindex exists
|
||||
* #3588: No compact (p tag) html output in the i18n document build even when
|
||||
:confval:`html_compact_lists` is True.
|
||||
* #3685: AttributeError when using 3rd party domains
|
||||
* #3702: LaTeX writer styles figure legends with a hard-coded ``\small``
|
||||
|
||||
Testing
|
||||
--------
|
||||
|
||||
@ -428,10 +507,8 @@ Incompatible changes
|
||||
* QtHelpBuilder doens't generate search page (ref: #2352)
|
||||
* QtHelpBuilder uses ``nonav`` theme instead of default one
|
||||
to improve readability.
|
||||
* latex: To provide good default settings to Japanese docs, Sphinx uses ``jsbook``
|
||||
as a docclass by default if the ``language`` is ``ja``.
|
||||
* latex: To provide good default settings to Japanese docs, Sphinx uses
|
||||
``jreport`` and ``jsbooks`` as a docclass by default if the ``language`` is
|
||||
* latex: To provide good default settings to Japanese documents, Sphinx uses
|
||||
``jreport`` and ``jsbook`` as docclass if :confval:`language` is
|
||||
``ja``.
|
||||
* ``sphinx-quickstart`` now allows a project version is empty
|
||||
* Fix :download: role on epub/qthelp builder. They ignore the role because they don't support it.
|
||||
@ -471,6 +548,7 @@ Incompatible changes
|
||||
* Emit warnings that will be deprecated in Sphinx 1.6 by default.
|
||||
Users can change the behavior by setting the environment variable
|
||||
PYTHONWARNINGS. Please refer :ref:`when-deprecation-warnings-are-displayed`.
|
||||
* #2454: new JavaScript variable ``SOURCELINK_SUFFIX`` is added
|
||||
|
||||
Deprecated
|
||||
----------
|
||||
|
1
Makefile
1
Makefile
@ -64,6 +64,7 @@ clean-backupfiles:
|
||||
|
||||
clean-generated:
|
||||
find . -name '.DS_Store' -exec rm -f {} +
|
||||
rm -rf Sphinx.egg-info/
|
||||
rm -rf doc/_build/
|
||||
rm -f sphinx/pycode/*.pickle
|
||||
rm -f utils/*3.py*
|
||||
|
11
doc/_static/conf.py.txt
vendored
11
doc/_static/conf.py.txt
vendored
@ -167,11 +167,6 @@ html_static_path = ['_static']
|
||||
#
|
||||
# html_last_updated_fmt = None
|
||||
|
||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||
# typographically correct entities.
|
||||
#
|
||||
# html_use_smartypants = True
|
||||
|
||||
# Custom sidebar templates, maps document names to template names.
|
||||
#
|
||||
# html_sidebars = {}
|
||||
@ -280,12 +275,6 @@ latex_documents = [
|
||||
#
|
||||
# latex_appendices = []
|
||||
|
||||
# If false, will not define \strong, \code, \titleref, \crossref ... but only
|
||||
# \sphinxstrong, ..., \sphinxtitleref, ... to help avoid clash with user added
|
||||
# packages.
|
||||
#
|
||||
# latex_keep_old_macro_names = True
|
||||
|
||||
# If false, no module index is generated.
|
||||
#
|
||||
# latex_domain_indices = True
|
||||
|
2
doc/_templates/indexsidebar.html
vendored
2
doc/_templates/indexsidebar.html
vendored
@ -3,7 +3,7 @@
|
||||
{%trans%}project{%endtrans%}</p>
|
||||
|
||||
<h3>Download</h3>
|
||||
{% if version.endswith('a0') %}
|
||||
{% if version.endswith('+') %}
|
||||
<p>{%trans%}This documentation is for version <b><a href="changes.html">{{ version }}</a></b>, which is
|
||||
not released yet.{%endtrans%}</p>
|
||||
<p>{%trans%}You can use it from the
|
||||
|
@ -125,26 +125,6 @@ The builder's "name" must be given to the **-b** command-line option of
|
||||
|
||||
.. autoattribute:: supported_image_types
|
||||
|
||||
.. module:: sphinx.builders.epub2
|
||||
.. class:: Epub2Builder
|
||||
|
||||
This builder produces the same output as the standalone HTML builder, but
|
||||
also generates an *epub* file for ebook readers. See :ref:`epub-faq` for
|
||||
details about it. For definition of the epub format, have a look at
|
||||
`<http://idpf.org/epub>`_ or `<https://en.wikipedia.org/wiki/EPUB>`_.
|
||||
The builder creates *EPUB 2* files.
|
||||
|
||||
.. autoattribute:: name
|
||||
|
||||
.. autoattribute:: format
|
||||
|
||||
.. autoattribute:: supported_image_types
|
||||
|
||||
.. deprecated:: 1.5
|
||||
|
||||
Since Sphinx-1.5, the epub3 builder is used for the default builder of epub.
|
||||
Now EpubBuilder is renamed to epub2.
|
||||
|
||||
.. module:: sphinx.builders.epub3
|
||||
.. class:: Epub3Builder
|
||||
|
||||
@ -202,6 +182,31 @@ The builder's "name" must be given to the **-b** command-line option of
|
||||
.. versionchanged:: 1.6
|
||||
Use of ``latexmk`` on GNU/Linux or Mac OS X.
|
||||
|
||||
Since 1.6, ``make latexpdf`` (or ``make -C "<builddir>/latex"`` after a
|
||||
``sphinx-build`` run) uses ``latexmk`` (on GNU/Linux and Mac OS X).
|
||||
It invokes it with option ``-f`` which attempts to complete targets
|
||||
even in case of LaTeX processing errors. This can be overridden via
|
||||
``LATEXMKOPTS`` variable, for example:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
make latexpdf LATEXMKOPTS=""
|
||||
|
||||
The ``pdflatex`` calls themselves obey the ``LATEXOPTS`` variable whose
|
||||
default is ``--interaction=nonstopmode`` (same as ``-interaction
|
||||
nonstopmode``.) In order to stop the
|
||||
compilation on first error one can use ``--halt-on-error``.
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
make latexpdf LATEXMKOPTS="-silent" LATEXOPTS="--halt-on-error"
|
||||
|
||||
In case the first ``pdflatex`` run aborts with an error, this will stop
|
||||
further ``latexmk`` processing (no ``-f`` option). The console output
|
||||
will be kept to a bare minimum during target processing (``-silent``).
|
||||
|
||||
.. autoattribute:: name
|
||||
|
||||
.. autoattribute:: format
|
||||
|
@ -16,7 +16,7 @@ exclude_patterns = ['_build']
|
||||
|
||||
project = 'Sphinx'
|
||||
copyright = '2007-2017, Georg Brandl and the Sphinx team'
|
||||
version = sphinx.__released__
|
||||
version = sphinx.__display_version__
|
||||
release = version
|
||||
show_authors = True
|
||||
|
||||
|
@ -765,12 +765,6 @@ that use Sphinx's HTMLWriter class.
|
||||
The empty string is equivalent to ``'%b %d, %Y'`` (or a
|
||||
locale-dependent equivalent).
|
||||
|
||||
.. confval:: html_use_smartypants
|
||||
|
||||
If true, `SmartyPants <http://daringfireball.net/projects/smartypants/>`_
|
||||
will be used to convert quotes and dashes to typographically correct
|
||||
entities. Default: ``True``.
|
||||
|
||||
.. confval:: html_add_permalinks
|
||||
|
||||
Sphinx will add "permalinks" for each heading and description environment as
|
||||
@ -1615,26 +1609,6 @@ These options influence LaTeX output. See further :doc:`latex`.
|
||||
value selected the ``'inline'`` display. For backwards compatibility,
|
||||
``True`` is still accepted.
|
||||
|
||||
.. confval:: latex_keep_old_macro_names
|
||||
|
||||
If ``True`` the ``\strong``, ``\code``, ``\bfcode``, ``\email``,
|
||||
``\tablecontinued``, ``\titleref``, ``\menuselection``, ``\accelerator``,
|
||||
``\crossref``, ``\termref``, and ``\optional`` text styling macros are
|
||||
pre-defined by Sphinx and may be user-customized by some
|
||||
``\renewcommand``'s inserted either via ``'preamble'`` key or :dudir:`raw
|
||||
<raw-data-pass-through>` directive. If ``False``, only ``\sphinxstrong``,
|
||||
etc... macros are defined (and may be redefined by user).
|
||||
|
||||
The default is ``False`` as it prevents macro name conflicts caused by
|
||||
latex packages. For example (``lualatex`` or ``xelatex``) ``fontspec v2.6``
|
||||
has its own ``\strong`` macro.
|
||||
|
||||
.. versionadded:: 1.4.5
|
||||
.. versionchanged:: 1.6
|
||||
Default was changed from ``True`` to ``False``.
|
||||
.. deprecated:: 1.6
|
||||
This setting will be removed at Sphinx 1.7.
|
||||
|
||||
.. confval:: latex_use_latex_multicolumn
|
||||
|
||||
If ``False`` (default), the LaTeX writer uses for merged cells in grid
|
||||
|
@ -563,7 +563,7 @@ a visibility statement (``public``, ``private`` or ``protected``).
|
||||
|
||||
.. cpp:class:: OuterScope::MyClass : public MyBase, MyOtherBase
|
||||
|
||||
A template class can be declared::
|
||||
A class template can be declared::
|
||||
|
||||
.. cpp:class:: template<typename T, std::size_t N> std::array
|
||||
|
||||
@ -728,6 +728,17 @@ a visibility statement (``public``, ``private`` or ``protected``).
|
||||
Proxy to an element of a notional sequence that can be compared,
|
||||
indirected, or incremented.
|
||||
|
||||
**Notation**
|
||||
|
||||
.. cpp:var:: It r
|
||||
|
||||
An lvalue.
|
||||
|
||||
**Valid Expressions**
|
||||
|
||||
- :cpp:expr:`*r`, when :cpp:expr:`r` is dereferenceable.
|
||||
- :cpp:expr:`++r`, with return type :cpp:expr:`It&`, when :cpp:expr:`r` is incrementable.
|
||||
|
||||
.. cpp:concept:: template<typename Cont> std::Container()
|
||||
|
||||
Holder of elements, to which it can provide access via
|
||||
@ -740,6 +751,17 @@ a visibility statement (``public``, ``private`` or ``protected``).
|
||||
Proxy to an element of a notional sequence that can be compared,
|
||||
indirected, or incremented.
|
||||
|
||||
**Notation**
|
||||
|
||||
.. cpp:var:: It r
|
||||
|
||||
An lvalue.
|
||||
|
||||
**Valid Expressions**
|
||||
|
||||
- :cpp:expr:`*r`, when :cpp:expr:`r` is dereferenceable.
|
||||
- :cpp:expr:`++r`, with return type :cpp:expr:`It&`, when :cpp:expr:`r` is incrementable.
|
||||
|
||||
.. cpp:concept:: template<typename Cont> std::Container()
|
||||
|
||||
Holder of elements, to which it can provide access via
|
||||
@ -807,6 +829,30 @@ compatibility. E.g., ``Iterator{A, B, C}`` will be accepted as an introduction
|
||||
even though it would not be valid C++.
|
||||
|
||||
|
||||
Inline Expressions and Tpes
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
.. rst:role:: cpp:expr
|
||||
|
||||
A role for inserting a C++ expression or type as inline text.
|
||||
For example::
|
||||
|
||||
.. cpp:var:: int a = 42
|
||||
|
||||
.. cpp:function:: int f(int i)
|
||||
|
||||
An expression: :cpp:expr:`a * f(a)`.
|
||||
A type: :cpp:expr:`const MySortedContainer<int>&`.
|
||||
|
||||
will be rendered as follows:
|
||||
|
||||
.. cpp:var:: int a = 42
|
||||
|
||||
.. cpp:function:: int f(int i)
|
||||
|
||||
An expression: :cpp:expr:`a * f(a)`.
|
||||
A type: :cpp:expr:`const MySortedContainer<int>&`.
|
||||
|
||||
Namespacing
|
||||
~~~~~~~~~~~~~~~~~
|
||||
|
||||
@ -842,7 +888,7 @@ directive.
|
||||
|
||||
.. cpp:function:: std::size_t size() const
|
||||
|
||||
declares ``size`` as a member function of the template class ``std::vector``.
|
||||
declares ``size`` as a member function of the class template ``std::vector``.
|
||||
Equivalently this could have been declared using::
|
||||
|
||||
.. cpp:class:: template<typename T> \
|
||||
@ -922,7 +968,7 @@ These roles link to the given declaration types:
|
||||
.. admonition:: Note on References with Templates Parameters/Arguments
|
||||
|
||||
Sphinx's syntax to give references a custom title can interfere with
|
||||
linking to template classes, if nothing follows the closing angle
|
||||
linking to class templates, if nothing follows the closing angle
|
||||
bracket, i.e. if the link looks like this: ``:cpp:class:`MyClass<int>```.
|
||||
This is interpreted as a link to ``int`` with a title of ``MyClass``.
|
||||
In this case, please escape the opening angle bracket with a backslash,
|
||||
@ -961,7 +1007,7 @@ In general the reference must include the template paraemter declarations, e.g.,
|
||||
Currently the lookup only succeed if the template parameter identifiers are equal strings. That is,
|
||||
``template\<typename UOuter> Wrapper::Outer`` will not work.
|
||||
|
||||
The inner template class can not be directly referenced, unless the current namespace
|
||||
The inner class template can not be directly referenced, unless the current namespace
|
||||
is changed or the following shorthand is used.
|
||||
If a template parameter list is omitted, then the lookup will assume either a template or a non-template,
|
||||
but not a partial template specialisation.
|
||||
|
@ -414,19 +414,7 @@ Let us now list some macros from the package file
|
||||
- text styling commands (they have one argument): ``\sphinx<foo>`` with
|
||||
``<foo>`` being one of ``strong``, ``bfcode``, ``email``, ``tablecontinued``,
|
||||
``titleref``, ``menuselection``, ``accelerator``, ``crossref``, ``termref``,
|
||||
``optional``. The non-prefixed macros will still be defined if option
|
||||
:confval:`latex_keep_old_macro_names` has been set to ``True`` (default is
|
||||
``False``), in which case the prefixed macros expand to the
|
||||
non-prefixed ones.
|
||||
|
||||
.. versionadded:: 1.4.5
|
||||
Use of ``\sphinx`` prefixed macro names to limit possibilities of conflict
|
||||
with LaTeX packages.
|
||||
.. versionchanged:: 1.6
|
||||
The default value of :confval:`latex_keep_old_macro_names` changes to
|
||||
``False``, and even if set to ``True``, if a non-prefixed macro
|
||||
already exists at ``sphinx.sty`` loading time, only the ``\sphinx``
|
||||
prefixed one will be defined. The setting will be removed at 1.7.
|
||||
``optional``.
|
||||
|
||||
- more text styling commands: ``\sphinxstyle<bar>`` with ``<bar>`` one of
|
||||
``indexentry``, ``indexextra``, ``indexpageref``, ``topictitle``,
|
||||
@ -438,7 +426,14 @@ Let us now list some macros from the package file
|
||||
the new macros are wrappers of the formerly hard-coded ``\texttt``,
|
||||
``\emph``, ... The default definitions can be found in
|
||||
:file:`sphinx.sty`.
|
||||
- paragraph level environments: for each admonition type ``<foo>``, the
|
||||
- a :dudir:`figure` may have an optional legend with arbitrary body
|
||||
elements: they are rendered in a ``sphinxlegend`` environment. The default
|
||||
definition issues ``\small``, and ends with ``\par``.
|
||||
|
||||
.. versionadded:: 1.5.6
|
||||
formerly, the ``\small`` was hardcoded in LaTeX writer and the ending
|
||||
``\par`` was lacking.
|
||||
- for each admonition type ``<foo>``, the
|
||||
used environment is named ``sphinx<foo>``. They may be ``\renewenvironment``
|
||||
'd individually, and must then be defined with one argument (it is the heading
|
||||
of the notice, for example ``Warning:`` for :dudir:`warning` directive, if
|
||||
|
@ -47,7 +47,9 @@ file :file:`blue.zip`, you can put it right in the directory containing
|
||||
html_theme_path = ["."]
|
||||
|
||||
The third form is a python package. If a theme you want to use is distributed
|
||||
as a python package, you can use it after installing::
|
||||
as a python package, you can use it after installing
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
# installing theme package
|
||||
$ pip install sphinxjp.themes.dotted
|
||||
|
1
mypy.ini
1
mypy.ini
@ -2,7 +2,6 @@
|
||||
python_version = 2.7
|
||||
ignore_missing_imports = True
|
||||
follow_imports = skip
|
||||
fast_parser = True
|
||||
incremental = True
|
||||
check_untyped_defs = True
|
||||
warn_unused_ignores = True
|
||||
|
6
setup.py
6
setup.py
@ -51,15 +51,19 @@ requires = [
|
||||
'alabaster>=0.7,<0.8',
|
||||
'imagesize',
|
||||
'requests>=2.0.0',
|
||||
'sphinxcontrib-websupport',
|
||||
'typing',
|
||||
'setuptools',
|
||||
'sphinxcontrib-websupport',
|
||||
]
|
||||
extras_require = {
|
||||
# Environment Marker works for wheel 0.24 or later
|
||||
':sys_platform=="win32"': [
|
||||
'colorama>=0.3.5',
|
||||
],
|
||||
'websupport': [
|
||||
'sqlalchemy>=0.9',
|
||||
'whoosh>=2.0',
|
||||
],
|
||||
'test': [
|
||||
'pytest',
|
||||
'mock', # it would be better for 'test:python_version in 2.7'
|
||||
|
@ -58,7 +58,7 @@ if __version__.endswith('+'):
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
out, err = p.communicate()
|
||||
if out:
|
||||
__display_version__ += '/' + out.decode().strip() # type: ignore
|
||||
__display_version__ += '/' + out.decode().strip()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
@ -29,7 +29,7 @@ import sphinx
|
||||
from sphinx import package_dir, locale
|
||||
from sphinx.config import Config
|
||||
from sphinx.errors import ConfigError, ExtensionError, VersionRequirementError
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning, RemovedInSphinx20Warning
|
||||
from sphinx.deprecation import RemovedInSphinx20Warning
|
||||
from sphinx.environment import BuildEnvironment
|
||||
from sphinx.events import EventManager
|
||||
from sphinx.extension import verify_required_extensions
|
||||
@ -39,10 +39,9 @@ from sphinx.registry import SphinxComponentRegistry
|
||||
from sphinx.util import pycompat # noqa: F401
|
||||
from sphinx.util import import_object
|
||||
from sphinx.util import logging
|
||||
from sphinx.util import status_iterator, old_status_iterator, display_chunk
|
||||
from sphinx.util.tags import Tags
|
||||
from sphinx.util.osutil import ENOENT
|
||||
from sphinx.util.console import bold, darkgreen # type: ignore
|
||||
from sphinx.util.console import bold # type: ignore
|
||||
from sphinx.util.docutils import is_html5_writer_available, directive_helper
|
||||
from sphinx.util.i18n import find_catalog_source_files
|
||||
|
||||
@ -60,7 +59,6 @@ if False:
|
||||
builtin_extensions = (
|
||||
'sphinx.builders.applehelp',
|
||||
'sphinx.builders.changes',
|
||||
'sphinx.builders.epub2',
|
||||
'sphinx.builders.epub3',
|
||||
'sphinx.builders.devhelp',
|
||||
'sphinx.builders.dummy',
|
||||
@ -315,13 +313,6 @@ class Sphinx(object):
|
||||
for node, settings in iteritems(self.enumerable_nodes):
|
||||
self.env.get_domain('std').enumerable_nodes[node] = settings # type: ignore
|
||||
|
||||
@property
|
||||
def buildername(self):
|
||||
# type: () -> unicode
|
||||
warnings.warn('app.buildername is deprecated. Please use app.builder.name instead',
|
||||
RemovedInSphinx17Warning)
|
||||
return self.builder.name
|
||||
|
||||
# ---- main "build" method -------------------------------------------------
|
||||
|
||||
def build(self, force_all=False, filenames=None):
|
||||
@ -357,31 +348,15 @@ class Sphinx(object):
|
||||
self.builder.cleanup()
|
||||
|
||||
# ---- logging handling ----------------------------------------------------
|
||||
def warn(self, message, location=None, prefix=None,
|
||||
type=None, subtype=None, colorfunc=None):
|
||||
# type: (unicode, unicode, unicode, unicode, unicode, Callable) -> None
|
||||
def warn(self, message, location=None, type=None, subtype=None):
|
||||
# type: (unicode, unicode, unicode, unicode) -> None
|
||||
"""Emit a warning.
|
||||
|
||||
If *location* is given, it should either be a tuple of (docname, lineno)
|
||||
or a string describing the location of the warning as well as possible.
|
||||
|
||||
*prefix* usually should not be changed.
|
||||
|
||||
*type* and *subtype* are used to suppress warnings with :confval:`suppress_warnings`.
|
||||
|
||||
.. note::
|
||||
|
||||
For warnings emitted during parsing, you should use
|
||||
:meth:`.BuildEnvironment.warn` since that will collect all
|
||||
warnings during parsing for later output.
|
||||
"""
|
||||
if prefix:
|
||||
warnings.warn('prefix option of warn() is now deprecated.',
|
||||
RemovedInSphinx17Warning)
|
||||
if colorfunc:
|
||||
warnings.warn('colorfunc option of warn() is now deprecated.',
|
||||
RemovedInSphinx17Warning)
|
||||
|
||||
warnings.warn('app.warning() is now deprecated. Use sphinx.util.logging instead.',
|
||||
RemovedInSphinx20Warning)
|
||||
logger.warning(message, type=type, subtype=subtype, location=location)
|
||||
@ -418,34 +393,6 @@ class Sphinx(object):
|
||||
RemovedInSphinx20Warning)
|
||||
logger.debug(message, *args, **kwargs)
|
||||
|
||||
def _display_chunk(chunk):
|
||||
# type: (Any) -> unicode
|
||||
warnings.warn('app._display_chunk() is now deprecated. '
|
||||
'Use sphinx.util.display_chunk() instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
return display_chunk(chunk)
|
||||
|
||||
def old_status_iterator(self, iterable, summary, colorfunc=darkgreen,
|
||||
stringify_func=display_chunk):
|
||||
# type: (Iterable, unicode, Callable, Callable[[Any], unicode]) -> Iterator
|
||||
warnings.warn('app.old_status_iterator() is now deprecated. '
|
||||
'Use sphinx.util.status_iterator() instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
for item in old_status_iterator(iterable, summary,
|
||||
color="darkgreen", stringify_func=stringify_func):
|
||||
yield item
|
||||
|
||||
# new version with progress info
|
||||
def status_iterator(self, iterable, summary, colorfunc=darkgreen, length=0,
|
||||
stringify_func=_display_chunk):
|
||||
# type: (Iterable, unicode, Callable, int, Callable[[Any], unicode]) -> Iterable
|
||||
warnings.warn('app.status_iterator() is now deprecated. '
|
||||
'Use sphinx.util.status_iterator() instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
for item in status_iterator(iterable, summary, length=length, verbosity=self.verbosity,
|
||||
color="darkgreen", stringify_func=stringify_func):
|
||||
yield item
|
||||
|
||||
# ---- general extensibility interface -------------------------------------
|
||||
|
||||
def setup_extension(self, extname):
|
||||
@ -567,13 +514,6 @@ class Sphinx(object):
|
||||
self.enumerable_nodes[node] = (figtype, title_getter)
|
||||
self.add_node(node, **kwds)
|
||||
|
||||
def _directive_helper(self, obj, has_content=None, argument_spec=None, **option_spec):
|
||||
# type: (Any, bool, Tuple[int, int, bool], Any) -> Any
|
||||
warnings.warn('_directive_helper() is now deprecated. '
|
||||
'Please use sphinx.util.docutils.directive_helper() instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
return directive_helper(obj, has_content, argument_spec, **option_spec)
|
||||
|
||||
def add_directive(self, name, obj, content=None, arguments=None, **options):
|
||||
# type: (unicode, Any, bool, Tuple[int, int, bool], Any) -> None
|
||||
logger.debug('[app] adding directive: %r',
|
||||
|
@ -91,9 +91,6 @@ class Builder(object):
|
||||
self.tags.add(self.name)
|
||||
self.tags.add("format_%s" % self.format)
|
||||
self.tags.add("builder_%s" % self.name)
|
||||
# compatibility aliases
|
||||
self.status_iterator = app.status_iterator
|
||||
self.old_status_iterator = app.old_status_iterator
|
||||
|
||||
# images that need to be copied over (source -> dest)
|
||||
self.images = {} # type: Dict[unicode, unicode]
|
||||
|
@ -25,6 +25,7 @@ except ImportError:
|
||||
Image = None
|
||||
|
||||
from docutils import nodes
|
||||
from docutils.utils import smartquotes
|
||||
|
||||
from sphinx import addnodes
|
||||
from sphinx.builders.html import StandaloneHTMLBuilder
|
||||
@ -32,7 +33,6 @@ from sphinx.util import logging
|
||||
from sphinx.util import status_iterator
|
||||
from sphinx.util.osutil import ensuredir, copyfile
|
||||
from sphinx.util.fileutil import copy_asset_file
|
||||
from sphinx.util.smartypants import sphinx_smarty_pants as ssp
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
@ -94,6 +94,18 @@ Guide = namedtuple('Guide', ['type', 'title', 'uri'])
|
||||
NavPoint = namedtuple('NavPoint', ['navpoint', 'playorder', 'text', 'refuri', 'children'])
|
||||
|
||||
|
||||
def sphinx_smarty_pants(t, language='en'):
|
||||
# type: (unicode, str) -> unicode
|
||||
t = t.replace('"', '"')
|
||||
t = smartquotes.educateDashesOldSchool(t)
|
||||
t = smartquotes.educateQuotes(t, language)
|
||||
t = t.replace('"', '"')
|
||||
return t
|
||||
|
||||
|
||||
ssp = sphinx_smarty_pants
|
||||
|
||||
|
||||
# The epub publisher
|
||||
|
||||
class EpubBuilder(StandaloneHTMLBuilder):
|
||||
@ -437,7 +449,7 @@ class EpubBuilder(StandaloneHTMLBuilder):
|
||||
This method is overwritten for genindex pages in order to fix href link
|
||||
attributes.
|
||||
"""
|
||||
if pagename.startswith('genindex'):
|
||||
if pagename.startswith('genindex') and 'genindexentries' in addctx:
|
||||
if not self.use_index:
|
||||
return
|
||||
self.fix_genindex(addctx['genindexentries'])
|
||||
|
@ -190,7 +190,7 @@ class AppleHelpBuilder(StandaloneHTMLBuilder):
|
||||
|
||||
# Build the access page
|
||||
logger.info(bold('building access page...'), nonl=True)
|
||||
with codecs.open(path.join(language_dir, '_access.html'), 'w') as f:
|
||||
with codecs.open(path.join(language_dir, '_access.html'), 'w') as f: # type: ignore
|
||||
f.write(access_page_template % {
|
||||
'toc': htmlescape(toc, quote=True),
|
||||
'title': htmlescape(self.config.applehelp_title)
|
||||
|
@ -1,100 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
sphinx.builders.epub2
|
||||
~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Build epub2 files.
|
||||
Originally derived from qthelp.py.
|
||||
|
||||
:copyright: Copyright 2007-2016 by the Sphinx team, see AUTHORS.
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
|
||||
import warnings
|
||||
from os import path
|
||||
|
||||
from sphinx import package_dir
|
||||
from sphinx.builders import _epub_base
|
||||
from sphinx.util.osutil import make_filename
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Any, Dict # NOQA
|
||||
from sphinx.application import Sphinx # NOQA
|
||||
|
||||
|
||||
DOCTYPE = '''<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
|
||||
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">'''
|
||||
|
||||
|
||||
# The epub publisher
|
||||
|
||||
class Epub2Builder(_epub_base.EpubBuilder):
|
||||
"""
|
||||
Builder that outputs epub files.
|
||||
|
||||
It creates the metainfo files container.opf, toc.ncx, mimetype, and
|
||||
META-INF/container.xml. Afterwards, all necessary files are zipped to an
|
||||
epub file.
|
||||
"""
|
||||
name = 'epub2'
|
||||
|
||||
template_dir = path.join(package_dir, 'templates', 'epub2')
|
||||
doctype = DOCTYPE
|
||||
|
||||
# Finish by building the epub file
|
||||
def handle_finish(self):
|
||||
# type: () -> None
|
||||
"""Create the metainfo files and finally the epub."""
|
||||
self.get_toc()
|
||||
self.build_mimetype(self.outdir, 'mimetype')
|
||||
self.build_container(self.outdir, 'META-INF/container.xml')
|
||||
self.build_content(self.outdir, 'content.opf')
|
||||
self.build_toc(self.outdir, 'toc.ncx')
|
||||
self.build_epub(self.outdir, self.config.epub_basename + '.epub')
|
||||
|
||||
|
||||
def emit_deprecation_warning(app):
|
||||
# type: (Sphinx) -> None
|
||||
if app.builder.__class__ is Epub2Builder:
|
||||
warnings.warn('epub2 builder is deprecated. Please use epub3 builder instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
|
||||
|
||||
def setup(app):
|
||||
# type: (Sphinx) -> Dict[unicode, Any]
|
||||
app.setup_extension('sphinx.builders.html')
|
||||
app.add_builder(Epub2Builder)
|
||||
app.connect('builder-inited', emit_deprecation_warning)
|
||||
|
||||
# config values
|
||||
app.add_config_value('epub_basename', lambda self: make_filename(self.project), None)
|
||||
app.add_config_value('epub_theme', 'epub', 'html')
|
||||
app.add_config_value('epub_theme_options', {}, 'html')
|
||||
app.add_config_value('epub_title', lambda self: self.html_title, 'html')
|
||||
app.add_config_value('epub_author', 'unknown', 'html')
|
||||
app.add_config_value('epub_language', lambda self: self.language or 'en', 'html')
|
||||
app.add_config_value('epub_publisher', 'unknown', 'html')
|
||||
app.add_config_value('epub_copyright', lambda self: self.copyright, 'html')
|
||||
app.add_config_value('epub_identifier', 'unknown', 'html')
|
||||
app.add_config_value('epub_scheme', 'unknown', 'html')
|
||||
app.add_config_value('epub_uid', 'unknown', 'env')
|
||||
app.add_config_value('epub_cover', (), 'env')
|
||||
app.add_config_value('epub_guide', (), 'env')
|
||||
app.add_config_value('epub_pre_files', [], 'env')
|
||||
app.add_config_value('epub_post_files', [], 'env')
|
||||
app.add_config_value('epub_exclude_files', [], 'env')
|
||||
app.add_config_value('epub_tocdepth', 3, 'env')
|
||||
app.add_config_value('epub_tocdup', True, 'env')
|
||||
app.add_config_value('epub_tocscope', 'default', 'env')
|
||||
app.add_config_value('epub_fix_images', False, 'env')
|
||||
app.add_config_value('epub_max_image_width', 0, 'env')
|
||||
app.add_config_value('epub_show_urls', 'inline', 'html')
|
||||
app.add_config_value('epub_use_index', lambda self: self.html_use_index, 'html')
|
||||
|
||||
return {
|
||||
'version': 'builtin',
|
||||
'parallel_read_safe': True,
|
||||
'parallel_write_safe': True,
|
||||
}
|
@ -19,6 +19,7 @@ from sphinx.config import string_classes, ENUM
|
||||
from sphinx.builders import _epub_base
|
||||
from sphinx.util import logging
|
||||
from sphinx.util.fileutil import copy_asset_file
|
||||
from sphinx.util.osutil import make_filename
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
@ -225,12 +226,32 @@ class Epub3Builder(_epub_base.EpubBuilder):
|
||||
|
||||
def setup(app):
|
||||
# type: (Sphinx) -> Dict[unicode, Any]
|
||||
|
||||
app.setup_extension('sphinx.builders.epub2')
|
||||
|
||||
app.add_builder(Epub3Builder)
|
||||
|
||||
# config values
|
||||
app.add_config_value('epub_basename', lambda self: make_filename(self.project), None)
|
||||
app.add_config_value('epub_theme', 'epub', 'html')
|
||||
app.add_config_value('epub_theme_options', {}, 'html')
|
||||
app.add_config_value('epub_title', lambda self: self.html_title, 'html')
|
||||
app.add_config_value('epub_author', 'unknown', 'html')
|
||||
app.add_config_value('epub_language', lambda self: self.language or 'en', 'html')
|
||||
app.add_config_value('epub_publisher', 'unknown', 'html')
|
||||
app.add_config_value('epub_copyright', lambda self: self.copyright, 'html')
|
||||
app.add_config_value('epub_identifier', 'unknown', 'html')
|
||||
app.add_config_value('epub_scheme', 'unknown', 'html')
|
||||
app.add_config_value('epub_uid', 'unknown', 'env')
|
||||
app.add_config_value('epub_cover', (), 'env')
|
||||
app.add_config_value('epub_guide', (), 'env')
|
||||
app.add_config_value('epub_pre_files', [], 'env')
|
||||
app.add_config_value('epub_post_files', [], 'env')
|
||||
app.add_config_value('epub_exclude_files', [], 'env')
|
||||
app.add_config_value('epub_tocdepth', 3, 'env')
|
||||
app.add_config_value('epub_tocdup', True, 'env')
|
||||
app.add_config_value('epub_tocscope', 'default', 'env')
|
||||
app.add_config_value('epub_fix_images', False, 'env')
|
||||
app.add_config_value('epub_max_image_width', 0, 'env')
|
||||
app.add_config_value('epub_show_urls', 'inline', 'html')
|
||||
app.add_config_value('epub_use_index', lambda self: self.html_use_index, 'html')
|
||||
app.add_config_value('epub_description', 'unknown', 'epub3', string_classes)
|
||||
app.add_config_value('epub_contributor', 'unknown', 'epub3', string_classes)
|
||||
app.add_config_value('epub_writing_mode', 'horizontal', 'epub3',
|
||||
|
@ -31,7 +31,7 @@ from sphinx.locale import pairindextypes
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Any, Dict, Iterable, List, Set, Tuple # NOQA
|
||||
from typing import Any, DefaultDict, Dict, Iterable, List, Set, Tuple # NOQA
|
||||
from docutils import nodes # NOQA
|
||||
from sphinx.util.i18n import CatalogInfo # NOQA
|
||||
from sphinx.application import Sphinx # NOQA
|
||||
@ -122,7 +122,7 @@ class I18nBuilder(Builder):
|
||||
self.env.set_versioning_method(self.versioning_method,
|
||||
self.env.config.gettext_uuid)
|
||||
self.tags = I18nTags()
|
||||
self.catalogs = defaultdict(Catalog) # type: defaultdict[unicode, Catalog]
|
||||
self.catalogs = defaultdict(Catalog) # type: DefaultDict[unicode, Catalog]
|
||||
|
||||
def get_target_uri(self, docname, typ=None):
|
||||
# type: (unicode, unicode) -> unicode
|
||||
@ -194,14 +194,18 @@ ltz = LocalTimeZone()
|
||||
def should_write(filepath, new_content):
|
||||
if not path.exists(filepath):
|
||||
return True
|
||||
with open(filepath, 'r', encoding='utf-8') as oldpot: # type: ignore
|
||||
old_content = oldpot.read()
|
||||
old_header_index = old_content.index('"POT-Creation-Date:')
|
||||
new_header_index = new_content.index('"POT-Creation-Date:')
|
||||
old_body_index = old_content.index('"PO-Revision-Date:')
|
||||
new_body_index = new_content.index('"PO-Revision-Date:')
|
||||
return ((old_content[:old_header_index] != new_content[:new_header_index]) or
|
||||
(new_content[new_body_index:] != old_content[old_body_index:]))
|
||||
try:
|
||||
with open(filepath, 'r', encoding='utf-8') as oldpot: # type: ignore
|
||||
old_content = oldpot.read()
|
||||
old_header_index = old_content.index('"POT-Creation-Date:')
|
||||
new_header_index = new_content.index('"POT-Creation-Date:')
|
||||
old_body_index = old_content.index('"PO-Revision-Date:')
|
||||
new_body_index = new_content.index('"PO-Revision-Date:')
|
||||
return ((old_content[:old_header_index] != new_content[:new_header_index]) or
|
||||
(new_content[new_body_index:] != old_content[old_body_index:]))
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
@ -13,6 +13,7 @@ import os
|
||||
import re
|
||||
import sys
|
||||
import codecs
|
||||
import warnings
|
||||
import posixpath
|
||||
from os import path
|
||||
from hashlib import md5
|
||||
@ -38,6 +39,7 @@ from sphinx.util.docutils import is_html5_writer_available, __version_info__
|
||||
from sphinx.util.fileutil import copy_asset
|
||||
from sphinx.util.matching import patmatch, Matcher, DOTFILES
|
||||
from sphinx.config import string_classes
|
||||
from sphinx.deprecation import RemovedInSphinx20Warning
|
||||
from sphinx.locale import _, l_
|
||||
from sphinx.search import js_index
|
||||
from sphinx.theming import HTMLThemeFactory
|
||||
@ -45,8 +47,7 @@ from sphinx.builders import Builder
|
||||
from sphinx.application import ENV_PICKLE_FILENAME
|
||||
from sphinx.highlighting import PygmentsBridge
|
||||
from sphinx.util.console import bold, darkgreen # type: ignore
|
||||
from sphinx.writers.html import HTMLWriter, HTMLTranslator, \
|
||||
SmartyPantsHTMLTranslator
|
||||
from sphinx.writers.html import HTMLWriter, HTMLTranslator
|
||||
from sphinx.environment.adapters.asset import ImageAdapter
|
||||
from sphinx.environment.adapters.toctree import TocTree
|
||||
from sphinx.environment.adapters.indexentries import IndexEntries
|
||||
@ -59,7 +60,7 @@ if False:
|
||||
|
||||
# Experimental HTML5 Writer
|
||||
if is_html5_writer_available():
|
||||
from sphinx.writers.html5 import HTML5Translator, SmartyPantsHTML5Translator
|
||||
from sphinx.writers.html5 import HTML5Translator
|
||||
html5_ready = True
|
||||
else:
|
||||
html5_ready = False
|
||||
@ -87,6 +88,47 @@ def get_stable_hash(obj):
|
||||
return md5(text_type(obj).encode('utf8')).hexdigest()
|
||||
|
||||
|
||||
class CSSContainer(list):
|
||||
"""The container of stylesheets.
|
||||
|
||||
To support the extensions which access the container directly, this wraps
|
||||
the entry with Stylesheet class.
|
||||
"""
|
||||
def append(self, obj):
|
||||
if isinstance(obj, Stylesheet):
|
||||
super(CSSContainer, self).append(obj)
|
||||
else:
|
||||
super(CSSContainer, self).append(Stylesheet(obj, None, 'stylesheet'))
|
||||
|
||||
def insert(self, index, obj):
|
||||
warnings.warn('builder.css_files is deprecated. '
|
||||
'Please use app.add_stylesheet() instead.',
|
||||
RemovedInSphinx20Warning)
|
||||
if isinstance(obj, Stylesheet):
|
||||
super(CSSContainer, self).insert(index, obj)
|
||||
else:
|
||||
super(CSSContainer, self).insert(index, Stylesheet(obj, None, 'stylesheet'))
|
||||
|
||||
def extend(self, other):
|
||||
warnings.warn('builder.css_files is deprecated. '
|
||||
'Please use app.add_stylesheet() instead.',
|
||||
RemovedInSphinx20Warning)
|
||||
for item in other:
|
||||
self.append(item)
|
||||
|
||||
def __iadd__(self, other):
|
||||
warnings.warn('builder.css_files is deprecated. '
|
||||
'Please use app.add_stylesheet() instead.',
|
||||
RemovedInSphinx20Warning)
|
||||
for item in other:
|
||||
self.append(item)
|
||||
|
||||
def __add__(self, other):
|
||||
ret = CSSContainer(self)
|
||||
ret += other
|
||||
return ret
|
||||
|
||||
|
||||
class Stylesheet(text_type):
|
||||
"""The metadata of stylesheet.
|
||||
|
||||
@ -136,7 +178,7 @@ class StandaloneHTMLBuilder(Builder):
|
||||
script_files = ['_static/jquery.js', '_static/underscore.js',
|
||||
'_static/doctools.js'] # type: List[unicode]
|
||||
# Ditto for this one (Sphinx.add_stylesheet).
|
||||
css_files = [] # type: List[Dict[unicode, unicode]]
|
||||
css_files = CSSContainer() # type: List[Dict[unicode, unicode]]
|
||||
|
||||
imgpath = None # type: unicode
|
||||
domain_indices = [] # type: List[Tuple[unicode, Type[Index], List[Tuple[unicode, List[List[Union[unicode, int]]]]], bool]] # NOQA
|
||||
@ -220,15 +262,9 @@ class StandaloneHTMLBuilder(Builder):
|
||||
@property
|
||||
def default_translator_class(self):
|
||||
if self.config.html_experimental_html5_writer and html5_ready:
|
||||
if self.config.html_use_smartypants:
|
||||
return SmartyPantsHTML5Translator
|
||||
else:
|
||||
return HTML5Translator
|
||||
return HTML5Translator
|
||||
else:
|
||||
if self.config.html_use_smartypants:
|
||||
return SmartyPantsHTMLTranslator
|
||||
else:
|
||||
return HTMLTranslator
|
||||
return HTMLTranslator
|
||||
|
||||
def get_outdated_docs(self):
|
||||
# type: () -> Iterator[unicode]
|
||||
@ -354,7 +390,7 @@ class StandaloneHTMLBuilder(Builder):
|
||||
# typically doesn't include the time of day
|
||||
lufmt = self.config.html_last_updated_fmt
|
||||
if lufmt is not None:
|
||||
self.last_updated = format_date(lufmt or _('%b %d, %Y'),
|
||||
self.last_updated = format_date(lufmt or _('%b %d, %Y'), # type: ignore
|
||||
language=self.config.language)
|
||||
else:
|
||||
self.last_updated = None
|
||||
@ -787,7 +823,7 @@ class StandaloneHTMLBuilder(Builder):
|
||||
else:
|
||||
f = open(searchindexfn, 'rb') # type: ignore
|
||||
with f:
|
||||
self.indexer.load(f, self.indexer_format) # type: ignore
|
||||
self.indexer.load(f, self.indexer_format)
|
||||
except (IOError, OSError, ValueError):
|
||||
if keep:
|
||||
logger.warning('search index couldn\'t be loaded, but not all '
|
||||
@ -957,7 +993,7 @@ class StandaloneHTMLBuilder(Builder):
|
||||
else:
|
||||
f = open(searchindexfn + '.tmp', 'wb') # type: ignore
|
||||
with f:
|
||||
self.indexer.dump(f, self.indexer_format) # type: ignore
|
||||
self.indexer.dump(f, self.indexer_format)
|
||||
movefile(searchindexfn + '.tmp', searchindexfn)
|
||||
logger.info('done')
|
||||
|
||||
@ -1319,7 +1355,6 @@ def setup(app):
|
||||
app.add_config_value('html_static_path', [], 'html')
|
||||
app.add_config_value('html_extra_path', [], 'html')
|
||||
app.add_config_value('html_last_updated_fmt', None, 'html', string_classes)
|
||||
app.add_config_value('html_use_smartypants', True, 'html')
|
||||
app.add_config_value('html_sidebars', {}, 'html')
|
||||
app.add_config_value('html_additional_pages', {}, 'html')
|
||||
app.add_config_value('html_domain_indices', True, 'html', [list])
|
||||
|
@ -10,7 +10,6 @@
|
||||
"""
|
||||
|
||||
import os
|
||||
import warnings
|
||||
from os import path
|
||||
|
||||
from docutils import nodes
|
||||
@ -19,7 +18,6 @@ from docutils.utils import new_document
|
||||
from docutils.frontend import OptionParser
|
||||
|
||||
from sphinx import package_dir, addnodes, highlighting
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning
|
||||
from sphinx.config import string_classes, ENUM
|
||||
from sphinx.errors import SphinxError
|
||||
from sphinx.locale import _
|
||||
@ -250,29 +248,6 @@ class LaTeXBuilder(Builder):
|
||||
path.join(self.srcdir, src), err)
|
||||
|
||||
|
||||
def validate_config_values(app):
|
||||
# type: (Sphinx) -> None
|
||||
if app.config.latex_toplevel_sectioning not in (None, 'part', 'chapter', 'section'):
|
||||
logger.warning('invalid latex_toplevel_sectioning, ignored: %s',
|
||||
app.config.latex_toplevel_sectioning)
|
||||
app.config.latex_toplevel_sectioning = None # type: ignore
|
||||
|
||||
if 'footer' in app.config.latex_elements:
|
||||
if 'postamble' in app.config.latex_elements:
|
||||
logger.warning("latex_elements['footer'] conflicts with "
|
||||
"latex_elements['postamble'], ignored.")
|
||||
else:
|
||||
warnings.warn("latex_elements['footer'] is deprecated. "
|
||||
"Use latex_elements['preamble'] instead.",
|
||||
RemovedInSphinx17Warning)
|
||||
app.config.latex_elements['postamble'] = app.config.latex_elements['footer']
|
||||
|
||||
if app.config.latex_keep_old_macro_names:
|
||||
warnings.warn("latex_keep_old_macro_names is deprecated. "
|
||||
"LaTeX markup since Sphinx 1.4.5 uses only prefixed macro names.",
|
||||
RemovedInSphinx17Warning)
|
||||
|
||||
|
||||
def default_latex_engine(config):
|
||||
# type: (Config) -> unicode
|
||||
""" Better default latex_engine settings for specific languages. """
|
||||
@ -295,7 +270,6 @@ def default_latex_docclass(config):
|
||||
def setup(app):
|
||||
# type: (Sphinx) -> Dict[unicode, Any]
|
||||
app.add_builder(LaTeXBuilder)
|
||||
app.connect('builder-inited', validate_config_values)
|
||||
|
||||
app.add_config_value('latex_engine', default_latex_engine, None,
|
||||
ENUM('pdflatex', 'xelatex', 'lualatex', 'platex'))
|
||||
@ -305,9 +279,9 @@ def setup(app):
|
||||
None)
|
||||
app.add_config_value('latex_logo', None, None, string_classes)
|
||||
app.add_config_value('latex_appendices', [], None)
|
||||
app.add_config_value('latex_keep_old_macro_names', False, None)
|
||||
app.add_config_value('latex_use_latex_multicolumn', False, None)
|
||||
app.add_config_value('latex_toplevel_sectioning', None, None, [str])
|
||||
app.add_config_value('latex_toplevel_sectioning', None, None,
|
||||
ENUM('part', 'chapter', 'section'))
|
||||
app.add_config_value('latex_domain_indices', True, None, [list])
|
||||
app.add_config_value('latex_show_urls', 'no', None)
|
||||
app.add_config_value('latex_show_pagerefs', False, None)
|
||||
|
@ -16,7 +16,7 @@ import threading
|
||||
from os import path
|
||||
|
||||
from requests.exceptions import HTTPError
|
||||
from six.moves import queue, html_parser # type: ignore
|
||||
from six.moves import queue, html_parser
|
||||
from six.moves.urllib.parse import unquote
|
||||
from docutils import nodes
|
||||
|
||||
@ -105,8 +105,8 @@ class CheckExternalLinksBuilder(Builder):
|
||||
open(path.join(self.outdir, 'output.txt'), 'w').close()
|
||||
|
||||
# create queues and worker threads
|
||||
self.wqueue = queue.Queue()
|
||||
self.rqueue = queue.Queue()
|
||||
self.wqueue = queue.Queue() # type: queue.Queue
|
||||
self.rqueue = queue.Queue() # type: queue.Queue
|
||||
self.workers = [] # type: List[threading.Thread]
|
||||
for i in range(self.app.config.linkcheck_workers):
|
||||
thread = threading.Thread(target=self.check_thread)
|
||||
|
@ -206,7 +206,7 @@ class QtHelpBuilder(StandaloneHTMLBuilder):
|
||||
|
||||
# write the project file
|
||||
with codecs.open(path.join(outdir, outname + '.qhp'), 'w', 'utf-8') as f: # type: ignore # NOQA
|
||||
f.write(project_template % { # type: ignore
|
||||
f.write(project_template % {
|
||||
'outname': htmlescape(outname),
|
||||
'title': htmlescape(self.config.html_title),
|
||||
'version': htmlescape(self.config.version),
|
||||
@ -223,7 +223,7 @@ class QtHelpBuilder(StandaloneHTMLBuilder):
|
||||
|
||||
logger.info('writing collection project file...')
|
||||
with codecs.open(path.join(outdir, outname + '.qhcp'), 'w', 'utf-8') as f: # type: ignore # NOQA
|
||||
f.write(collection_template % { # type: ignore
|
||||
f.write(collection_template % {
|
||||
'outname': htmlescape(outname),
|
||||
'title': htmlescape(self.config.html_short_title),
|
||||
'homepage': htmlescape(homepage),
|
||||
|
@ -9,8 +9,6 @@
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
|
||||
from sphinxcontrib.websupport.builder import WebSupportBuilder
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Any, Dict # NOQA
|
||||
@ -19,7 +17,11 @@ if False:
|
||||
|
||||
def setup(app):
|
||||
# type: (Sphinx) -> Dict[unicode, Any]
|
||||
app.add_builder(WebSupportBuilder)
|
||||
try:
|
||||
from sphinxcontrib.websupport.builder import WebSupportBuilder
|
||||
app.add_builder(WebSupportBuilder)
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
return {
|
||||
'version': 'builtin',
|
||||
|
@ -10,11 +10,11 @@
|
||||
"""
|
||||
|
||||
|
||||
class RemovedInSphinx17Warning(DeprecationWarning):
|
||||
class RemovedInSphinx18Warning(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
class RemovedInSphinx18Warning(PendingDeprecationWarning):
|
||||
class RemovedInSphinx19Warning(PendingDeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
@ -22,4 +22,4 @@ class RemovedInSphinx20Warning(PendingDeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
RemovedInNextVersionWarning = RemovedInSphinx17Warning
|
||||
RemovedInNextVersionWarning = RemovedInSphinx18Warning
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -20,12 +20,13 @@ import warnings
|
||||
from os import path
|
||||
from collections import defaultdict
|
||||
|
||||
from six import StringIO, itervalues, class_types, next
|
||||
from six import BytesIO, itervalues, class_types, next
|
||||
from six.moves import cPickle as pickle
|
||||
|
||||
from docutils.io import NullOutput
|
||||
from docutils.core import Publisher
|
||||
from docutils.utils import Reporter, get_source_line
|
||||
from docutils.utils.smartquotes import smartchars
|
||||
from docutils.parsers.rst import roles
|
||||
from docutils.parsers.rst.languages import en as english
|
||||
from docutils.frontend import OptionParser
|
||||
@ -46,7 +47,7 @@ from sphinx.errors import SphinxError, ExtensionError
|
||||
from sphinx.locale import _
|
||||
from sphinx.transforms import SphinxTransformer
|
||||
from sphinx.versioning import add_uids, merge_doctrees
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning, RemovedInSphinx20Warning
|
||||
from sphinx.deprecation import RemovedInSphinx20Warning
|
||||
from sphinx.environment.adapters.indexentries import IndexEntries
|
||||
from sphinx.environment.adapters.toctree import TocTree
|
||||
|
||||
@ -121,7 +122,7 @@ class BuildEnvironment(object):
|
||||
@classmethod
|
||||
def loads(cls, string, app=None):
|
||||
# type: (unicode, Sphinx) -> BuildEnvironment
|
||||
io = StringIO(string)
|
||||
io = BytesIO(string)
|
||||
return cls.load(io, app)
|
||||
|
||||
@classmethod
|
||||
@ -156,7 +157,7 @@ class BuildEnvironment(object):
|
||||
@classmethod
|
||||
def dumps(cls, env):
|
||||
# type: (BuildEnvironment) -> unicode
|
||||
io = StringIO()
|
||||
io = BytesIO()
|
||||
cls.dump(env, io)
|
||||
return io.getvalue()
|
||||
|
||||
@ -671,6 +672,10 @@ class BuildEnvironment(object):
|
||||
self.settings['trim_footnote_reference_space'] = \
|
||||
self.config.trim_footnote_reference_space
|
||||
self.settings['gettext_compact'] = self.config.gettext_compact
|
||||
language = (self.config.language or 'en').replace('_', '-')
|
||||
self.settings['language_code'] = language
|
||||
if language in smartchars.quotes: # We enable smartypants by default
|
||||
self.settings['smart_quotes'] = True
|
||||
|
||||
docutilsconf = path.join(self.srcdir, 'docutils.conf')
|
||||
# read docutils.conf from source dir, not from current dir
|
||||
@ -767,24 +772,6 @@ class BuildEnvironment(object):
|
||||
"""Returns the docname of the document currently being parsed."""
|
||||
return self.temp_data['docname']
|
||||
|
||||
@property
|
||||
def currmodule(self):
|
||||
# type: () -> None
|
||||
"""Backwards compatible alias. Will be removed."""
|
||||
warnings.warn('env.currmodule is deprecated. '
|
||||
'Use env.ref_context["py:module"] instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
return self.ref_context.get('py:module')
|
||||
|
||||
@property
|
||||
def currclass(self):
|
||||
# type: () -> None
|
||||
"""Backwards compatible alias. Will be removed."""
|
||||
warnings.warn('env.currclass is deprecated. '
|
||||
'Use env.ref_context["py:class"] instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
return self.ref_context.get('py:class')
|
||||
|
||||
def new_serialno(self, category=''):
|
||||
# type: (unicode) -> int
|
||||
"""Return a serial number, e.g. for index entry targets.
|
||||
|
@ -48,7 +48,7 @@ try:
|
||||
if sys.version_info >= (3,):
|
||||
import typing
|
||||
else:
|
||||
typing = None # type: ignore
|
||||
typing = None
|
||||
except ImportError:
|
||||
typing = None
|
||||
|
||||
@ -473,10 +473,19 @@ def formatargspec(function, args, varargs=None, varkw=None, defaults=None,
|
||||
|
||||
for i, arg in enumerate(args):
|
||||
arg_fd = StringIO()
|
||||
arg_fd.write(format_arg_with_annotation(arg))
|
||||
if defaults and i >= defaults_start:
|
||||
arg_fd.write(' = ' if arg in annotations else '=')
|
||||
arg_fd.write(object_description(defaults[i - defaults_start])) # type: ignore
|
||||
if isinstance(arg, list):
|
||||
# support tupled arguments list (only for py2): def foo((x, y))
|
||||
arg_fd.write('(')
|
||||
arg_fd.write(format_arg_with_annotation(arg[0]))
|
||||
for param in arg[1:]:
|
||||
arg_fd.write(', ')
|
||||
arg_fd.write(format_arg_with_annotation(param))
|
||||
arg_fd.write(')')
|
||||
else:
|
||||
arg_fd.write(format_arg_with_annotation(arg))
|
||||
if defaults and i >= defaults_start:
|
||||
arg_fd.write(' = ' if arg in annotations else '=')
|
||||
arg_fd.write(object_description(defaults[i - defaults_start])) # type: ignore
|
||||
formatted.append(arg_fd.getvalue())
|
||||
|
||||
if varargs:
|
||||
|
@ -259,8 +259,7 @@ def find_autosummary_in_files(filenames):
|
||||
with codecs.open(filename, 'r', encoding='utf-8', # type: ignore
|
||||
errors='ignore') as f:
|
||||
lines = f.read().splitlines()
|
||||
documented.extend(find_autosummary_in_lines(lines, # type: ignore
|
||||
filename=filename))
|
||||
documented.extend(find_autosummary_in_lines(lines, filename=filename))
|
||||
return documented
|
||||
|
||||
|
||||
@ -273,7 +272,7 @@ def find_autosummary_in_docstring(name, module=None, filename=None):
|
||||
try:
|
||||
real_name, obj, parent, modname = import_by_name(name)
|
||||
lines = pydoc.getdoc(obj).splitlines()
|
||||
return find_autosummary_in_lines(lines, module=name, filename=filename)
|
||||
return find_autosummary_in_lines(lines, module=name, filename=filename) # type: ignore
|
||||
except AttributeError:
|
||||
pass
|
||||
except ImportError as e:
|
||||
|
@ -307,7 +307,7 @@ class DocTestBuilder(Builder):
|
||||
self.outfile = None # type: IO
|
||||
self.outfile = codecs.open(path.join(self.outdir, 'output.txt'), # type: ignore
|
||||
'w', encoding='utf-8')
|
||||
self.outfile.write(('Results of doctest builder run on %s\n' # type: ignore
|
||||
self.outfile.write(('Results of doctest builder run on %s\n'
|
||||
'==================================%s\n') %
|
||||
(date, '=' * len(date)))
|
||||
|
||||
|
@ -66,7 +66,7 @@ def process_ifconfig_nodes(app, doctree, docname):
|
||||
except Exception as err:
|
||||
# handle exceptions in a clean fashion
|
||||
from traceback import format_exception_only
|
||||
msg = ''.join(format_exception_only(err.__class__, err)) # type: ignore
|
||||
msg = ''.join(format_exception_only(err.__class__, err))
|
||||
newnode = doctree.reporter.error('Exception occured in '
|
||||
'ifconfig expression: \n%s' %
|
||||
msg, base_node=node)
|
||||
|
@ -42,7 +42,7 @@ import inspect
|
||||
try:
|
||||
from hashlib import md5
|
||||
except ImportError:
|
||||
from md5 import md5 # type: ignore
|
||||
from md5 import md5
|
||||
|
||||
from six import text_type
|
||||
from six.moves import builtins
|
||||
|
@ -156,7 +156,7 @@ def collect_pages(app):
|
||||
# construct a page name for the highlighted source
|
||||
pagename = '_modules/' + modname.replace('.', '/')
|
||||
# highlight the source using the builder's highlighter
|
||||
if env.config.highlight_language in ('python3', 'default'):
|
||||
if env.config.highlight_language in ('python3', 'default', 'none'):
|
||||
lexer = env.config.highlight_language
|
||||
else:
|
||||
lexer = 'python'
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -557,7 +557,7 @@ def valid_dir(d):
|
||||
|
||||
|
||||
class MyFormatter(optparse.IndentedHelpFormatter):
|
||||
def format_usage(self, usage):
|
||||
def format_usage(self, usage): # type: ignore
|
||||
# type: (str) -> str
|
||||
return usage
|
||||
|
||||
|
@ -133,7 +133,7 @@ class SphinxComponentRegistry(object):
|
||||
directive = type(directivename, # type: ignore
|
||||
(GenericObject, object),
|
||||
{'indextemplate': indextemplate,
|
||||
'parse_node': staticmethod(parse_node), # type: ignore
|
||||
'parse_node': staticmethod(parse_node),
|
||||
'doc_field_types': doc_field_types})
|
||||
|
||||
stddomain = self.domains['std']
|
||||
|
@ -41,7 +41,7 @@ from sphinx.util import import_object
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Dict, List # NOQA
|
||||
from typing import Any, Dict, List # NOQA
|
||||
|
||||
|
||||
class BaseSplitter(object):
|
||||
@ -65,8 +65,8 @@ class MecabSplitter(BaseSplitter):
|
||||
def __init__(self, options):
|
||||
# type: (Dict) -> None
|
||||
super(MecabSplitter, self).__init__(options)
|
||||
self.ctypes_libmecab = None # type: ignore
|
||||
self.ctypes_mecab = None # type: ignore
|
||||
self.ctypes_libmecab = None # type: Any
|
||||
self.ctypes_mecab = None # type: Any
|
||||
if not native_module:
|
||||
self.init_ctypes(options)
|
||||
else:
|
||||
|
@ -25,16 +25,6 @@
|
||||
\usepackage<%= sphinxpkgoptions %>{sphinx}
|
||||
<%= sphinxsetup %>
|
||||
<%= geometry %>
|
||||
\usepackage{multirow}
|
||||
\let\originalmutirow\multirow\protected\def\multirow{%
|
||||
\sphinxdeprecationwarning{\multirow}{1.6}{1.7}
|
||||
{Sphinx does not use package multirow. Its loading will be removed at 1.7.}%
|
||||
\originalmultirow}%
|
||||
\usepackage{eqparbox}
|
||||
\let\originaleqparbox\eqparbox\protected\def\eqparbox{%
|
||||
\sphinxdeprecationwarning{\eqparbox}{1.6}{1.7}
|
||||
{Sphinx does not use package eqparbox. Its loading will be removed at 1.7.}%
|
||||
\originaleqparbox}%
|
||||
<%= usepackages %>
|
||||
<%= hyperref %>
|
||||
<%= contentsname %>
|
||||
|
@ -14,8 +14,10 @@ ALLIMGS = $(wildcard *.png *.gif *.jpg *.jpeg)
|
||||
|
||||
# Prefix for archive names
|
||||
ARCHIVEPRREFIX =
|
||||
# Additional LaTeX options
|
||||
LATEXOPTS =
|
||||
# Additional LaTeX options (used via latexmkrc/latexmkjarc file)
|
||||
LATEXOPTS = --interaction=nonstopmode
|
||||
# Additional latexmk options
|
||||
LATEXMKOPTS = -f
|
||||
# format: pdf or dvi
|
||||
FMT = pdf
|
||||
|
||||
@ -40,11 +42,11 @@ PDFLATEX = $(LATEX)
|
||||
{% if latex_engine == 'platex' -%}
|
||||
%.dvi: %.tex $(ALLIMGS) FORCE_MAKE
|
||||
for f in *.pdf; do extractbb "$$f"; done
|
||||
$(LATEX) $(LATEXOPTS) '$<'
|
||||
$(LATEX) $(LATEXMKOPTS) '$<'
|
||||
|
||||
{% elif latex_engine != 'xelatex' -%}
|
||||
%.dvi: %.tex FORCE_MAKE
|
||||
$(LATEX) $(LATEXOPTS) '$<'
|
||||
$(LATEX) $(LATEXMKOPTS) '$<'
|
||||
|
||||
{% endif -%}
|
||||
%.ps: %.dvi
|
||||
@ -56,7 +58,9 @@ PDFLATEX = $(LATEX)
|
||||
{%- else -%}
|
||||
%.pdf: %.tex FORCE_MAKE
|
||||
{%- endif %}
|
||||
$(PDFLATEX) $(LATEXOPTS) '$<'
|
||||
$(PDFLATEX) $(LATEXMKOPTS) '$<'
|
||||
|
||||
all: $(ALLPDF)
|
||||
|
||||
all-dvi: $(ALLDVI)
|
||||
|
||||
@ -64,8 +68,6 @@ all-ps: $(ALLPS)
|
||||
|
||||
all-pdf: $(ALLPDF)
|
||||
|
||||
all: $(ALLPDF)
|
||||
|
||||
zip: all-$(FMT)
|
||||
mkdir $(ARCHIVEPREFIX)docs-$(FMT)
|
||||
cp $(ALLPDF) $(ARCHIVEPREFIX)docs-$(FMT)
|
||||
|
@ -1,4 +1,4 @@
|
||||
$latex = 'platex --halt-on-error --interaction=nonstopmode -kanji=utf8 %O %S';
|
||||
$latex = 'platex $LATEXOPTS -kanji=utf8 %O %S';
|
||||
$dvipdf = 'dvipdfmx %O -o %D %S';
|
||||
$makeindex = 'rm -f %D; mendex -U -f -d %B.dic -s python.ist %S || echo "mendex exited with error code $? (ignoring)" && : >> %D';
|
||||
add_cus_dep( "glo", "gls", 0, "makeglo" );
|
||||
|
@ -1,7 +1,7 @@
|
||||
$latex = 'latex --halt-on-error --interaction=nonstopmode %O %S';
|
||||
$pdflatex = 'pdflatex --halt-on-error --interaction=nonstopmode %O %S';
|
||||
$lualatex = 'lualatex --halt-on-error --interaction=nonstopmode %O %S';
|
||||
$xelatex = 'xelatex --no-pdf --halt-on-error --interaction=nonstopmode %O %S';
|
||||
$latex = 'latex $LATEXOPTS %O %S';
|
||||
$pdflatex = 'pdflatex $LATEXOPTS %O %S';
|
||||
$lualatex = 'lualatex $LATEXOPTS %O %S';
|
||||
$xelatex = 'xelatex --no-pdf $LATEXOPTS %O %S';
|
||||
$makeindex = 'makeindex -s python.ist %O -o %D %S';
|
||||
add_cus_dep( "glo", "gls", 0, "makeglo" );
|
||||
sub makeglo {
|
||||
|
@ -6,7 +6,7 @@
|
||||
%
|
||||
|
||||
\NeedsTeXFormat{LaTeX2e}[1995/12/01]
|
||||
\ProvidesPackage{sphinx}[2017/03/26 v1.6 LaTeX package (Sphinx markup)]
|
||||
\ProvidesPackage{sphinx}[2017/05/01 v1.6 LaTeX package (Sphinx markup)]
|
||||
|
||||
% provides \ltx@ifundefined
|
||||
% (many packages load ltxcmds: graphicx does for pdftex and lualatex but
|
||||
@ -16,15 +16,16 @@
|
||||
|
||||
%% for deprecation warnings
|
||||
\newcommand\sphinxdeprecationwarning[4]{% #1 the deprecated macro or name,
|
||||
% #2 = version when deprecated, #3 = version when removed, #4 = message
|
||||
% #2 = when deprecated, #3 = when removed, #4 = additional info
|
||||
\edef\spx@tempa{\detokenize{#1}}%
|
||||
\ltx@ifundefined{sphinx_depr_\spx@tempa}{%
|
||||
\global\expandafter\let\csname sphinx_depr_\spx@tempa\endcsname\spx@tempa
|
||||
\expandafter\AtEndDocument\expandafter{\expandafter\let\expandafter
|
||||
\sphinxdeprecatedmacro\csname sphinx_depr_\spx@tempa\endcsname
|
||||
\PackageWarningNoLine{sphinx}{^^J**** SPHINX DEPRECATION WARNING:^^J
|
||||
\sphinxdeprecatedmacro\space will be (or has been)
|
||||
deprecated at Sphinx #2^^J and will be removed at Sphinx #3.^^J
|
||||
\sphinxdeprecatedmacro^^J
|
||||
\@spaces- is deprecated at Sphinx #2^^J
|
||||
\@spaces- and removed at Sphinx #3.^^J
|
||||
#4^^J****}}%
|
||||
}{% warning already emitted (at end of latex log), don't repeat
|
||||
}}
|
||||
@ -41,10 +42,29 @@
|
||||
\RequirePackage{textcomp}
|
||||
\RequirePackage{titlesec}
|
||||
\@ifpackagelater{titlesec}{2016/03/15}%
|
||||
{\@ifpackagelater{titlesec}{2016/03/21}{}%
|
||||
{\AtEndDocument{\PackageWarningNoLine{sphinx}{^^J%
|
||||
******** ERROR !! PLEASE UPDATE titlesec.sty !!********^^J%
|
||||
******** THIS VERSION SWALLOWS SECTION NUMBERS.********}}}}{}
|
||||
{\@ifpackagelater{titlesec}{2016/03/21}%
|
||||
{}%
|
||||
{\newif\ifsphinx@ttlpatch@ok
|
||||
\IfFileExists{etoolbox.sty}{%
|
||||
\RequirePackage{etoolbox}%
|
||||
\patchcmd{\ttlh@hang}{\parindent\z@}{\parindent\z@\leavevmode}%
|
||||
{\sphinx@ttlpatch@oktrue}{}%
|
||||
\ifsphinx@ttlpatch@ok
|
||||
\patchcmd{\ttlh@hang}{\noindent}{}{}{\sphinx@ttlpatch@okfalse}%
|
||||
\fi
|
||||
}{}%
|
||||
\ifsphinx@ttlpatch@ok
|
||||
\typeout{^^J Package Sphinx Info: ^^J
|
||||
**** titlesec 2.10.1 successfully patched for bugfix ****^^J}%
|
||||
\else
|
||||
\AtEndDocument{\PackageWarningNoLine{sphinx}{^^J%
|
||||
******** titlesec 2.10.1 has a bug, (section numbers disappear) ......|^^J%
|
||||
******** and Sphinx could not patch it, perhaps because your local ...|^^J%
|
||||
******** copy is already fixed without a changed release date. .......|^^J%
|
||||
******** If not, you must update titlesec! ...........................|}}%
|
||||
\fi
|
||||
}%
|
||||
}{}
|
||||
\RequirePackage{tabulary}
|
||||
% tabulary has a bug with its re-definition of \multicolumn in its first pass
|
||||
% which is not \long. But now Sphinx does not use LaTeX's \multicolumn but its
|
||||
@ -185,7 +205,6 @@
|
||||
\DeclareStringOption[.5\dimexpr\inv@mag in\relax]{marginpar}
|
||||
\fi
|
||||
|
||||
\DeclareBoolOption{dontkeepoldnames} % \ifspx@opt@dontkeepoldnames = \iffalse
|
||||
\DeclareStringOption[0]{maxlistdepth}% \newcommand*\spx@opt@maxlistdepth{0}
|
||||
|
||||
% dimensions, we declare the \dimen registers here.
|
||||
@ -1140,14 +1159,6 @@
|
||||
\begin{sphinx#1}{#2}}
|
||||
% in end part, need to go around a LaTeX's "feature"
|
||||
{\edef\spx@temp{\noexpand\end{sphinx\spx@noticetype}}\spx@temp}
|
||||
% use of ``notice'' is for backwards compatibility and will be removed in
|
||||
% Sphinx 1.7.
|
||||
\newenvironment{notice}
|
||||
{\sphinxdeprecationwarning {notice}{1.6}{1.7}{%
|
||||
This document was probably built with a Sphinx extension using ``notice''^^J
|
||||
environment. At Sphinx 1.7, ``notice'' environment will be removed. Please^^J
|
||||
report to extension author to use ``sphinxadmonition'' instead.^^J%
|
||||
****}\begin{sphinxadmonition}}{\end{sphinxadmonition}}
|
||||
|
||||
|
||||
%% PYTHON DOCS MACROS AND ENVIRONMENTS
|
||||
@ -1186,26 +1197,19 @@
|
||||
% {fulllineitems} is the main environment for object descriptions.
|
||||
%
|
||||
\newcommand{\py@itemnewline}[1]{%
|
||||
\kern\labelsep
|
||||
\@tempdima\linewidth
|
||||
\advance\@tempdima \leftmargin\makebox[\@tempdima][l]{#1}%
|
||||
\advance\@tempdima \labelwidth\makebox[\@tempdima][l]{#1}%
|
||||
\kern-\labelsep
|
||||
}
|
||||
|
||||
\newenvironment{fulllineitems}{
|
||||
\begin{list}{}{\labelwidth \leftmargin \labelsep \z@
|
||||
\newenvironment{fulllineitems}{%
|
||||
\begin{list}{}{\labelwidth \leftmargin
|
||||
\rightmargin \z@ \topsep -\parskip \partopsep \parskip
|
||||
\itemsep -\parsep
|
||||
\let\makelabel=\py@itemnewline}
|
||||
\let\makelabel=\py@itemnewline}%
|
||||
}{\end{list}}
|
||||
|
||||
% Redefine description environment so that it is usable inside fulllineitems.
|
||||
%
|
||||
% FIXME: use sphinxdescription, do not redefine description environment!
|
||||
\renewcommand{\description}{%
|
||||
\list{}{\labelwidth\z@
|
||||
\itemindent-\leftmargin
|
||||
\labelsep5pt\relax
|
||||
\let\makelabel=\descriptionlabel}}
|
||||
|
||||
% Signatures, possibly multi-line
|
||||
%
|
||||
\newlength{\py@argswidth}
|
||||
@ -1311,7 +1315,6 @@
|
||||
%% TEXT STYLING
|
||||
%
|
||||
% Some custom font markup commands.
|
||||
% *** the macros without \sphinx prefix are still defined farther down ***
|
||||
\protected\def\sphinxstrong#1{{\textbf{#1}}}
|
||||
% to obtain straight quotes we execute \@noligs as patched by upquote, and
|
||||
% \scantokens is needed in cases where it would be too late for the macro to
|
||||
@ -1347,37 +1350,6 @@
|
||||
\long\protected\def\sphinxoptional#1{%
|
||||
{\textnormal{\Large[}}{#1}\hspace{0.5mm}{\textnormal{\Large]}}}
|
||||
|
||||
\ifspx@opt@dontkeepoldnames\else
|
||||
\let\spx@alreadydefinedlist\@empty
|
||||
\typeout{** (sphinx) defining (legacy) text style macros without \string\sphinx\space prefix}
|
||||
\typeout{** if clashes with packages, do not set latex_keep_old_macro_names=True
|
||||
in conf.py}
|
||||
\@for\@tempa:=code,strong,bfcode,email,tablecontinued,titleref,%
|
||||
menuselection,accelerator,crossref,termref,optional\do
|
||||
{% first, check if command with no prefix already exists
|
||||
\ltx@ifundefined{\@tempa}{%
|
||||
% give it the meaning defined so far with \sphinx prefix
|
||||
\expandafter\let\csname\@tempa\expandafter\endcsname
|
||||
\csname sphinx\@tempa\endcsname
|
||||
% redefine the \sphinx prefixed macro to expand to non-prefixed one
|
||||
\expandafter\def\csname sphinx\@tempa\expandafter\endcsname
|
||||
\expandafter{\csname\@tempa\endcsname}%
|
||||
}{\edef\spx@alreadydefinedlist{\spx@alreadydefinedlist{\@tempa}}}%
|
||||
}%
|
||||
\ifx\spx@alreadydefinedlist\@empty\else
|
||||
\expandafter\@tfor\expandafter\@tempa\expandafter:\expandafter=\spx@alreadydefinedlist\do
|
||||
{% emit warning now
|
||||
\PackageWarning{sphinx}{not redefining already existing \@backslashchar\@tempa\space!^^J%
|
||||
Anyhow, Sphinx mark-up uses only \string\sphinx\@tempa.}%
|
||||
% and also at end of log for better visibility
|
||||
\expandafter\sphinxdeprecationwarning\expandafter{\csname\@tempa\endcsname}{1.6}{1.7}
|
||||
{\sphinxdeprecatedmacro\space already existed at Sphinx loading time! Not redefined!^^J
|
||||
Sphinx mark-up uses only \string\sphinx\expandafter\@gobble\sphinxdeprecatedmacro.}%
|
||||
}%
|
||||
\fi
|
||||
\sphinxdeprecationwarning{latex_keep_old_macro_names=True}{1.6}{1.7}{}%
|
||||
\fi
|
||||
|
||||
% additional customizable styling
|
||||
% FIXME: convert this to package options ?
|
||||
\protected\def\sphinxstyleindexentry {\texttt}
|
||||
@ -1394,10 +1366,8 @@
|
||||
\protected\def\sphinxstyleliteralstrong {\sphinxbfcode}
|
||||
\protected\def\sphinxstyleabbreviation {\textsc}
|
||||
\protected\def\sphinxstyleliteralintitle {\sphinxcode}
|
||||
|
||||
% LaTeX writer uses macros to hide double quotes from \sphinxcode's \@noligs
|
||||
\protected\def\sphinxquotedblleft{``}
|
||||
\protected\def\sphinxquotedblright{''}
|
||||
% figure legend comes after caption and may contain arbitrary body elements
|
||||
\newenvironment{sphinxlegend}{\par\small}{\par}
|
||||
|
||||
% Tell TeX about pathological hyphenation cases:
|
||||
\hyphenation{Base-HTTP-Re-quest-Hand-ler}
|
||||
|
@ -116,7 +116,7 @@ class DefaultSubstitutions(SphinxTransform):
|
||||
text = self.config[refname]
|
||||
if refname == 'today' and not text:
|
||||
# special handling: can also specify a strftime format
|
||||
text = format_date(self.config.today_fmt or _('%b %d, %Y'),
|
||||
text = format_date(self.config.today_fmt or _('%b %d, %Y'), # type: ignore
|
||||
language=self.config.language)
|
||||
ref.replace_self(nodes.Text(text, text))
|
||||
|
||||
|
@ -215,12 +215,12 @@ class Locale(SphinxTransform):
|
||||
for child in patch.children:
|
||||
child.parent = node
|
||||
node.children = patch.children
|
||||
node['translated'] = True
|
||||
node['translated'] = True # to avoid double translation
|
||||
|
||||
# phase2: translation
|
||||
for node, msg in extract_messages(self.document):
|
||||
if node.get('translated', False):
|
||||
continue
|
||||
if node.get('translated', False): # to avoid double translation
|
||||
continue # skip if the node is already translated by phase1
|
||||
|
||||
msgstr = catalog.gettext(msg)
|
||||
# XXX add marker to untranslated parts
|
||||
@ -395,7 +395,7 @@ class Locale(SphinxTransform):
|
||||
if isinstance(node, IMAGE_TYPE_NODES):
|
||||
node.update_all_atts(patch)
|
||||
|
||||
node['translated'] = True
|
||||
node['translated'] = True # to avoid double translation
|
||||
|
||||
if 'index' in self.config.gettext_additional_targets:
|
||||
# Extract and translate messages for index entries.
|
||||
@ -415,6 +415,12 @@ class Locale(SphinxTransform):
|
||||
node['raw_entries'] = entries
|
||||
node['entries'] = new_entries
|
||||
|
||||
# remove translated attribute that is used for avoiding double translation.
|
||||
def has_translatable(node):
|
||||
return isinstance(node, nodes.Element) and 'translated' in node
|
||||
for node in self.document.traverse(has_translatable):
|
||||
node.delattr('translated')
|
||||
|
||||
|
||||
class RemoveTranslatableInline(SphinxTransform):
|
||||
"""
|
||||
|
@ -9,9 +9,13 @@
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
|
||||
import warnings
|
||||
|
||||
from docutils import nodes
|
||||
from docutils.utils import get_source_line
|
||||
|
||||
from sphinx import addnodes
|
||||
from sphinx.deprecation import RemovedInSphinx20Warning
|
||||
from sphinx.environment import NoUri
|
||||
from sphinx.locale import _
|
||||
from sphinx.transforms import SphinxTransform
|
||||
@ -27,6 +31,35 @@ if False:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DocReferenceMigrator(SphinxTransform):
|
||||
"""Migrate :doc: reference to std domain."""
|
||||
|
||||
default_priority = 5 # before ReferencesResolver
|
||||
|
||||
def apply(self):
|
||||
# type: () -> None
|
||||
for node in self.document.traverse(addnodes.pending_xref):
|
||||
if node.get('reftype') == 'doc' and node.get('refdomain') is None:
|
||||
source, line = get_source_line(node)
|
||||
if source and line:
|
||||
location = "%s:%s" % (source, line)
|
||||
elif source:
|
||||
location = "%s:" % source
|
||||
elif line:
|
||||
location = "<unknown>:%s" % line
|
||||
else:
|
||||
location = None
|
||||
|
||||
message = ('Invalid pendig_xref node detected. '
|
||||
':doc: reference should have refdomain=std attribute.')
|
||||
if location:
|
||||
warnings.warn("%s: %s" % (location, message),
|
||||
RemovedInSphinx20Warning)
|
||||
else:
|
||||
warnings.warn(message, RemovedInSphinx20Warning)
|
||||
node['refdomain'] = 'std'
|
||||
|
||||
|
||||
class ReferencesResolver(SphinxTransform):
|
||||
"""
|
||||
Resolves cross-references on doctrees.
|
||||
@ -165,6 +198,7 @@ class OnlyNodeTransform(SphinxTransform):
|
||||
|
||||
def setup(app):
|
||||
# type: (Sphinx) -> Dict[unicode, Any]
|
||||
app.add_post_transform(DocReferenceMigrator)
|
||||
app.add_post_transform(ReferencesResolver)
|
||||
app.add_post_transform(OnlyNodeTransform)
|
||||
|
||||
|
@ -34,6 +34,7 @@ from sphinx.util import logging
|
||||
from sphinx.util.console import strip_colors, colorize, bold, term_width_line # type: ignore
|
||||
from sphinx.util.fileutil import copy_asset_file
|
||||
from sphinx.util.osutil import fs_encoding
|
||||
from sphinx.util import smartypants # noqa
|
||||
|
||||
# import other utilities; partly for backwards compatibility, so don't
|
||||
# prune unused ones indiscriminately
|
||||
|
@ -1,48 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
sphinx.util.compat
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Stuff for docutils compatibility.
|
||||
|
||||
:copyright: Copyright 2007-2017 by the Sphinx team, see AUTHORS.
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
from docutils.parsers.rst import Directive # noqa
|
||||
from docutils import __version__ as _du_version
|
||||
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning
|
||||
|
||||
docutils_version = tuple(int(x) for x in _du_version.split('.')[:2])
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Any, Dict # NOQA
|
||||
|
||||
|
||||
class _DeprecationWrapper(object):
|
||||
def __init__(self, mod, deprecated):
|
||||
# type: (Any, Dict) -> None
|
||||
self._mod = mod
|
||||
self._deprecated = deprecated
|
||||
|
||||
def __getattr__(self, attr):
|
||||
# type: (str) -> Any
|
||||
if attr in self._deprecated:
|
||||
warnings.warn("sphinx.util.compat.%s is deprecated and will be "
|
||||
"removed in Sphinx 1.7, please use the standard "
|
||||
"library version instead." % attr,
|
||||
RemovedInSphinx17Warning, stacklevel=2)
|
||||
return self._deprecated[attr]
|
||||
return getattr(self._mod, attr)
|
||||
|
||||
|
||||
sys.modules[__name__] = _DeprecationWrapper(sys.modules[__name__], dict( # type: ignore
|
||||
docutils_version = docutils_version,
|
||||
Directive = Directive,
|
||||
))
|
@ -359,7 +359,8 @@ class DocFieldTransformer(object):
|
||||
else:
|
||||
fieldtype, content = entry
|
||||
fieldtypes = types.get(fieldtype.name, {})
|
||||
env = self.directive.state.document.settings.env
|
||||
new_list += fieldtype.make_field(fieldtypes, self.directive.domain,
|
||||
content, env=self.directive.env)
|
||||
content, env=env)
|
||||
|
||||
node.replace_self(new_list)
|
||||
|
@ -51,7 +51,7 @@ def copy_asset_file(source, destination, context=None, renderer=None):
|
||||
if destination.lower().endswith('_t'):
|
||||
destination = destination[:-2]
|
||||
with codecs.open(destination, 'w', encoding='utf-8') as fdst: # type: ignore
|
||||
fdst.write(renderer.render_string(fsrc.read(), context)) # type: ignore
|
||||
fdst.write(renderer.render_string(fsrc.read(), context))
|
||||
else:
|
||||
copyfile(source, destination)
|
||||
|
||||
|
@ -118,5 +118,5 @@ def parse_data_uri(uri):
|
||||
elif prop:
|
||||
mimetype = prop
|
||||
|
||||
image_data = base64.b64decode(data) # type: ignore
|
||||
image_data = base64.b64decode(data)
|
||||
return DataURI(mimetype, charset, image_data)
|
||||
|
@ -11,14 +11,12 @@
|
||||
from __future__ import absolute_import
|
||||
|
||||
import re
|
||||
import warnings
|
||||
|
||||
from six import text_type
|
||||
|
||||
from docutils import nodes
|
||||
|
||||
from sphinx import addnodes
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning
|
||||
from sphinx.locale import pairindextypes
|
||||
from sphinx.util import logging
|
||||
|
||||
@ -354,30 +352,6 @@ def set_role_source_info(inliner, lineno, node):
|
||||
node.source, node.line = inliner.reporter.get_source_and_line(lineno)
|
||||
|
||||
|
||||
def process_only_nodes(doctree, tags):
|
||||
# type: (nodes.Node, Tags) -> None
|
||||
# A comment on the comment() nodes being inserted: replacing by [] would
|
||||
# result in a "Losing ids" exception if there is a target node before
|
||||
# the only node, so we make sure docutils can transfer the id to
|
||||
# something, even if it's just a comment and will lose the id anyway...
|
||||
warnings.warn('process_only_nodes() is deprecated. '
|
||||
'Use sphinx.environment.apply_post_transforms() instead.',
|
||||
RemovedInSphinx17Warning)
|
||||
|
||||
for node in doctree.traverse(addnodes.only):
|
||||
try:
|
||||
ret = tags.eval_condition(node['expr'])
|
||||
except Exception as err:
|
||||
logger.warning('exception while evaluating only directive expression: %s', err,
|
||||
location=node)
|
||||
node.replace_self(node.children or nodes.comment())
|
||||
else:
|
||||
if ret:
|
||||
node.replace_self(node.children or nodes.comment())
|
||||
else:
|
||||
node.replace_self(nodes.comment())
|
||||
|
||||
|
||||
# monkey-patch Element.copy to copy the rawsource and line
|
||||
|
||||
def _new_copy(self):
|
||||
|
@ -88,7 +88,7 @@ class ParallelTasks(object):
|
||||
failed = False
|
||||
except BaseException as err:
|
||||
failed = True
|
||||
errmsg = traceback.format_exception_only(err.__class__, err)[0].strip() # type: ignore # NOQA
|
||||
errmsg = traceback.format_exception_only(err.__class__, err)[0].strip()
|
||||
ret = (errmsg, traceback.format_exc())
|
||||
logging.convert_serializable(collector.logs)
|
||||
pipe.send((failed, collector.logs, ret))
|
||||
|
@ -79,7 +79,7 @@ if PY3:
|
||||
return text_type(tree)
|
||||
else:
|
||||
# no need to refactor on 2.x versions
|
||||
convert_with_2to3 = None # type: ignore
|
||||
convert_with_2to3 = None
|
||||
|
||||
|
||||
# htmlescape()
|
||||
|
@ -36,6 +36,16 @@ except ImportError:
|
||||
# for requests < 2.4.0
|
||||
InsecureRequestWarning = None
|
||||
|
||||
try:
|
||||
from requests.packages.urllib3.exceptions import InsecurePlatformWarning
|
||||
except ImportError:
|
||||
try:
|
||||
# for Debian-jessie
|
||||
from urllib3.exceptions import InsecurePlatformWarning # type: ignore
|
||||
except ImportError:
|
||||
# for requests < 2.4.0
|
||||
InsecurePlatformWarning = None
|
||||
|
||||
# try to load requests[security] (but only if SSL is available)
|
||||
try:
|
||||
import ssl
|
||||
@ -48,8 +58,8 @@ else:
|
||||
pkg_resources.VersionConflict):
|
||||
if not getattr(ssl, 'HAS_SNI', False):
|
||||
# don't complain on each url processed about the SSL issue
|
||||
requests.packages.urllib3.disable_warnings(
|
||||
requests.packages.urllib3.exceptions.InsecurePlatformWarning)
|
||||
if InsecurePlatformWarning:
|
||||
requests.packages.urllib3.disable_warnings(InsecurePlatformWarning)
|
||||
warnings.warn(
|
||||
'Some links may return broken results due to being unable to '
|
||||
'check the Server Name Indication (SNI) in the returned SSL cert '
|
||||
@ -110,7 +120,7 @@ def _get_tls_cacert(url, config):
|
||||
certs = getattr(config, 'tls_cacerts', None)
|
||||
if not certs:
|
||||
return True
|
||||
elif isinstance(certs, (string_types, tuple)): # type: ignore
|
||||
elif isinstance(certs, (string_types, tuple)):
|
||||
return certs # type: ignore
|
||||
else:
|
||||
hostname = urlsplit(url)[1]
|
||||
|
@ -1,312 +1,280 @@
|
||||
r"""
|
||||
This is based on SmartyPants.py by `Chad Miller`_ <smartypantspy@chad.org>,
|
||||
version 1.5_1.6.
|
||||
|
||||
Copyright and License
|
||||
=====================
|
||||
|
||||
SmartyPants_ license::
|
||||
|
||||
Copyright (c) 2003 John Gruber
|
||||
(http://daringfireball.net/)
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in
|
||||
the documentation and/or other materials provided with the
|
||||
distribution.
|
||||
|
||||
* Neither the name "SmartyPants" nor the names of its contributors
|
||||
may be used to endorse or promote products derived from this
|
||||
software without specific prior written permission.
|
||||
|
||||
This software is provided by the copyright holders and contributors "as
|
||||
is" and any express or implied warranties, including, but not limited
|
||||
to, the implied warranties of merchantability and fitness for a
|
||||
particular purpose are disclaimed. In no event shall the copyright
|
||||
owner or contributors be liable for any direct, indirect, incidental,
|
||||
special, exemplary, or consequential damages (including, but not
|
||||
limited to, procurement of substitute goods or services; loss of use,
|
||||
data, or profits; or business interruption) however caused and on any
|
||||
theory of liability, whether in contract, strict liability, or tort
|
||||
(including negligence or otherwise) arising in any way out of the use
|
||||
of this software, even if advised of the possibility of such damage.
|
||||
|
||||
|
||||
smartypants.py license::
|
||||
|
||||
smartypants.py is a derivative work of SmartyPants.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in
|
||||
the documentation and/or other materials provided with the
|
||||
distribution.
|
||||
|
||||
This software is provided by the copyright holders and contributors "as
|
||||
is" and any express or implied warranties, including, but not limited
|
||||
to, the implied warranties of merchantability and fitness for a
|
||||
particular purpose are disclaimed. In no event shall the copyright
|
||||
owner or contributors be liable for any direct, indirect, incidental,
|
||||
special, exemplary, or consequential damages (including, but not
|
||||
limited to, procurement of substitute goods or services; loss of use,
|
||||
data, or profits; or business interruption) however caused and on any
|
||||
theory of liability, whether in contract, strict liability, or tort
|
||||
(including negligence or otherwise) arising in any way out of the use
|
||||
of this software, even if advised of the possibility of such damage.
|
||||
|
||||
.. _Chad Miller: http://web.chad.org/
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
sphinx.util.smartypants
|
||||
~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
This code is copied from docutils’ docutils/utils/smartquotes.py
|
||||
version 1.7.1 (from 2017-03-19). It should be removed in the future.
|
||||
|
||||
:copyright: © 2010 Günter Milde,
|
||||
original `SmartyPants`_: © 2003 John Gruber
|
||||
smartypants.py: © 2004, 2007 Chad Miller
|
||||
:license: Released under the terms of the `2-Clause BSD license`_, in short:
|
||||
|
||||
Copying and distribution of this file, with or without modification,
|
||||
are permitted in any medium without royalty provided the copyright
|
||||
notices and this notice are preserved.
|
||||
This file is offered as-is, without any warranty.
|
||||
|
||||
.. _SmartyPants: http://daringfireball.net/projects/smartypants/
|
||||
.. _2-Clause BSD license: http://www.spdx.org/licenses/BSD-2-Clause
|
||||
|
||||
See the LICENSE file and the original docutils code for details.
|
||||
"""
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import re
|
||||
from docutils.utils import smartquotes
|
||||
from sphinx.util.docutils import __version_info__ as docutils_version
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
from typing import Tuple # NOQA
|
||||
if False: # For type annotation
|
||||
from typing import Iterable, Iterator, Tuple # NOQA
|
||||
|
||||
|
||||
def sphinx_smarty_pants(t):
|
||||
# type: (unicode) -> unicode
|
||||
t = t.replace('"', '"')
|
||||
t = educate_dashes_oldschool(t)
|
||||
t = educate_quotes(t) # type: ignore
|
||||
t = t.replace('"', '"')
|
||||
return t
|
||||
|
||||
|
||||
# Constants for quote education.
|
||||
punct_class = r"""[!"#\$\%'()*+,-.\/:;<=>?\@\[\\\]\^_`{|}~]"""
|
||||
end_of_word_class = r"""[\s.,;:!?)]"""
|
||||
close_class = r"""[^\ \t\r\n\[\{\(\-]"""
|
||||
dec_dashes = r"""–|—"""
|
||||
|
||||
# Special case if the very first character is a quote
|
||||
# followed by punctuation at a non-word-break. Close the quotes by brute force:
|
||||
single_quote_start_re = re.compile(r"""^'(?=%s\\B)""" % (punct_class,))
|
||||
double_quote_start_re = re.compile(r"""^"(?=%s\\B)""" % (punct_class,))
|
||||
|
||||
# Special case for double sets of quotes, e.g.:
|
||||
# <p>He said, "'Quoted' words in a larger quote."</p>
|
||||
double_quote_sets_re = re.compile(r""""'(?=\w)""")
|
||||
single_quote_sets_re = re.compile(r"""'"(?=\w)""")
|
||||
|
||||
# Special case for decade abbreviations (the '80s):
|
||||
decade_abbr_re = re.compile(r"""\b'(?=\d{2}s)""")
|
||||
|
||||
# Get most opening double quotes:
|
||||
opening_double_quotes_regex = re.compile(r"""
|
||||
(
|
||||
\s | # a whitespace char, or
|
||||
  | # a non-breaking space entity, or
|
||||
-- | # dashes, or
|
||||
&[mn]dash; | # named dash entities
|
||||
%s | # or decimal entities
|
||||
&\#x201[34]; # or hex
|
||||
)
|
||||
" # the quote
|
||||
(?=\w) # followed by a word character
|
||||
""" % (dec_dashes,), re.VERBOSE)
|
||||
|
||||
# Double closing quotes:
|
||||
closing_double_quotes_regex = re.compile(r"""
|
||||
#(%s)? # character that indicates the quote should be closing
|
||||
"
|
||||
(?=%s)
|
||||
""" % (close_class, end_of_word_class), re.VERBOSE)
|
||||
|
||||
closing_double_quotes_regex_2 = re.compile(r"""
|
||||
(%s) # character that indicates the quote should be closing
|
||||
"
|
||||
""" % (close_class,), re.VERBOSE)
|
||||
|
||||
# Get most opening single quotes:
|
||||
opening_single_quotes_regex = re.compile(r"""
|
||||
(
|
||||
\s | # a whitespace char, or
|
||||
  | # a non-breaking space entity, or
|
||||
-- | # dashes, or
|
||||
&[mn]dash; | # named dash entities
|
||||
%s | # or decimal entities
|
||||
&\#x201[34]; # or hex
|
||||
)
|
||||
' # the quote
|
||||
(?=\w) # followed by a word character
|
||||
""" % (dec_dashes,), re.VERBOSE)
|
||||
|
||||
closing_single_quotes_regex = re.compile(r"""
|
||||
(%s)
|
||||
'
|
||||
(?!\s | s\b | \d)
|
||||
""" % (close_class,), re.VERBOSE)
|
||||
|
||||
closing_single_quotes_regex_2 = re.compile(r"""
|
||||
(%s)
|
||||
'
|
||||
(\s | s\b)
|
||||
""" % (close_class,), re.VERBOSE)
|
||||
|
||||
|
||||
def educate_quotes(s):
|
||||
# type: (str) -> str
|
||||
def educateQuotes(text, language='en'):
|
||||
# type: (unicode, unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
|
||||
Returns: The string, with "educated" curly quote HTML entities.
|
||||
Parameter: - text string (unicode or bytes).
|
||||
- language (`BCP 47` language tag.)
|
||||
Returns: The `text`, with "educated" curly quote characters.
|
||||
|
||||
Example input: "Isn't this fun?"
|
||||
Example output: “Isn’t this fun?”
|
||||
Example output: “Isn’t this fun?“;
|
||||
"""
|
||||
|
||||
smart = smartquotes.smartchars(language)
|
||||
smart.apostrophe = u'’'
|
||||
|
||||
# oldtext = text
|
||||
punct_class = r"""[!"#\$\%'()*+,-.\/:;<=>?\@\[\\\]\^_`{|}~]"""
|
||||
|
||||
# Special case if the very first character is a quote
|
||||
# followed by punctuation at a non-word-break. Close the quotes
|
||||
# by brute force:
|
||||
s = single_quote_start_re.sub("’", s)
|
||||
s = double_quote_start_re.sub("”", s)
|
||||
# followed by punctuation at a non-word-break.
|
||||
# Close the quotes by brute force:
|
||||
text = re.sub(r"""^'(?=%s\\B)""" % (punct_class,), smart.csquote, text)
|
||||
text = re.sub(r"""^"(?=%s\\B)""" % (punct_class,), smart.cpquote, text)
|
||||
|
||||
# Special case for double sets of quotes, e.g.:
|
||||
# <p>He said, "'Quoted' words in a larger quote."</p>
|
||||
s = double_quote_sets_re.sub("“‘", s)
|
||||
s = single_quote_sets_re.sub("‘“", s)
|
||||
text = re.sub(r""""'(?=\w)""", smart.opquote + smart.osquote, text)
|
||||
text = re.sub(r"""'"(?=\w)""", smart.osquote + smart.opquote, text)
|
||||
|
||||
# Special case for decade abbreviations (the '80s):
|
||||
s = decade_abbr_re.sub("’", s)
|
||||
text = re.sub(r"""\b'(?=\d{2}s)""", smart.csquote, text)
|
||||
|
||||
s = opening_single_quotes_regex.sub(r"\1‘", s)
|
||||
s = closing_single_quotes_regex.sub(r"\1’", s)
|
||||
s = closing_single_quotes_regex_2.sub(r"\1’\2", s)
|
||||
close_class = r"""[^\ \t\r\n\[\{\(\-]"""
|
||||
dec_dashes = r"""–|—"""
|
||||
|
||||
# Get most opening single quotes:
|
||||
opening_single_quotes_regex = re.compile(r"""
|
||||
(
|
||||
\s | # a whitespace char, or
|
||||
| # a non-breaking space entity, or
|
||||
-- | # dashes, or
|
||||
&[mn]dash; | # named dash entities
|
||||
%s | # or decimal entities
|
||||
&\#x201[34]; # or hex
|
||||
)
|
||||
' # the quote
|
||||
(?=\w) # followed by a word character
|
||||
""" % (dec_dashes,), re.VERBOSE | re.UNICODE)
|
||||
text = opening_single_quotes_regex.sub(r'\1' + smart.osquote, text)
|
||||
|
||||
# In many locales, single closing quotes are different from apostrophe:
|
||||
if smart.csquote != smart.apostrophe:
|
||||
apostrophe_regex = re.compile(r"(?<=(\w|\d))'(?=\w)", re.UNICODE)
|
||||
text = apostrophe_regex.sub(smart.apostrophe, text)
|
||||
|
||||
closing_single_quotes_regex = re.compile(r"""
|
||||
(%s)
|
||||
'
|
||||
(?!\s | # whitespace
|
||||
s\b |
|
||||
\d # digits ('80s)
|
||||
)
|
||||
""" % (close_class,), re.VERBOSE | re.UNICODE)
|
||||
text = closing_single_quotes_regex.sub(r'\1' + smart.csquote, text)
|
||||
|
||||
closing_single_quotes_regex = re.compile(r"""
|
||||
(%s)
|
||||
'
|
||||
(\s | s\b)
|
||||
""" % (close_class,), re.VERBOSE | re.UNICODE)
|
||||
text = closing_single_quotes_regex.sub(r'\1%s\2' % smart.csquote, text)
|
||||
|
||||
# Any remaining single quotes should be opening ones:
|
||||
s = s.replace("'", "‘")
|
||||
text = re.sub(r"""'""", smart.osquote, text)
|
||||
|
||||
s = opening_double_quotes_regex.sub(r"\1“", s)
|
||||
s = closing_double_quotes_regex.sub(r"”", s)
|
||||
s = closing_double_quotes_regex_2.sub(r"\1”", s)
|
||||
# Get most opening double quotes:
|
||||
opening_double_quotes_regex = re.compile(r"""
|
||||
(
|
||||
\s | # a whitespace char, or
|
||||
| # a non-breaking space entity, or
|
||||
-- | # dashes, or
|
||||
&[mn]dash; | # named dash entities
|
||||
%s | # or decimal entities
|
||||
&\#x201[34]; # or hex
|
||||
)
|
||||
" # the quote
|
||||
(?=\w) # followed by a word character
|
||||
""" % (dec_dashes,), re.VERBOSE)
|
||||
text = opening_double_quotes_regex.sub(r'\1' + smart.opquote, text)
|
||||
|
||||
# Double closing quotes:
|
||||
closing_double_quotes_regex = re.compile(r"""
|
||||
#(%s)? # character that indicates the quote should be closing
|
||||
"
|
||||
(?=\s)
|
||||
""" % (close_class,), re.VERBOSE)
|
||||
text = closing_double_quotes_regex.sub(smart.cpquote, text)
|
||||
|
||||
closing_double_quotes_regex = re.compile(r"""
|
||||
(%s) # character that indicates the quote should be closing
|
||||
"
|
||||
""" % (close_class,), re.VERBOSE)
|
||||
text = closing_double_quotes_regex.sub(r'\1' + smart.cpquote, text)
|
||||
|
||||
# Any remaining quotes should be opening ones.
|
||||
return s.replace('"', "“")
|
||||
text = re.sub(r'"', smart.opquote, text)
|
||||
|
||||
return text
|
||||
|
||||
|
||||
def educate_quotes_latex(s, dquotes=("``", "''")):
|
||||
# type: (str, Tuple[str, str]) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
|
||||
Returns: The string, with double quotes corrected to LaTeX quotes.
|
||||
|
||||
Example input: "Isn't this fun?"
|
||||
Example output: ``Isn't this fun?'';
|
||||
def educate_tokens(text_tokens, attr='1', language='en'):
|
||||
# type: (Iterable[Tuple[str, unicode]], unicode, unicode) -> Iterator
|
||||
"""Return iterator that "educates" the items of `text_tokens`.
|
||||
"""
|
||||
|
||||
# Special case if the very first character is a quote
|
||||
# followed by punctuation at a non-word-break. Close the quotes
|
||||
# by brute force:
|
||||
s = single_quote_start_re.sub("\x04", s)
|
||||
s = double_quote_start_re.sub("\x02", s)
|
||||
# Parse attributes:
|
||||
# 0 : do nothing
|
||||
# 1 : set all
|
||||
# 2 : set all, using old school en- and em- dash shortcuts
|
||||
# 3 : set all, using inverted old school en and em- dash shortcuts
|
||||
#
|
||||
# q : quotes
|
||||
# b : backtick quotes (``double'' only)
|
||||
# B : backtick quotes (``double'' and `single')
|
||||
# d : dashes
|
||||
# D : old school dashes
|
||||
# i : inverted old school dashes
|
||||
# e : ellipses
|
||||
# w : convert " entities to " for Dreamweaver users
|
||||
|
||||
# Special case for double sets of quotes, e.g.:
|
||||
# <p>He said, "'Quoted' words in a larger quote."</p>
|
||||
s = double_quote_sets_re.sub("\x01\x03", s)
|
||||
s = single_quote_sets_re.sub("\x03\x01", s)
|
||||
convert_quot = False # translate " entities into normal quotes?
|
||||
do_dashes = 0
|
||||
do_backticks = 0
|
||||
do_quotes = False
|
||||
do_ellipses = False
|
||||
do_stupefy = False
|
||||
|
||||
# Special case for decade abbreviations (the '80s):
|
||||
s = decade_abbr_re.sub("\x04", s)
|
||||
if attr == "0": # Do nothing.
|
||||
pass
|
||||
elif attr == "1": # Do everything, turn all options on.
|
||||
do_quotes = True
|
||||
do_backticks = 1
|
||||
do_dashes = 1
|
||||
do_ellipses = True
|
||||
elif attr == "2":
|
||||
# Do everything, turn all options on, use old school dash shorthand.
|
||||
do_quotes = True
|
||||
do_backticks = 1
|
||||
do_dashes = 2
|
||||
do_ellipses = True
|
||||
elif attr == "3":
|
||||
# Do everything, use inverted old school dash shorthand.
|
||||
do_quotes = True
|
||||
do_backticks = 1
|
||||
do_dashes = 3
|
||||
do_ellipses = True
|
||||
elif attr == "-1": # Special "stupefy" mode.
|
||||
do_stupefy = True
|
||||
else:
|
||||
if "q" in attr:
|
||||
do_quotes = True
|
||||
if "b" in attr:
|
||||
do_backticks = 1
|
||||
if "B" in attr:
|
||||
do_backticks = 2
|
||||
if "d" in attr:
|
||||
do_dashes = 1
|
||||
if "D" in attr:
|
||||
do_dashes = 2
|
||||
if "i" in attr:
|
||||
do_dashes = 3
|
||||
if "e" in attr:
|
||||
do_ellipses = True
|
||||
if "w" in attr:
|
||||
convert_quot = True
|
||||
|
||||
s = opening_single_quotes_regex.sub("\\1\x03", s)
|
||||
s = closing_single_quotes_regex.sub("\\1\x04", s)
|
||||
s = closing_single_quotes_regex_2.sub("\\1\x04\\2", s)
|
||||
prev_token_last_char = " "
|
||||
# Last character of the previous text token. Used as
|
||||
# context to curl leading quote characters correctly.
|
||||
|
||||
# Any remaining single quotes should be opening ones:
|
||||
s = s.replace("'", "\x03")
|
||||
for (ttype, text) in text_tokens:
|
||||
|
||||
s = opening_double_quotes_regex.sub("\\1\x01", s)
|
||||
s = closing_double_quotes_regex.sub("\x02", s)
|
||||
s = closing_double_quotes_regex_2.sub("\\1\x02", s)
|
||||
# skip HTML and/or XML tags as well as emtpy text tokens
|
||||
# without updating the last character
|
||||
if ttype == 'tag' or not text:
|
||||
yield text
|
||||
continue
|
||||
|
||||
# Any remaining quotes should be opening ones.
|
||||
s = s.replace('"', "\x01")
|
||||
# skip literal text (math, literal, raw, ...)
|
||||
if ttype == 'literal':
|
||||
prev_token_last_char = text[-1:]
|
||||
yield text
|
||||
continue
|
||||
|
||||
# Finally, replace all helpers with quotes.
|
||||
return s.replace("\x01", dquotes[0]).replace("\x02", dquotes[1]).\
|
||||
replace("\x03", "`").replace("\x04", "'")
|
||||
last_char = text[-1:] # Remember last char before processing.
|
||||
|
||||
text = smartquotes.processEscapes(text)
|
||||
|
||||
if convert_quot:
|
||||
text = re.sub('"', '"', text)
|
||||
|
||||
if do_dashes == 1:
|
||||
text = smartquotes.educateDashes(text)
|
||||
elif do_dashes == 2:
|
||||
text = smartquotes.educateDashesOldSchool(text)
|
||||
elif do_dashes == 3:
|
||||
text = smartquotes.educateDashesOldSchoolInverted(text)
|
||||
|
||||
if do_ellipses:
|
||||
text = smartquotes.educateEllipses(text)
|
||||
|
||||
# Note: backticks need to be processed before quotes.
|
||||
if do_backticks:
|
||||
text = smartquotes.educateBackticks(text, language)
|
||||
|
||||
if do_backticks == 2:
|
||||
text = smartquotes.educateSingleBackticks(text, language)
|
||||
|
||||
if do_quotes:
|
||||
# Replace plain quotes to prevent converstion to
|
||||
# 2-character sequence in French.
|
||||
context = prev_token_last_char.replace('"', ';').replace("'", ';')
|
||||
text = educateQuotes(context + text, language)[1:]
|
||||
|
||||
if do_stupefy:
|
||||
text = smartquotes.stupefyEntities(text, language)
|
||||
|
||||
# Remember last char as context for the next token
|
||||
prev_token_last_char = last_char
|
||||
|
||||
text = smartquotes.processEscapes(text, restore=True)
|
||||
|
||||
yield text
|
||||
|
||||
|
||||
def educate_backticks(s):
|
||||
# type: (unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
Returns: The string, with ``backticks'' -style double quotes
|
||||
translated into HTML curly quote entities.
|
||||
Example input: ``Isn't this fun?''
|
||||
Example output: “Isn't this fun?”
|
||||
"""
|
||||
return s.replace("``", "“").replace("''", "”")
|
||||
if docutils_version < (0, 13, 2):
|
||||
# Monkey patch the old docutils versions to fix the issue mentioned
|
||||
# at https://sourceforge.net/p/docutils/bugs/313/
|
||||
smartquotes.educateQuotes = educateQuotes
|
||||
|
||||
# And the one mentioned at https://sourceforge.net/p/docutils/bugs/317/
|
||||
smartquotes.educate_tokens = educate_tokens
|
||||
|
||||
def educate_single_backticks(s):
|
||||
# type: (unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
Returns: The string, with `backticks' -style single quotes
|
||||
translated into HTML curly quote entities.
|
||||
|
||||
Example input: `Isn't this fun?'
|
||||
Example output: ‘Isn’t this fun?’
|
||||
"""
|
||||
return s.replace('`', "‘").replace("'", "’")
|
||||
|
||||
|
||||
def educate_dashes_oldschool(s):
|
||||
# type: (unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
|
||||
Returns: The string, with each instance of "--" translated to
|
||||
an en-dash HTML entity, and each "---" translated to
|
||||
an em-dash HTML entity.
|
||||
"""
|
||||
return s.replace('---', "—").replace('--', "–")
|
||||
|
||||
|
||||
def educate_dashes_oldschool_inverted(s):
|
||||
# type: (unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
|
||||
Returns: The string, with each instance of "--" translated to
|
||||
an em-dash HTML entity, and each "---" translated to
|
||||
an en-dash HTML entity. Two reasons why: First, unlike the
|
||||
en- and em-dash syntax supported by
|
||||
educate_dashes_oldschool(), it's compatible with existing
|
||||
entries written before SmartyPants 1.1, back when "--" was
|
||||
only used for em-dashes. Second, em-dashes are more
|
||||
common than en-dashes, and so it sort of makes sense that
|
||||
the shortcut should be shorter to type. (Thanks to Aaron
|
||||
Swartz for the idea.)
|
||||
"""
|
||||
return s.replace('---', "–").replace('--', "—")
|
||||
|
||||
|
||||
def educate_ellipses(s):
|
||||
# type: (unicode) -> unicode
|
||||
"""
|
||||
Parameter: String.
|
||||
Returns: The string, with each instance of "..." translated to
|
||||
an ellipsis HTML entity.
|
||||
|
||||
Example input: Huh...?
|
||||
Example output: Huh…?
|
||||
"""
|
||||
return s.replace('...', "…").replace('. . .', "…")
|
||||
# Fix the issue with French quotes mentioned at
|
||||
# https://sourceforge.net/p/docutils/mailman/message/35760696/
|
||||
quotes = smartquotes.smartchars.quotes
|
||||
quotes['fr'] = (u'«\u00a0', u'\u00a0»', u'“', u'”')
|
||||
quotes['fr-ch'] = u'«»‹›'
|
||||
|
@ -7,4 +7,8 @@
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
|
||||
from sphinxcontrib.websupport.utils import is_commentable # NOQA
|
||||
try:
|
||||
from sphinxcontrib.websupport.utils import is_commentable # NOQA
|
||||
except ImportError:
|
||||
def is_commentable(node):
|
||||
raise RuntimeError
|
||||
|
@ -12,11 +12,17 @@
|
||||
import warnings
|
||||
|
||||
from sphinx.deprecation import RemovedInSphinx20Warning
|
||||
from sphinxcontrib.websupport import WebSupport # NOQA
|
||||
from sphinxcontrib.websupport import errors # NOQA
|
||||
from sphinxcontrib.websupport.search import BaseSearch, SEARCH_ADAPTERS # NOQA
|
||||
from sphinxcontrib.websupport.storage import StorageBackend # NOQA
|
||||
|
||||
warnings.warn('sphinx.websupport module is now provided as sphinxcontrib.webuspport. '
|
||||
'sphinx.websupport will be removed in Sphinx-2.0. Please use it instaed',
|
||||
RemovedInSphinx20Warning)
|
||||
try:
|
||||
from sphinxcontrib.websupport import WebSupport # NOQA
|
||||
from sphinxcontrib.websupport import errors # NOQA
|
||||
from sphinxcontrib.websupport.search import BaseSearch, SEARCH_ADAPTERS # NOQA
|
||||
from sphinxcontrib.websupport.storage import StorageBackend # NOQA
|
||||
|
||||
warnings.warn('sphinx.websupport module is now provided as sphinxcontrib-websupport. '
|
||||
'sphinx.websupport will be removed at Sphinx-2.0. '
|
||||
'Please use the package instead.',
|
||||
RemovedInSphinx20Warning)
|
||||
except ImportError:
|
||||
warnings.warn('Since Sphinx-1.6, sphinx.websupport module is now separated to '
|
||||
'sphinxcontrib-websupport package. Please add it into your dependency list.')
|
||||
|
@ -22,7 +22,6 @@ from sphinx import addnodes
|
||||
from sphinx.locale import admonitionlabels, _
|
||||
from sphinx.util import logging
|
||||
from sphinx.util.images import get_image_size
|
||||
from sphinx.util.smartypants import sphinx_smarty_pants
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
@ -74,7 +73,6 @@ class HTMLTranslator(BaseTranslator):
|
||||
# type: (StandaloneHTMLBuilder, Any, Any) -> None
|
||||
BaseTranslator.__init__(self, *args, **kwds)
|
||||
self.highlighter = builder.highlighter
|
||||
self.no_smarty = 0
|
||||
self.builder = builder
|
||||
self.highlightlang = self.highlightlang_base = \
|
||||
builder.config.highlight_language
|
||||
@ -685,10 +683,6 @@ class HTMLTranslator(BaseTranslator):
|
||||
BaseTranslator.visit_option_group(self, node)
|
||||
self.context[-2] = self.context[-2].replace(' ', ' ')
|
||||
|
||||
def bulk_text_processor(self, text):
|
||||
# type: (unicode) -> unicode
|
||||
return text
|
||||
|
||||
# overwritten
|
||||
def visit_Text(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
@ -710,8 +704,6 @@ class HTMLTranslator(BaseTranslator):
|
||||
else:
|
||||
if self.in_mailto and self.settings.cloak_email_addresses:
|
||||
encoded = self.cloak_email(encoded)
|
||||
else:
|
||||
encoded = self.bulk_text_processor(encoded)
|
||||
self.body.append(encoded)
|
||||
|
||||
def visit_note(self, node):
|
||||
@ -786,7 +778,6 @@ class HTMLTranslator(BaseTranslator):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_admonition(node)
|
||||
|
||||
# these are only handled specially in the SmartyPantsHTMLTranslator
|
||||
def visit_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
return self.visit_emphasis(node)
|
||||
@ -875,94 +866,3 @@ class HTMLTranslator(BaseTranslator):
|
||||
def unknown_visit(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
raise NotImplementedError('Unknown node: ' + node.__class__.__name__)
|
||||
|
||||
|
||||
class SmartyPantsHTMLTranslator(HTMLTranslator):
|
||||
"""
|
||||
Handle ordinary text via smartypants, converting quotes and dashes
|
||||
to the correct entities.
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwds):
|
||||
# type: (Any, Any) -> None
|
||||
self.no_smarty = 0
|
||||
HTMLTranslator.__init__(self, *args, **kwds)
|
||||
|
||||
def visit_literal(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
# this raises SkipNode
|
||||
HTMLTranslator.visit_literal(self, node)
|
||||
finally:
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_block(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
HTMLTranslator.visit_literal_block(self, node)
|
||||
except nodes.SkipNode:
|
||||
# HTMLTranslator raises SkipNode for simple literal blocks,
|
||||
# but not for parsed literal blocks
|
||||
self.no_smarty -= 1
|
||||
raise
|
||||
|
||||
def depart_literal_block(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
HTMLTranslator.depart_literal_block(self, node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
self.visit_emphasis(node)
|
||||
|
||||
def depart_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_emphasis(node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_strong(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
self.visit_strong(node)
|
||||
|
||||
def depart_literal_strong(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_strong(node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_desc_signature(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
HTMLTranslator.visit_desc_signature(self, node)
|
||||
|
||||
def depart_desc_signature(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty -= 1
|
||||
HTMLTranslator.depart_desc_signature(self, node)
|
||||
|
||||
def visit_productionlist(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
HTMLTranslator.visit_productionlist(self, node)
|
||||
finally:
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_option(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
HTMLTranslator.visit_option(self, node)
|
||||
|
||||
def depart_option(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty -= 1
|
||||
HTMLTranslator.depart_option(self, node)
|
||||
|
||||
def bulk_text_processor(self, text):
|
||||
# type: (unicode) -> unicode
|
||||
if self.no_smarty <= 0:
|
||||
return sphinx_smarty_pants(text)
|
||||
return text
|
||||
|
@ -21,7 +21,6 @@ from sphinx import addnodes
|
||||
from sphinx.locale import admonitionlabels, _
|
||||
from sphinx.util import logging
|
||||
from sphinx.util.images import get_image_size
|
||||
from sphinx.util.smartypants import sphinx_smarty_pants
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
@ -44,7 +43,6 @@ class HTML5Translator(BaseTranslator):
|
||||
# type: (StandaloneHTMLBuilder, Any, Any) -> None
|
||||
BaseTranslator.__init__(self, *args, **kwds)
|
||||
self.highlighter = builder.highlighter
|
||||
self.no_smarty = 0
|
||||
self.builder = builder
|
||||
self.highlightlang = self.highlightlang_base = \
|
||||
builder.config.highlight_language
|
||||
@ -628,10 +626,6 @@ class HTML5Translator(BaseTranslator):
|
||||
# type: (nodes.Node) -> None
|
||||
self.body.append('</td>')
|
||||
|
||||
def bulk_text_processor(self, text):
|
||||
# type: (unicode) -> unicode
|
||||
return text
|
||||
|
||||
# overwritten
|
||||
def visit_Text(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
@ -653,8 +647,6 @@ class HTML5Translator(BaseTranslator):
|
||||
else:
|
||||
if self.in_mailto and self.settings.cloak_email_addresses:
|
||||
encoded = self.cloak_email(encoded)
|
||||
else:
|
||||
encoded = self.bulk_text_processor(encoded)
|
||||
self.body.append(encoded)
|
||||
|
||||
def visit_note(self, node):
|
||||
@ -729,7 +721,6 @@ class HTML5Translator(BaseTranslator):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_admonition(node)
|
||||
|
||||
# these are only handled specially in the SmartyPantsHTML5Translator
|
||||
def visit_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
return self.visit_emphasis(node)
|
||||
@ -830,94 +821,3 @@ class HTML5Translator(BaseTranslator):
|
||||
def unknown_visit(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
raise NotImplementedError('Unknown node: ' + node.__class__.__name__)
|
||||
|
||||
|
||||
class SmartyPantsHTML5Translator(HTML5Translator):
|
||||
"""
|
||||
Handle ordinary text via smartypants, converting quotes and dashes
|
||||
to the correct entities.
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwds):
|
||||
# type: (Any, Any) -> None
|
||||
self.no_smarty = 0
|
||||
HTML5Translator.__init__(self, *args, **kwds)
|
||||
|
||||
def visit_literal(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
# this raises SkipNode
|
||||
HTML5Translator.visit_literal(self, node)
|
||||
finally:
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_block(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
HTML5Translator.visit_literal_block(self, node)
|
||||
except nodes.SkipNode:
|
||||
# HTML5Translator raises SkipNode for simple literal blocks,
|
||||
# but not for parsed literal blocks
|
||||
self.no_smarty -= 1
|
||||
raise
|
||||
|
||||
def depart_literal_block(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
HTML5Translator.depart_literal_block(self, node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
self.visit_emphasis(node)
|
||||
|
||||
def depart_literal_emphasis(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_emphasis(node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_literal_strong(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
self.visit_strong(node)
|
||||
|
||||
def depart_literal_strong(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.depart_strong(node)
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_desc_signature(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
HTML5Translator.visit_desc_signature(self, node)
|
||||
|
||||
def depart_desc_signature(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty -= 1
|
||||
HTML5Translator.depart_desc_signature(self, node)
|
||||
|
||||
def visit_productionlist(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
try:
|
||||
HTML5Translator.visit_productionlist(self, node)
|
||||
finally:
|
||||
self.no_smarty -= 1
|
||||
|
||||
def visit_option(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty += 1
|
||||
HTML5Translator.visit_option(self, node)
|
||||
|
||||
def depart_option(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.no_smarty -= 1
|
||||
HTML5Translator.depart_option(self, node)
|
||||
|
||||
def bulk_text_processor(self, text):
|
||||
# type: (unicode) -> unicode
|
||||
if self.no_smarty <= 0:
|
||||
return sphinx_smarty_pants(text)
|
||||
return text
|
||||
|
@ -30,7 +30,6 @@ from sphinx.util.i18n import format_date
|
||||
from sphinx.util.nodes import clean_astext, traverse_parent
|
||||
from sphinx.util.template import LaTeXRenderer
|
||||
from sphinx.util.texescape import tex_escape_map, tex_replace_map
|
||||
from sphinx.util.smartypants import educate_quotes_latex
|
||||
|
||||
if False:
|
||||
# For type annotation
|
||||
@ -58,7 +57,7 @@ DEFAULT_SETTINGS = {
|
||||
'classoptions': '',
|
||||
'extraclassoptions': '',
|
||||
'maxlistdepth': '',
|
||||
'sphinxpkgoptions': 'dontkeepoldnames',
|
||||
'sphinxpkgoptions': '',
|
||||
'sphinxsetup': '',
|
||||
'passoptionstopackages': '',
|
||||
'geometry': '\\usepackage{geometry}',
|
||||
@ -548,8 +547,6 @@ class LaTeXTranslator(nodes.NodeVisitor):
|
||||
self.elements.update({
|
||||
'releasename': _('Release'),
|
||||
})
|
||||
if builder.config.latex_keep_old_macro_names:
|
||||
self.elements['sphinxpkgoptions'] = ''
|
||||
if document.settings.docclass == 'howto':
|
||||
docclass = builder.config.latex_docclass.get('howto', 'article')
|
||||
else:
|
||||
@ -558,7 +555,7 @@ class LaTeXTranslator(nodes.NodeVisitor):
|
||||
if builder.config.today:
|
||||
self.elements['date'] = builder.config.today
|
||||
else:
|
||||
self.elements['date'] = format_date(builder.config.today_fmt or _('%b %d, %Y'),
|
||||
self.elements['date'] = format_date(builder.config.today_fmt or _('%b %d, %Y'), # type: ignore # NOQA
|
||||
language=builder.config.language)
|
||||
if builder.config.latex_logo:
|
||||
# no need for \\noindent here, used in flushright
|
||||
@ -1834,12 +1831,10 @@ class LaTeXTranslator(nodes.NodeVisitor):
|
||||
self.unrestrict_footnote(node)
|
||||
|
||||
def visit_legend(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.body.append('{\\small ')
|
||||
self.body.append('\n\\begin{sphinxlegend}')
|
||||
|
||||
def depart_legend(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
self.body.append('}')
|
||||
self.body.append('\\end{sphinxlegend}\n')
|
||||
|
||||
def visit_admonition(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
@ -2533,10 +2528,6 @@ class LaTeXTranslator(nodes.NodeVisitor):
|
||||
def visit_Text(self, node):
|
||||
# type: (nodes.Node) -> None
|
||||
text = self.encode(node.astext())
|
||||
if not self.no_contractions and not self.in_parsed_literal:
|
||||
text = educate_quotes_latex(text, # type: ignore
|
||||
dquotes=("\\sphinxquotedblleft{}",
|
||||
"\\sphinxquotedblright{}"))
|
||||
self.body.append(text)
|
||||
|
||||
def depart_Text(self, node):
|
||||
|
@ -107,7 +107,7 @@ class ManualPageTranslator(BaseTranslator):
|
||||
if builder.config.today:
|
||||
self._docinfo['date'] = builder.config.today
|
||||
else:
|
||||
self._docinfo['date'] = format_date(builder.config.today_fmt or _('%b %d, %Y'),
|
||||
self._docinfo['date'] = format_date(builder.config.today_fmt or _('%b %d, %Y'), # type: ignore # NOQA
|
||||
language=builder.config.language)
|
||||
self._docinfo['copyright'] = builder.config.copyright
|
||||
self._docinfo['version'] = builder.config.version
|
||||
|
@ -238,7 +238,7 @@ class TexinfoTranslator(nodes.NodeVisitor):
|
||||
'project': self.escape(self.builder.config.project),
|
||||
'copyright': self.escape(self.builder.config.copyright),
|
||||
'date': self.escape(self.builder.config.today or
|
||||
format_date(self.builder.config.today_fmt or _('%b %d, %Y'),
|
||||
format_date(self.builder.config.today_fmt or _('%b %d, %Y'), # type: ignore # NOQA
|
||||
language=self.builder.config.language))
|
||||
})
|
||||
# title
|
||||
|
@ -661,7 +661,7 @@ class TextTranslator(nodes.NodeVisitor):
|
||||
self.add_text(''.join(out) + self.nl)
|
||||
|
||||
def writerow(row):
|
||||
# type: (list[List[unicode]]) -> None
|
||||
# type: (List[List[unicode]]) -> None
|
||||
lines = zip_longest(*row)
|
||||
for line in lines:
|
||||
out = ['|']
|
||||
|
@ -8,6 +8,8 @@ Pygments>=2.0
|
||||
docutils>=0.11
|
||||
snowballstemmer>=1.1
|
||||
babel
|
||||
sqlalchemy>=0.9
|
||||
whoosh>=2.0
|
||||
alabaster
|
||||
sphinx_rtd_theme
|
||||
sphinxcontrib-websupport
|
||||
|
@ -426,6 +426,26 @@ More domains:
|
||||
.. default-role::
|
||||
|
||||
|
||||
Smart quotes
|
||||
------------
|
||||
|
||||
* Smart "quotes" in English 'text'.
|
||||
* Smart --- long and -- short dashes.
|
||||
* Ellipsis...
|
||||
* No smartypants in literal blocks: ``foo--"bar"...``.
|
||||
|
||||
.. only:: html
|
||||
|
||||
.. LaTeX does not like Cyrillic letters in this test, so it is HTML only.
|
||||
|
||||
.. rst-class:: language-ru
|
||||
|
||||
Этот "абзац" должен использовать 'русские' кавычки.
|
||||
|
||||
.. rst-class:: language-fr
|
||||
|
||||
Il dit : "C'est 'super' !"
|
||||
|
||||
.. rubric:: Footnotes
|
||||
|
||||
.. [#] Like footnotes.
|
||||
|
@ -24,6 +24,7 @@ sys.path.insert(0, os.path.abspath(os.path.join(testroot, os.path.pardir)))
|
||||
# filter warnings of test dependencies
|
||||
warnings.filterwarnings('ignore', category=DeprecationWarning, module='site') # virtualenv
|
||||
warnings.filterwarnings('ignore', category=ImportWarning, module='backports')
|
||||
warnings.filterwarnings('ignore', category=ImportWarning, module='pkgutil')
|
||||
warnings.filterwarnings('ignore', category=ImportWarning, module='pytest_cov')
|
||||
warnings.filterwarnings('ignore', category=PendingDeprecationWarning, module=r'_pytest\..*')
|
||||
|
||||
|
@ -28,15 +28,6 @@ def test_html_translator(app, status, warning):
|
||||
# no set_translator()
|
||||
translator_class = app.builder.get_translator_class()
|
||||
assert translator_class
|
||||
assert translator_class.__name__ == 'SmartyPantsHTMLTranslator'
|
||||
|
||||
|
||||
@pytest.mark.sphinx('html', confoverrides={
|
||||
'html_use_smartypants': False})
|
||||
def test_html_with_smartypants(app, status, warning):
|
||||
# no set_translator(), html_use_smartypants=False
|
||||
translator_class = app.builder.get_translator_class()
|
||||
assert translator_class
|
||||
assert translator_class.__name__ == 'HTMLTranslator'
|
||||
|
||||
|
||||
|
@ -58,7 +58,7 @@ def nonascii_srcdir(request):
|
||||
[
|
||||
# note: no 'html' - if it's ok with dirhtml it's ok with html
|
||||
'dirhtml', 'singlehtml', 'latex', 'texinfo', 'pickle', 'json', 'text',
|
||||
'htmlhelp', 'qthelp', 'epub2', 'epub', 'applehelp', 'changes', 'xml',
|
||||
'htmlhelp', 'qthelp', 'epub', 'applehelp', 'changes', 'xml',
|
||||
'pseudoxml', 'man', 'linkcheck',
|
||||
],
|
||||
)
|
||||
|
@ -162,7 +162,7 @@ def test_nested_toc(app):
|
||||
app.build()
|
||||
|
||||
# toc.ncx
|
||||
toc = EPUBElementTree.fromstring((app.outdir / 'toc.ncx').text())
|
||||
toc = EPUBElementTree.fromstring((app.outdir / 'toc.ncx').bytes())
|
||||
assert toc.find("./ncx:docTitle/ncx:text").text == 'Python documentation'
|
||||
|
||||
# toc.ncx / navPoint
|
||||
@ -175,7 +175,7 @@ def test_nested_toc(app):
|
||||
navpoints = toc.findall("./ncx:navMap/ncx:navPoint")
|
||||
assert len(navpoints) == 4
|
||||
assert navinfo(navpoints[0]) == ('navPoint1', '1', 'index.xhtml',
|
||||
"Welcome to Sphinx Tests's documentation!")
|
||||
u"Welcome to Sphinx Tests’s documentation!")
|
||||
assert navpoints[0].findall("./ncx:navPoint") == []
|
||||
|
||||
# toc.ncx / nested navPoints
|
||||
@ -192,11 +192,11 @@ def test_nested_toc(app):
|
||||
anchor = elem.find("./xhtml:a")
|
||||
return (anchor.get('href'), anchor.text)
|
||||
|
||||
nav = EPUBElementTree.fromstring((app.outdir / 'nav.xhtml').text())
|
||||
nav = EPUBElementTree.fromstring((app.outdir / 'nav.xhtml').bytes())
|
||||
toc = nav.findall("./xhtml:body/xhtml:nav/xhtml:ol/xhtml:li")
|
||||
assert len(toc) == 4
|
||||
assert navinfo(toc[0]) == ('index.xhtml',
|
||||
"Welcome to Sphinx Tests's documentation!")
|
||||
u"Welcome to Sphinx Tests’s documentation!")
|
||||
assert toc[0].findall("./xhtml:ol") == []
|
||||
|
||||
# nav.xhtml / nested toc
|
||||
|
@ -297,6 +297,13 @@ def test_static_output(app):
|
||||
# tests for ``any`` role
|
||||
(".//a[@href='#with']/span", 'headings'),
|
||||
(".//a[@href='objects.html#func_without_body']/code/span", 'objects'),
|
||||
# tests for smartypants
|
||||
(".//li", u'Smart “quotes” in English ‘text’.'),
|
||||
(".//li", u'Smart — long and – short dashes.'),
|
||||
(".//li", u'Ellipsis…'),
|
||||
(".//li//code//span[@class='pre']", 'foo--"bar"...'),
|
||||
(".//p", u'Этот «абзац» должен использовать „русские“ кавычки.'),
|
||||
(".//p", u'Il dit : « C’est “super” ! »'),
|
||||
],
|
||||
'objects.html': [
|
||||
(".//dt[@id='mod.Cls.meth1']", ''),
|
||||
|
@ -14,19 +14,13 @@
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
from itertools import cycle, chain
|
||||
import xml.etree.cElementTree as ElementTree
|
||||
|
||||
from six import PY3
|
||||
import pytest
|
||||
from html5lib import getTreeBuilder, HTMLParser
|
||||
|
||||
from sphinx import __display_version__
|
||||
from sphinx.util.docutils import is_html5_writer_available
|
||||
|
||||
from util import remove_unicode_literals, strip_escseq, skip_unless
|
||||
from test_build_html import flat_dict, tail_check, check_xpath
|
||||
|
||||
TREE_BUILDER = getTreeBuilder('etree', implementation=ElementTree)
|
||||
@ -35,7 +29,8 @@ HTML_PARSER = HTMLParser(TREE_BUILDER, namespaceHTMLElements=False)
|
||||
|
||||
etree_cache = {}
|
||||
|
||||
@skip_unless(is_html5_writer_available())
|
||||
|
||||
@pytest.mark.skipif(not is_html5_writer_available(), reason='HTML5 writer is not available')
|
||||
@pytest.fixture(scope='module')
|
||||
def cached_etree_parse():
|
||||
def parse(fname):
|
||||
@ -50,7 +45,7 @@ def cached_etree_parse():
|
||||
etree_cache.clear()
|
||||
|
||||
|
||||
@skip_unless(is_html5_writer_available())
|
||||
@pytest.mark.skipif(not is_html5_writer_available(), reason='HTML5 writer is not available')
|
||||
@pytest.mark.parametrize("fname,expect", flat_dict({
|
||||
'images.html': [
|
||||
(".//img[@src='_images/img.png']", ''),
|
||||
|
@ -24,7 +24,7 @@ from sphinx.util.osutil import cd, ensuredir
|
||||
from sphinx.util import docutils
|
||||
from sphinx.writers.latex import LaTeXTranslator
|
||||
|
||||
from util import SkipTest, remove_unicode_literals, strip_escseq, skip_if
|
||||
from util import remove_unicode_literals, strip_escseq
|
||||
from test_build_html import ENV_WARNINGS
|
||||
|
||||
|
||||
@ -77,7 +77,7 @@ def compile_latex_document(app):
|
||||
'SphinxTests.tex'],
|
||||
stdout=PIPE, stderr=PIPE)
|
||||
except OSError: # most likely the latex executable was not found
|
||||
raise SkipTest
|
||||
raise pytest.skip.Exception
|
||||
else:
|
||||
stdout, stderr = p.communicate()
|
||||
if p.returncode != 0:
|
||||
@ -90,7 +90,7 @@ def compile_latex_document(app):
|
||||
def skip_if_requested(testfunc):
|
||||
if 'SKIP_LATEX_BUILD' in os.environ:
|
||||
msg = 'Skip LaTeX builds because SKIP_LATEX_BUILD is set'
|
||||
return skip_if(True, msg)(testfunc)
|
||||
return pytest.mark.skipif(True, reason=msg)(testfunc)
|
||||
else:
|
||||
return testfunc
|
||||
|
||||
@ -98,7 +98,7 @@ def skip_if_requested(testfunc):
|
||||
def skip_if_stylefiles_notfound(testfunc):
|
||||
if kpsetest(*STYLEFILES) is False:
|
||||
msg = 'not running latex, the required styles do not seem to be installed'
|
||||
return skip_if(True, msg)(testfunc)
|
||||
return pytest.mark.skipif(True, reason=msg)(testfunc)
|
||||
else:
|
||||
return testfunc
|
||||
|
||||
|
@ -19,7 +19,7 @@ import pytest
|
||||
|
||||
from sphinx.writers.texinfo import TexinfoTranslator
|
||||
|
||||
from util import SkipTest, remove_unicode_literals, strip_escseq
|
||||
from util import remove_unicode_literals, strip_escseq
|
||||
from test_build_html import ENV_WARNINGS
|
||||
|
||||
|
||||
@ -58,7 +58,7 @@ def test_texinfo(app, status, warning):
|
||||
p = Popen(['makeinfo', '--no-split', 'SphinxTests.texi'],
|
||||
stdout=PIPE, stderr=PIPE)
|
||||
except OSError:
|
||||
raise SkipTest # most likely makeinfo was not found
|
||||
raise pytest.skip.Exception # most likely makeinfo was not found
|
||||
else:
|
||||
stdout, stderr = p.communicate()
|
||||
retcode = p.returncode
|
||||
|
@ -8,12 +8,11 @@
|
||||
:copyright: Copyright 2007-2017 by the Sphinx team, see AUTHORS.
|
||||
:license: BSD, see LICENSE for details.
|
||||
"""
|
||||
import pytest
|
||||
|
||||
from docutils.utils import column_width
|
||||
from sphinx.writers.text import MAXWIDTH
|
||||
|
||||
from util import with_app
|
||||
|
||||
|
||||
def with_text_app(*args, **kw):
|
||||
default_kw = {
|
||||
@ -21,7 +20,7 @@ def with_text_app(*args, **kw):
|
||||
'testroot': 'build-text',
|
||||
}
|
||||
default_kw.update(kw)
|
||||
return with_app(*args, **default_kw)
|
||||
return pytest.mark.sphinx(*args, **default_kw)
|
||||
|
||||
|
||||
@with_text_app()
|
||||
|
@ -12,7 +12,6 @@
|
||||
import re
|
||||
|
||||
from docutils import nodes
|
||||
from sphinx.util.nodes import process_only_nodes
|
||||
import pytest
|
||||
|
||||
|
||||
@ -46,7 +45,7 @@ def test_sectioning(app, status, warning):
|
||||
|
||||
app.builder.build(['only'])
|
||||
doctree = app.env.get_doctree('only')
|
||||
process_only_nodes(doctree, app.builder.tags)
|
||||
app.env.apply_post_transforms(doctree, 'only')
|
||||
|
||||
parts = [getsects(n)
|
||||
for n in [_n for _n in doctree.children if isinstance(_n, nodes.section)]]
|
||||
|
@ -12,7 +12,7 @@
|
||||
import re
|
||||
|
||||
import pytest
|
||||
from util import path, SkipTest
|
||||
from util import path
|
||||
|
||||
|
||||
def regex_count(expr, result):
|
||||
@ -78,7 +78,7 @@ def test_docutils_source_link_with_nonascii_file(app, status, warning):
|
||||
(srcdir / (mb_name + '.txt')).write_text('')
|
||||
except UnicodeEncodeError:
|
||||
from path import FILESYSTEMENCODING
|
||||
raise SkipTest(
|
||||
raise pytest.skip.Exception(
|
||||
'nonascii filename not supported on this filesystem encoding: '
|
||||
'%s', FILESYSTEMENCODING)
|
||||
|
||||
|
@ -16,11 +16,9 @@ import pytest
|
||||
|
||||
from sphinx import addnodes
|
||||
from sphinx.domains.cpp import DefinitionParser, DefinitionError, NoOldIdError
|
||||
from sphinx.domains.cpp import Symbol
|
||||
from sphinx.domains.cpp import Symbol, _max_id, _id_prefix
|
||||
import sphinx.domains.cpp as cppDomain
|
||||
|
||||
ids = []
|
||||
|
||||
|
||||
def parse(name, string):
|
||||
class Config(object):
|
||||
@ -28,18 +26,14 @@ def parse(name, string):
|
||||
cpp_paren_attributes = ["paren_attr"]
|
||||
parser = DefinitionParser(string, None, Config())
|
||||
ast = parser.parse_declaration(name)
|
||||
if not parser.eof:
|
||||
print("Parsing stopped at", parser.pos)
|
||||
print(string)
|
||||
print('-' * parser.pos + '^')
|
||||
raise DefinitionError("")
|
||||
parser.assert_end()
|
||||
# The scopedness would usually have been set by CPPEnumObject
|
||||
if name == "enum":
|
||||
ast.scoped = None # simulate unscoped enum
|
||||
return ast
|
||||
|
||||
|
||||
def check(name, input, idv1output=None, idv2output=None, output=None):
|
||||
def check(name, input, idDict, output=None):
|
||||
# first a simple check of the AST
|
||||
if output is None:
|
||||
output = input
|
||||
@ -58,37 +52,43 @@ def check(name, input, idv1output=None, idv2output=None, output=None):
|
||||
parentNode += signode
|
||||
ast.describe_signature(signode, 'lastIsName', symbol, options={})
|
||||
|
||||
if idv2output:
|
||||
idv2output = "_CPPv2" + idv2output
|
||||
try:
|
||||
idv1 = ast.get_id_v1()
|
||||
assert idv1 is not None
|
||||
except NoOldIdError:
|
||||
idv1 = None
|
||||
try:
|
||||
idv2 = ast.get_id_v2()
|
||||
assert idv2 is not None
|
||||
except NoOldIdError:
|
||||
idv2 = None
|
||||
if idv1 != idv1output or idv2 != idv2output:
|
||||
idExpected = [None]
|
||||
for i in range(1, _max_id + 1):
|
||||
if i in idDict:
|
||||
idExpected.append(idDict[i])
|
||||
else:
|
||||
idExpected.append(idExpected[i - 1])
|
||||
idActual = [None]
|
||||
for i in range(1, _max_id + 1):
|
||||
try:
|
||||
id = ast.get_id(version=i)
|
||||
assert id is not None
|
||||
idActual.append(id[len(_id_prefix[i]):])
|
||||
except NoOldIdError:
|
||||
idActual.append(None)
|
||||
|
||||
res = [True]
|
||||
for i in range(1, _max_id + 1):
|
||||
res.append(idExpected[i] == idActual[i])
|
||||
|
||||
if not all(res):
|
||||
print("input: %s" % text_type(input).rjust(20))
|
||||
print(" %s %s" % ("Id v1".rjust(20), "Id v2".rjust(20)))
|
||||
print("result: %s %s" % (str(idv1).rjust(20), str(idv2).rjust(20)))
|
||||
print("expected: %s %s" % (str(idv1output).rjust(20),
|
||||
str(idv2output).rjust(20)))
|
||||
for i in range(1, _max_id + 1):
|
||||
if res[i]:
|
||||
continue
|
||||
print("Error in id version %d." % i)
|
||||
print("result: %s" % str(idActual[i]))
|
||||
print("expected: %s" % str(idExpected[i]))
|
||||
print(rootSymbol.dump(0))
|
||||
raise DefinitionError("")
|
||||
ids.append(ast.get_id_v2())
|
||||
# print ".. %s:: %s" % (name, input)
|
||||
|
||||
|
||||
def test_fundamental_types():
|
||||
# see http://en.cppreference.com/w/cpp/language/types
|
||||
for t, id_v2 in cppDomain._id_fundamental_v2.items():
|
||||
if t == "decltype(auto)":
|
||||
continue
|
||||
|
||||
def makeIdV1():
|
||||
if t == 'decltype(auto)':
|
||||
return None
|
||||
id = t.replace(" ", "-").replace("long", "l").replace("int", "i")
|
||||
id = id.replace("bool", "b").replace("char", "c")
|
||||
id = id.replace("wc_t", "wchar_t").replace("c16_t", "char16_t")
|
||||
@ -100,55 +100,141 @@ def test_fundamental_types():
|
||||
if t == "std::nullptr_t":
|
||||
id = "NSt9nullptr_tE"
|
||||
return "1f%s" % id
|
||||
check("function", "void f(%s arg)" % t, makeIdV1(), makeIdV2())
|
||||
check("function", "void f(%s arg)" % t, {1: makeIdV1(), 2:makeIdV2()})
|
||||
|
||||
|
||||
def test_expressions():
|
||||
def exprCheck(expr, id):
|
||||
ids = 'IE1CIA%s_1aE'
|
||||
check('class', 'template<> C<a[%s]>' % expr, {2:ids % expr, 3:ids % id})
|
||||
# primary
|
||||
exprCheck('nullptr', 'LDnE')
|
||||
exprCheck('true', 'L1E')
|
||||
exprCheck('false', 'L0E')
|
||||
exprCheck('5', 'L5E')
|
||||
exprCheck('5.0', 'L5.0E')
|
||||
exprCheck('"abc\\"cba"', 'LA8_KcE')
|
||||
# TODO: test the rest
|
||||
exprCheck('(... + Ns)', '(... + Ns)')
|
||||
exprCheck('(5)', 'L5E')
|
||||
exprCheck('C', '1C')
|
||||
# postfix
|
||||
exprCheck('A(2)', 'cl1AL2EE')
|
||||
exprCheck('A[2]', 'ix1AL2E')
|
||||
exprCheck('a.b.c', 'dtdt1a1b1c')
|
||||
exprCheck('a->b->c', 'ptpt1a1b1c')
|
||||
exprCheck('i++', 'pp1i')
|
||||
exprCheck('i--', 'mm1i')
|
||||
# TODO, those with prefixes
|
||||
# unary
|
||||
exprCheck('++5', 'pp_L5E')
|
||||
exprCheck('--5', 'mm_L5E')
|
||||
exprCheck('*5', 'deL5E')
|
||||
exprCheck('&5', 'adL5E')
|
||||
exprCheck('+5', 'psL5E')
|
||||
exprCheck('-5', 'ngL5E')
|
||||
exprCheck('!5', 'ntL5E')
|
||||
exprCheck('~5', 'coL5E')
|
||||
# cast
|
||||
exprCheck('(int)2', 'cviL2E')
|
||||
# binary op
|
||||
exprCheck('5 || 42', 'ooL5EL42E')
|
||||
exprCheck('5 && 42', 'aaL5EL42E')
|
||||
exprCheck('5 | 42', 'orL5EL42E')
|
||||
exprCheck('5 ^ 42', 'eoL5EL42E')
|
||||
exprCheck('5 & 42', 'anL5EL42E')
|
||||
# ['==', '!=']
|
||||
exprCheck('5 == 42', 'eqL5EL42E')
|
||||
exprCheck('5 != 42', 'neL5EL42E')
|
||||
# ['<=', '>=', '<', '>']
|
||||
exprCheck('5 <= 42', 'leL5EL42E')
|
||||
exprCheck('5 >= 42', 'geL5EL42E')
|
||||
exprCheck('5 < 42', 'ltL5EL42E')
|
||||
exprCheck('5 > 42', 'gtL5EL42E')
|
||||
# ['<<', '>>']
|
||||
exprCheck('5 << 42', 'lsL5EL42E')
|
||||
exprCheck('5 >> 42', 'rsL5EL42E')
|
||||
# ['+', '-']
|
||||
exprCheck('5 + 42', 'plL5EL42E')
|
||||
exprCheck('5 - 42', 'miL5EL42E')
|
||||
# ['*', '/', '%']
|
||||
exprCheck('5 * 42', 'mlL5EL42E')
|
||||
exprCheck('5 / 42', 'dvL5EL42E')
|
||||
exprCheck('5 % 42', 'rmL5EL42E')
|
||||
# ['.*', '->*']
|
||||
exprCheck('5 .* 42', 'dsL5EL42E')
|
||||
exprCheck('5 ->* 42', 'pmL5EL42E')
|
||||
# conditional
|
||||
# TODO
|
||||
# assignment
|
||||
exprCheck('a = 5', 'aS1aL5E')
|
||||
exprCheck('a *= 5', 'mL1aL5E')
|
||||
exprCheck('a /= 5', 'dV1aL5E')
|
||||
exprCheck('a %= 5', 'rM1aL5E')
|
||||
exprCheck('a += 5', 'pL1aL5E')
|
||||
exprCheck('a -= 5', 'mI1aL5E')
|
||||
exprCheck('a >>= 5', 'rS1aL5E')
|
||||
exprCheck('a <<= 5', 'lS1aL5E')
|
||||
exprCheck('a &= 5', 'aN1aL5E')
|
||||
exprCheck('a ^= 5', 'eO1aL5E')
|
||||
exprCheck('a |= 5', 'oR1aL5E')
|
||||
|
||||
# Additional tests
|
||||
# a < expression that starts with something that could be a template
|
||||
exprCheck('A < 42', 'lt1AL42E')
|
||||
check('function', 'template<> void f(A<B, 2> &v)',
|
||||
{2:"IE1fR1AI1BX2EE", 3:"IE1fR1AI1BXL2EEE"})
|
||||
exprCheck('A<1>::value', 'N1AIXL1EEE5valueE')
|
||||
check('class', "template<int T = 42> A", {2:"I_iE1A"})
|
||||
check('enumerator', 'A = std::numeric_limits<unsigned long>::max()', {2:"1A"})
|
||||
|
||||
|
||||
def test_type_definitions():
|
||||
check("type", "public bool b", "b", "1b", "bool b")
|
||||
check("type", "bool A::b", "A::b", "N1A1bE")
|
||||
check("type", "bool *b", "b", "1b")
|
||||
check("type", "bool *const b", "b", "1b")
|
||||
check("type", "bool *volatile const b", "b", "1b")
|
||||
check("type", "bool *volatile const b", "b", "1b")
|
||||
check("type", "bool *volatile const *b", "b", "1b")
|
||||
check("type", "bool &b", "b", "1b")
|
||||
check("type", "bool b[]", "b", "1b")
|
||||
check("type", "std::pair<int, int> coord", "coord", "5coord")
|
||||
check("type", "long long int foo", "foo", "3foo")
|
||||
check("type", "public bool b", {1:"b", 2:"1b"}, "bool b")
|
||||
check("type", "bool A::b", {1:"A::b", 2:"N1A1bE"})
|
||||
check("type", "bool *b", {1:"b", 2:"1b"})
|
||||
check("type", "bool *const b", {1:"b", 2:"1b"})
|
||||
check("type", "bool *volatile const b", {1:"b", 2:"1b"})
|
||||
check("type", "bool *volatile const b", {1:"b", 2:"1b"})
|
||||
check("type", "bool *volatile const *b", {1:"b", 2:"1b"})
|
||||
check("type", "bool &b", {1:"b", 2:"1b"})
|
||||
check("type", "bool b[]", {1:"b", 2:"1b"})
|
||||
check("type", "std::pair<int, int> coord", {1:"coord", 2:"5coord"})
|
||||
check("type", "long long int foo", {1:"foo", 2:"3foo"})
|
||||
check("type", 'std::vector<std::pair<std::string, long long>> module::blah',
|
||||
"module::blah", "N6module4blahE")
|
||||
check("type", "std::function<void()> F", "F", "1F")
|
||||
check("type", "std::function<R(A1, A2)> F", "F", "1F")
|
||||
check("type", "std::function<R(A1, A2, A3)> F", "F", "1F")
|
||||
check("type", "std::function<R(A1, A2, A3, As...)> F", "F", "1F")
|
||||
{1:"module::blah", 2:"N6module4blahE"})
|
||||
check("type", "std::function<void()> F", {1:"F", 2:"1F"})
|
||||
check("type", "std::function<R(A1, A2)> F", {1:"F", 2:"1F"})
|
||||
check("type", "std::function<R(A1, A2, A3)> F", {1:"F", 2:"1F"})
|
||||
check("type", "std::function<R(A1, A2, A3, As...)> F", {1:"F", 2:"1F"})
|
||||
check("type", "MyContainer::const_iterator",
|
||||
"MyContainer::const_iterator", "N11MyContainer14const_iteratorE")
|
||||
{1:"MyContainer::const_iterator", 2:"N11MyContainer14const_iteratorE"})
|
||||
check("type",
|
||||
"public MyContainer::const_iterator",
|
||||
"MyContainer::const_iterator", "N11MyContainer14const_iteratorE",
|
||||
{1:"MyContainer::const_iterator", 2:"N11MyContainer14const_iteratorE"},
|
||||
output="MyContainer::const_iterator")
|
||||
# test decl specs on right
|
||||
check("type", "bool const b", "b", "1b")
|
||||
check("type", "bool const b", {1:"b", 2:"1b"})
|
||||
# test name in global scope
|
||||
check("type", "bool ::B::b", "B::b", "N1B1bE")
|
||||
check("type", "bool ::B::b", {1:"B::b", 2:"N1B1bE"})
|
||||
|
||||
check('type', 'A = B', None, '1A')
|
||||
check('type', 'A = B', {2:'1A'})
|
||||
|
||||
# from breathe#267 (named function parameters for function pointers
|
||||
check('type', 'void (*gpio_callback_t)(struct device *port, uint32_t pin)',
|
||||
'gpio_callback_t', '15gpio_callback_t')
|
||||
check('type', 'void (*f)(std::function<void(int i)> g)', 'f', '1f')
|
||||
{1:'gpio_callback_t', 2:'15gpio_callback_t'})
|
||||
check('type', 'void (*f)(std::function<void(int i)> g)', {1:'f', 2:'1f'})
|
||||
|
||||
|
||||
def test_concept_definitions():
|
||||
check('concept', 'template<typename Param> A::B::Concept',
|
||||
None, 'I0EN1A1B7ConceptE')
|
||||
{2:'I0EN1A1B7ConceptE'})
|
||||
check('concept', 'template<typename A, typename B, typename ...C> Foo',
|
||||
None, 'I00DpE3Foo')
|
||||
{2:'I00DpE3Foo'})
|
||||
check('concept', 'template<typename Param> A::B::Concept()',
|
||||
None, 'I0EN1A1B7ConceptE')
|
||||
{2:'I0EN1A1B7ConceptE'})
|
||||
check('concept', 'template<typename A, typename B, typename ...C> Foo()',
|
||||
None, 'I00DpE3Foo')
|
||||
{2:'I00DpE3Foo'})
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('concept', 'Foo')
|
||||
with pytest.raises(DefinitionError):
|
||||
@ -157,256 +243,257 @@ def test_concept_definitions():
|
||||
|
||||
def test_member_definitions():
|
||||
check('member', ' const std::string & name = 42',
|
||||
"name__ssCR", "4name", output='const std::string &name = 42')
|
||||
check('member', ' const std::string & name', "name__ssCR", "4name",
|
||||
{1:"name__ssCR", 2:"4name"}, output='const std::string &name = 42')
|
||||
check('member', ' const std::string & name', {1:"name__ssCR", 2:"4name"},
|
||||
output='const std::string &name')
|
||||
check('member', ' const std::string & name [ n ]',
|
||||
"name__ssCRA", "4name", output='const std::string &name[n]')
|
||||
{1:"name__ssCRA", 2:"4name"}, output='const std::string &name[n]')
|
||||
check('member', 'const std::vector< unsigned int, long> &name',
|
||||
"name__std::vector:unsigned-i.l:CR",
|
||||
"4name", output='const std::vector<unsigned int, long> &name')
|
||||
check('member', 'module::myclass foo[n]', "foo__module::myclassA", "3foo")
|
||||
check('member', 'int *const p', 'p__iPC', '1p')
|
||||
check('member', 'extern int myInt', 'myInt__i', '5myInt')
|
||||
check('member', 'thread_local int myInt', 'myInt__i', '5myInt')
|
||||
check('member', 'extern thread_local int myInt', 'myInt__i', '5myInt')
|
||||
check('member', 'thread_local extern int myInt', 'myInt__i', '5myInt',
|
||||
{1:"name__std::vector:unsigned-i.l:CR", 2:"4name"},
|
||||
output='const std::vector<unsigned int, long> &name')
|
||||
check('member', 'module::myclass foo[n]', {1:"foo__module::myclassA", 2:"3foo"})
|
||||
check('member', 'int *const p', {1:'p__iPC', 2:'1p'})
|
||||
check('member', 'extern int myInt', {1:'myInt__i', 2:'5myInt'})
|
||||
check('member', 'thread_local int myInt', {1:'myInt__i', 2:'5myInt'})
|
||||
check('member', 'extern thread_local int myInt', {1:'myInt__i', 2:'5myInt'})
|
||||
check('member', 'thread_local extern int myInt', {1:'myInt__i', 2:'5myInt'},
|
||||
'extern thread_local int myInt')
|
||||
|
||||
|
||||
def test_function_definitions():
|
||||
check('function', 'operator bool() const', "castto-b-operatorC", "NKcvbEv")
|
||||
check('function', 'operator bool() const', {1:"castto-b-operatorC", 2:"NKcvbEv"})
|
||||
check('function', 'A::operator bool() const',
|
||||
"A::castto-b-operatorC", "NK1AcvbEv")
|
||||
{1:"A::castto-b-operatorC", 2:"NK1AcvbEv"})
|
||||
check('function', 'A::operator bool() volatile const &',
|
||||
"A::castto-b-operatorVCR", "NVKR1AcvbEv")
|
||||
{1:"A::castto-b-operatorVCR", 2:"NVKR1AcvbEv"})
|
||||
check('function', 'A::operator bool() volatile const &&',
|
||||
"A::castto-b-operatorVCO", "NVKO1AcvbEv")
|
||||
{1:"A::castto-b-operatorVCO", 2:"NVKO1AcvbEv"})
|
||||
check('function', 'bool namespaced::theclass::method(arg1, arg2)',
|
||||
"namespaced::theclass::method__arg1.arg2",
|
||||
"N10namespaced8theclass6methodE4arg14arg2")
|
||||
{1:"namespaced::theclass::method__arg1.arg2",
|
||||
2:"N10namespaced8theclass6methodE4arg14arg2"})
|
||||
x = 'std::vector<std::pair<std::string, int>> &module::test(register int ' \
|
||||
'foo, bar, std::string baz = "foobar, blah, bleh") const = 0'
|
||||
check('function', x, "module::test__i.bar.ssC",
|
||||
"NK6module4testEi3barNSt6stringE")
|
||||
check('function', x, {1:"module::test__i.bar.ssC",
|
||||
2:"NK6module4testEi3barNSt6stringE"})
|
||||
check('function', 'void f(std::pair<A, B>)',
|
||||
"f__std::pair:A.B:", "1fNSt4pairI1A1BEE")
|
||||
{1:"f__std::pair:A.B:", 2:"1fNSt4pairI1A1BEE"})
|
||||
check('function', 'explicit module::myclass::foo::foo()',
|
||||
"module::myclass::foo::foo", "N6module7myclass3foo3fooEv")
|
||||
{1:"module::myclass::foo::foo", 2:"N6module7myclass3foo3fooEv"})
|
||||
check('function', 'module::myclass::foo::~foo()',
|
||||
"module::myclass::foo::~foo", "N6module7myclass3fooD0Ev")
|
||||
{1:"module::myclass::foo::~foo", 2:"N6module7myclass3fooD0Ev"})
|
||||
check('function', 'int printf(const char *fmt, ...)',
|
||||
"printf__cCP.z", "6printfPKcz")
|
||||
{1:"printf__cCP.z", 2:"6printfPKcz"})
|
||||
check('function', 'int foo(const unsigned int j)',
|
||||
"foo__unsigned-iC", "3fooKj")
|
||||
{1:"foo__unsigned-iC", 2:"3fooKj"})
|
||||
check('function', 'int foo(const int *const ptr)',
|
||||
"foo__iCPC", "3fooPCKi")
|
||||
{1:"foo__iCPC", 2:"3fooPCKi"})
|
||||
check('function', 'module::myclass::operator std::vector<std::string>()',
|
||||
"module::myclass::castto-std::vector:ss:-operator",
|
||||
"N6module7myclasscvNSt6vectorINSt6stringEEEEv")
|
||||
{1:"module::myclass::castto-std::vector:ss:-operator",
|
||||
2:"N6module7myclasscvNSt6vectorINSt6stringEEEEv"})
|
||||
check('function',
|
||||
'void operator()(const boost::array<VertexID, 2> &v) const',
|
||||
"call-operator__boost::array:VertexID.2:CRC",
|
||||
"NKclERKN5boost5arrayI8VertexIDX2EEE")
|
||||
{1:"call-operator__boost::array:VertexID.2:CRC",
|
||||
2:"NKclERKN5boost5arrayI8VertexIDX2EEE",
|
||||
3:"NKclERKN5boost5arrayI8VertexIDXL2EEEE"})
|
||||
check('function',
|
||||
'void operator()(const boost::array<VertexID, 2, "foo, bar"> &v) const',
|
||||
'call-operator__boost::array:VertexID.2."foo,--bar":CRC',
|
||||
'NKclERKN5boost5arrayI8VertexIDX2EX"foo, bar"EEE')
|
||||
{1:'call-operator__boost::array:VertexID.2."foo,--bar":CRC',
|
||||
2:'NKclERKN5boost5arrayI8VertexIDX2EX"foo, bar"EEE',
|
||||
3:'NKclERKN5boost5arrayI8VertexIDXL2EEXLA9_KcEEEE'})
|
||||
check('function', 'MyClass::MyClass(MyClass::MyClass&&)',
|
||||
"MyClass::MyClass__MyClass::MyClassRR",
|
||||
"N7MyClass7MyClassERRN7MyClass7MyClassE")
|
||||
check('function', 'constexpr int get_value()', "get_valueCE", "9get_valuev")
|
||||
{1:"MyClass::MyClass__MyClass::MyClassRR",
|
||||
2:"N7MyClass7MyClassERRN7MyClass7MyClassE"})
|
||||
check('function', 'constexpr int get_value()', {1:"get_valueCE", 2:"9get_valuev"})
|
||||
check('function', 'static constexpr int get_value()',
|
||||
"get_valueCE", "9get_valuev")
|
||||
{1:"get_valueCE", 2:"9get_valuev"})
|
||||
check('function', 'int get_value() const noexcept',
|
||||
"get_valueC", "NK9get_valueEv")
|
||||
{1:"get_valueC", 2:"NK9get_valueEv"})
|
||||
check('function', 'int get_value() const noexcept = delete',
|
||||
"get_valueC", "NK9get_valueEv")
|
||||
{1:"get_valueC", 2:"NK9get_valueEv"})
|
||||
check('function', 'int get_value() volatile const',
|
||||
"get_valueVC", "NVK9get_valueEv")
|
||||
{1:"get_valueVC", 2:"NVK9get_valueEv"})
|
||||
check('function', 'MyClass::MyClass(MyClass::MyClass&&) = default',
|
||||
"MyClass::MyClass__MyClass::MyClassRR",
|
||||
"N7MyClass7MyClassERRN7MyClass7MyClassE")
|
||||
{1:"MyClass::MyClass__MyClass::MyClassRR",
|
||||
2:"N7MyClass7MyClassERRN7MyClass7MyClassE"})
|
||||
check('function', 'virtual MyClass::a_virtual_function() const override',
|
||||
"MyClass::a_virtual_functionC", "NK7MyClass18a_virtual_functionEv")
|
||||
check('function', 'A B() override', "B", "1Bv")
|
||||
check('function', 'A B() final', "B", "1Bv")
|
||||
check('function', 'A B() final override', "B", "1Bv")
|
||||
check('function', 'A B() override final', "B", "1Bv",
|
||||
{1:"MyClass::a_virtual_functionC", 2:"NK7MyClass18a_virtual_functionEv"})
|
||||
check('function', 'A B() override', {1:"B", 2:"1Bv"})
|
||||
check('function', 'A B() final', {1:"B", 2:"1Bv"})
|
||||
check('function', 'A B() final override', {1:"B", 2:"1Bv"})
|
||||
check('function', 'A B() override final', {1:"B", 2:"1Bv"},
|
||||
output='A B() final override')
|
||||
check('function', 'MyClass::a_member_function() volatile',
|
||||
"MyClass::a_member_functionV", "NV7MyClass17a_member_functionEv")
|
||||
{1:"MyClass::a_member_functionV", 2:"NV7MyClass17a_member_functionEv"})
|
||||
check('function', 'MyClass::a_member_function() volatile const',
|
||||
"MyClass::a_member_functionVC", "NVK7MyClass17a_member_functionEv")
|
||||
{1:"MyClass::a_member_functionVC", 2:"NVK7MyClass17a_member_functionEv"})
|
||||
check('function', 'MyClass::a_member_function() &&',
|
||||
"MyClass::a_member_functionO", "NO7MyClass17a_member_functionEv")
|
||||
{1:"MyClass::a_member_functionO", 2:"NO7MyClass17a_member_functionEv"})
|
||||
check('function', 'MyClass::a_member_function() &',
|
||||
"MyClass::a_member_functionR", "NR7MyClass17a_member_functionEv")
|
||||
{1:"MyClass::a_member_functionR", 2:"NR7MyClass17a_member_functionEv"})
|
||||
check('function', 'MyClass::a_member_function() const &',
|
||||
"MyClass::a_member_functionCR", "NKR7MyClass17a_member_functionEv")
|
||||
{1:"MyClass::a_member_functionCR", 2:"NKR7MyClass17a_member_functionEv"})
|
||||
check('function', 'int main(int argc, char *argv[])',
|
||||
"main__i.cPA", "4mainiA_Pc")
|
||||
{1:"main__i.cPA", 2:"4mainiA_Pc"})
|
||||
check('function', 'MyClass &MyClass::operator++()',
|
||||
"MyClass::inc-operator", "N7MyClassppEv")
|
||||
{1:"MyClass::inc-operator", 2:"N7MyClassppEv"})
|
||||
check('function', 'MyClass::pointer MyClass::operator->()',
|
||||
"MyClass::pointer-operator", "N7MyClassptEv")
|
||||
{1:"MyClass::pointer-operator", 2:"N7MyClassptEv"})
|
||||
|
||||
x = 'std::vector<std::pair<std::string, int>> &module::test(register int ' \
|
||||
'foo, bar[n], std::string baz = "foobar, blah, bleh") const = 0'
|
||||
check('function', x, "module::test__i.barA.ssC",
|
||||
"NK6module4testEiAn_3barNSt6stringE")
|
||||
check('function', x, {1:"module::test__i.barA.ssC",
|
||||
2:"NK6module4testEiAn_3barNSt6stringE",
|
||||
3:"NK6module4testEiA1n_3barNSt6stringE"})
|
||||
check('function',
|
||||
'int foo(Foo f = Foo(double(), std::make_pair(int(2), double(3.4))))',
|
||||
"foo__Foo", "3foo3Foo")
|
||||
check('function', 'int foo(A a = x(a))', "foo__A", "3foo1A")
|
||||
{1:"foo__Foo", 2:"3foo3Foo"})
|
||||
check('function', 'int foo(A a = x(a))', {1:"foo__A", 2:"3foo1A"})
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('function', 'int foo(B b=x(a)')
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('function', 'int foo)C c=x(a))')
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('function', 'int foo(D d=x(a')
|
||||
check('function', 'int foo(const A&... a)', "foo__ACRDp", "3fooDpRK1A")
|
||||
check('function', 'virtual void f()', "f", "1fv")
|
||||
check('function', 'int foo(const A&... a)', {1:"foo__ACRDp", 2:"3fooDpRK1A"})
|
||||
check('function', 'virtual void f()', {1:"f", 2:"1fv"})
|
||||
# test for ::nestedName, from issue 1738
|
||||
check("function", "result(int val, ::std::error_category const &cat)",
|
||||
"result__i.std::error_categoryCR", "6resultiRNSt14error_categoryE")
|
||||
check("function", "int *f()", "f", "1fv")
|
||||
{1:"result__i.std::error_categoryCR", 2:"6resultiRNSt14error_categoryE"})
|
||||
check("function", "int *f()", {1:"f", 2:"1fv"})
|
||||
# tests derived from issue #1753 (skip to keep sanity)
|
||||
check("function", "f(int (&array)[10])", None, "1fRA10_i")
|
||||
check("function", "void f(int (&array)[10])", None, "1fRA10_i")
|
||||
check("function", "void f(float *q(double))", None, "1fFPfdE")
|
||||
check("function", "void f(float *(*q)(double))", None, "1fPFPfdE")
|
||||
check("function", "void f(float (*q)(double))", None, "1fPFfdE")
|
||||
check("function", "int (*f(double d))(float)", "f__double", "1fd")
|
||||
check("function", "int (*f(bool b))[5]", "f__b", "1fb")
|
||||
check("function", "f(int (&array)[10])", {2:"1fRA10_i", 3:"1fRAL10E_i"})
|
||||
check("function", "void f(int (&array)[10])", {2:"1fRA10_i", 3:"1fRAL10E_i"})
|
||||
check("function", "void f(float *q(double))", {2:"1fFPfdE"})
|
||||
check("function", "void f(float *(*q)(double))", {2:"1fPFPfdE"})
|
||||
check("function", "void f(float (*q)(double))", {2:"1fPFfdE"})
|
||||
check("function", "int (*f(double d))(float)", {1:"f__double", 2:"1fd"})
|
||||
check("function", "int (*f(bool b))[5]", {1:"f__b", 2:"1fb"})
|
||||
check("function", "int (*A::f(double d) const)(float)",
|
||||
"A::f__doubleC", "NK1A1fEd")
|
||||
{1:"A::f__doubleC", 2:"NK1A1fEd"})
|
||||
check("function", "void f(std::shared_ptr<int(double)> ptr)",
|
||||
None, "1fNSt10shared_ptrIFidEEE")
|
||||
check("function", "void f(int *const p)", "f__iPC", "1fPCi")
|
||||
check("function", "void f(int *volatile const p)", "f__iPVC", "1fPVCi")
|
||||
{2:"1fNSt10shared_ptrIFidEEE"})
|
||||
check("function", "void f(int *const p)", {1:"f__iPC", 2:"1fPCi"})
|
||||
check("function", "void f(int *volatile const p)", {1:"f__iPVC", 2:"1fPVCi"})
|
||||
|
||||
check('function', 'extern int f()', 'f', '1fv')
|
||||
check('function', 'extern int f()', {1:'f', 2:'1fv'})
|
||||
|
||||
# TODO: make tests for functions in a template, e.g., Test<int&&()>
|
||||
# such that the id generation for function type types is correct.
|
||||
|
||||
check('function', 'friend std::ostream &f(std::ostream&, int)',
|
||||
'f__osR.i', '1fRNSt7ostreamEi')
|
||||
{1:'f__osR.i', 2:'1fRNSt7ostreamEi'})
|
||||
|
||||
# from breathe#223
|
||||
check('function', 'void f(struct E e)', 'f__E', '1f1E')
|
||||
check('function', 'void f(class E e)', 'f__E', '1f1E')
|
||||
check('function', 'void f(typename E e)', 'f__E', '1f1E')
|
||||
check('function', 'void f(enum E e)', 'f__E', '1f1E')
|
||||
check('function', 'void f(union E e)', 'f__E', '1f1E')
|
||||
check('function', 'void f(struct E e)', {1:'f__E', 2:'1f1E'})
|
||||
check('function', 'void f(class E e)', {1:'f__E', 2:'1f1E'})
|
||||
check('function', 'void f(typename E e)', {1:'f__E', 2:'1f1E'})
|
||||
check('function', 'void f(enum E e)', {1:'f__E', 2:'1f1E'})
|
||||
check('function', 'void f(union E e)', {1:'f__E', 2:'1f1E'})
|
||||
|
||||
# pointer to member (function)
|
||||
check('function', 'void f(int C::*)', None, '1fM1Ci')
|
||||
check('function', 'void f(int C::* p)', None, '1fM1Ci')
|
||||
check('function', 'void f(int ::C::* p)', None, '1fM1Ci')
|
||||
check('function', 'void f(int C::* const)', None, '1fKM1Ci')
|
||||
check('function', 'void f(int C::* const&)', None, '1fRKM1Ci')
|
||||
check('function', 'void f(int C::* volatile)', None, '1fVM1Ci')
|
||||
check('function', 'void f(int C::* const volatile)', None, '1fVKM1Ci',
|
||||
check('function', 'void f(int C::*)', {2:'1fM1Ci'})
|
||||
check('function', 'void f(int C::* p)', {2:'1fM1Ci'})
|
||||
check('function', 'void f(int ::C::* p)', {2:'1fM1Ci'})
|
||||
check('function', 'void f(int C::* const)', {2:'1fKM1Ci'})
|
||||
check('function', 'void f(int C::* const&)', {2:'1fRKM1Ci'})
|
||||
check('function', 'void f(int C::* volatile)', {2:'1fVM1Ci'})
|
||||
check('function', 'void f(int C::* const volatile)', {2:'1fVKM1Ci'},
|
||||
output='void f(int C::* volatile const)')
|
||||
check('function', 'void f(int C::* volatile const)', None, '1fVKM1Ci')
|
||||
check('function', 'void f(int (C::*)(float, double))', None, '1fM1CFifdE')
|
||||
check('function', 'void f(int (C::* p)(float, double))', None, '1fM1CFifdE')
|
||||
check('function', 'void f(int (::C::* p)(float, double))', None, '1fM1CFifdE')
|
||||
check('function', 'void f(void (C::*)() const &)', None, '1fM1CKRFvvE')
|
||||
check('function', 'int C::* f(int, double)', None, '1fid')
|
||||
check('function', 'void f(int C::* *)', None, '1fPM1Ci')
|
||||
check('function', 'void f(int C::* volatile const)', {2:'1fVKM1Ci'})
|
||||
check('function', 'void f(int (C::*)(float, double))', {2:'1fM1CFifdE'})
|
||||
check('function', 'void f(int (C::* p)(float, double))', {2:'1fM1CFifdE'})
|
||||
check('function', 'void f(int (::C::* p)(float, double))', {2:'1fM1CFifdE'})
|
||||
check('function', 'void f(void (C::*)() const &)', {2:'1fM1CKRFvvE'})
|
||||
check('function', 'int C::* f(int, double)', {2:'1fid'})
|
||||
check('function', 'void f(int C::* *)', {2:'1fPM1Ci'})
|
||||
|
||||
|
||||
def test_operators():
|
||||
check('function', 'void operator new [ ] ()',
|
||||
"new-array-operator", "nav", output='void operator new[]()')
|
||||
{1:"new-array-operator", 2:"nav"}, output='void operator new[]()')
|
||||
check('function', 'void operator delete ()',
|
||||
"delete-operator", "dlv", output='void operator delete()')
|
||||
{1:"delete-operator", 2:"dlv"}, output='void operator delete()')
|
||||
check('function', 'operator bool() const',
|
||||
"castto-b-operatorC", "NKcvbEv", output='operator bool() const')
|
||||
{1:"castto-b-operatorC", 2:"NKcvbEv"}, output='operator bool() const')
|
||||
|
||||
check('function', 'void operator * ()',
|
||||
"mul-operator", "mlv", output='void operator*()')
|
||||
{1:"mul-operator", 2:"mlv"}, output='void operator*()')
|
||||
check('function', 'void operator - ()',
|
||||
"sub-operator", "miv", output='void operator-()')
|
||||
{1:"sub-operator", 2:"miv"}, output='void operator-()')
|
||||
check('function', 'void operator + ()',
|
||||
"add-operator", "plv", output='void operator+()')
|
||||
{1:"add-operator", 2:"plv"}, output='void operator+()')
|
||||
check('function', 'void operator = ()',
|
||||
"assign-operator", "aSv", output='void operator=()')
|
||||
{1:"assign-operator", 2:"aSv"}, output='void operator=()')
|
||||
check('function', 'void operator / ()',
|
||||
"div-operator", "dvv", output='void operator/()')
|
||||
{1:"div-operator", 2:"dvv"}, output='void operator/()')
|
||||
check('function', 'void operator % ()',
|
||||
"mod-operator", "rmv", output='void operator%()')
|
||||
{1:"mod-operator", 2:"rmv"}, output='void operator%()')
|
||||
check('function', 'void operator ! ()',
|
||||
"not-operator", "ntv", output='void operator!()')
|
||||
{1:"not-operator", 2:"ntv"}, output='void operator!()')
|
||||
|
||||
check('function', 'void operator "" _udl()',
|
||||
None, 'li4_udlv', output='void operator""_udl()')
|
||||
{2:'li4_udlv'}, output='void operator""_udl()')
|
||||
|
||||
|
||||
def test_class_definitions():
|
||||
check('class', 'public A', "A", "1A", output='A')
|
||||
check('class', 'private A', "A", "1A")
|
||||
check('class', 'A final', 'A', '1A')
|
||||
check('class', 'public A', {1:"A", 2:"1A"}, output='A')
|
||||
check('class', 'private A', {1:"A", 2:"1A"})
|
||||
check('class', 'A final', {1:'A', 2:'1A'})
|
||||
|
||||
# test bases
|
||||
check('class', 'A', "A", "1A")
|
||||
check('class', 'A::B::C', "A::B::C", "N1A1B1CE")
|
||||
check('class', 'A : B', "A", "1A")
|
||||
check('class', 'A : private B', "A", "1A", output='A : B')
|
||||
check('class', 'A : public B', "A", "1A")
|
||||
check('class', 'A : B, C', "A", "1A")
|
||||
check('class', 'A : B, protected C, D', "A", "1A")
|
||||
check('class', 'A : virtual private B', 'A', '1A', output='A : virtual B')
|
||||
check('class', 'A : B, virtual C', 'A', '1A')
|
||||
check('class', 'A : public virtual B', 'A', '1A')
|
||||
check('class', 'A : B, C...', 'A', '1A')
|
||||
check('class', 'A : B..., C', 'A', '1A')
|
||||
check('class', 'A', {1:"A", 2:"1A"})
|
||||
check('class', 'A::B::C', {1:"A::B::C", 2:"N1A1B1CE"})
|
||||
check('class', 'A : B', {1:"A", 2:"1A"})
|
||||
check('class', 'A : private B', {1:"A", 2:"1A"}, output='A : B')
|
||||
check('class', 'A : public B', {1:"A", 2:"1A"})
|
||||
check('class', 'A : B, C', {1:"A", 2:"1A"})
|
||||
check('class', 'A : B, protected C, D', {1:"A", 2:"1A"})
|
||||
check('class', 'A : virtual private B', {1:'A', 2:'1A'}, output='A : virtual B')
|
||||
check('class', 'A : B, virtual C', {1:'A', 2:'1A'})
|
||||
check('class', 'A : public virtual B', {1:'A', 2:'1A'})
|
||||
check('class', 'A : B, C...', {1:'A', 2:'1A'})
|
||||
check('class', 'A : B..., C', {1:'A', 2:'1A'})
|
||||
|
||||
|
||||
def test_enum_definitions():
|
||||
check('enum', 'A', None, "1A")
|
||||
check('enum', 'A : std::underlying_type<B>::type', None, "1A")
|
||||
check('enum', 'A : unsigned int', None, "1A")
|
||||
check('enum', 'public A', None, "1A", output='A')
|
||||
check('enum', 'private A', None, "1A")
|
||||
check('enum', 'A', {2:"1A"})
|
||||
check('enum', 'A : std::underlying_type<B>::type', {2:"1A"})
|
||||
check('enum', 'A : unsigned int', {2:"1A"})
|
||||
check('enum', 'public A', {2:"1A"}, output='A')
|
||||
check('enum', 'private A', {2:"1A"})
|
||||
|
||||
check('enumerator', 'A', None, "1A")
|
||||
check('enumerator', 'A = std::numeric_limits<unsigned long>::max()',
|
||||
None, "1A")
|
||||
check('enumerator', 'A', {2:"1A"})
|
||||
check('enumerator', 'A = std::numeric_limits<unsigned long>::max()', {2:"1A"})
|
||||
|
||||
|
||||
def test_templates():
|
||||
check('class', "A<T>", None, "IE1AI1TE", output="template<> A<T>")
|
||||
check('class', "A<T>", {2:"IE1AI1TE"}, output="template<> A<T>")
|
||||
# first just check which objects support templating
|
||||
check('class', "template<> A", None, "IE1A")
|
||||
check('function', "template<> void A()", None, "IE1Av")
|
||||
check('member', "template<> A a", None, "IE1a")
|
||||
check('type', "template<> a = A", None, "IE1a")
|
||||
check('class', "template<> A", {2:"IE1A"})
|
||||
check('function', "template<> void A()", {2:"IE1Av"})
|
||||
check('member', "template<> A a", {2:"IE1a"})
|
||||
check('type', "template<> a = A", {2:"IE1a"})
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('enum', "template<> A")
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('enumerator', "template<> A")
|
||||
# then all the real tests
|
||||
check('class', "template<typename T1, typename T2> A", None, "I00E1A")
|
||||
check('type', "template<> a", None, "IE1a")
|
||||
check('class', "template<typename T1, typename T2> A", {2:"I00E1A"})
|
||||
check('type', "template<> a", {2:"IE1a"})
|
||||
|
||||
check('class', "template<typename T> A", None, "I0E1A")
|
||||
check('class', "template<class T> A", None, "I0E1A")
|
||||
check('class', "template<typename ...T> A", None, "IDpE1A")
|
||||
check('class', "template<typename...> A", None, "IDpE1A")
|
||||
check('class', "template<typename = Test> A", None, "I0E1A")
|
||||
check('class', "template<typename T = Test> A", None, "I0E1A")
|
||||
check('class', "template<typename T> A", {2:"I0E1A"})
|
||||
check('class', "template<class T> A", {2:"I0E1A"})
|
||||
check('class', "template<typename ...T> A", {2:"IDpE1A"})
|
||||
check('class', "template<typename...> A", {2:"IDpE1A"})
|
||||
check('class', "template<typename = Test> A", {2:"I0E1A"})
|
||||
check('class', "template<typename T = Test> A", {2:"I0E1A"})
|
||||
|
||||
check('class', "template<template<typename> typename T> A",
|
||||
None, "II0E0E1A")
|
||||
check('class', "template<int> A", None, "I_iE1A")
|
||||
check('class', "template<int T> A", None, "I_iE1A")
|
||||
check('class', "template<int... T> A", None, "I_DpiE1A")
|
||||
check('class', "template<int T = 42> A", None, "I_iE1A")
|
||||
check('class', "template<int = 42> A", None, "I_iE1A")
|
||||
check('class', "template<template<typename> typename T> A", {2:"II0E0E1A"})
|
||||
check('class', "template<int> A", {2:"I_iE1A"})
|
||||
check('class', "template<int T> A", {2:"I_iE1A"})
|
||||
check('class', "template<int... T> A", {2:"I_DpiE1A"})
|
||||
check('class', "template<int T = 42> A", {2:"I_iE1A"})
|
||||
check('class', "template<int = 42> A", {2:"I_iE1A"})
|
||||
|
||||
# from #2058
|
||||
check('function',
|
||||
@ -414,8 +501,8 @@ def test_templates():
|
||||
"inline std::basic_ostream<Char, Traits> &operator<<("
|
||||
"std::basic_ostream<Char, Traits> &os, "
|
||||
"const c_string_view_base<const Char, Traits> &str)",
|
||||
None, "I00ElsRNSt13basic_ostreamI4Char6TraitsEE"
|
||||
"RK18c_string_view_baseIK4Char6TraitsE")
|
||||
{2:"I00ElsRNSt13basic_ostreamI4Char6TraitsEE"
|
||||
"RK18c_string_view_baseIK4Char6TraitsE"})
|
||||
|
||||
# template introductions
|
||||
with pytest.raises(DefinitionError):
|
||||
@ -423,63 +510,64 @@ def test_templates():
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('enumerator', 'abc::ns::foo{id_0, id_1, id_2} A')
|
||||
check('class', 'abc::ns::foo{id_0, id_1, id_2} xyz::bar',
|
||||
None, 'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE')
|
||||
{2:'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE'})
|
||||
check('class', 'abc::ns::foo{id_0, id_1, ...id_2} xyz::bar',
|
||||
None, 'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE')
|
||||
{2:'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE'})
|
||||
check('class', 'abc::ns::foo{id_0, id_1, id_2} xyz::bar<id_0, id_1, id_2>',
|
||||
None, 'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barI4id_04id_14id_2EE')
|
||||
{2:'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barI4id_04id_14id_2EE'})
|
||||
check('class', 'abc::ns::foo{id_0, id_1, ...id_2} xyz::bar<id_0, id_1, id_2...>',
|
||||
None, 'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barI4id_04id_1Dp4id_2EE')
|
||||
{2:'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barI4id_04id_1Dp4id_2EE'})
|
||||
|
||||
check('class', 'template<> Concept{U} A<int>::B',
|
||||
None, 'IEI0EX7ConceptI1UEEN1AIiE1BE')
|
||||
check('class', 'template<> Concept{U} A<int>::B', {2:'IEI0EX7ConceptI1UEEN1AIiE1BE'})
|
||||
|
||||
check('type', 'abc::ns::foo{id_0, id_1, id_2} xyz::bar = ghi::qux',
|
||||
None, 'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE')
|
||||
{2:'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE'})
|
||||
check('type', 'abc::ns::foo{id_0, id_1, ...id_2} xyz::bar = ghi::qux',
|
||||
None, 'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE')
|
||||
{2:'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE'})
|
||||
check('function', 'abc::ns::foo{id_0, id_1, id_2} void xyz::bar()',
|
||||
None, 'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barEv')
|
||||
{2:'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barEv'})
|
||||
check('function', 'abc::ns::foo{id_0, id_1, ...id_2} void xyz::bar()',
|
||||
None, 'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barEv')
|
||||
{2:'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barEv'})
|
||||
check('member', 'abc::ns::foo{id_0, id_1, id_2} ghi::qux xyz::bar',
|
||||
None, 'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE')
|
||||
{2:'I000EXN3abc2ns3fooEI4id_04id_14id_2EEN3xyz3barE'})
|
||||
check('member', 'abc::ns::foo{id_0, id_1, ...id_2} ghi::qux xyz::bar',
|
||||
None, 'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE')
|
||||
check('concept', 'Iterator{T, U} Another',
|
||||
None, 'I00EX8IteratorI1T1UEE7Another')
|
||||
{2:'I00DpEXN3abc2ns3fooEI4id_04id_1sp4id_2EEN3xyz3barE'})
|
||||
check('concept', 'Iterator{T, U} Another', {2:'I00EX8IteratorI1T1UEE7Another'})
|
||||
check('concept', 'template<typename ...Pack> Numerics = (... && Numeric<Pack>)',
|
||||
None, 'IDpE8Numerics')
|
||||
{2:'IDpE8Numerics'})
|
||||
|
||||
|
||||
def test_template_args():
|
||||
# from breathe#218
|
||||
check('function',
|
||||
"template<typename F> "
|
||||
"void allow(F *f, typename func<F, B, G!=1>::type tt)",
|
||||
None, "I0E5allowP1FN4funcI1F1BXG!=1EE4typeE")
|
||||
"void allow(F *f, typename func<F, B, G != 1>::type tt)",
|
||||
{2:"I0E5allowP1FN4funcI1F1BXG != 1EE4typeE",
|
||||
3:"I0E5allowP1FN4funcI1F1BXne1GL1EEE4typeE"})
|
||||
# from #3542
|
||||
check('type', "template<typename T> "
|
||||
"enable_if_not_array_t = std::enable_if_t<!is_array<T>::value, int>",
|
||||
None, "I0E21enable_if_not_array_t")
|
||||
{2:"I0E21enable_if_not_array_t"})
|
||||
|
||||
|
||||
def test_attributes():
|
||||
# style: C++
|
||||
check('member', '[[]] int f', 'f__i', '1f')
|
||||
check('member', '[ [ ] ] int f', 'f__i', '1f',
|
||||
check('member', '[[]] int f', {1:'f__i', 2:'1f'})
|
||||
check('member', '[ [ ] ] int f', {1:'f__i', 2:'1f'},
|
||||
# this will fail when the proper grammar is implemented
|
||||
output='[[ ]] int f')
|
||||
check('member', '[[a]] int f', 'f__i', '1f')
|
||||
check('member', '[[a]] int f', {1:'f__i', 2:'1f'})
|
||||
# style: GNU
|
||||
check('member', '__attribute__(()) int f', 'f__i', '1f')
|
||||
check('member', '__attribute__((a)) int f', 'f__i', '1f')
|
||||
check('member', '__attribute__((a, b)) int f', 'f__i', '1f')
|
||||
check('member', '__attribute__(()) int f', {1:'f__i', 2:'1f'})
|
||||
check('member', '__attribute__((a)) int f', {1:'f__i', 2:'1f'})
|
||||
check('member', '__attribute__((a, b)) int f', {1:'f__i', 2:'1f'})
|
||||
# style: user-defined id
|
||||
check('member', 'id_attr int f', 'f__i', '1f')
|
||||
check('member', 'id_attr int f', {1:'f__i', 2:'1f'})
|
||||
# style: user-defined paren
|
||||
check('member', 'paren_attr() int f', 'f__i', '1f')
|
||||
check('member', 'paren_attr(a) int f', 'f__i', '1f')
|
||||
check('member', 'paren_attr("") int f', 'f__i', '1f')
|
||||
check('member', 'paren_attr(()[{}][]{}) int f', 'f__i', '1f')
|
||||
check('member', 'paren_attr() int f', {1:'f__i', 2:'1f'})
|
||||
check('member', 'paren_attr(a) int f', {1:'f__i', 2:'1f'})
|
||||
check('member', 'paren_attr("") int f', {1:'f__i', 2:'1f'})
|
||||
check('member', 'paren_attr(()[{}][]{}) int f', {1:'f__i', 2:'1f'})
|
||||
with pytest.raises(DefinitionError):
|
||||
parse('member', 'paren_attr(() int f')
|
||||
with pytest.raises(DefinitionError):
|
||||
@ -495,7 +583,7 @@ def test_attributes():
|
||||
|
||||
# position: decl specs
|
||||
check('function', 'static inline __attribute__(()) void f()',
|
||||
'f', '1fv',
|
||||
{1:'f', 2:'1fv'},
|
||||
output='__attribute__(()) static inline void f()')
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ def test_process_doc(app):
|
||||
list_item)])
|
||||
|
||||
assert_node(toctree[0][0],
|
||||
[compact_paragraph, reference, "Welcome to Sphinx Tests's documentation!"])
|
||||
[compact_paragraph, reference, u"Welcome to Sphinx Tests’s documentation!"])
|
||||
assert_node(toctree[0][0][0], reference, anchorname='')
|
||||
assert_node(toctree[0][1][0], addnodes.toctree,
|
||||
caption="Table of Contents", glob=False, hidden=False,
|
||||
@ -150,7 +150,7 @@ def test_get_toc_for(app):
|
||||
addnodes.toctree)])],
|
||||
[list_item, compact_paragraph])]) # [2][0]
|
||||
assert_node(toctree[0][0],
|
||||
[compact_paragraph, reference, "Welcome to Sphinx Tests's documentation!"])
|
||||
[compact_paragraph, reference, u"Welcome to Sphinx Tests’s documentation!"])
|
||||
assert_node(toctree[0][1][2],
|
||||
([compact_paragraph, reference, "subsection"],
|
||||
[bullet_list, list_item, compact_paragraph, reference, "subsubsection"]))
|
||||
@ -177,7 +177,7 @@ def test_get_toc_for_only(app):
|
||||
addnodes.toctree)])],
|
||||
[list_item, compact_paragraph])]) # [2][0]
|
||||
assert_node(toctree[0][0],
|
||||
[compact_paragraph, reference, "Welcome to Sphinx Tests's documentation!"])
|
||||
[compact_paragraph, reference, u"Welcome to Sphinx Tests’s documentation!"])
|
||||
assert_node(toctree[0][1][1],
|
||||
([compact_paragraph, reference, "Section for HTML"],
|
||||
[bullet_list, addnodes.toctree]))
|
||||
|
@ -12,7 +12,6 @@
|
||||
import re
|
||||
|
||||
import pytest
|
||||
from util import SkipTest
|
||||
|
||||
|
||||
@pytest.mark.sphinx(
|
||||
@ -40,9 +39,9 @@ def test_jsmath(app, status, warning):
|
||||
def test_imgmath_png(app, status, warning):
|
||||
app.builder.build_all()
|
||||
if "LaTeX command 'latex' cannot be run" in warning.getvalue():
|
||||
raise SkipTest('LaTeX command "latex" is not available')
|
||||
raise pytest.skip.Exception('LaTeX command "latex" is not available')
|
||||
if "dvipng command 'dvipng' cannot be run" in warning.getvalue():
|
||||
raise SkipTest('dvipng command "dvipng" is not available')
|
||||
raise pytest.skip.Exception('dvipng command "dvipng" is not available')
|
||||
|
||||
content = (app.outdir / 'index.html').text()
|
||||
html = (r'<div class="math">\s*<p>\s*<img src="_images/math/\w+.png"'
|
||||
@ -56,9 +55,9 @@ def test_imgmath_png(app, status, warning):
|
||||
def test_imgmath_svg(app, status, warning):
|
||||
app.builder.build_all()
|
||||
if "LaTeX command 'latex' cannot be run" in warning.getvalue():
|
||||
raise SkipTest('LaTeX command "latex" is not available')
|
||||
raise pytest.skip.Exception('LaTeX command "latex" is not available')
|
||||
if "dvisvgm command 'dvisvgm' cannot be run" in warning.getvalue():
|
||||
raise SkipTest('dvisvgm command "dvisvgm" is not available')
|
||||
raise pytest.skip.Exception('dvisvgm command "dvisvgm" is not available')
|
||||
|
||||
content = (app.outdir / 'index.html').text()
|
||||
html = (r'<div class="math">\s*<p>\s*<img src="_images/math/\w+.svg"'
|
||||
|
@ -14,11 +14,12 @@ import pickle
|
||||
|
||||
from docutils import frontend, utils, nodes
|
||||
from docutils.parsers.rst import Parser as RstParser
|
||||
from docutils.transforms.universal import SmartQuotes
|
||||
|
||||
from sphinx import addnodes
|
||||
from sphinx.util import texescape
|
||||
from sphinx.util.docutils import sphinx_domains
|
||||
from sphinx.writers.html import HTMLWriter, SmartyPantsHTMLTranslator
|
||||
from sphinx.writers.html import HTMLWriter, HTMLTranslator
|
||||
from sphinx.writers.latex import LaTeXWriter, LaTeXTranslator
|
||||
import pytest
|
||||
|
||||
@ -31,6 +32,7 @@ def settings(app):
|
||||
optparser = frontend.OptionParser(
|
||||
components=(RstParser, HTMLWriter, LaTeXWriter))
|
||||
settings = optparser.get_default_values()
|
||||
settings.smart_quotes = True
|
||||
settings.env = app.builder.env
|
||||
settings.env.temp_data['docname'] = 'dummy'
|
||||
domain_context = sphinx_domains(settings.env)
|
||||
@ -46,6 +48,7 @@ def parse(settings):
|
||||
document['file'] = 'dummy'
|
||||
parser = RstParser()
|
||||
parser.parse(rst, document)
|
||||
SmartQuotes(document, startnode=None).apply()
|
||||
for msg in document.traverse(nodes.system_message):
|
||||
if msg['level'] == 1:
|
||||
msg.replace_self([])
|
||||
@ -62,7 +65,7 @@ class ForgivingTranslator:
|
||||
pass
|
||||
|
||||
|
||||
class ForgivingHTMLTranslator(SmartyPantsHTMLTranslator, ForgivingTranslator):
|
||||
class ForgivingHTMLTranslator(HTMLTranslator, ForgivingTranslator):
|
||||
pass
|
||||
|
||||
|
||||
@ -178,8 +181,8 @@ def get_verifier(verify, verify_re):
|
||||
# verify smarty-pants quotes
|
||||
'verify',
|
||||
'"John"',
|
||||
'<p>“John”</p>',
|
||||
r'\sphinxquotedblleft{}John\sphinxquotedblright{}',
|
||||
u'<p>“John”</p>',
|
||||
u"“John”",
|
||||
),
|
||||
(
|
||||
# ... but not in literal text
|
||||
|
@ -34,14 +34,14 @@ def test_docinfo(app, status, warning):
|
||||
'field name': u'This is a generic bibliographic field.',
|
||||
'field name 2': (u'Generic bibliographic fields may contain multiple '
|
||||
u'body elements.\n\nLike this.'),
|
||||
'status': u'This is a "work in progress"',
|
||||
'status': u'This is a “work in progress”',
|
||||
'version': u'1',
|
||||
'copyright': (u'This document has been placed in the public domain. '
|
||||
u'You\nmay do with it as you wish. You may copy, modify,'
|
||||
u'\nredistribute, reattribute, sell, buy, rent, lease,\n'
|
||||
u'destroy, or improve it, quote it at length, excerpt,\n'
|
||||
u'incorporate, collate, fold, staple, or mutilate it, or '
|
||||
u'do\nanything else to it that your or anyone else\'s '
|
||||
u'do\nanything else to it that your or anyone else’s '
|
||||
u'heart\ndesires.'),
|
||||
'contact': u'goodger@python.org',
|
||||
'date': u'2006-05-21',
|
||||
|
@ -16,8 +16,6 @@ from six import PY2, text_type, StringIO
|
||||
from six.moves import input
|
||||
import pytest
|
||||
|
||||
from util import SkipTest
|
||||
|
||||
from sphinx import application
|
||||
from sphinx import quickstart as qs
|
||||
from sphinx.util.console import nocolor, coloron
|
||||
@ -121,7 +119,7 @@ def test_do_prompt_with_nonascii():
|
||||
try:
|
||||
qs.do_prompt(d, 'k1', 'Q1', default=u'\u65e5\u672c')
|
||||
except UnicodeEncodeError:
|
||||
raise SkipTest(
|
||||
raise pytest.skip.Exception(
|
||||
'non-ASCII console input not supported on this encoding: %s',
|
||||
qs.TERM_ENCODING)
|
||||
assert d['k1'] == u'\u30c9\u30a4\u30c4'
|
||||
|
160
tests/util.py
160
tests/util.py
@ -11,11 +11,9 @@ import os
|
||||
import re
|
||||
import sys
|
||||
import warnings
|
||||
from functools import wraps
|
||||
from xml.etree import ElementTree
|
||||
|
||||
from six import string_types
|
||||
from six import StringIO
|
||||
|
||||
import pytest
|
||||
|
||||
@ -26,7 +24,6 @@ from sphinx import application
|
||||
from sphinx.builders.latex import LaTeXBuilder
|
||||
from sphinx.ext.autodoc import AutoDirective
|
||||
from sphinx.pycode import ModuleAnalyzer
|
||||
from sphinx.deprecation import RemovedInSphinx17Warning
|
||||
|
||||
from path import path
|
||||
|
||||
@ -201,160 +198,3 @@ def find_files(root, suffix=None):
|
||||
|
||||
def strip_escseq(text):
|
||||
return re.sub('\x1b.*?m', '', text)
|
||||
|
||||
|
||||
# #############################################
|
||||
# DEPRECATED implementations
|
||||
|
||||
|
||||
def gen_with_app(*args, **kwargs):
|
||||
"""
|
||||
**DEPRECATED**: use pytest.mark.parametrize instead.
|
||||
|
||||
Decorate a test generator to pass a SphinxTestApp as the first argument to
|
||||
the test generator when it's executed.
|
||||
"""
|
||||
def generator(func):
|
||||
@wraps(func)
|
||||
def deco(*args2, **kwargs2):
|
||||
status, warning = StringIO(), StringIO()
|
||||
kwargs['status'] = status
|
||||
kwargs['warning'] = warning
|
||||
app = SphinxTestApp(*args, **kwargs)
|
||||
try:
|
||||
for item in func(app, status, warning, *args2, **kwargs2):
|
||||
yield item
|
||||
finally:
|
||||
app.cleanup()
|
||||
return deco
|
||||
return generator
|
||||
|
||||
|
||||
def skip_if(condition, msg=None):
|
||||
"""
|
||||
**DEPRECATED**: use pytest.mark.skipif instead.
|
||||
|
||||
Decorator to skip test if condition is true.
|
||||
"""
|
||||
return pytest.mark.skipif(condition, reason=(msg or 'conditional skip'))
|
||||
|
||||
|
||||
def skip_unless(condition, msg=None):
|
||||
"""
|
||||
**DEPRECATED**: use pytest.mark.skipif instead.
|
||||
|
||||
Decorator to skip test if condition is false.
|
||||
"""
|
||||
return pytest.mark.skipif(not condition, reason=(msg or 'conditional skip'))
|
||||
|
||||
|
||||
def with_tempdir(func):
|
||||
"""
|
||||
**DEPRECATED**: use tempdir fixture instead.
|
||||
"""
|
||||
return func
|
||||
|
||||
|
||||
def raises(exc, func, *args, **kwds):
|
||||
"""
|
||||
**DEPRECATED**: use pytest.raises instead.
|
||||
|
||||
Raise AssertionError if ``func(*args, **kwds)`` does not raise *exc*.
|
||||
"""
|
||||
with pytest.raises(exc):
|
||||
func(*args, **kwds)
|
||||
|
||||
|
||||
def raises_msg(exc, msg, func, *args, **kwds):
|
||||
"""
|
||||
**DEPRECATED**: use pytest.raises instead.
|
||||
|
||||
Raise AssertionError if ``func(*args, **kwds)`` does not raise *exc*,
|
||||
and check if the message contains *msg*.
|
||||
"""
|
||||
with pytest.raises(exc) as excinfo:
|
||||
func(*args, **kwds)
|
||||
assert msg in str(excinfo.value)
|
||||
|
||||
|
||||
def assert_true(v1, msg=''):
|
||||
"""
|
||||
**DEPRECATED**: use assert instead.
|
||||
"""
|
||||
assert v1, msg
|
||||
|
||||
|
||||
def assert_equal(v1, v2, msg=''):
|
||||
"""
|
||||
**DEPRECATED**: use assert instead.
|
||||
"""
|
||||
assert v1 == v2, msg
|
||||
|
||||
|
||||
def assert_in(x, thing, msg=''):
|
||||
"""
|
||||
**DEPRECATED**: use assert instead.
|
||||
"""
|
||||
if x not in thing:
|
||||
assert False, msg or '%r is not in %r' % (x, thing)
|
||||
|
||||
|
||||
def assert_not_in(x, thing, msg=''):
|
||||
"""
|
||||
**DEPRECATED**: use assert instead.
|
||||
"""
|
||||
if x in thing:
|
||||
assert False, msg or '%r is in %r' % (x, thing)
|
||||
|
||||
|
||||
class ListOutput(object):
|
||||
"""
|
||||
File-like object that collects written text in a list.
|
||||
"""
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
self.content = []
|
||||
|
||||
def reset(self):
|
||||
del self.content[:]
|
||||
|
||||
def write(self, text):
|
||||
self.content.append(text)
|
||||
|
||||
|
||||
# **DEPRECATED**: use pytest.skip instead.
|
||||
SkipTest = pytest.skip.Exception
|
||||
|
||||
|
||||
class _DeprecationWrapper(object):
|
||||
def __init__(self, mod, deprecated):
|
||||
self._mod = mod
|
||||
self._deprecated = deprecated
|
||||
|
||||
def __getattr__(self, attr):
|
||||
if attr in self._deprecated:
|
||||
obj, instead = self._deprecated[attr]
|
||||
warnings.warn("tests/util.py::%s is deprecated and will be "
|
||||
"removed in Sphinx 1.7, please use %s instead."
|
||||
% (attr, instead),
|
||||
RemovedInSphinx17Warning, stacklevel=2)
|
||||
return obj
|
||||
return getattr(self._mod, attr)
|
||||
|
||||
|
||||
sys.modules[__name__] = _DeprecationWrapper(sys.modules[__name__], dict( # type: ignore
|
||||
with_app=(pytest.mark.sphinx, 'pytest.mark.sphinx'),
|
||||
TestApp=(SphinxTestApp, 'SphinxTestApp'),
|
||||
gen_with_app=(gen_with_app, 'pytest.mark.parametrize'),
|
||||
skip_if=(skip_if, 'pytest.skipif'),
|
||||
skip_unless=(skip_unless, 'pytest.skipif'),
|
||||
with_tempdir=(with_tempdir, 'tmpdir pytest fixture'),
|
||||
raises=(raises, 'pytest.raises'),
|
||||
raises_msg=(raises_msg, 'pytest.raises'),
|
||||
assert_true=(assert_true, 'assert'),
|
||||
assert_equal=(assert_equal, 'assert'),
|
||||
assert_in=(assert_in, 'assert'),
|
||||
assert_not_in=(assert_not_in, 'assert'),
|
||||
ListOutput=(ListOutput, 'StringIO'),
|
||||
SkipTest=(SkipTest, 'pytest.skip'),
|
||||
))
|
||||
|
@ -5,6 +5,7 @@ from __future__ import print_function
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import argparse
|
||||
from datetime import datetime
|
||||
from contextlib import contextmanager
|
||||
|
||||
@ -14,17 +15,22 @@ package_dir = os.path.abspath(os.path.join(script_dir, '..'))
|
||||
RELEASE_TYPE = {'a': 'alpha', 'b': 'beta'}
|
||||
|
||||
|
||||
def stringify_version(version_info):
|
||||
def stringify_version(version_info, in_develop=True):
|
||||
if version_info[2] == 0:
|
||||
return '.'.join(str(v) for v in version_info[:2])
|
||||
version = '.'.join(str(v) for v in version_info[:2])
|
||||
else:
|
||||
return '.'.join(str(v) for v in version_info[:3])
|
||||
version = '.'.join(str(v) for v in version_info[:3])
|
||||
|
||||
if not in_develop and version_info[3] != 'final':
|
||||
version += version_info[3][0] + str(version_info[4])
|
||||
|
||||
return version
|
||||
|
||||
|
||||
def bump_version(path, version_info):
|
||||
version = stringify_version(version_info)
|
||||
def bump_version(path, version_info, in_develop=True):
|
||||
version = stringify_version(version_info, in_develop)
|
||||
release = version
|
||||
if version_info[3] != 'final':
|
||||
if in_develop:
|
||||
version += '+'
|
||||
|
||||
with open(path, 'r+') as f:
|
||||
@ -143,19 +149,25 @@ class Changes(object):
|
||||
f.write(body)
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print("bump_version.py [version]")
|
||||
return -1
|
||||
def parse_options(argv):
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('version', help='A version number (cf. 1.6b0)')
|
||||
parser.add_argument('--in-develop', action='store_true')
|
||||
options = parser.parse_args(argv)
|
||||
options.version = parse_version(options.version)
|
||||
return options
|
||||
|
||||
version_info = parse_version(sys.argv[-1])
|
||||
|
||||
def main():
|
||||
options = parse_options(sys.argv[1:])
|
||||
|
||||
with processing("Rewriting sphinx/__init__.py"):
|
||||
bump_version(os.path.join(package_dir, 'sphinx/__init__.py'), version_info)
|
||||
bump_version(os.path.join(package_dir, 'sphinx/__init__.py'),
|
||||
options.version, options.in_develop)
|
||||
|
||||
with processing('Rewriting CHANGES'):
|
||||
changes = Changes(os.path.join(package_dir, 'CHANGES'))
|
||||
if changes.version_info == version_info:
|
||||
if changes.version_info == options.version:
|
||||
if changes.in_development:
|
||||
changes.finalize_release_date()
|
||||
else:
|
||||
@ -163,7 +175,7 @@ def main():
|
||||
else:
|
||||
if changes.in_development:
|
||||
print('WARNING: last version is not released yet: %s' % changes.version)
|
||||
changes.add_release(version_info)
|
||||
changes.add_release(options.version)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
@ -4,7 +4,12 @@ Release checklist
|
||||
* open https://travis-ci.org/sphinx-doc/sphinx/branches and check stable branch is green
|
||||
* Check `git status`
|
||||
* Run `make style-check`
|
||||
* if final major release ...
|
||||
* Update sphinx/locale/sphinx.pot if first major release (beta1)
|
||||
|
||||
* Run `pytho nsetup.py extract_messages`
|
||||
* Run `(cd sphinx/locale; tx push -s)`
|
||||
|
||||
* Update sphinx/locale/<lang>/ files if final major release ...
|
||||
|
||||
* Run `(cd sphinx/locale; tx pull -a -f)`
|
||||
* Run `python setup.py compile_catalog`
|
||||
@ -21,7 +26,7 @@ Release checklist
|
||||
* `git push origin stable --tags`
|
||||
* open https://readthedocs.org/dashboard/sphinx/versions/ and enable the released version
|
||||
* Add new version/milestone to tracker categories
|
||||
* `python utils/bump_version.py a.b.cb0` (ex. 1.5.3b0)
|
||||
* `python utils/bump_version.py --in-develop a.b.cb0` (ex. 1.5.3b0)
|
||||
* Check diff by `git diff`
|
||||
* `git commit -am 'Bump version'`
|
||||
* `git push origin stable`
|
||||
|
Loading…
Reference in New Issue
Block a user