mirror of
https://github.com/sphinx-doc/sphinx.git
synced 2025-02-25 18:55:22 -06:00
Merge branch 'master' into latex_font_for_pdflatex
This commit is contained in:
commit
2e22bd0a23
27
CHANGES
27
CHANGES
@ -17,6 +17,8 @@ Incompatible changes
|
|||||||
MathJax configuration may have to set the old MathJax path or update their
|
MathJax configuration may have to set the old MathJax path or update their
|
||||||
configuration for version 3. See :mod:`sphinx.ext.mathjax`.
|
configuration for version 3. See :mod:`sphinx.ext.mathjax`.
|
||||||
* #7784: i18n: The msgid for alt text of image is changed
|
* #7784: i18n: The msgid for alt text of image is changed
|
||||||
|
* #5560: napoleon: :confval:`napoleon_use_param` also affect "other parameters"
|
||||||
|
section
|
||||||
* #7996: manpage: Make a section directory on build manpage by default (see
|
* #7996: manpage: Make a section directory on build manpage by default (see
|
||||||
:confval:`man_make_section_directory`)
|
:confval:`man_make_section_directory`)
|
||||||
* #8380: html search: search results are wrapped with ``<p>`` instead of
|
* #8380: html search: search results are wrapped with ``<p>`` instead of
|
||||||
@ -75,9 +77,16 @@ Incompatible changes
|
|||||||
Deprecated
|
Deprecated
|
||||||
----------
|
----------
|
||||||
|
|
||||||
|
* pending_xref node for viewcode extension
|
||||||
|
* ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.broken``
|
||||||
|
* ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.good``
|
||||||
|
* ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.redirected``
|
||||||
|
* ``sphinx.builders.linkcheck.node_line_or_0()``
|
||||||
* ``sphinx.ext.autodoc.AttributeDocumenter.isinstanceattribute()``
|
* ``sphinx.ext.autodoc.AttributeDocumenter.isinstanceattribute()``
|
||||||
* ``sphinx.ext.autodoc.directive.DocumenterBridge.reporter``
|
* ``sphinx.ext.autodoc.directive.DocumenterBridge.reporter``
|
||||||
* ``sphinx.ext.autodoc.importer.get_module_members()``
|
* ``sphinx.ext.autodoc.importer.get_module_members()``
|
||||||
|
* ``sphinx.ext.autosummary.generate._simple_info()``
|
||||||
|
* ``sphinx.ext.autosummary.generate._simple_warn()``
|
||||||
|
|
||||||
Features added
|
Features added
|
||||||
--------------
|
--------------
|
||||||
@ -85,6 +94,8 @@ Features added
|
|||||||
* #8022: autodoc: autodata and autoattribute directives does not show right-hand
|
* #8022: autodoc: autodata and autoattribute directives does not show right-hand
|
||||||
value of the variable if docstring contains ``:meta hide-value:`` in
|
value of the variable if docstring contains ``:meta hide-value:`` in
|
||||||
info-field-list
|
info-field-list
|
||||||
|
* #8514: autodoc: Default values of overloaded functions are taken from actual
|
||||||
|
implementation if they're ellipsis
|
||||||
* #8619: html: kbd role generates customizable HTML tags for compound keys
|
* #8619: html: kbd role generates customizable HTML tags for compound keys
|
||||||
* #8634: html: Allow to change the order of JS/CSS via ``priority`` parameter
|
* #8634: html: Allow to change the order of JS/CSS via ``priority`` parameter
|
||||||
for :meth:`Sphinx.add_js_file()` and :meth:`Sphinx.add_css_file()`
|
for :meth:`Sphinx.add_js_file()` and :meth:`Sphinx.add_css_file()`
|
||||||
@ -93,12 +104,16 @@ Features added
|
|||||||
:event:`html-page-context` event
|
:event:`html-page-context` event
|
||||||
* #8649: imgconverter: Skip availability check if builder supports the image
|
* #8649: imgconverter: Skip availability check if builder supports the image
|
||||||
type
|
type
|
||||||
|
* #8573: napoleon: Allow to change the style of custom sections using
|
||||||
|
:confval:`napoleon_custom_styles`
|
||||||
* #6241: mathjax: Include mathjax.js only on the document using equations
|
* #6241: mathjax: Include mathjax.js only on the document using equations
|
||||||
* #8651: std domain: cross-reference for a rubric having inline item is broken
|
* #8651: std domain: cross-reference for a rubric having inline item is broken
|
||||||
|
* #8681: viewcode: Support incremental build
|
||||||
* #8132: Add :confval:`project_copyright` as an alias of :confval:`copyright`
|
* #8132: Add :confval:`project_copyright` as an alias of :confval:`copyright`
|
||||||
* #207: Now :confval:`highlight_language` supports multiple languages
|
* #207: Now :confval:`highlight_language` supports multiple languages
|
||||||
* #2030: :rst:dir:`code-block` and :rst:dir:`literalinclude` supports automatic
|
* #2030: :rst:dir:`code-block` and :rst:dir:`literalinclude` supports automatic
|
||||||
dedent via no-argument ``:dedent:`` option
|
dedent via no-argument ``:dedent:`` option
|
||||||
|
* C++, also hyperlink operator overloads in expressions and alias declarations.
|
||||||
|
|
||||||
Bugs fixed
|
Bugs fixed
|
||||||
----------
|
----------
|
||||||
@ -110,19 +125,27 @@ Bugs fixed
|
|||||||
* #8315: autodoc: Failed to resolve struct.Struct type annotation
|
* #8315: autodoc: Failed to resolve struct.Struct type annotation
|
||||||
* #8652: autodoc: All variable comments in the module are ignored if the module
|
* #8652: autodoc: All variable comments in the module are ignored if the module
|
||||||
contains invalid type comments
|
contains invalid type comments
|
||||||
|
* #8693: autodoc: Default values for overloaded functions are rendered as string
|
||||||
* #8306: autosummary: mocked modules are documented as empty page when using
|
* #8306: autosummary: mocked modules are documented as empty page when using
|
||||||
:recursive: option
|
:recursive: option
|
||||||
* #8618: html: kbd role produces incorrect HTML when compound-key separators (-,
|
* #8618: html: kbd role produces incorrect HTML when compound-key separators (-,
|
||||||
+ or ^) are used as keystrokes
|
+ or ^) are used as keystrokes
|
||||||
* #8629: html: A type warning for html_use_opensearch is shown twice
|
* #8629: html: A type warning for html_use_opensearch is shown twice
|
||||||
|
* #8714: html: kbd role with "Caps Lock" rendered incorrectly
|
||||||
* #8665: html theme: Could not override globaltoc_maxdepth in theme.conf
|
* #8665: html theme: Could not override globaltoc_maxdepth in theme.conf
|
||||||
|
* #4304: linkcheck: Fix race condition that could lead to checking the
|
||||||
|
availability of the same URL twice
|
||||||
* #8094: texinfo: image files on the different directory with document are not
|
* #8094: texinfo: image files on the different directory with document are not
|
||||||
copied
|
copied
|
||||||
|
* #8720: viewcode: module pages are generated for epub on incremental build
|
||||||
|
* #8704: viewcode: anchors are generated in incremental build after singlehtml
|
||||||
* #8671: :confval:`highlight_options` is not working
|
* #8671: :confval:`highlight_options` is not working
|
||||||
* #8341: C, fix intersphinx lookup types for names in declarations.
|
* #8341: C, fix intersphinx lookup types for names in declarations.
|
||||||
* C, C++: in general fix intersphinx and role lookup types.
|
* C, C++: in general fix intersphinx and role lookup types.
|
||||||
* #8683: :confval:`html_last_updated_fmt` does not support UTC offset (%z)
|
* #8683: :confval:`html_last_updated_fmt` does not support UTC offset (%z)
|
||||||
* #8683: :confval:`html_last_updated_fmt` generates wrong time zone for %Z
|
* #8683: :confval:`html_last_updated_fmt` generates wrong time zone for %Z
|
||||||
|
* #1112: ``download`` role creates duplicated copies when relative path is
|
||||||
|
specified
|
||||||
|
|
||||||
Testing
|
Testing
|
||||||
--------
|
--------
|
||||||
@ -145,6 +168,10 @@ Features added
|
|||||||
Bugs fixed
|
Bugs fixed
|
||||||
----------
|
----------
|
||||||
|
|
||||||
|
* #8655: autodoc: Failed to generate document if target module contains an
|
||||||
|
object that raises an exception on ``hasattr()``
|
||||||
|
* C, ``expr`` role should start symbol lookup in the current scope.
|
||||||
|
|
||||||
Testing
|
Testing
|
||||||
--------
|
--------
|
||||||
|
|
||||||
|
@ -2,10 +2,12 @@
|
|||||||
Extending Sphinx
|
Extending Sphinx
|
||||||
================
|
================
|
||||||
|
|
||||||
This guide is aimed at those wishing to develop their own extensions for
|
This guide is aimed at giving a quick introduction for those wishing to
|
||||||
Sphinx. Sphinx possesses significant extensibility capabilities including the
|
develop their own extensions for Sphinx. Sphinx possesses significant
|
||||||
ability to hook into almost every point of the build process. If you simply
|
extensibility capabilities including the ability to hook into almost every
|
||||||
wish to use Sphinx with existing extensions, refer to :doc:`/usage/index`.
|
point of the build process. If you simply wish to use Sphinx with existing
|
||||||
|
extensions, refer to :doc:`/usage/index`. For a more detailed discussion of
|
||||||
|
the extension interface see :doc:`/extdev/index`.
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 2
|
:maxdepth: 2
|
||||||
|
@ -61,6 +61,31 @@ The following is a list of deprecated interfaces.
|
|||||||
- 6.0
|
- 6.0
|
||||||
- ``docutils.utils.smartyquotes``
|
- ``docutils.utils.smartyquotes``
|
||||||
|
|
||||||
|
* - pending_xref node for viewcode extension
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- ``sphinx.ext.viewcode.viewcode_anchor``
|
||||||
|
|
||||||
|
* - ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.broken``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- N/A
|
||||||
|
|
||||||
|
* - ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.good``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- N/A
|
||||||
|
|
||||||
|
* - ``sphinx.builders.linkcheck.CheckExternalLinksBuilder.redirected``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- N/A
|
||||||
|
|
||||||
|
* - ``sphinx.builders.linkcheck.node_line_or_0()``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- ``sphinx.util.nodes.get_node_line()``
|
||||||
|
|
||||||
* - ``sphinx.ext.autodoc.AttributeDocumenter.isinstanceattribute()``
|
* - ``sphinx.ext.autodoc.AttributeDocumenter.isinstanceattribute()``
|
||||||
- 3.5
|
- 3.5
|
||||||
- 5.0
|
- 5.0
|
||||||
@ -71,6 +96,16 @@ The following is a list of deprecated interfaces.
|
|||||||
- 5.0
|
- 5.0
|
||||||
- ``sphinx.ext.autodoc.ModuleDocumenter.get_module_members()``
|
- ``sphinx.ext.autodoc.ModuleDocumenter.get_module_members()``
|
||||||
|
|
||||||
|
* - ``sphinx.ext.autosummary.generate._simple_info()``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- :ref:`logging-api`
|
||||||
|
|
||||||
|
* - ``sphinx.ext.autosummary.generate._simple_warn()``
|
||||||
|
- 3.5
|
||||||
|
- 5.0
|
||||||
|
- :ref:`logging-api`
|
||||||
|
|
||||||
* - The ``follow_wrapped`` argument of ``sphinx.util.inspect.signature()``
|
* - The ``follow_wrapped`` argument of ``sphinx.util.inspect.signature()``
|
||||||
- 3.4
|
- 3.4
|
||||||
- 5.0
|
- 5.0
|
||||||
|
@ -171,8 +171,8 @@ Keys that you may want to override include:
|
|||||||
"Bjornstrup". You can also set this to ``''`` to disable fncychap.
|
"Bjornstrup". You can also set this to ``''`` to disable fncychap.
|
||||||
|
|
||||||
Default: ``'\\usepackage[Bjarne]{fncychap}'`` for English documents,
|
Default: ``'\\usepackage[Bjarne]{fncychap}'`` for English documents,
|
||||||
``'\\usepackage[Sonny]{fncychap}'`` for internationalized documents, and
|
``'\\usepackage[Sonny]{fncychap}'`` for internationalized documents, and
|
||||||
``''`` for Japanese documents.
|
``''`` for Japanese documents.
|
||||||
|
|
||||||
``'preamble'``
|
``'preamble'``
|
||||||
Additional preamble content. One may move all needed macros into some file
|
Additional preamble content. One may move all needed macros into some file
|
||||||
@ -276,7 +276,7 @@ Keys that don't need to be overridden unless in special cases are:
|
|||||||
"inputenc" package inclusion.
|
"inputenc" package inclusion.
|
||||||
|
|
||||||
Default: ``'\\usepackage[utf8]{inputenc}'`` when using pdflatex, else
|
Default: ``'\\usepackage[utf8]{inputenc}'`` when using pdflatex, else
|
||||||
``''``
|
``''``
|
||||||
|
|
||||||
.. versionchanged:: 1.4.3
|
.. versionchanged:: 1.4.3
|
||||||
Previously ``'\\usepackage[utf8]{inputenc}'`` was used for all
|
Previously ``'\\usepackage[utf8]{inputenc}'`` was used for all
|
||||||
@ -360,8 +360,8 @@ Keys that don't need to be overridden unless in special cases are:
|
|||||||
``'pdflatex'`` support Greek Unicode input in :rst:dir:`math` context.
|
``'pdflatex'`` support Greek Unicode input in :rst:dir:`math` context.
|
||||||
For example ``:math:`α``` (U+03B1) will render as :math:`\alpha`.
|
For example ``:math:`α``` (U+03B1) will render as :math:`\alpha`.
|
||||||
|
|
||||||
For wider Unicode support in math input, see the discussion of
|
Default: ``'\\usepackage{textalpha}'`` or ``''`` if ``fontenc`` does not
|
||||||
:confval:`latex_engine`.
|
include the ``LGR`` option.
|
||||||
|
|
||||||
.. versionadded:: 2.0
|
.. versionadded:: 2.0
|
||||||
|
|
||||||
@ -379,7 +379,7 @@ Keys that don't need to be overridden unless in special cases are:
|
|||||||
<latexsphinxsetup>`.
|
<latexsphinxsetup>`.
|
||||||
|
|
||||||
Default: ``'\\usepackage{geometry}'`` (or
|
Default: ``'\\usepackage{geometry}'`` (or
|
||||||
``'\\usepackage[dvipdfm]{geometry}'`` for Japanese documents)
|
``'\\usepackage[dvipdfm]{geometry}'`` for Japanese documents)
|
||||||
|
|
||||||
.. versionadded:: 1.5
|
.. versionadded:: 1.5
|
||||||
|
|
||||||
@ -762,14 +762,14 @@ macros may be significant.
|
|||||||
|warningbdcolors|
|
|warningbdcolors|
|
||||||
The colour for the admonition frame.
|
The colour for the admonition frame.
|
||||||
|
|
||||||
Default: ``{rgb}{0,0,0}`` (black)
|
Default: ``{rgb}{0,0,0}`` (black)
|
||||||
|
|
||||||
.. only:: latex
|
.. only:: latex
|
||||||
|
|
||||||
|wgbdcolorslatex|
|
|wgbdcolorslatex|
|
||||||
The colour for the admonition frame.
|
The colour for the admonition frame.
|
||||||
|
|
||||||
Default: ``{rgb}{0,0,0}`` (black)
|
Default: ``{rgb}{0,0,0}`` (black)
|
||||||
|
|
||||||
|warningbgcolors|
|
|warningbgcolors|
|
||||||
The background colours for the respective admonitions.
|
The background colours for the respective admonitions.
|
||||||
|
@ -546,4 +546,28 @@ sure that "sphinx.ext.napoleon" is enabled in `conf.py`::
|
|||||||
If an attribute is documented in the docstring without a type and
|
If an attribute is documented in the docstring without a type and
|
||||||
has an annotation in the class body, that type is used.
|
has an annotation in the class body, that type is used.
|
||||||
|
|
||||||
.. versionadded:: 3.4
|
.. versionadded:: 3.4
|
||||||
|
|
||||||
|
.. confval:: napoleon_custom_sections
|
||||||
|
|
||||||
|
Add a list of custom sections to include, expanding the list of parsed sections.
|
||||||
|
*Defaults to None.*
|
||||||
|
|
||||||
|
The entries can either be strings or tuples, depending on the intention:
|
||||||
|
|
||||||
|
* To create a custom "generic" section, just pass a string.
|
||||||
|
* To create an alias for an existing section, pass a tuple containing the
|
||||||
|
alias name and the original, in that order.
|
||||||
|
* To create a custom section that displays like the parameters or returns
|
||||||
|
section, pass a tuple containing the custom section name and a string
|
||||||
|
value, "params_style" or "returns_style".
|
||||||
|
|
||||||
|
If an entry is just a string, it is interpreted as a header for a generic
|
||||||
|
section. If the entry is a tuple/list/indexed container, the first entry
|
||||||
|
is the name of the section, the second is the section key to emulate. If the
|
||||||
|
second entry value is "params_style" or "returns_style", the custom section
|
||||||
|
will be displayed like the parameters section or returns section.
|
||||||
|
|
||||||
|
.. versionadded:: 1.8
|
||||||
|
.. versionchanged:: 3.5
|
||||||
|
Support ``params_style`` and ``returns_style``
|
2
setup.py
2
setup.py
@ -44,7 +44,7 @@ extras_require = {
|
|||||||
'lint': [
|
'lint': [
|
||||||
'flake8>=3.5.0',
|
'flake8>=3.5.0',
|
||||||
'isort',
|
'isort',
|
||||||
'mypy>=0.790',
|
'mypy>=0.800',
|
||||||
'docutils-stubs',
|
'docutils-stubs',
|
||||||
],
|
],
|
||||||
'test': [
|
'test': [
|
||||||
|
@ -88,7 +88,7 @@ class Stylesheet(str):
|
|||||||
|
|
||||||
def __new__(cls, filename: str, *args: str, priority: int = 500, **attributes: Any
|
def __new__(cls, filename: str, *args: str, priority: int = 500, **attributes: Any
|
||||||
) -> "Stylesheet":
|
) -> "Stylesheet":
|
||||||
self = str.__new__(cls, filename) # type: ignore
|
self = str.__new__(cls, filename)
|
||||||
self.filename = filename
|
self.filename = filename
|
||||||
self.priority = priority
|
self.priority = priority
|
||||||
self.attributes = attributes
|
self.attributes = attributes
|
||||||
@ -113,7 +113,7 @@ class JavaScript(str):
|
|||||||
priority = None # type: int
|
priority = None # type: int
|
||||||
|
|
||||||
def __new__(cls, filename: str, priority: int = 500, **attributes: str) -> "JavaScript":
|
def __new__(cls, filename: str, priority: int = 500, **attributes: str) -> "JavaScript":
|
||||||
self = str.__new__(cls, filename) # type: ignore
|
self = str.__new__(cls, filename)
|
||||||
self.filename = filename
|
self.filename = filename
|
||||||
self.priority = priority
|
self.priority = priority
|
||||||
self.attributes = attributes
|
self.attributes = attributes
|
||||||
|
@ -9,7 +9,7 @@
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import re
|
import re
|
||||||
from typing import Any, Dict
|
from typing import Any, Dict, List
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
|
|
||||||
@ -38,18 +38,29 @@ class KeyboardTransform(SphinxPostTransform):
|
|||||||
default_priority = 400
|
default_priority = 400
|
||||||
builders = ('html',)
|
builders = ('html',)
|
||||||
pattern = re.compile(r'(?<=.)(-|\+|\^|\s+)(?=.)')
|
pattern = re.compile(r'(?<=.)(-|\+|\^|\s+)(?=.)')
|
||||||
|
multiwords_keys = (('caps', 'lock'),
|
||||||
|
('page' 'down'),
|
||||||
|
('page', 'up'),
|
||||||
|
('scroll' 'lock'),
|
||||||
|
('num', 'lock'),
|
||||||
|
('sys' 'rq'),
|
||||||
|
('back' 'space'))
|
||||||
|
|
||||||
def run(self, **kwargs: Any) -> None:
|
def run(self, **kwargs: Any) -> None:
|
||||||
matcher = NodeMatcher(nodes.literal, classes=["kbd"])
|
matcher = NodeMatcher(nodes.literal, classes=["kbd"])
|
||||||
for node in self.document.traverse(matcher): # type: nodes.literal
|
for node in self.document.traverse(matcher): # type: nodes.literal
|
||||||
parts = self.pattern.split(node[-1].astext())
|
parts = self.pattern.split(node[-1].astext())
|
||||||
if len(parts) == 1:
|
if len(parts) == 1 or self.is_multiwords_key(parts):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
node['classes'].append('compound')
|
node['classes'].append('compound')
|
||||||
node.pop()
|
node.pop()
|
||||||
while parts:
|
while parts:
|
||||||
key = parts.pop(0)
|
if self.is_multiwords_key(parts):
|
||||||
|
key = ''.join(parts[:3])
|
||||||
|
parts[:3] = []
|
||||||
|
else:
|
||||||
|
key = parts.pop(0)
|
||||||
node += nodes.literal('', key, classes=["kbd"])
|
node += nodes.literal('', key, classes=["kbd"])
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -59,6 +70,16 @@ class KeyboardTransform(SphinxPostTransform):
|
|||||||
except IndexError:
|
except IndexError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def is_multiwords_key(self, parts: List[str]) -> bool:
|
||||||
|
if len(parts) >= 3 and parts[1].strip() == '':
|
||||||
|
name = parts[0].lower(), parts[2].lower()
|
||||||
|
if name in self.multiwords_keys:
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def setup(app: Sphinx) -> Dict[str, Any]:
|
def setup(app: Sphinx) -> Dict[str, Any]:
|
||||||
app.add_post_transform(KeyboardTransform)
|
app.add_post_transform(KeyboardTransform)
|
||||||
|
@ -14,21 +14,24 @@ import re
|
|||||||
import socket
|
import socket
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
|
import warnings
|
||||||
from datetime import datetime, timezone
|
from datetime import datetime, timezone
|
||||||
from email.utils import parsedate_to_datetime
|
from email.utils import parsedate_to_datetime
|
||||||
from html.parser import HTMLParser
|
from html.parser import HTMLParser
|
||||||
from os import path
|
from os import path
|
||||||
from typing import Any, Dict, List, NamedTuple, Optional, Set, Tuple
|
from typing import Any, Dict, List, NamedTuple, Optional, Set, Tuple, cast
|
||||||
from urllib.parse import unquote, urlparse
|
from urllib.parse import unquote, urlparse
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from docutils.nodes import Element, Node
|
from docutils.nodes import Element
|
||||||
from requests import Response
|
from requests import Response
|
||||||
from requests.exceptions import HTTPError, TooManyRedirects
|
from requests.exceptions import HTTPError, TooManyRedirects
|
||||||
|
|
||||||
from sphinx.application import Sphinx
|
from sphinx.application import Sphinx
|
||||||
from sphinx.builders import Builder
|
from sphinx.builders.dummy import DummyBuilder
|
||||||
|
from sphinx.deprecation import RemovedInSphinx50Warning
|
||||||
from sphinx.locale import __
|
from sphinx.locale import __
|
||||||
|
from sphinx.transforms.post_transforms import SphinxPostTransform
|
||||||
from sphinx.util import encode_uri, logging, requests
|
from sphinx.util import encode_uri, logging, requests
|
||||||
from sphinx.util.console import darkgray, darkgreen, purple, red, turquoise # type: ignore
|
from sphinx.util.console import darkgray, darkgreen, purple, red, turquoise # type: ignore
|
||||||
from sphinx.util.nodes import get_node_line
|
from sphinx.util.nodes import get_node_line
|
||||||
@ -37,6 +40,10 @@ logger = logging.getLogger(__name__)
|
|||||||
|
|
||||||
uri_re = re.compile('([a-z]+:)?//') # matches to foo:// and // (a protocol relative URL)
|
uri_re = re.compile('([a-z]+:)?//') # matches to foo:// and // (a protocol relative URL)
|
||||||
|
|
||||||
|
Hyperlink = NamedTuple('Hyperlink', (('next_check', float),
|
||||||
|
('uri', Optional[str]),
|
||||||
|
('docname', Optional[str]),
|
||||||
|
('lineno', Optional[int])))
|
||||||
RateLimit = NamedTuple('RateLimit', (('delay', float), ('next_check', float)))
|
RateLimit = NamedTuple('RateLimit', (('delay', float), ('next_check', float)))
|
||||||
|
|
||||||
DEFAULT_REQUEST_HEADERS = {
|
DEFAULT_REQUEST_HEADERS = {
|
||||||
@ -52,6 +59,8 @@ def node_line_or_0(node: Element) -> int:
|
|||||||
PriorityQueue items must be comparable. The line number is part of the
|
PriorityQueue items must be comparable. The line number is part of the
|
||||||
tuple used by the PriorityQueue, keep an homogeneous type for comparison.
|
tuple used by the PriorityQueue, keep an homogeneous type for comparison.
|
||||||
"""
|
"""
|
||||||
|
warnings.warn('node_line_or_0() is deprecated.',
|
||||||
|
RemovedInSphinx50Warning, stacklevel=2)
|
||||||
return get_node_line(node) or 0
|
return get_node_line(node) or 0
|
||||||
|
|
||||||
|
|
||||||
@ -89,7 +98,7 @@ def check_anchor(response: requests.requests.Response, anchor: str) -> bool:
|
|||||||
return parser.found
|
return parser.found
|
||||||
|
|
||||||
|
|
||||||
class CheckExternalLinksBuilder(Builder):
|
class CheckExternalLinksBuilder(DummyBuilder):
|
||||||
"""
|
"""
|
||||||
Checks for broken external links.
|
Checks for broken external links.
|
||||||
"""
|
"""
|
||||||
@ -98,14 +107,15 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
'%(outdir)s/output.txt')
|
'%(outdir)s/output.txt')
|
||||||
|
|
||||||
def init(self) -> None:
|
def init(self) -> None:
|
||||||
|
self.hyperlinks = {} # type: Dict[str, Hyperlink]
|
||||||
self.to_ignore = [re.compile(x) for x in self.app.config.linkcheck_ignore]
|
self.to_ignore = [re.compile(x) for x in self.app.config.linkcheck_ignore]
|
||||||
self.anchors_ignore = [re.compile(x)
|
self.anchors_ignore = [re.compile(x)
|
||||||
for x in self.app.config.linkcheck_anchors_ignore]
|
for x in self.app.config.linkcheck_anchors_ignore]
|
||||||
self.auth = [(re.compile(pattern), auth_info) for pattern, auth_info
|
self.auth = [(re.compile(pattern), auth_info) for pattern, auth_info
|
||||||
in self.app.config.linkcheck_auth]
|
in self.app.config.linkcheck_auth]
|
||||||
self.good = set() # type: Set[str]
|
self._good = set() # type: Set[str]
|
||||||
self.broken = {} # type: Dict[str, str]
|
self._broken = {} # type: Dict[str, str]
|
||||||
self.redirected = {} # type: Dict[str, Tuple[str, int]]
|
self._redirected = {} # type: Dict[str, Tuple[str, int]]
|
||||||
# set a timeout for non-responding servers
|
# set a timeout for non-responding servers
|
||||||
socket.setdefaulttimeout(5.0)
|
socket.setdefaulttimeout(5.0)
|
||||||
# create output file
|
# create output file
|
||||||
@ -123,6 +133,33 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
thread.start()
|
thread.start()
|
||||||
self.workers.append(thread)
|
self.workers.append(thread)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def good(self):
|
||||||
|
warnings.warn(
|
||||||
|
"%s.%s is deprecated." % (self.__class__.__name__, "good"),
|
||||||
|
RemovedInSphinx50Warning,
|
||||||
|
stacklevel=2,
|
||||||
|
)
|
||||||
|
return self._good
|
||||||
|
|
||||||
|
@property
|
||||||
|
def broken(self):
|
||||||
|
warnings.warn(
|
||||||
|
"%s.%s is deprecated." % (self.__class__.__name__, "broken"),
|
||||||
|
RemovedInSphinx50Warning,
|
||||||
|
stacklevel=2,
|
||||||
|
)
|
||||||
|
return self._broken
|
||||||
|
|
||||||
|
@property
|
||||||
|
def redirected(self):
|
||||||
|
warnings.warn(
|
||||||
|
"%s.%s is deprecated." % (self.__class__.__name__, "redirected"),
|
||||||
|
RemovedInSphinx50Warning,
|
||||||
|
stacklevel=2,
|
||||||
|
)
|
||||||
|
return self._redirected
|
||||||
|
|
||||||
def check_thread(self) -> None:
|
def check_thread(self) -> None:
|
||||||
kwargs = {}
|
kwargs = {}
|
||||||
if self.app.config.linkcheck_timeout:
|
if self.app.config.linkcheck_timeout:
|
||||||
@ -251,14 +288,14 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
if rex.match(uri):
|
if rex.match(uri):
|
||||||
return 'ignored', '', 0
|
return 'ignored', '', 0
|
||||||
else:
|
else:
|
||||||
self.broken[uri] = ''
|
self._broken[uri] = ''
|
||||||
return 'broken', '', 0
|
return 'broken', '', 0
|
||||||
elif uri in self.good:
|
elif uri in self._good:
|
||||||
return 'working', 'old', 0
|
return 'working', 'old', 0
|
||||||
elif uri in self.broken:
|
elif uri in self._broken:
|
||||||
return 'broken', self.broken[uri], 0
|
return 'broken', self._broken[uri], 0
|
||||||
elif uri in self.redirected:
|
elif uri in self._redirected:
|
||||||
return 'redirected', self.redirected[uri][0], self.redirected[uri][1]
|
return 'redirected', self._redirected[uri][0], self._redirected[uri][1]
|
||||||
for rex in self.to_ignore:
|
for rex in self.to_ignore:
|
||||||
if rex.match(uri):
|
if rex.match(uri):
|
||||||
return 'ignored', '', 0
|
return 'ignored', '', 0
|
||||||
@ -270,11 +307,11 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
break
|
break
|
||||||
|
|
||||||
if status == "working":
|
if status == "working":
|
||||||
self.good.add(uri)
|
self._good.add(uri)
|
||||||
elif status == "broken":
|
elif status == "broken":
|
||||||
self.broken[uri] = info
|
self._broken[uri] = info
|
||||||
elif status == "redirected":
|
elif status == "redirected":
|
||||||
self.redirected[uri] = (info, code)
|
self._redirected[uri] = (info, code)
|
||||||
|
|
||||||
return (status, info, code)
|
return (status, info, code)
|
||||||
|
|
||||||
@ -396,46 +433,6 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
lineno, uri + ' to ' + info)
|
lineno, uri + ' to ' + info)
|
||||||
self.write_linkstat(linkstat)
|
self.write_linkstat(linkstat)
|
||||||
|
|
||||||
def get_target_uri(self, docname: str, typ: str = None) -> str:
|
|
||||||
return ''
|
|
||||||
|
|
||||||
def get_outdated_docs(self) -> Set[str]:
|
|
||||||
return self.env.found_docs
|
|
||||||
|
|
||||||
def prepare_writing(self, docnames: Set[str]) -> None:
|
|
||||||
return
|
|
||||||
|
|
||||||
def write_doc(self, docname: str, doctree: Node) -> None:
|
|
||||||
logger.info('')
|
|
||||||
n = 0
|
|
||||||
|
|
||||||
# reference nodes
|
|
||||||
for refnode in doctree.traverse(nodes.reference):
|
|
||||||
if 'refuri' not in refnode:
|
|
||||||
continue
|
|
||||||
uri = refnode['refuri']
|
|
||||||
lineno = node_line_or_0(refnode)
|
|
||||||
uri_info = (CHECK_IMMEDIATELY, uri, docname, lineno)
|
|
||||||
self.wqueue.put(uri_info, False)
|
|
||||||
n += 1
|
|
||||||
|
|
||||||
# image nodes
|
|
||||||
for imgnode in doctree.traverse(nodes.image):
|
|
||||||
uri = imgnode['candidates'].get('?')
|
|
||||||
if uri and '://' in uri:
|
|
||||||
lineno = node_line_or_0(imgnode)
|
|
||||||
uri_info = (CHECK_IMMEDIATELY, uri, docname, lineno)
|
|
||||||
self.wqueue.put(uri_info, False)
|
|
||||||
n += 1
|
|
||||||
|
|
||||||
done = 0
|
|
||||||
while done < n:
|
|
||||||
self.process_result(self.rqueue.get())
|
|
||||||
done += 1
|
|
||||||
|
|
||||||
if self.broken:
|
|
||||||
self.app.statuscode = 1
|
|
||||||
|
|
||||||
def write_entry(self, what: str, docname: str, filename: str, line: int,
|
def write_entry(self, what: str, docname: str, filename: str, line: int,
|
||||||
uri: str) -> None:
|
uri: str) -> None:
|
||||||
with open(path.join(self.outdir, 'output.txt'), 'a') as output:
|
with open(path.join(self.outdir, 'output.txt'), 'a') as output:
|
||||||
@ -447,14 +444,58 @@ class CheckExternalLinksBuilder(Builder):
|
|||||||
output.write('\n')
|
output.write('\n')
|
||||||
|
|
||||||
def finish(self) -> None:
|
def finish(self) -> None:
|
||||||
|
logger.info('')
|
||||||
|
n = 0
|
||||||
|
|
||||||
|
for hyperlink in self.hyperlinks.values():
|
||||||
|
self.wqueue.put(hyperlink, False)
|
||||||
|
n += 1
|
||||||
|
|
||||||
|
done = 0
|
||||||
|
while done < n:
|
||||||
|
self.process_result(self.rqueue.get())
|
||||||
|
done += 1
|
||||||
|
|
||||||
|
if self._broken:
|
||||||
|
self.app.statuscode = 1
|
||||||
|
|
||||||
self.wqueue.join()
|
self.wqueue.join()
|
||||||
# Shutdown threads.
|
# Shutdown threads.
|
||||||
for worker in self.workers:
|
for worker in self.workers:
|
||||||
self.wqueue.put((CHECK_IMMEDIATELY, None, None, None), False)
|
self.wqueue.put((CHECK_IMMEDIATELY, None, None, None), False)
|
||||||
|
|
||||||
|
|
||||||
|
class HyperlinkCollector(SphinxPostTransform):
|
||||||
|
builders = ('linkcheck',)
|
||||||
|
default_priority = 800
|
||||||
|
|
||||||
|
def run(self, **kwargs: Any) -> None:
|
||||||
|
builder = cast(CheckExternalLinksBuilder, self.app.builder)
|
||||||
|
hyperlinks = builder.hyperlinks
|
||||||
|
|
||||||
|
# reference nodes
|
||||||
|
for refnode in self.document.traverse(nodes.reference):
|
||||||
|
if 'refuri' not in refnode:
|
||||||
|
continue
|
||||||
|
uri = refnode['refuri']
|
||||||
|
lineno = get_node_line(refnode)
|
||||||
|
uri_info = Hyperlink(CHECK_IMMEDIATELY, uri, self.env.docname, lineno)
|
||||||
|
if uri not in hyperlinks:
|
||||||
|
hyperlinks[uri] = uri_info
|
||||||
|
|
||||||
|
# image nodes
|
||||||
|
for imgnode in self.document.traverse(nodes.image):
|
||||||
|
uri = imgnode['candidates'].get('?')
|
||||||
|
if uri and '://' in uri:
|
||||||
|
lineno = get_node_line(imgnode)
|
||||||
|
uri_info = Hyperlink(CHECK_IMMEDIATELY, uri, self.env.docname, lineno)
|
||||||
|
if uri not in hyperlinks:
|
||||||
|
hyperlinks[uri] = uri_info
|
||||||
|
|
||||||
|
|
||||||
def setup(app: Sphinx) -> Dict[str, Any]:
|
def setup(app: Sphinx) -> Dict[str, Any]:
|
||||||
app.add_builder(CheckExternalLinksBuilder)
|
app.add_builder(CheckExternalLinksBuilder)
|
||||||
|
app.add_post_transform(HyperlinkCollector)
|
||||||
|
|
||||||
app.add_config_value('linkcheck_ignore', [], None)
|
app.add_config_value('linkcheck_ignore', [], None)
|
||||||
app.add_config_value('linkcheck_auth', [], None)
|
app.add_config_value('linkcheck_auth', [], None)
|
||||||
|
@ -137,8 +137,7 @@ class ASTIdentifier(ASTBaseBase):
|
|||||||
reftype='identifier',
|
reftype='identifier',
|
||||||
reftarget=targetText, modname=None,
|
reftarget=targetText, modname=None,
|
||||||
classname=None)
|
classname=None)
|
||||||
key = symbol.get_lookup_key()
|
pnode['c:parent_key'] = symbol.get_lookup_key()
|
||||||
pnode['c:parent_key'] = key
|
|
||||||
if self.is_anon():
|
if self.is_anon():
|
||||||
pnode += nodes.strong(text="[anonymous]")
|
pnode += nodes.strong(text="[anonymous]")
|
||||||
else:
|
else:
|
||||||
@ -3204,7 +3203,8 @@ class CObject(ObjectDescription[ASTDeclaration]):
|
|||||||
def parse_pre_v3_type_definition(self, parser: DefinitionParser) -> ASTDeclaration:
|
def parse_pre_v3_type_definition(self, parser: DefinitionParser) -> ASTDeclaration:
|
||||||
return parser.parse_pre_v3_type_definition()
|
return parser.parse_pre_v3_type_definition()
|
||||||
|
|
||||||
def describe_signature(self, signode: TextElement, ast: Any, options: Dict) -> None:
|
def describe_signature(self, signode: TextElement, ast: ASTDeclaration,
|
||||||
|
options: Dict) -> None:
|
||||||
ast.describe_signature(signode, 'lastIsName', self.env, options)
|
ast.describe_signature(signode, 'lastIsName', self.env, options)
|
||||||
|
|
||||||
def run(self) -> List[Node]:
|
def run(self) -> List[Node]:
|
||||||
@ -3642,7 +3642,7 @@ class CExprRole(SphinxRole):
|
|||||||
location=self.get_source_info())
|
location=self.get_source_info())
|
||||||
# see below
|
# see below
|
||||||
return [self.node_type(text, text, classes=classes)], []
|
return [self.node_type(text, text, classes=classes)], []
|
||||||
parentSymbol = self.env.temp_data.get('cpp:parent_symbol', None)
|
parentSymbol = self.env.temp_data.get('c:parent_symbol', None)
|
||||||
if parentSymbol is None:
|
if parentSymbol is None:
|
||||||
parentSymbol = self.env.domaindata['c']['root_symbol']
|
parentSymbol = self.env.domaindata['c']['root_symbol']
|
||||||
# ...most if not all of these classes should really apply to the individual references,
|
# ...most if not all of these classes should really apply to the individual references,
|
||||||
|
@ -1592,6 +1592,15 @@ class ASTOperator(ASTBase):
|
|||||||
identifier = str(self)
|
identifier = str(self)
|
||||||
if mode == 'lastIsName':
|
if mode == 'lastIsName':
|
||||||
signode += addnodes.desc_name(identifier, identifier)
|
signode += addnodes.desc_name(identifier, identifier)
|
||||||
|
elif mode == 'markType':
|
||||||
|
targetText = prefix + identifier + templateArgs
|
||||||
|
pnode = addnodes.pending_xref('', refdomain='cpp',
|
||||||
|
reftype='identifier',
|
||||||
|
reftarget=targetText, modname=None,
|
||||||
|
classname=None)
|
||||||
|
pnode['cpp:parent_key'] = symbol.get_lookup_key()
|
||||||
|
pnode += nodes.Text(identifier)
|
||||||
|
signode += pnode
|
||||||
else:
|
else:
|
||||||
signode += addnodes.desc_addname(identifier, identifier)
|
signode += addnodes.desc_addname(identifier, identifier)
|
||||||
|
|
||||||
|
@ -10,6 +10,7 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
import pickle
|
import pickle
|
||||||
|
import posixpath
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from copy import copy
|
from copy import copy
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
@ -340,9 +341,9 @@ class BuildEnvironment:
|
|||||||
docdir = path.dirname(self.doc2path(docname or self.docname,
|
docdir = path.dirname(self.doc2path(docname or self.docname,
|
||||||
base=None))
|
base=None))
|
||||||
rel_fn = path.join(docdir, filename)
|
rel_fn = path.join(docdir, filename)
|
||||||
# the path.abspath() might seem redundant, but otherwise artifacts
|
|
||||||
# such as ".." will remain in the path
|
return (posixpath.normpath(rel_fn),
|
||||||
return rel_fn, path.abspath(path.join(self.srcdir, rel_fn))
|
path.normpath(path.join(self.srcdir, rel_fn)))
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def found_docs(self) -> Set[str]:
|
def found_docs(self) -> Set[str]:
|
||||||
|
@ -1338,8 +1338,11 @@ class FunctionDocumenter(DocstringSignatureMixin, ModuleLevelDocumenter): # typ
|
|||||||
documenter.objpath = [None]
|
documenter.objpath = [None]
|
||||||
sigs.append(documenter.format_signature())
|
sigs.append(documenter.format_signature())
|
||||||
if overloaded:
|
if overloaded:
|
||||||
|
actual = inspect.signature(self.object,
|
||||||
|
type_aliases=self.config.autodoc_type_aliases)
|
||||||
__globals__ = safe_getattr(self.object, '__globals__', {})
|
__globals__ = safe_getattr(self.object, '__globals__', {})
|
||||||
for overload in self.analyzer.overloads.get('.'.join(self.objpath)):
|
for overload in self.analyzer.overloads.get('.'.join(self.objpath)):
|
||||||
|
overload = self.merge_default_value(actual, overload)
|
||||||
overload = evaluate_signature(overload, __globals__,
|
overload = evaluate_signature(overload, __globals__,
|
||||||
self.config.autodoc_type_aliases)
|
self.config.autodoc_type_aliases)
|
||||||
|
|
||||||
@ -1348,6 +1351,16 @@ class FunctionDocumenter(DocstringSignatureMixin, ModuleLevelDocumenter): # typ
|
|||||||
|
|
||||||
return "\n".join(sigs)
|
return "\n".join(sigs)
|
||||||
|
|
||||||
|
def merge_default_value(self, actual: Signature, overload: Signature) -> Signature:
|
||||||
|
"""Merge default values of actual implementation to the overload variants."""
|
||||||
|
parameters = list(overload.parameters.values())
|
||||||
|
for i, param in enumerate(parameters):
|
||||||
|
actual_param = actual.parameters.get(param.name)
|
||||||
|
if actual_param and param.default == '...':
|
||||||
|
parameters[i] = param.replace(default=actual_param.default)
|
||||||
|
|
||||||
|
return overload.replace(parameters=parameters)
|
||||||
|
|
||||||
def annotate_to_first_argument(self, func: Callable, typ: Type) -> None:
|
def annotate_to_first_argument(self, func: Callable, typ: Type) -> None:
|
||||||
"""Annotate type hint to the first argument of function if needed."""
|
"""Annotate type hint to the first argument of function if needed."""
|
||||||
try:
|
try:
|
||||||
@ -2096,8 +2109,16 @@ class MethodDocumenter(DocstringSignatureMixin, ClassLevelDocumenter): # type:
|
|||||||
documenter.objpath = [None]
|
documenter.objpath = [None]
|
||||||
sigs.append(documenter.format_signature())
|
sigs.append(documenter.format_signature())
|
||||||
if overloaded:
|
if overloaded:
|
||||||
|
if inspect.isstaticmethod(self.object, cls=self.parent, name=self.object_name):
|
||||||
|
actual = inspect.signature(self.object, bound_method=False,
|
||||||
|
type_aliases=self.config.autodoc_type_aliases)
|
||||||
|
else:
|
||||||
|
actual = inspect.signature(self.object, bound_method=True,
|
||||||
|
type_aliases=self.config.autodoc_type_aliases)
|
||||||
|
|
||||||
__globals__ = safe_getattr(self.object, '__globals__', {})
|
__globals__ = safe_getattr(self.object, '__globals__', {})
|
||||||
for overload in self.analyzer.overloads.get('.'.join(self.objpath)):
|
for overload in self.analyzer.overloads.get('.'.join(self.objpath)):
|
||||||
|
overload = self.merge_default_value(actual, overload)
|
||||||
overload = evaluate_signature(overload, __globals__,
|
overload = evaluate_signature(overload, __globals__,
|
||||||
self.config.autodoc_type_aliases)
|
self.config.autodoc_type_aliases)
|
||||||
|
|
||||||
@ -2110,6 +2131,16 @@ class MethodDocumenter(DocstringSignatureMixin, ClassLevelDocumenter): # type:
|
|||||||
|
|
||||||
return "\n".join(sigs)
|
return "\n".join(sigs)
|
||||||
|
|
||||||
|
def merge_default_value(self, actual: Signature, overload: Signature) -> Signature:
|
||||||
|
"""Merge default values of actual implementation to the overload variants."""
|
||||||
|
parameters = list(overload.parameters.values())
|
||||||
|
for i, param in enumerate(parameters):
|
||||||
|
actual_param = actual.parameters.get(param.name)
|
||||||
|
if actual_param and param.default == '...':
|
||||||
|
parameters[i] = param.replace(default=actual_param.default)
|
||||||
|
|
||||||
|
return overload.replace(parameters=parameters)
|
||||||
|
|
||||||
def annotate_to_first_argument(self, func: Callable, typ: Type) -> None:
|
def annotate_to_first_argument(self, func: Callable, typ: Type) -> None:
|
||||||
"""Annotate type hint to the first argument of function if needed."""
|
"""Annotate type hint to the first argument of function if needed."""
|
||||||
try:
|
try:
|
||||||
|
@ -97,10 +97,14 @@ def setup_documenters(app: Any) -> None:
|
|||||||
|
|
||||||
|
|
||||||
def _simple_info(msg: str) -> None:
|
def _simple_info(msg: str) -> None:
|
||||||
|
warnings.warn('_simple_info() is deprecated.',
|
||||||
|
RemovedInSphinx50Warning, stacklevel=2)
|
||||||
print(msg)
|
print(msg)
|
||||||
|
|
||||||
|
|
||||||
def _simple_warn(msg: str) -> None:
|
def _simple_warn(msg: str) -> None:
|
||||||
|
warnings.warn('_simple_warn() is deprecated.',
|
||||||
|
RemovedInSphinx50Warning, stacklevel=2)
|
||||||
print('WARNING: ' + msg, file=sys.stderr)
|
print('WARNING: ' + msg, file=sys.stderr)
|
||||||
|
|
||||||
|
|
||||||
|
@ -253,10 +253,15 @@ class Config:
|
|||||||
* To create a custom "generic" section, just pass a string.
|
* To create a custom "generic" section, just pass a string.
|
||||||
* To create an alias for an existing section, pass a tuple containing the
|
* To create an alias for an existing section, pass a tuple containing the
|
||||||
alias name and the original, in that order.
|
alias name and the original, in that order.
|
||||||
|
* To create a custom section that displays like the parameters or returns
|
||||||
|
section, pass a tuple containing the custom section name and a string
|
||||||
|
value, "params_style" or "returns_style".
|
||||||
|
|
||||||
If an entry is just a string, it is interpreted as a header for a generic
|
If an entry is just a string, it is interpreted as a header for a generic
|
||||||
section. If the entry is a tuple/list/indexed container, the first entry
|
section. If the entry is a tuple/list/indexed container, the first entry
|
||||||
is the name of the section, the second is the section key to emulate.
|
is the name of the section, the second is the section key to emulate. If the
|
||||||
|
second entry value is "params_style" or "returns_style", the custom section
|
||||||
|
will be displayed like the parameters section or returns section.
|
||||||
|
|
||||||
napoleon_attr_annotations : :obj:`bool` (Defaults to True)
|
napoleon_attr_annotations : :obj:`bool` (Defaults to True)
|
||||||
Use the type annotations of class attributes that are documented in the docstring
|
Use the type annotations of class attributes that are documented in the docstring
|
||||||
|
@ -544,11 +544,18 @@ class GoogleDocstring:
|
|||||||
self._sections[entry.lower()] = self._parse_custom_generic_section
|
self._sections[entry.lower()] = self._parse_custom_generic_section
|
||||||
else:
|
else:
|
||||||
# otherwise, assume entry is container;
|
# otherwise, assume entry is container;
|
||||||
# [0] is new section, [1] is the section to alias.
|
if entry[1] == "params_style":
|
||||||
# in the case of key mismatch, just handle as generic section.
|
self._sections[entry[0].lower()] = \
|
||||||
self._sections[entry[0].lower()] = \
|
self._parse_custom_params_style_section
|
||||||
self._sections.get(entry[1].lower(),
|
elif entry[1] == "returns_style":
|
||||||
self._parse_custom_generic_section)
|
self._sections[entry[0].lower()] = \
|
||||||
|
self._parse_custom_returns_style_section
|
||||||
|
else:
|
||||||
|
# [0] is new section, [1] is the section to alias.
|
||||||
|
# in the case of key mismatch, just handle as generic section.
|
||||||
|
self._sections[entry[0].lower()] = \
|
||||||
|
self._sections.get(entry[1].lower(),
|
||||||
|
self._parse_custom_generic_section)
|
||||||
|
|
||||||
def _parse(self) -> None:
|
def _parse(self) -> None:
|
||||||
self._parsed_lines = self._consume_empty()
|
self._parsed_lines = self._consume_empty()
|
||||||
@ -636,6 +643,13 @@ class GoogleDocstring:
|
|||||||
# for now, no admonition for simple custom sections
|
# for now, no admonition for simple custom sections
|
||||||
return self._parse_generic_section(section, False)
|
return self._parse_generic_section(section, False)
|
||||||
|
|
||||||
|
def _parse_custom_params_style_section(self, section: str) -> List[str]:
|
||||||
|
return self._format_fields(section, self._consume_fields())
|
||||||
|
|
||||||
|
def _parse_custom_returns_style_section(self, section: str) -> List[str]:
|
||||||
|
fields = self._consume_returns_section()
|
||||||
|
return self._format_fields(section, fields)
|
||||||
|
|
||||||
def _parse_usage_section(self, section: str) -> List[str]:
|
def _parse_usage_section(self, section: str) -> List[str]:
|
||||||
header = ['.. rubric:: Usage:', '']
|
header = ['.. rubric:: Usage:', '']
|
||||||
block = ['.. code-block:: python', '']
|
block = ['.. code-block:: python', '']
|
||||||
@ -682,7 +696,13 @@ class GoogleDocstring:
|
|||||||
return self._parse_generic_section(_('Notes'), use_admonition)
|
return self._parse_generic_section(_('Notes'), use_admonition)
|
||||||
|
|
||||||
def _parse_other_parameters_section(self, section: str) -> List[str]:
|
def _parse_other_parameters_section(self, section: str) -> List[str]:
|
||||||
return self._format_fields(_('Other Parameters'), self._consume_fields())
|
if self._config.napoleon_use_param:
|
||||||
|
# Allow to declare multiple parameters at once (ex: x, y: int)
|
||||||
|
fields = self._consume_fields(multiple=True)
|
||||||
|
return self._format_docutils_params(fields)
|
||||||
|
else:
|
||||||
|
fields = self._consume_fields()
|
||||||
|
return self._format_fields(_('Other Parameters'), fields)
|
||||||
|
|
||||||
def _parse_parameters_section(self, section: str) -> List[str]:
|
def _parse_parameters_section(self, section: str) -> List[str]:
|
||||||
if self._config.napoleon_use_param:
|
if self._config.napoleon_use_param:
|
||||||
|
@ -8,8 +8,11 @@
|
|||||||
:license: BSD, see LICENSE for details.
|
:license: BSD, see LICENSE for details.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
import posixpath
|
||||||
import traceback
|
import traceback
|
||||||
from typing import Any, Dict, Iterable, Iterator, Set, Tuple
|
import warnings
|
||||||
|
from os import path
|
||||||
|
from typing import Any, Dict, Generator, Iterable, Optional, Set, Tuple, cast
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from docutils.nodes import Element, Node
|
from docutils.nodes import Element, Node
|
||||||
@ -17,16 +20,32 @@ from docutils.nodes import Element, Node
|
|||||||
import sphinx
|
import sphinx
|
||||||
from sphinx import addnodes
|
from sphinx import addnodes
|
||||||
from sphinx.application import Sphinx
|
from sphinx.application import Sphinx
|
||||||
|
from sphinx.builders import Builder
|
||||||
|
from sphinx.builders.html import StandaloneHTMLBuilder
|
||||||
|
from sphinx.deprecation import RemovedInSphinx50Warning
|
||||||
from sphinx.environment import BuildEnvironment
|
from sphinx.environment import BuildEnvironment
|
||||||
from sphinx.locale import _, __
|
from sphinx.locale import _, __
|
||||||
from sphinx.pycode import ModuleAnalyzer
|
from sphinx.pycode import ModuleAnalyzer
|
||||||
|
from sphinx.transforms.post_transforms import SphinxPostTransform
|
||||||
from sphinx.util import get_full_modname, logging, status_iterator
|
from sphinx.util import get_full_modname, logging, status_iterator
|
||||||
from sphinx.util.nodes import make_refnode
|
from sphinx.util.nodes import make_refnode
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
def _get_full_modname(app: Sphinx, modname: str, attribute: str) -> str:
|
OUTPUT_DIRNAME = '_modules'
|
||||||
|
|
||||||
|
|
||||||
|
class viewcode_anchor(Element):
|
||||||
|
"""Node for viewcode anchors.
|
||||||
|
|
||||||
|
This node will be processed in the resolving phase.
|
||||||
|
For viewcode supported builders, they will be all converted to the anchors.
|
||||||
|
For not supported builders, they will be removed.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def _get_full_modname(app: Sphinx, modname: str, attribute: str) -> Optional[str]:
|
||||||
try:
|
try:
|
||||||
return get_full_modname(modname, attribute)
|
return get_full_modname(modname, attribute)
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
@ -44,14 +63,21 @@ def _get_full_modname(app: Sphinx, modname: str, attribute: str) -> str:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def is_supported_builder(builder: Builder) -> bool:
|
||||||
|
if builder.format != 'html':
|
||||||
|
return False
|
||||||
|
elif builder.name == 'singlehtml':
|
||||||
|
return False
|
||||||
|
elif builder.name.startswith('epub') and not builder.config.viewcode_enable_epub:
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
def doctree_read(app: Sphinx, doctree: Node) -> None:
|
def doctree_read(app: Sphinx, doctree: Node) -> None:
|
||||||
env = app.builder.env
|
env = app.builder.env
|
||||||
if not hasattr(env, '_viewcode_modules'):
|
if not hasattr(env, '_viewcode_modules'):
|
||||||
env._viewcode_modules = {} # type: ignore
|
env._viewcode_modules = {} # type: ignore
|
||||||
if app.builder.name == "singlehtml":
|
|
||||||
return
|
|
||||||
if app.builder.name.startswith("epub") and not env.config.viewcode_enable_epub:
|
|
||||||
return
|
|
||||||
|
|
||||||
def has_tag(modname: str, fullname: str, docname: str, refname: str) -> bool:
|
def has_tag(modname: str, fullname: str, docname: str, refname: str) -> bool:
|
||||||
entry = env._viewcode_modules.get(modname, None) # type: ignore
|
entry = env._viewcode_modules.get(modname, None) # type: ignore
|
||||||
@ -108,13 +134,8 @@ def doctree_read(app: Sphinx, doctree: Node) -> None:
|
|||||||
# only one link per name, please
|
# only one link per name, please
|
||||||
continue
|
continue
|
||||||
names.add(fullname)
|
names.add(fullname)
|
||||||
pagename = '_modules/' + modname.replace('.', '/')
|
pagename = posixpath.join(OUTPUT_DIRNAME, modname.replace('.', '/'))
|
||||||
inline = nodes.inline('', _('[source]'), classes=['viewcode-link'])
|
signode += viewcode_anchor(reftarget=pagename, refid=fullname, refdoc=env.docname)
|
||||||
onlynode = addnodes.only(expr='html')
|
|
||||||
onlynode += addnodes.pending_xref('', inline, reftype='viewcode', refdomain='std',
|
|
||||||
refexplicit=False, reftarget=pagename,
|
|
||||||
refid=fullname, refdoc=env.docname)
|
|
||||||
signode += onlynode
|
|
||||||
|
|
||||||
|
|
||||||
def env_merge_info(app: Sphinx, env: BuildEnvironment, docnames: Iterable[str],
|
def env_merge_info(app: Sphinx, env: BuildEnvironment, docnames: Iterable[str],
|
||||||
@ -128,20 +149,80 @@ def env_merge_info(app: Sphinx, env: BuildEnvironment, docnames: Iterable[str],
|
|||||||
env._viewcode_modules.update(other._viewcode_modules) # type: ignore
|
env._viewcode_modules.update(other._viewcode_modules) # type: ignore
|
||||||
|
|
||||||
|
|
||||||
|
class ViewcodeAnchorTransform(SphinxPostTransform):
|
||||||
|
"""Convert or remove viewcode_anchor nodes depends on builder."""
|
||||||
|
default_priority = 100
|
||||||
|
|
||||||
|
def run(self, **kwargs: Any) -> None:
|
||||||
|
if is_supported_builder(self.app.builder):
|
||||||
|
self.convert_viewcode_anchors()
|
||||||
|
else:
|
||||||
|
self.remove_viewcode_anchors()
|
||||||
|
|
||||||
|
def convert_viewcode_anchors(self) -> None:
|
||||||
|
for node in self.document.traverse(viewcode_anchor):
|
||||||
|
anchor = nodes.inline('', _('[source]'), classes=['viewcode-link'])
|
||||||
|
refnode = make_refnode(self.app.builder, node['refdoc'], node['reftarget'],
|
||||||
|
node['refid'], anchor)
|
||||||
|
node.replace_self(refnode)
|
||||||
|
|
||||||
|
def remove_viewcode_anchors(self) -> None:
|
||||||
|
for node in self.document.traverse(viewcode_anchor):
|
||||||
|
node.parent.remove(node)
|
||||||
|
|
||||||
|
|
||||||
def missing_reference(app: Sphinx, env: BuildEnvironment, node: Element, contnode: Node
|
def missing_reference(app: Sphinx, env: BuildEnvironment, node: Element, contnode: Node
|
||||||
) -> Node:
|
) -> Optional[Node]:
|
||||||
# resolve our "viewcode" reference nodes -- they need special treatment
|
# resolve our "viewcode" reference nodes -- they need special treatment
|
||||||
if node['reftype'] == 'viewcode':
|
if node['reftype'] == 'viewcode':
|
||||||
|
warnings.warn('viewcode extension is no longer use pending_xref node. '
|
||||||
|
'Please update your extension.', RemovedInSphinx50Warning)
|
||||||
return make_refnode(app.builder, node['refdoc'], node['reftarget'],
|
return make_refnode(app.builder, node['refdoc'], node['reftarget'],
|
||||||
node['refid'], contnode)
|
node['refid'], contnode)
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def collect_pages(app: Sphinx) -> Iterator[Tuple[str, Dict[str, Any], str]]:
|
def get_module_filename(app: Sphinx, modname: str) -> Optional[str]:
|
||||||
|
"""Get module filename for *modname*."""
|
||||||
|
source_info = app.emit_firstresult('viewcode-find-source', modname)
|
||||||
|
if source_info:
|
||||||
|
return None
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
filename, source = ModuleAnalyzer.get_module_source(modname)
|
||||||
|
return filename
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def should_generate_module_page(app: Sphinx, modname: str) -> bool:
|
||||||
|
"""Check generation of module page is needed."""
|
||||||
|
module_filename = get_module_filename(app, modname)
|
||||||
|
if module_filename is None:
|
||||||
|
# Always (re-)generate module page when module filename is not found.
|
||||||
|
return True
|
||||||
|
|
||||||
|
builder = cast(StandaloneHTMLBuilder, app.builder)
|
||||||
|
basename = modname.replace('.', '/') + builder.out_suffix
|
||||||
|
page_filename = path.join(app.outdir, '_modules/', basename)
|
||||||
|
|
||||||
|
try:
|
||||||
|
if path.getmtime(module_filename) <= path.getmtime(page_filename):
|
||||||
|
# generation is not needed if the HTML page is newer than module file.
|
||||||
|
return False
|
||||||
|
except IOError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def collect_pages(app: Sphinx) -> Generator[Tuple[str, Dict[str, Any], str], None, None]:
|
||||||
env = app.builder.env
|
env = app.builder.env
|
||||||
if not hasattr(env, '_viewcode_modules'):
|
if not hasattr(env, '_viewcode_modules'):
|
||||||
return
|
return
|
||||||
|
if not is_supported_builder(app.builder):
|
||||||
|
return
|
||||||
highlighter = app.builder.highlighter # type: ignore
|
highlighter = app.builder.highlighter # type: ignore
|
||||||
urito = app.builder.get_relative_uri
|
urito = app.builder.get_relative_uri
|
||||||
|
|
||||||
@ -154,9 +235,12 @@ def collect_pages(app: Sphinx) -> Iterator[Tuple[str, Dict[str, Any], str]]:
|
|||||||
app.verbosity, lambda x: x[0]):
|
app.verbosity, lambda x: x[0]):
|
||||||
if not entry:
|
if not entry:
|
||||||
continue
|
continue
|
||||||
|
if not should_generate_module_page(app, modname):
|
||||||
|
continue
|
||||||
|
|
||||||
code, tags, used, refname = entry
|
code, tags, used, refname = entry
|
||||||
# construct a page name for the highlighted source
|
# construct a page name for the highlighted source
|
||||||
pagename = '_modules/' + modname.replace('.', '/')
|
pagename = posixpath.join(OUTPUT_DIRNAME, modname.replace('.', '/'))
|
||||||
# highlight the source using the builder's highlighter
|
# highlight the source using the builder's highlighter
|
||||||
if env.config.highlight_language in ('python3', 'default', 'none'):
|
if env.config.highlight_language in ('python3', 'default', 'none'):
|
||||||
lexer = env.config.highlight_language
|
lexer = env.config.highlight_language
|
||||||
@ -188,10 +272,10 @@ def collect_pages(app: Sphinx) -> Iterator[Tuple[str, Dict[str, Any], str]]:
|
|||||||
parent = parent.rsplit('.', 1)[0]
|
parent = parent.rsplit('.', 1)[0]
|
||||||
if parent in modnames:
|
if parent in modnames:
|
||||||
parents.append({
|
parents.append({
|
||||||
'link': urito(pagename, '_modules/' +
|
'link': urito(pagename,
|
||||||
parent.replace('.', '/')),
|
posixpath.join(OUTPUT_DIRNAME, parent.replace('.', '/'))),
|
||||||
'title': parent})
|
'title': parent})
|
||||||
parents.append({'link': urito(pagename, '_modules/index'),
|
parents.append({'link': urito(pagename, posixpath.join(OUTPUT_DIRNAME, 'index')),
|
||||||
'title': _('Module code')})
|
'title': _('Module code')})
|
||||||
parents.reverse()
|
parents.reverse()
|
||||||
# putting it all together
|
# putting it all together
|
||||||
@ -220,7 +304,8 @@ def collect_pages(app: Sphinx) -> Iterator[Tuple[str, Dict[str, Any], str]]:
|
|||||||
html.append('</ul>')
|
html.append('</ul>')
|
||||||
stack.append(modname + '.')
|
stack.append(modname + '.')
|
||||||
html.append('<li><a href="%s">%s</a></li>\n' % (
|
html.append('<li><a href="%s">%s</a></li>\n' % (
|
||||||
urito('_modules/index', '_modules/' + modname.replace('.', '/')),
|
urito(posixpath.join(OUTPUT_DIRNAME, 'index'),
|
||||||
|
posixpath.join(OUTPUT_DIRNAME, modname.replace('.', '/'))),
|
||||||
modname))
|
modname))
|
||||||
html.append('</ul>' * (len(stack) - 1))
|
html.append('</ul>' * (len(stack) - 1))
|
||||||
context = {
|
context = {
|
||||||
@ -229,7 +314,7 @@ def collect_pages(app: Sphinx) -> Iterator[Tuple[str, Dict[str, Any], str]]:
|
|||||||
''.join(html)),
|
''.join(html)),
|
||||||
}
|
}
|
||||||
|
|
||||||
yield ('_modules/index', context, 'page.html')
|
yield (posixpath.join(OUTPUT_DIRNAME, 'index'), context, 'page.html')
|
||||||
|
|
||||||
|
|
||||||
def setup(app: Sphinx) -> Dict[str, Any]:
|
def setup(app: Sphinx) -> Dict[str, Any]:
|
||||||
@ -244,6 +329,7 @@ def setup(app: Sphinx) -> Dict[str, Any]:
|
|||||||
# app.add_config_value('viewcode_exclude_modules', [], 'env')
|
# app.add_config_value('viewcode_exclude_modules', [], 'env')
|
||||||
app.add_event('viewcode-find-source')
|
app.add_event('viewcode-find-source')
|
||||||
app.add_event('viewcode-follow-imported')
|
app.add_event('viewcode-follow-imported')
|
||||||
|
app.add_post_transform(ViewcodeAnchorTransform)
|
||||||
return {
|
return {
|
||||||
'version': sphinx.__display_version__,
|
'version': sphinx.__display_version__,
|
||||||
'env_version': 1,
|
'env_version': 1,
|
||||||
|
@ -8,7 +8,7 @@
|
|||||||
:license: BSD, see LICENSE for details.
|
:license: BSD, see LICENSE for details.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from typing import Any, Dict, List, Tuple, Type, cast
|
from typing import Any, Dict, List, Optional, Tuple, Type, cast
|
||||||
|
|
||||||
from docutils import nodes
|
from docutils import nodes
|
||||||
from docutils.nodes import Element
|
from docutils.nodes import Element
|
||||||
@ -150,7 +150,7 @@ class ReferencesResolver(SphinxPostTransform):
|
|||||||
return newnode
|
return newnode
|
||||||
|
|
||||||
def warn_missing_reference(self, refdoc: str, typ: str, target: str,
|
def warn_missing_reference(self, refdoc: str, typ: str, target: str,
|
||||||
node: pending_xref, domain: Domain) -> None:
|
node: pending_xref, domain: Optional[Domain]) -> None:
|
||||||
warn = node.get('refwarn')
|
warn = node.get('refwarn')
|
||||||
if self.config.nitpicky:
|
if self.config.nitpicky:
|
||||||
warn = True
|
warn = True
|
||||||
|
@ -482,6 +482,19 @@ def is_builtin_class_method(obj: Any, attr_name: str) -> bool:
|
|||||||
return getattr(builtins, name, None) is cls
|
return getattr(builtins, name, None) is cls
|
||||||
|
|
||||||
|
|
||||||
|
class DefaultValue:
|
||||||
|
"""A simple wrapper for default value of the parameters of overload functions."""
|
||||||
|
|
||||||
|
def __init__(self, value: str) -> None:
|
||||||
|
self.value = value
|
||||||
|
|
||||||
|
def __eq__(self, other: object) -> bool:
|
||||||
|
return self.value == other
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
return self.value
|
||||||
|
|
||||||
|
|
||||||
def _should_unwrap(subject: Callable) -> bool:
|
def _should_unwrap(subject: Callable) -> bool:
|
||||||
"""Check the function should be unwrapped on getting signature."""
|
"""Check the function should be unwrapped on getting signature."""
|
||||||
if (safe_getattr(subject, '__globals__', None) and
|
if (safe_getattr(subject, '__globals__', None) and
|
||||||
@ -687,7 +700,7 @@ def signature_from_ast(node: ast.FunctionDef, code: str = '') -> inspect.Signatu
|
|||||||
if defaults[i] is Parameter.empty:
|
if defaults[i] is Parameter.empty:
|
||||||
default = Parameter.empty
|
default = Parameter.empty
|
||||||
else:
|
else:
|
||||||
default = ast_unparse(defaults[i], code)
|
default = DefaultValue(ast_unparse(defaults[i], code))
|
||||||
|
|
||||||
annotation = ast_unparse(arg.annotation, code) or Parameter.empty
|
annotation = ast_unparse(arg.annotation, code) or Parameter.empty
|
||||||
params.append(Parameter(arg.arg, Parameter.POSITIONAL_ONLY,
|
params.append(Parameter(arg.arg, Parameter.POSITIONAL_ONLY,
|
||||||
@ -697,7 +710,7 @@ def signature_from_ast(node: ast.FunctionDef, code: str = '') -> inspect.Signatu
|
|||||||
if defaults[i + posonlyargs] is Parameter.empty:
|
if defaults[i + posonlyargs] is Parameter.empty:
|
||||||
default = Parameter.empty
|
default = Parameter.empty
|
||||||
else:
|
else:
|
||||||
default = ast_unparse(defaults[i + posonlyargs], code)
|
default = DefaultValue(ast_unparse(defaults[i + posonlyargs], code))
|
||||||
|
|
||||||
annotation = ast_unparse(arg.annotation, code) or Parameter.empty
|
annotation = ast_unparse(arg.annotation, code) or Parameter.empty
|
||||||
params.append(Parameter(arg.arg, Parameter.POSITIONAL_OR_KEYWORD,
|
params.append(Parameter(arg.arg, Parameter.POSITIONAL_OR_KEYWORD,
|
||||||
|
@ -1203,7 +1203,6 @@ class LaTeXTranslator(SphinxTranslator):
|
|||||||
return isinstance(node.parent, nodes.TextElement)
|
return isinstance(node.parent, nodes.TextElement)
|
||||||
|
|
||||||
def visit_image(self, node: Element) -> None:
|
def visit_image(self, node: Element) -> None:
|
||||||
attrs = node.attributes
|
|
||||||
pre = [] # type: List[str]
|
pre = [] # type: List[str]
|
||||||
# in reverse order
|
# in reverse order
|
||||||
post = [] # type: List[str]
|
post = [] # type: List[str]
|
||||||
@ -1213,27 +1212,27 @@ class LaTeXTranslator(SphinxTranslator):
|
|||||||
is_inline = self.is_inline(node.parent)
|
is_inline = self.is_inline(node.parent)
|
||||||
else:
|
else:
|
||||||
is_inline = self.is_inline(node)
|
is_inline = self.is_inline(node)
|
||||||
if 'width' in attrs:
|
if 'width' in node:
|
||||||
if 'scale' in attrs:
|
if 'scale' in node:
|
||||||
w = self.latex_image_length(attrs['width'], attrs['scale'])
|
w = self.latex_image_length(node['width'], node['scale'])
|
||||||
else:
|
else:
|
||||||
w = self.latex_image_length(attrs['width'])
|
w = self.latex_image_length(node['width'])
|
||||||
if w:
|
if w:
|
||||||
include_graphics_options.append('width=%s' % w)
|
include_graphics_options.append('width=%s' % w)
|
||||||
if 'height' in attrs:
|
if 'height' in node:
|
||||||
if 'scale' in attrs:
|
if 'scale' in node:
|
||||||
h = self.latex_image_length(attrs['height'], attrs['scale'])
|
h = self.latex_image_length(node['height'], node['scale'])
|
||||||
else:
|
else:
|
||||||
h = self.latex_image_length(attrs['height'])
|
h = self.latex_image_length(node['height'])
|
||||||
if h:
|
if h:
|
||||||
include_graphics_options.append('height=%s' % h)
|
include_graphics_options.append('height=%s' % h)
|
||||||
if 'scale' in attrs:
|
if 'scale' in node:
|
||||||
if not include_graphics_options:
|
if not include_graphics_options:
|
||||||
# if no "width" nor "height", \sphinxincludegraphics will fit
|
# if no "width" nor "height", \sphinxincludegraphics will fit
|
||||||
# to the available text width if oversized after rescaling.
|
# to the available text width if oversized after rescaling.
|
||||||
include_graphics_options.append('scale=%s'
|
include_graphics_options.append('scale=%s'
|
||||||
% (float(attrs['scale']) / 100.0))
|
% (float(node['scale']) / 100.0))
|
||||||
if 'align' in attrs:
|
if 'align' in node:
|
||||||
align_prepost = {
|
align_prepost = {
|
||||||
# By default latex aligns the top of an image.
|
# By default latex aligns the top of an image.
|
||||||
(1, 'top'): ('', ''),
|
(1, 'top'): ('', ''),
|
||||||
@ -1247,8 +1246,8 @@ class LaTeXTranslator(SphinxTranslator):
|
|||||||
(0, 'right'): ('{\\hspace*{\\fill}', '}'),
|
(0, 'right'): ('{\\hspace*{\\fill}', '}'),
|
||||||
}
|
}
|
||||||
try:
|
try:
|
||||||
pre.append(align_prepost[is_inline, attrs['align']][0])
|
pre.append(align_prepost[is_inline, node['align']][0])
|
||||||
post.append(align_prepost[is_inline, attrs['align']][1])
|
post.append(align_prepost[is_inline, node['align']][1])
|
||||||
except KeyError:
|
except KeyError:
|
||||||
pass
|
pass
|
||||||
if self.in_parsed_literal:
|
if self.in_parsed_literal:
|
||||||
|
@ -1205,11 +1205,10 @@ class TexinfoTranslator(SphinxTranslator):
|
|||||||
# ignore remote images
|
# ignore remote images
|
||||||
return
|
return
|
||||||
name, ext = path.splitext(uri)
|
name, ext = path.splitext(uri)
|
||||||
attrs = node.attributes
|
|
||||||
# width and height ignored in non-tex output
|
# width and height ignored in non-tex output
|
||||||
width = self.tex_image_length(attrs.get('width', ''))
|
width = self.tex_image_length(node.get('width', ''))
|
||||||
height = self.tex_image_length(attrs.get('height', ''))
|
height = self.tex_image_length(node.get('height', ''))
|
||||||
alt = self.escape_arg(attrs.get('alt', ''))
|
alt = self.escape_arg(node.get('alt', ''))
|
||||||
filename = "%s-figures/%s" % (self.elements['filename'][:-5], name) # type: ignore
|
filename = "%s-figures/%s" % (self.elements['filename'][:-5], name) # type: ignore
|
||||||
self.body.append('\n@image{%s,%s,%s,%s,%s}\n' %
|
self.body.append('\n@image{%s,%s,%s,%s,%s}\n' %
|
||||||
(filename, width, height, alt, ext[1:]))
|
(filename, width, height, alt, ext[1:]))
|
||||||
|
13
tests/roots/test-domain-c/ns_lookup.rst
Normal file
13
tests/roots/test-domain-c/ns_lookup.rst
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
.. c:namespace:: ns_lookup
|
||||||
|
|
||||||
|
.. c:var:: int i
|
||||||
|
|
||||||
|
.. c:function:: void f(int j)
|
||||||
|
|
||||||
|
- :c:var:`i`
|
||||||
|
- :c:var:`j`
|
||||||
|
- :c:expr:`i`
|
||||||
|
- :c:expr:`j`
|
||||||
|
|
||||||
|
- :c:var:`i`
|
||||||
|
- :c:expr:`i`
|
@ -2,21 +2,21 @@ from typing import Any, overload
|
|||||||
|
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(x: int, y: int) -> int:
|
def sum(x: int, y: int = 0) -> int:
|
||||||
...
|
...
|
||||||
|
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(x: "float", y: "float") -> "float":
|
def sum(x: "float", y: "float" = 0.0) -> "float":
|
||||||
...
|
...
|
||||||
|
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(x: str, y: str) -> str:
|
def sum(x: str, y: str = ...) -> str:
|
||||||
...
|
...
|
||||||
|
|
||||||
|
|
||||||
def sum(x, y):
|
def sum(x, y=None):
|
||||||
"""docstring"""
|
"""docstring"""
|
||||||
return x + y
|
return x + y
|
||||||
|
|
||||||
@ -25,18 +25,18 @@ class Math:
|
|||||||
"""docstring"""
|
"""docstring"""
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(self, x: int, y: int) -> int:
|
def sum(self, x: int, y: int = 0) -> int:
|
||||||
...
|
...
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(self, x: "float", y: "float") -> "float":
|
def sum(self, x: "float", y: "float" = 0.0) -> "float":
|
||||||
...
|
...
|
||||||
|
|
||||||
@overload
|
@overload
|
||||||
def sum(self, x: str, y: str) -> str:
|
def sum(self, x: str, y: str = ...) -> str:
|
||||||
...
|
...
|
||||||
|
|
||||||
def sum(self, x, y):
|
def sum(self, x, y=None):
|
||||||
"""docstring"""
|
"""docstring"""
|
||||||
return x + y
|
return x + y
|
||||||
|
|
||||||
|
@ -1 +0,0 @@
|
|||||||
exclude_patterns = ['_build']
|
|
@ -1,6 +0,0 @@
|
|||||||
.. image:: http://localhost:7777/
|
|
||||||
:target: http://localhost:7777/
|
|
||||||
|
|
||||||
`weblate.org`_
|
|
||||||
|
|
||||||
.. _weblate.org: http://localhost:7777/
|
|
@ -573,40 +573,3 @@ def test_limit_rate_bails_out_after_waiting_max_time(app):
|
|||||||
checker.rate_limits = {"localhost": RateLimit(90.0, 0.0)}
|
checker.rate_limits = {"localhost": RateLimit(90.0, 0.0)}
|
||||||
next_check = checker.limit_rate(FakeResponse())
|
next_check = checker.limit_rate(FakeResponse())
|
||||||
assert next_check is None
|
assert next_check is None
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.sphinx(
|
|
||||||
'linkcheck', testroot='linkcheck-localserver-two-links', freshenv=True,
|
|
||||||
)
|
|
||||||
def test_priorityqueue_items_are_comparable(app):
|
|
||||||
with http_server(OKHandler):
|
|
||||||
app.builder.build_all()
|
|
||||||
content = (app.outdir / 'output.json').read_text()
|
|
||||||
rows = [json.loads(x) for x in sorted(content.splitlines())]
|
|
||||||
assert rows == [
|
|
||||||
{
|
|
||||||
'filename': 'index.rst',
|
|
||||||
# Should not be None.
|
|
||||||
'lineno': 0,
|
|
||||||
'status': 'working',
|
|
||||||
'code': 0,
|
|
||||||
'uri': 'http://localhost:7777/',
|
|
||||||
'info': '',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'filename': 'index.rst',
|
|
||||||
'lineno': 0,
|
|
||||||
'status': 'working',
|
|
||||||
'code': 0,
|
|
||||||
'uri': 'http://localhost:7777/',
|
|
||||||
'info': '',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'filename': 'index.rst',
|
|
||||||
'lineno': 4,
|
|
||||||
'status': 'working',
|
|
||||||
'code': 0,
|
|
||||||
'uri': 'http://localhost:7777/',
|
|
||||||
'info': '',
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
@ -598,6 +598,13 @@ def test_build_function_param_target(app, warning):
|
|||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.sphinx(testroot='domain-c', confoverrides={'nitpicky': True})
|
||||||
|
def test_build_ns_lookup(app, warning):
|
||||||
|
app.builder.build_all()
|
||||||
|
ws = filter_warnings(warning, "ns_lookup")
|
||||||
|
assert len(ws) == 0
|
||||||
|
|
||||||
|
|
||||||
def _get_obj(app, queryName):
|
def _get_obj(app, queryName):
|
||||||
domain = app.env.get_domain('c')
|
domain = app.env.get_domain('c')
|
||||||
for name, dispname, objectType, docname, anchor, prio in domain.get_objects():
|
for name, dispname, objectType, docname, anchor, prio in domain.get_objects():
|
||||||
|
@ -138,6 +138,11 @@ def test_env_relfn2path(app):
|
|||||||
assert relfn == '../logo.jpg'
|
assert relfn == '../logo.jpg'
|
||||||
assert absfn == app.srcdir.parent / 'logo.jpg'
|
assert absfn == app.srcdir.parent / 'logo.jpg'
|
||||||
|
|
||||||
|
# relative path traversal
|
||||||
|
relfn, absfn = app.env.relfn2path('subdir/../logo.jpg', 'index')
|
||||||
|
assert relfn == 'logo.jpg'
|
||||||
|
assert absfn == app.srcdir / 'logo.jpg'
|
||||||
|
|
||||||
# omit docname (w/ current docname)
|
# omit docname (w/ current docname)
|
||||||
app.env.temp_data['docname'] = 'subdir/document'
|
app.env.temp_data['docname'] = 'subdir/document'
|
||||||
relfn, absfn = app.env.relfn2path('images/logo.jpg')
|
relfn, absfn = app.env.relfn2path('images/logo.jpg')
|
||||||
|
@ -2067,17 +2067,17 @@ def test_overload(app):
|
|||||||
' docstring',
|
' docstring',
|
||||||
'',
|
'',
|
||||||
'',
|
'',
|
||||||
' .. py:method:: Math.sum(x: int, y: int) -> int',
|
' .. py:method:: Math.sum(x: int, y: int = 0) -> int',
|
||||||
' Math.sum(x: float, y: float) -> float',
|
' Math.sum(x: float, y: float = 0.0) -> float',
|
||||||
' Math.sum(x: str, y: str) -> str',
|
' Math.sum(x: str, y: str = None) -> str',
|
||||||
' :module: target.overload',
|
' :module: target.overload',
|
||||||
'',
|
'',
|
||||||
' docstring',
|
' docstring',
|
||||||
'',
|
'',
|
||||||
'',
|
'',
|
||||||
'.. py:function:: sum(x: int, y: int) -> int',
|
'.. py:function:: sum(x: int, y: int = 0) -> int',
|
||||||
' sum(x: float, y: float) -> float',
|
' sum(x: float, y: float = 0.0) -> float',
|
||||||
' sum(x: str, y: str) -> str',
|
' sum(x: str, y: str = None) -> str',
|
||||||
' :module: target.overload',
|
' :module: target.overload',
|
||||||
'',
|
'',
|
||||||
' docstring',
|
' docstring',
|
||||||
|
@ -647,13 +647,13 @@ def test_autodoc_typehints_none_for_overload(app):
|
|||||||
' docstring',
|
' docstring',
|
||||||
'',
|
'',
|
||||||
'',
|
'',
|
||||||
' .. py:method:: Math.sum(x, y)',
|
' .. py:method:: Math.sum(x, y=None)',
|
||||||
' :module: target.overload',
|
' :module: target.overload',
|
||||||
'',
|
'',
|
||||||
' docstring',
|
' docstring',
|
||||||
'',
|
'',
|
||||||
'',
|
'',
|
||||||
'.. py:function:: sum(x, y)',
|
'.. py:function:: sum(x, y=None)',
|
||||||
' :module: target.overload',
|
' :module: target.overload',
|
||||||
'',
|
'',
|
||||||
' docstring',
|
' docstring',
|
||||||
|
@ -1072,10 +1072,27 @@ You should listen to me!
|
|||||||
Sooper Warning:
|
Sooper Warning:
|
||||||
Stop hitting yourself!
|
Stop hitting yourself!
|
||||||
""", """:Warns: **Stop hitting yourself!**
|
""", """:Warns: **Stop hitting yourself!**
|
||||||
|
"""),
|
||||||
|
("""\
|
||||||
|
Params Style:
|
||||||
|
arg1 (int): Description of arg1
|
||||||
|
arg2 (str): Description of arg2
|
||||||
|
|
||||||
|
""", """\
|
||||||
|
:Params Style: * **arg1** (*int*) -- Description of arg1
|
||||||
|
* **arg2** (*str*) -- Description of arg2
|
||||||
|
"""),
|
||||||
|
("""\
|
||||||
|
Returns Style:
|
||||||
|
description of custom section
|
||||||
|
|
||||||
|
""", """:Returns Style: description of custom section
|
||||||
"""))
|
"""))
|
||||||
|
|
||||||
testConfig = Config(napoleon_custom_sections=['Really Important Details',
|
testConfig = Config(napoleon_custom_sections=['Really Important Details',
|
||||||
('Sooper Warning', 'warns')])
|
('Sooper Warning', 'warns'),
|
||||||
|
('Params Style', 'params_style'),
|
||||||
|
('Returns Style', 'returns_style')])
|
||||||
|
|
||||||
for docstring, expected in docstrings:
|
for docstring, expected in docstrings:
|
||||||
actual = str(GoogleDocstring(docstring, testConfig))
|
actual = str(GoogleDocstring(docstring, testConfig))
|
||||||
@ -1441,12 +1458,18 @@ Parameters
|
|||||||
----------
|
----------
|
||||||
param1 : :class:`MyClass <name.space.MyClass>` instance
|
param1 : :class:`MyClass <name.space.MyClass>` instance
|
||||||
|
|
||||||
|
Other Parameters
|
||||||
|
----------------
|
||||||
|
param2 : :class:`MyClass <name.space.MyClass>` instance
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
config = Config(napoleon_use_param=False)
|
config = Config(napoleon_use_param=False)
|
||||||
actual = str(NumpyDocstring(docstring, config))
|
actual = str(NumpyDocstring(docstring, config))
|
||||||
expected = """\
|
expected = """\
|
||||||
:Parameters: **param1** (:class:`MyClass <name.space.MyClass>` instance)
|
:Parameters: **param1** (:class:`MyClass <name.space.MyClass>` instance)
|
||||||
|
|
||||||
|
:Other Parameters: **param2** (:class:`MyClass <name.space.MyClass>` instance)
|
||||||
"""
|
"""
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
@ -1455,6 +1478,9 @@ param1 : :class:`MyClass <name.space.MyClass>` instance
|
|||||||
expected = """\
|
expected = """\
|
||||||
:param param1:
|
:param param1:
|
||||||
:type param1: :class:`MyClass <name.space.MyClass>` instance
|
:type param1: :class:`MyClass <name.space.MyClass>` instance
|
||||||
|
|
||||||
|
:param param2:
|
||||||
|
:type param2: :class:`MyClass <name.space.MyClass>` instance
|
||||||
"""
|
"""
|
||||||
self.assertEqual(expected, actual)
|
self.assertEqual(expected, actual)
|
||||||
|
|
||||||
|
@ -49,6 +49,27 @@ def test_viewcode(app, status, warning):
|
|||||||
'<span> """</span></div>\n') in result
|
'<span> """</span></div>\n') in result
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.sphinx('epub', testroot='ext-viewcode')
|
||||||
|
def test_viewcode_epub_default(app, status, warning):
|
||||||
|
app.builder.build_all()
|
||||||
|
|
||||||
|
assert not (app.outdir / '_modules/spam/mod1.xhtml').exists()
|
||||||
|
|
||||||
|
result = (app.outdir / 'index.xhtml').read_text()
|
||||||
|
assert result.count('href="_modules/spam/mod1.xhtml#func1"') == 0
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.sphinx('epub', testroot='ext-viewcode',
|
||||||
|
confoverrides={'viewcode_enable_epub': True})
|
||||||
|
def test_viewcode_epub_enabled(app, status, warning):
|
||||||
|
app.builder.build_all()
|
||||||
|
|
||||||
|
assert (app.outdir / '_modules/spam/mod1.xhtml').exists()
|
||||||
|
|
||||||
|
result = (app.outdir / 'index.xhtml').read_text()
|
||||||
|
assert result.count('href="_modules/spam/mod1.xhtml#func1"') == 2
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.sphinx(testroot='ext-viewcode', tags=['test_linkcode'])
|
@pytest.mark.sphinx(testroot='ext-viewcode', tags=['test_linkcode'])
|
||||||
def test_linkcode(app, status, warning):
|
def test_linkcode(app, status, warning):
|
||||||
app.builder.build(['objects'])
|
app.builder.build(['objects'])
|
||||||
|
@ -284,6 +284,13 @@ def get_verifier(verify, verify_re):
|
|||||||
'<p><kbd class="kbd docutils literal notranslate">-</kbd></p>',
|
'<p><kbd class="kbd docutils literal notranslate">-</kbd></p>',
|
||||||
'\\sphinxkeyboard{\\sphinxupquote{\\sphinxhyphen{}}}',
|
'\\sphinxkeyboard{\\sphinxupquote{\\sphinxhyphen{}}}',
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
# kbd role
|
||||||
|
'verify',
|
||||||
|
':kbd:`Caps Lock`',
|
||||||
|
'<p><kbd class="kbd docutils literal notranslate">Caps Lock</kbd></p>',
|
||||||
|
'\\sphinxkeyboard{\\sphinxupquote{Caps Lock}}',
|
||||||
|
),
|
||||||
(
|
(
|
||||||
# non-interpolation of dashes in option role
|
# non-interpolation of dashes in option role
|
||||||
'verify_re',
|
'verify_re',
|
||||||
|
@ -18,10 +18,10 @@ for stable releases
|
|||||||
* ``python utils/bump_version.py --in-develop X.Y.Zb0`` (ex. 1.5.3b0)
|
* ``python utils/bump_version.py --in-develop X.Y.Zb0`` (ex. 1.5.3b0)
|
||||||
* Check diff by ``git diff``
|
* Check diff by ``git diff``
|
||||||
* ``git commit -am 'Bump version'``
|
* ``git commit -am 'Bump version'``
|
||||||
* ``git push origin X.Y --tags``
|
* ``git push origin X.Y.x --tags``
|
||||||
* ``git checkout master``
|
* ``git checkout X.x``
|
||||||
* ``git merge X.Y``
|
* ``git merge X.Y.x``
|
||||||
* ``git push origin master``
|
* ``git push origin X.x``
|
||||||
* Add new version/milestone to tracker categories
|
* Add new version/milestone to tracker categories
|
||||||
* Write announcement and send to sphinx-dev, sphinx-users and python-announce
|
* Write announcement and send to sphinx-dev, sphinx-users and python-announce
|
||||||
|
|
||||||
@ -43,10 +43,10 @@ for first beta releases
|
|||||||
* ``python utils/bump_version.py --in-develop X.Y.0b2`` (ex. 1.6.0b2)
|
* ``python utils/bump_version.py --in-develop X.Y.0b2`` (ex. 1.6.0b2)
|
||||||
* Check diff by ``git diff``
|
* Check diff by ``git diff``
|
||||||
* ``git commit -am 'Bump version'``
|
* ``git commit -am 'Bump version'``
|
||||||
* ``git checkout -b X.Y``
|
* ``git checkout -b X.x``
|
||||||
* ``git push origin X.Y --tags``
|
* ``git push origin X.x --tags``
|
||||||
* ``git checkout master``
|
* ``git checkout master``
|
||||||
* ``git merge X.Y``
|
* ``git merge X.x``
|
||||||
* ``python utils/bump_version.py --in-develop A.B.0b0`` (ex. 1.7.0b0)
|
* ``python utils/bump_version.py --in-develop A.B.0b0`` (ex. 1.7.0b0)
|
||||||
* Check diff by ``git diff``
|
* Check diff by ``git diff``
|
||||||
* ``git commit -am 'Bump version'``
|
* ``git commit -am 'Bump version'``
|
||||||
@ -71,9 +71,9 @@ for other beta releases
|
|||||||
* ``python utils/bump_version.py --in-develop X.Y.0bM`` (ex. 1.6.0b3)
|
* ``python utils/bump_version.py --in-develop X.Y.0bM`` (ex. 1.6.0b3)
|
||||||
* Check diff by `git diff``
|
* Check diff by `git diff``
|
||||||
* ``git commit -am 'Bump version'``
|
* ``git commit -am 'Bump version'``
|
||||||
* ``git push origin X.Y --tags``
|
* ``git push origin X.x --tags``
|
||||||
* ``git checkout master``
|
* ``git checkout master``
|
||||||
* ``git merge X.Y``
|
* ``git merge X.x``
|
||||||
* ``git push origin master``
|
* ``git push origin master``
|
||||||
* Add new version/milestone to tracker categories
|
* Add new version/milestone to tracker categories
|
||||||
* Write announcement and send to sphinx-dev, sphinx-users and python-announce
|
* Write announcement and send to sphinx-dev, sphinx-users and python-announce
|
||||||
@ -99,9 +99,9 @@ for major releases
|
|||||||
* ``python utils/bump_version.py --in-develop X.Y.1b0`` (ex. 1.6.1b0)
|
* ``python utils/bump_version.py --in-develop X.Y.1b0`` (ex. 1.6.1b0)
|
||||||
* Check diff by ``git diff``
|
* Check diff by ``git diff``
|
||||||
* ``git commit -am 'Bump version'``
|
* ``git commit -am 'Bump version'``
|
||||||
* ``git push origin X.Y --tags``
|
* ``git push origin X.x --tags``
|
||||||
* ``git checkout master``
|
* ``git checkout master``
|
||||||
* ``git merge X.Y``
|
* ``git merge X.x``
|
||||||
* ``git push origin master``
|
* ``git push origin master``
|
||||||
* open https://github.com/sphinx-doc/sphinx/settings/branches and make ``A.B`` branch *not* protected
|
* open https://github.com/sphinx-doc/sphinx/settings/branches and make ``A.B`` branch *not* protected
|
||||||
* ``git checkout A.B`` (checkout old stable)
|
* ``git checkout A.B`` (checkout old stable)
|
||||||
|
Loading…
Reference in New Issue
Block a user