Merge branch '3.x'

This commit is contained in:
Takeshi KOMIYA 2020-03-28 23:07:02 +09:00
commit 50fd2ff510
15 changed files with 467 additions and 224 deletions

View File

@ -47,6 +47,14 @@ Features added
Bugs fixed
----------
* #7364: autosummary: crashed when :confval:`autosummary_generate` is False
* #7370: autosummary: raises UnboundLocalError when unknown module given
* #7367: C++, alternate operator spellings are now supported.
* C, alternate operator spellings are now supported.
* #7368: C++, comma operator in expressions, pack expansion in template
argument lists, and more comprehensive error messages in some cases.
* C, C++, fix crash and wrong duplicate warnings related to anon symbols.
Testing
--------

View File

@ -204,6 +204,14 @@ These are the basic steps needed to start developing on Sphinx.
#. Wait for a core developer to review your changes.
Translations
~~~~~~~~~~~~
The Sphinx core messages and documentations are translated on `Transifex
<https://www.transifex.com/>`_. Please join `Sphinx project on Transifex
<https://www.transifex.com/sphinx-doc/>`_ and translate them.
Core Developers
~~~~~~~~~~~~~~~
@ -228,39 +236,6 @@ The following are some general guidelines for core developers:
author in the commit message and any relevant :file:`CHANGES` entry.
Locale updates
~~~~~~~~~~~~~~
The parts of messages in Sphinx that go into builds are translated into several
locales. The translations are kept as gettext ``.po`` files translated from the
master template ``sphinx/locale/sphinx.pot``.
Sphinx uses `Babel <http://babel.pocoo.org/en/latest/>`_ to extract messages
and maintain the catalog files. It is integrated in ``setup.py``:
* Use ``python setup.py extract_messages`` to update the ``.pot`` template.
* Use ``python setup.py update_catalog`` to update all existing language
catalogs in ``sphinx/locale/*/LC_MESSAGES`` with the current messages in the
template file.
* Use ``python setup.py compile_catalog`` to compile the ``.po`` files to binary
``.mo`` files and ``.js`` files.
When an updated ``.po`` file is submitted, run compile_catalog to commit both
the source and the compiled catalogs.
When a new locale is submitted, add a new directory with the ISO 639-1 language
identifier and put ``sphinx.po`` in there. Don't forget to update the possible
values for :confval:`language` in ``doc/usage/configuration.rst``.
The Sphinx core messages can also be translated on `Transifex
<https://www.transifex.com/>`_. There exists a client tool named ``tx`` in the
Python package "transifex_client", which can be used to pull translations in
``.po`` format from Transifex. To do this, go to ``sphinx/locale`` and then run
``tx pull -f -l LANG`` where LANG is an existing language identifier. It is
good practice to run ``python setup.py update_catalog`` afterwards to make sure
the ``.po`` file has the canonical Babel formatting.
Coding Guide
------------
@ -439,3 +414,42 @@ and other ``test_*.py`` files under ``tests`` directory.
.. versionadded:: 1.8
Sphinx also runs JavaScript tests.
Release procedures
------------------
The release procedures are listed on ``utils/release-checklist``.
Locale Updates
~~~~~~~~~~~~~~
The parts of messages in Sphinx that go into builds are translated into several
locales. The translations are kept as gettext ``.po`` files translated from the
master template :file:`sphinx/locale/sphinx.pot`.
Sphinx uses `Babel <http://babel.pocoo.org/en/latest/>`_ to extract messages
and maintain the catalog files. It is integrated in ``setup.py``:
* Use ``python setup.py extract_messages`` to update the ``.pot`` template.
* Use ``python setup.py update_catalog`` to update all existing language
catalogs in ``sphinx/locale/*/LC_MESSAGES`` with the current messages in the
template file.
* Use ``python setup.py compile_catalog`` to compile the ``.po`` files to binary
``.mo`` files and ``.js`` files.
When an updated ``.po`` file is submitted, run compile_catalog to commit both
the source and the compiled catalogs.
When a new locale is submitted, add a new directory with the ISO 639-1 language
identifier and put ``sphinx.po`` in there. Don't forget to update the possible
values for :confval:`language` in ``doc/usage/configuration.rst``.
The Sphinx core messages can also be translated on `Transifex
<https://www.transifex.com/>`_. There exists a client tool named ``tx`` in the
Python package "transifex_client", which can be used to pull translations in
``.po`` format from Transifex. To do this, go to ``sphinx/locale`` and then run
``tx pull -f -l LANG`` where LANG is an existing language identifier. It is
good practice to run ``python setup.py update_catalog`` afterwards to make sure
the ``.po`` file has the canonical Babel formatting.

View File

@ -248,6 +248,7 @@ Documentation using sphinx_rtd_theme
* `PyPy <http://doc.pypy.org/>`__
* `python-sqlparse <https://sqlparse.readthedocs.io/>`__
* `PyVISA <https://pyvisa.readthedocs.io/>`__
* `pyvista <https://docs.pyvista.org/>`__
* `Read The Docs <https://docs.readthedocs.io/>`__
* `ROCm Platform <https://rocm-documentation.readthedocs.io/>`__
* `Free your information from their silos (French) <http://redaction-technique.org/>`__ (customized)

View File

@ -31,7 +31,7 @@ clean-backupfiles:
clean-generated:
find . -name '.DS_Store' -exec rm -f {} +
rm -rf Sphinx.egg-info/
rm -rf dists/
rm -rf dist/
rm -rf doc/_build/
rm -f sphinx/pycode/*.pickle
rm -f utils/*3.py*
@ -50,7 +50,7 @@ clean-buildfiles:
.PHONY: clean-mypyfiles
clean-mypyfiles:
rm -rf .mypy_cache/
rm -rf **/.mypy_cache/
.PHONY: style-check
style-check:

View File

@ -92,8 +92,8 @@
create a customized documentation using Sphinx written by the matplotlib
developers.{%endtrans%}</p>
<p>{%trans%}There is a <a href="http://docs.sphinx-users.jp/">Japanese translation</a>
of this documentation, thanks to the Japanese Sphinx user group.{%endtrans%}</p>
<p>{%trans%}There is a translation team in <a href="https://www.transifex.com/sphinx-doc/sphinx-doc/dashboard/">Transifex</a>
of this documentation, thanks to the Sphinx document translators.{%endtrans%}</p>
<p>{%trans%}A Japanese book about Sphinx has been published by O'Reilly:
<a href="https://www.oreilly.co.jp/books/9784873116488/">Sphinxをはじめよう /
Learning Sphinx</a>.{%endtrans%}</p>

View File

@ -1,4 +1,4 @@
:tocdepth: 2
:tocdepth: 1
.. default-role:: any

View File

@ -53,21 +53,21 @@ _keywords = [
# these are ordered by preceedence
_expression_bin_ops = [
['||'],
['&&'],
['|'],
['^'],
['&'],
['==', '!='],
['||', 'or'],
['&&', 'and'],
['|', 'bitor'],
['^', 'xor'],
['&', 'bitand'],
['==', '!=', 'not_eq'],
['<=', '>=', '<', '>'],
['<<', '>>'],
['+', '-'],
['*', '/', '%'],
['.*', '->*']
]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "~"]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "not", "~", "compl"]
_expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=",
">>=", "<<=", "&=", "^=", "|="]
">>=", "<<=", "&=", "and_eq", "^=", "xor_eq", "|=", "or_eq"]
_max_id = 1
_id_prefix = [None, 'c.', 'Cv2.']
@ -423,11 +423,16 @@ class ASTUnaryOpExpr(ASTExpression):
self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str:
if self.op[0] in 'cn':
return transform(self.op) + " " + transform(self.expr)
else:
return transform(self.op) + transform(self.expr)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
signode.append(nodes.Text(self.op))
if self.op[0] in 'cn':
signode.append(nodes.Text(" "))
self.expr.describe_signature(signode, mode, env, symbol)
@ -1670,7 +1675,7 @@ class Symbol:
onMissingQualifiedSymbol,
ancestorLookupType=None,
matchSelf=False,
recurseInAnon=True,
recurseInAnon=False,
searchInSiblings=False)
assert lookupResult is not None # we create symbols all the way, so that can't happen
symbols = list(lookupResult.symbols)
@ -1981,6 +1986,10 @@ class DefinitionParser(BaseParser):
_prefix_keys = ('struct', 'enum', 'union')
@property
def language(self) -> str:
return 'C'
def _parse_string(self) -> str:
if self.current_char != '"':
return None
@ -2241,7 +2250,11 @@ class DefinitionParser(BaseParser):
self.skip_ws()
for op in _expression_unary_ops:
# TODO: hmm, should we be able to backtrack here?
if self.skip_string(op):
if op[0] in 'cn':
res = self.skip_word(op)
else:
res = self.skip_string(op)
if res:
expr = self._parse_cast_expression()
return ASTUnaryOpExpr(op, expr)
if self.skip_word_and_ws('sizeof'):
@ -2313,6 +2326,10 @@ class DefinitionParser(BaseParser):
pos = self.pos
oneMore = False
for op in _expression_bin_ops[opId]:
if op[0] in 'abcnox':
if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
if op == '&' and self.current_char == '&':
@ -2353,6 +2370,10 @@ class DefinitionParser(BaseParser):
oneMore = False
self.skip_ws()
for op in _expression_assignment_ops:
if op[0] in 'abcnox':
if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
expr = self._parse_logical_or_expression()
@ -2680,7 +2701,7 @@ class DefinitionParser(BaseParser):
restrict=restrict, volatile=volatile, const=const,
attrs=attrs)
if typed and self.current_char == '(': # note: peeking, not skipping
# maybe this is the beginning of params,try that first,
# maybe this is the beginning of params, try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos
try:
@ -2689,7 +2710,10 @@ class DefinitionParser(BaseParser):
typed)
return res
except DefinitionError as exParamQual:
prevErrors.append((exParamQual, "If declId and parameters"))
msg = "If declarator-id with parameters"
if paramMode == 'function':
msg += " (e.g., 'void f(int arg)')"
prevErrors.append((exParamQual, msg))
self.pos = pos
try:
assert self.current_char == '('
@ -2706,7 +2730,10 @@ class DefinitionParser(BaseParser):
return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen:
self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator"))
msg = "If parenthesis in noptr-declarator"
if paramMode == 'function':
msg += " (e.g., 'void (*f(int arg))(double)')"
prevErrors.append((exNoPtrParen, msg))
header = "Error in declarator"
raise self._make_multi_error(prevErrors, header)
pos = self.pos

View File

@ -306,6 +306,7 @@ _operator_re = re.compile(r'''(?x)
| ->\*? | \,
| (<<|>>)=? | && | \|\|
| [!<>=/*%+|&^~-]=?
| (\b(and|and_eq|bitand|bitor|compl|not|not_eq|or|or_eq|xor|xor_eq)\b)
''')
_fold_operator_re = re.compile(r'''(?x)
->\* | \.\* | \,
@ -464,37 +465,37 @@ _id_operator_v2 = {
# '-(unary)' : 'ng',
# '&(unary)' : 'ad',
# '*(unary)' : 'de',
'~': 'co',
'~': 'co', 'compl': 'co',
'+': 'pl',
'-': 'mi',
'*': 'ml',
'/': 'dv',
'%': 'rm',
'&': 'an',
'|': 'or',
'^': 'eo',
'&': 'an', 'bitand': 'an',
'|': 'or', 'bitor': 'or',
'^': 'eo', 'xor': 'eo',
'=': 'aS',
'+=': 'pL',
'-=': 'mI',
'*=': 'mL',
'/=': 'dV',
'%=': 'rM',
'&=': 'aN',
'|=': 'oR',
'^=': 'eO',
'&=': 'aN', 'and_eq': 'aN',
'|=': 'oR', 'or_eq': 'oR',
'^=': 'eO', 'xor_eq': 'eO',
'<<': 'ls',
'>>': 'rs',
'<<=': 'lS',
'>>=': 'rS',
'==': 'eq',
'!=': 'ne',
'!=': 'ne', 'not_eq': 'ne',
'<': 'lt',
'>': 'gt',
'<=': 'le',
'>=': 'ge',
'!': 'nt',
'&&': 'aa',
'||': 'oo',
'!': 'nt', 'not': 'nt',
'&&': 'aa', 'and': 'aa',
'||': 'oo', 'or': 'oo',
'++': 'pp',
'--': 'mm',
',': 'cm',
@ -511,8 +512,8 @@ _id_operator_unary_v2 = {
'&': 'ad',
'+': 'ps',
'-': 'ng',
'!': 'nt',
'~': 'co'
'!': 'nt', 'not': 'nt',
'~': 'co', 'compl': 'co'
}
_id_char_from_prefix = {
None: 'c', 'u8': 'c',
@ -520,21 +521,21 @@ _id_char_from_prefix = {
} # type: Dict[Any, str]
# these are ordered by preceedence
_expression_bin_ops = [
['||'],
['&&'],
['|'],
['^'],
['&'],
['==', '!='],
['||', 'or'],
['&&', 'and'],
['|', 'bitor'],
['^', 'xor'],
['&', 'bitand'],
['==', '!=', 'not_eq'],
['<=', '>=', '<', '>'],
['<<', '>>'],
['+', '-'],
['*', '/', '%'],
['.*', '->*']
]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "~"]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "not", "~", "compl"]
_expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=",
">>=", "<<=", "&=", "^=", "|="]
">>=", "<<=", "&=", "and_eq", "^=", "|=", "xor_eq", "or_eq"]
_id_explicit_cast = {
'dynamic_cast': 'dc',
'static_cast': 'sc',
@ -1260,6 +1261,9 @@ class ASTUnaryOpExpr(ASTExpression):
self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str:
if self.op[0] in 'cn':
return transform(self.op) + " " + transform(self.expr)
else:
return transform(self.op) + transform(self.expr)
def get_id(self, version: int) -> str:
@ -1268,6 +1272,8 @@ class ASTUnaryOpExpr(ASTExpression):
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
signode.append(nodes.Text(self.op))
if self.op[0] in 'cn':
signode.append(nodes.Text(' '))
self.expr.describe_signature(signode, mode, env, symbol)
@ -1499,8 +1505,38 @@ class ASTBinOpExpr(ASTExpression):
self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTBracedInitList(ASTBase):
def __init__(self, exprs: List[Union[ASTExpression, "ASTBracedInitList"]],
trailingComma: bool) -> None:
self.exprs = exprs
self.trailingComma = trailingComma
def get_id(self, version: int) -> str:
return "il%sE" % ''.join(e.get_id(version) for e in self.exprs)
def _stringify(self, transform: StringifyTransform) -> str:
exprs = [transform(e) for e in self.exprs]
trailingComma = ',' if self.trailingComma else ''
return '{%s%s}' % (', '.join(exprs), trailingComma)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
verify_description_mode(mode)
signode.append(nodes.Text('{'))
first = True
for e in self.exprs:
if not first:
signode.append(nodes.Text(', '))
else:
first = False
e.describe_signature(signode, mode, env, symbol)
if self.trailingComma:
signode.append(nodes.Text(','))
signode.append(nodes.Text('}'))
class ASTAssignmentExpr(ASTExpression):
def __init__(self, exprs: List[ASTExpression], ops: List[str]):
def __init__(self, exprs: List[Union[ASTExpression, ASTBracedInitList]], ops: List[str]):
assert len(exprs) > 0
assert len(exprs) == len(ops) + 1
self.exprs = exprs
@ -1534,6 +1570,31 @@ class ASTAssignmentExpr(ASTExpression):
self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTCommaExpr(ASTExpression):
def __init__(self, exprs: List[ASTExpression]):
assert len(exprs) > 0
self.exprs = exprs
def _stringify(self, transform: StringifyTransform) -> str:
return ', '.join(transform(e) for e in self.exprs)
def get_id(self, version: int) -> str:
id_ = _id_operator_v2[',']
res = []
for i in range(len(self.exprs) - 1):
res.append(id_)
res.append(self.exprs[i].get_id(version))
res.append(self.exprs[-1].get_id(version))
return ''.join(res)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
self.exprs[0].describe_signature(signode, mode, env, symbol)
for i in range(1, len(self.exprs)):
signode.append(nodes.Text(', '))
self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTFallbackExpr(ASTExpression):
def __init__(self, expr: str):
self.expr = expr
@ -1584,6 +1645,8 @@ class ASTOperatorBuildIn(ASTOperator):
def get_id(self, version: int) -> str:
if version == 1:
ids = _id_operator_v1
if self.op not in ids:
raise NoOldIdError()
else:
ids = _id_operator_v2
if self.op not in ids:
@ -1592,7 +1655,7 @@ class ASTOperatorBuildIn(ASTOperator):
return ids[self.op]
def _stringify(self, transform: StringifyTransform) -> str:
if self.op in ('new', 'new[]', 'delete', 'delete[]'):
if self.op in ('new', 'new[]', 'delete', 'delete[]') or self.op[0] in "abcnox":
return 'operator ' + self.op
else:
return 'operator' + self.op
@ -1650,9 +1713,11 @@ class ASTTemplateArgConstant(ASTBase):
class ASTTemplateArgs(ASTBase):
def __init__(self, args: List[Union["ASTType", ASTTemplateArgConstant]]) -> None:
def __init__(self, args: List[Union["ASTType", ASTTemplateArgConstant]],
packExpansion: bool) -> None:
assert args is not None
self.args = args
self.packExpansion = packExpansion
def get_id(self, version: int) -> str:
if version == 1:
@ -1664,13 +1729,21 @@ class ASTTemplateArgs(ASTBase):
res = []
res.append('I')
for a in self.args:
if len(self.args) > 0:
for a in self.args[:-1]:
res.append(a.get_id(version))
if self.packExpansion:
res.append('J')
res.append(self.args[-1].get_id(version))
if self.packExpansion:
res.append('E')
res.append('E')
return ''.join(res)
def _stringify(self, transform: StringifyTransform) -> str:
res = ', '.join(transform(a) for a in self.args)
if self.packExpansion:
res += '...'
return '<' + res + '>'
def describe_signature(self, signode: TextElement, mode: str,
@ -1683,6 +1756,8 @@ class ASTTemplateArgs(ASTBase):
signode += nodes.Text(', ')
first = False
a.describe_signature(signode, 'markType', env, symbol=symbol)
if self.packExpansion:
signode += nodes.Text('...')
signode += nodes.Text('>')
@ -2633,7 +2708,7 @@ class ASTDeclaratorParen(ASTDeclarator):
##############################################################################################
class ASTPackExpansionExpr(ASTExpression):
def __init__(self, expr: ASTExpression):
def __init__(self, expr: Union[ASTExpression, ASTBracedInitList]):
self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str:
@ -2650,7 +2725,7 @@ class ASTPackExpansionExpr(ASTExpression):
class ASTParenExprList(ASTBase):
def __init__(self, exprs: List[ASTExpression]) -> None:
def __init__(self, exprs: List[Union[ASTExpression, ASTBracedInitList]]) -> None:
self.exprs = exprs
def get_id(self, version: int) -> str:
@ -2674,35 +2749,6 @@ class ASTParenExprList(ASTBase):
signode.append(nodes.Text(')'))
class ASTBracedInitList(ASTBase):
def __init__(self, exprs: List[ASTExpression], trailingComma: bool) -> None:
self.exprs = exprs
self.trailingComma = trailingComma
def get_id(self, version: int) -> str:
return "il%sE" % ''.join(e.get_id(version) for e in self.exprs)
def _stringify(self, transform: StringifyTransform) -> str:
exprs = [transform(e) for e in self.exprs]
trailingComma = ',' if self.trailingComma else ''
return '{%s%s}' % (', '.join(exprs), trailingComma)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
verify_description_mode(mode)
signode.append(nodes.Text('{'))
first = True
for e in self.exprs:
if not first:
signode.append(nodes.Text(', '))
else:
first = False
e.describe_signature(signode, mode, env, symbol)
if self.trailingComma:
signode.append(nodes.Text(','))
signode.append(nodes.Text('}'))
class ASTInitializer(ASTBase):
def __init__(self, value: Union[ASTExpression, ASTBracedInitList],
hasAssign: bool = True) -> None:
@ -4113,7 +4159,7 @@ class Symbol:
ancestorLookupType=None,
templateShorthand=False,
matchSelf=False,
recurseInAnon=True,
recurseInAnon=False,
correctPrimaryTemplateArgs=True,
searchInSiblings=False)
assert lookupResult is not None # we create symbols all the way, so that can't happen
@ -4568,6 +4614,10 @@ class DefinitionParser(BaseParser):
super().__init__(definition, location=location)
self.config = config
@property
def language(self) -> str:
return 'C++'
def _parse_string(self) -> str:
if self.current_char != '"':
return None
@ -4743,7 +4793,7 @@ class DefinitionParser(BaseParser):
self.pos = pos
# fall back to a paren expression
try:
res = self._parse_expression(inTemplate=False)
res = self._parse_expression()
self.skip_ws()
if not self.skip_string(')'):
self.fail("Expected ')' in end of parenthesized expression.")
@ -4791,7 +4841,9 @@ class DefinitionParser(BaseParser):
return None
def _parse_initializer_list(self, name: str, open: str, close: str
) -> Tuple[List[ASTExpression], bool]:
) -> Tuple[List[Union[ASTExpression,
ASTBracedInitList]],
bool]:
# Parse open and close with the actual initializer-list inbetween
# -> initializer-clause '...'[opt]
# | initializer-list ',' initializer-clause '...'[opt]
@ -4801,11 +4853,11 @@ class DefinitionParser(BaseParser):
if self.skip_string(close):
return [], False
exprs = [] # type: List[ASTExpression]
exprs = [] # type: List[Union[ASTExpression, ASTBracedInitList]]
trailingComma = False
while True:
self.skip_ws()
expr = self._parse_expression(inTemplate=False)
expr = self._parse_initializer_clause()
self.skip_ws()
if self.skip_string('...'):
exprs.append(ASTPackExpansionExpr(expr))
@ -4835,6 +4887,12 @@ class DefinitionParser(BaseParser):
return None
return ASTParenExprList(exprs)
def _parse_initializer_clause(self) -> Union[ASTExpression, ASTBracedInitList]:
bracedInitList = self._parse_braced_init_list()
if bracedInitList is not None:
return bracedInitList
return self._parse_assignment_expression(inTemplate=False)
def _parse_braced_init_list(self) -> ASTBracedInitList:
# -> '{' initializer-list ','[opt] '}'
# | '{' '}'
@ -4894,7 +4952,7 @@ class DefinitionParser(BaseParser):
self.fail("Expected '(' in '%s'." % cast)
def parser():
return self._parse_expression(inTemplate=False)
return self._parse_expression()
expr = self._parse_expression_fallback([')'], parser)
self.skip_ws()
if not self.skip_string(")"):
@ -4915,7 +4973,7 @@ class DefinitionParser(BaseParser):
try:
def parser():
return self._parse_expression(inTemplate=False)
return self._parse_expression()
expr = self._parse_expression_fallback([')'], parser)
prefix = ASTTypeId(expr, isType=False)
if not self.skip_string(')'):
@ -4962,7 +5020,7 @@ class DefinitionParser(BaseParser):
self.skip_ws()
if prefixType in ['expr', 'cast', 'typeid']:
if self.skip_string_and_ws('['):
expr = self._parse_expression(inTemplate=False)
expr = self._parse_expression()
self.skip_ws()
if not self.skip_string(']'):
self.fail("Expected ']' in end of postfix expression.")
@ -5016,7 +5074,11 @@ class DefinitionParser(BaseParser):
self.skip_ws()
for op in _expression_unary_ops:
# TODO: hmm, should we be able to backtrack here?
if self.skip_string(op):
if op[0] in 'cn':
res = self.skip_word(op)
else:
res = self.skip_string(op)
if res:
expr = self._parse_cast_expression()
return ASTUnaryOpExpr(op, expr)
if self.skip_word_and_ws('sizeof'):
@ -5049,7 +5111,7 @@ class DefinitionParser(BaseParser):
if self.skip_word_and_ws('noexcept'):
if not self.skip_string_and_ws('('):
self.fail("Expecting '(' after 'noexcept'.")
expr = self._parse_expression(inTemplate=False)
expr = self._parse_expression()
self.skip_ws()
if not self.skip_string(')'):
self.fail("Expecting ')' to end 'noexcept'.")
@ -5144,6 +5206,10 @@ class DefinitionParser(BaseParser):
pos = self.pos
oneMore = False
for op in _expression_bin_ops[opId]:
if op[0] in 'abcnox':
if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
if op == '&' and self.current_char == '&':
@ -5178,7 +5244,7 @@ class DefinitionParser(BaseParser):
# logical-or-expression
# | logical-or-expression "?" expression ":" assignment-expression
# | logical-or-expression assignment-operator initializer-clause
exprs = []
exprs = [] # type: List[Union[ASTExpression, ASTBracedInitList]]
ops = []
orExpr = self._parse_logical_or_expression(inTemplate=inTemplate)
exprs.append(orExpr)
@ -5187,9 +5253,13 @@ class DefinitionParser(BaseParser):
oneMore = False
self.skip_ws()
for op in _expression_assignment_ops:
if op[0] in 'anox':
if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
expr = self._parse_logical_or_expression(False)
expr = self._parse_initializer_clause()
exprs.append(expr)
ops.append(op)
oneMore = True
@ -5206,11 +5276,19 @@ class DefinitionParser(BaseParser):
# TODO: use _parse_conditional_expression_tail
return orExpr
def _parse_expression(self, inTemplate: bool) -> ASTExpression:
def _parse_expression(self) -> ASTExpression:
# -> assignment-expression
# | expression "," assignment-expresion
# TODO: actually parse the second production
return self._parse_assignment_expression(inTemplate=inTemplate)
exprs = [self._parse_assignment_expression(inTemplate=False)]
while True:
self.skip_ws()
if not self.skip_string(','):
break
exprs.append(self._parse_assignment_expression(inTemplate=False))
if len(exprs) == 1:
return exprs[0]
else:
return ASTCommaExpr(exprs)
def _parse_expression_fallback(self, end: List[str],
parser: Callable[[], ASTExpression],
@ -5289,13 +5367,21 @@ class DefinitionParser(BaseParser):
return ASTOperatorType(type)
def _parse_template_argument_list(self) -> ASTTemplateArgs:
# template-argument-list: (but we include the < and > here
# template-argument ...[opt]
# template-argument-list, template-argument ...[opt]
# template-argument:
# constant-expression
# type-id
# id-expression
self.skip_ws()
if not self.skip_string_and_ws('<'):
return None
if self.skip_string('>'):
return ASTTemplateArgs([])
return ASTTemplateArgs([], False)
prevErrors = []
templateArgs = [] # type: List[Union[ASTType, ASTTemplateArgConstant]]
packExpansion = False
while 1:
pos = self.pos
parsedComma = False
@ -5303,31 +5389,35 @@ class DefinitionParser(BaseParser):
try:
type = self._parse_type(named=False)
self.skip_ws()
if self.skip_string('>'):
if self.skip_string_and_ws('...'):
packExpansion = True
parsedEnd = True
if not self.skip_string('>'):
self.fail('Expected ">" after "..." in template argument list.')
elif self.skip_string('>'):
parsedEnd = True
elif self.skip_string(','):
parsedComma = True
else:
self.fail('Expected ">" or "," in template argument list.')
self.fail('Expected "...>", ">" or "," in template argument list.')
templateArgs.append(type)
except DefinitionError as e:
prevErrors.append((e, "If type argument"))
self.pos = pos
try:
# actually here we shouldn't use the fallback parser (hence allow=False),
# because if actually took the < in an expression, then we _will_ fail,
# which is handled elsewhere. E.g., :cpp:expr:`A <= 0`.
def parser():
return self._parse_constant_expression(inTemplate=True)
value = self._parse_expression_fallback(
[',', '>'], parser, allow=False)
value = self._parse_constant_expression(inTemplate=True)
self.skip_ws()
if self.skip_string('>'):
if self.skip_string_and_ws('...'):
packExpansion = True
parsedEnd = True
if not self.skip_string('>'):
self.fail('Expected ">" after "..." in template argument list.')
elif self.skip_string('>'):
parsedEnd = True
elif self.skip_string(','):
parsedComma = True
else:
self.fail('Expected ">" or "," in template argument list.')
self.fail('Expected "...>", ">" or "," in template argument list.')
templateArgs.append(ASTTemplateArgConstant(value))
except DefinitionError as e:
self.pos = pos
@ -5337,7 +5427,9 @@ class DefinitionParser(BaseParser):
if parsedEnd:
assert not parsedComma
break
return ASTTemplateArgs(templateArgs)
else:
assert not packExpansion
return ASTTemplateArgs(templateArgs, packExpansion)
def _parse_nested_name(self, memberPointer: bool = False) -> ASTNestedName:
names = [] # type: List[ASTNestedNameElement]
@ -5427,7 +5519,7 @@ class DefinitionParser(BaseParser):
if not self.skip_string(')'):
self.fail("Expected ')' after 'decltype(auto'.")
return ASTTrailingTypeSpecDecltypeAuto()
expr = self._parse_expression(inTemplate=False)
expr = self._parse_expression()
self.skip_ws()
if not self.skip_string(')'):
self.fail("Expected ')' after 'decltype(<expr>'.")
@ -5440,7 +5532,6 @@ class DefinitionParser(BaseParser):
if self.skip_word_and_ws(k):
prefix = k
break
nestedName = self._parse_nested_name()
return ASTTrailingTypeSpecName(prefix, nestedName)
@ -5450,7 +5541,7 @@ class DefinitionParser(BaseParser):
self.skip_ws()
if not self.skip_string('('):
if paramMode == 'function':
self.fail('Expecting "(" in parameters_and_qualifiers.')
self.fail('Expecting "(" in parameters-and-qualifiers.')
else:
return None
args = []
@ -5463,7 +5554,7 @@ class DefinitionParser(BaseParser):
self.skip_ws()
if not self.skip_string(')'):
self.fail('Expected ")" after "..." in '
'parameters_and_qualifiers.')
'parameters-and-qualifiers.')
break
# note: it seems that function arguments can always be named,
# even in function pointers and similar.
@ -5478,7 +5569,7 @@ class DefinitionParser(BaseParser):
break
else:
self.fail(
'Expecting "," or ")" in parameters_and_qualifiers, '
'Expecting "," or ")" in parameters-and-qualifiers, '
'got "%s".' % self.current_char)
# TODO: why did we have this bail-out?
@ -5509,7 +5600,7 @@ class DefinitionParser(BaseParser):
exceptionSpec = 'noexcept'
self.skip_ws()
if self.skip_string('('):
self.fail('Parameterised "noexcept" not implemented.')
self.fail('Parameterised "noexcept" not yet implemented.')
self.skip_ws()
override = self.skip_word_and_ws('override')
@ -5672,7 +5763,7 @@ class DefinitionParser(BaseParser):
continue
def parser():
return self._parse_expression(inTemplate=False)
return self._parse_expression()
value = self._parse_expression_fallback([']'], parser)
if not self.skip_string(']'):
self.fail("Expected ']' in end of array operator.")
@ -5734,6 +5825,42 @@ class DefinitionParser(BaseParser):
if typed and self.skip_string("..."):
next = self._parse_declarator(named, paramMode, False)
return ASTDeclaratorParamPack(next=next)
if typed and self.current_char == '(': # note: peeking, not skipping
if paramMode == "operatorCast":
# TODO: we should be able to parse cast operators which return
# function pointers. For now, just hax it and ignore.
return ASTDeclaratorNameParamQual(declId=None, arrayOps=[],
paramQual=None)
# maybe this is the beginning of params and quals,try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos
try:
# assume this is params and quals
res = self._parse_declarator_name_suffix(named, paramMode,
typed)
return res
except DefinitionError as exParamQual:
prevErrors.append((exParamQual,
"If declarator-id with parameters-and-qualifiers"))
self.pos = pos
try:
assert self.current_char == '('
self.skip_string('(')
# TODO: hmm, if there is a name, it must be in inner, right?
# TODO: hmm, if there must be parameters, they must be
# inside, right?
inner = self._parse_declarator(named, paramMode, typed)
if not self.skip_string(')'):
self.fail("Expected ')' in \"( ptr-declarator )\"")
next = self._parse_declarator(named=False,
paramMode="type",
typed=typed)
return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen:
self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator"))
header = "Error in declarator"
raise self._make_multi_error(prevErrors, header)
if typed: # pointer to member
pos = self.pos
try:
@ -5760,48 +5887,18 @@ class DefinitionParser(BaseParser):
break
next = self._parse_declarator(named, paramMode, typed)
return ASTDeclaratorMemPtr(name, const, volatile, next=next)
if typed and self.current_char == '(': # note: peeking, not skipping
if paramMode == "operatorCast":
# TODO: we should be able to parse cast operators which return
# function pointers. For now, just hax it and ignore.
return ASTDeclaratorNameParamQual(declId=None, arrayOps=[],
paramQual=None)
# maybe this is the beginning of params and quals,try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos
try:
# assume this is params and quals
res = self._parse_declarator_name_suffix(named, paramMode,
typed)
res = self._parse_declarator_name_suffix(named, paramMode, typed)
# this is a heuristic for error messages, for when there is a < after a
# nested name, but it was not a successful template argument list
if self.current_char == '<':
self.otherErrors.append(self._make_multi_error(prevErrors, ""))
return res
except DefinitionError as exParamQual:
prevErrors.append((exParamQual, "If declId, parameters, and qualifiers"))
self.pos = pos
try:
assert self.current_char == '('
self.skip_string('(')
# TODO: hmm, if there is a name, it must be in inner, right?
# TODO: hmm, if there must be parameters, they must b
# inside, right?
inner = self._parse_declarator(named, paramMode, typed)
if not self.skip_string(')'):
self.fail("Expected ')' in \"( ptr-declarator )\"")
next = self._parse_declarator(named=False,
paramMode="type",
typed=typed)
return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen:
self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator"))
header = "Error in declarator"
raise self._make_multi_error(prevErrors, header)
pos = self.pos
try:
return self._parse_declarator_name_suffix(named, paramMode, typed)
except DefinitionError as e:
self.pos = pos
prevErrors.append((e, "If declarator-id"))
header = "Error in declarator or parameters and qualifiers"
header = "Error in declarator or parameters-and-qualifiers"
raise self._make_multi_error(prevErrors, header)
def _parse_initializer(self, outer: str = None, allowFallback: bool = True
@ -5866,7 +5963,6 @@ class DefinitionParser(BaseParser):
raise Exception('Internal error, unknown outer "%s".' % outer)
if outer != 'operatorCast':
assert named
if outer in ('type', 'function'):
# We allow type objects to just be a name.
# Some functions don't have normal return types: constructors,
@ -5974,10 +6070,10 @@ class DefinitionParser(BaseParser):
if eExpr is None:
raise eType
errs = []
errs.append((eExpr, "If default is an expression"))
errs.append((eType, "If default is a type"))
errs.append((eExpr, "If default template argument is an expression"))
errs.append((eType, "If default template argument is a type"))
msg = "Error in non-type template parameter"
msg += " or constrianted template paramter."
msg += " or constrained template parameter."
raise self._make_multi_error(errs, msg)
def _parse_type_using(self) -> ASTTypeUsing:
@ -6060,8 +6156,8 @@ class DefinitionParser(BaseParser):
self.skip_ws()
if not self.skip_string("<"):
self.fail("Expected '<' after 'template'")
while 1:
prevErrors = []
while 1:
self.skip_ws()
if self.skip_word('template'):
# declare a tenplate template parameter
@ -6114,6 +6210,7 @@ class DefinitionParser(BaseParser):
if self.skip_string('>'):
return ASTTemplateParams(templateParams)
elif self.skip_string(','):
prevErrors = []
continue
else:
header = "Error in template parameter list."
@ -6333,7 +6430,7 @@ class DefinitionParser(BaseParser):
def parse_expression(self) -> Union[ASTExpression, ASTType]:
pos = self.pos
try:
expr = self._parse_expression(False)
expr = self._parse_expression()
self.skip_ws()
self.assert_end()
return expr

View File

@ -718,6 +718,8 @@ def process_generate_options(app: Sphinx) -> None:
env = app.builder.env
genfiles = [env.doc2path(x, base=None) for x in env.found_docs
if os.path.isfile(env.doc2path(x))]
elif genfiles is False:
pass
else:
ext = list(app.config.source_suffix)
genfiles = [genfile + (ext[0] if not genfile.endswith(tuple(ext)) else '')

View File

@ -275,7 +275,7 @@ def generate_autosummary_docs(sources: List[str], output_dir: str = None,
try:
name, obj, parent, mod_name = import_by_name(entry.name)
except ImportError as e:
_warn(__('[autosummary] failed to import %r: %s') % (name, e))
_warn(__('[autosummary] failed to import %r: %s') % (entry.name, e))
continue
content = generate_autosummary_content(name, obj, parent, template, entry.template,

View File

@ -154,19 +154,23 @@ class BaseParser:
result = [header, '\n']
for e in errors:
if len(e[1]) > 0:
ident = ' '
indent = ' '
result.append(e[1])
result.append(':\n')
for line in str(e[0]).split('\n'):
if len(line) == 0:
continue
result.append(ident)
result.append(indent)
result.append(line)
result.append('\n')
else:
result.append(str(e[0]))
return DefinitionError(''.join(result))
@property
def language(self) -> str:
raise NotImplementedError
def status(self, msg: str) -> None:
# for debugging
indicator = '-' * self.pos + '^'
@ -176,8 +180,8 @@ class BaseParser:
errors = []
indicator = '-' * self.pos + '^'
exMain = DefinitionError(
'Invalid definition: %s [error at %d]\n %s\n %s' %
(msg, self.pos, self.definition, indicator))
'Invalid %s declaration: %s [error at %d]\n %s\n %s' %
(self.language, msg, self.pos, self.definition, indicator))
errors.append((exMain, "Main error"))
for err in self.otherErrors:
errors.append((err, "Potential other error"))

View File

@ -0,0 +1,5 @@
.. c:struct:: anon_dup_decl
.. c:struct:: @a.A
.. c:struct:: @b.A
.. c:struct:: A

View File

@ -0,0 +1,4 @@
.. cpp:namespace:: anon_dup_decl
.. cpp:class:: @a::A
.. cpp:class:: @b::A
.. cpp:class:: A

View File

@ -172,7 +172,9 @@ def test_expressions():
exprCheck('+5')
exprCheck('-5')
exprCheck('!5')
exprCheck('not 5')
exprCheck('~5')
exprCheck('compl 5')
exprCheck('sizeof(T)')
exprCheck('sizeof -42')
exprCheck('alignof(T)')
@ -180,13 +182,19 @@ def test_expressions():
exprCheck('(int)2')
# binary op
exprCheck('5 || 42')
exprCheck('5 or 42')
exprCheck('5 && 42')
exprCheck('5 and 42')
exprCheck('5 | 42')
exprCheck('5 bitor 42')
exprCheck('5 ^ 42')
exprCheck('5 xor 42')
exprCheck('5 & 42')
exprCheck('5 bitand 42')
# ['==', '!=']
exprCheck('5 == 42')
exprCheck('5 != 42')
exprCheck('5 not_eq 42')
# ['<=', '>=', '<', '>']
exprCheck('5 <= 42')
exprCheck('5 >= 42')
@ -215,8 +223,11 @@ def test_expressions():
exprCheck('a >>= 5')
exprCheck('a <<= 5')
exprCheck('a &= 5')
exprCheck('a and_eq 5')
exprCheck('a ^= 5')
exprCheck('a xor_eq 5')
exprCheck('a |= 5')
exprCheck('a or_eq 5')
def test_type_definitions():
@ -463,6 +474,15 @@ def test_build_domain_c(app, status, warning):
assert len(ws) == 0
@pytest.mark.sphinx(testroot='domain-c', confoverrides={'nitpicky': True})
def test_build_domain_c_anon_dup_decl(app, status, warning):
app.builder.build_all()
ws = filter_warnings(warning, "anon-dup-decl")
assert len(ws) == 2
assert "WARNING: c:identifier reference target not found: @a" in ws[0]
assert "WARNING: c:identifier reference target not found: @b" in ws[1]
def test_cfunction(app):
text = (".. c:function:: PyObject* "
"PyType_GenericAlloc(PyTypeObject *type, Py_ssize_t nitems)")

View File

@ -196,7 +196,9 @@ def test_expressions():
exprCheck('+5', 'psL5E')
exprCheck('-5', 'ngL5E')
exprCheck('!5', 'ntL5E')
exprCheck('not 5', 'ntL5E')
exprCheck('~5', 'coL5E')
exprCheck('compl 5', 'coL5E')
exprCheck('sizeof...(a)', 'sZ1a')
exprCheck('sizeof(T)', 'st1T')
exprCheck('sizeof -42', 'szngL42E')
@ -220,13 +222,19 @@ def test_expressions():
exprCheck('(int)2', 'cviL2E')
# binary op
exprCheck('5 || 42', 'ooL5EL42E')
exprCheck('5 or 42', 'ooL5EL42E')
exprCheck('5 && 42', 'aaL5EL42E')
exprCheck('5 and 42', 'aaL5EL42E')
exprCheck('5 | 42', 'orL5EL42E')
exprCheck('5 bitor 42', 'orL5EL42E')
exprCheck('5 ^ 42', 'eoL5EL42E')
exprCheck('5 xor 42', 'eoL5EL42E')
exprCheck('5 & 42', 'anL5EL42E')
exprCheck('5 bitand 42', 'anL5EL42E')
# ['==', '!=']
exprCheck('5 == 42', 'eqL5EL42E')
exprCheck('5 != 42', 'neL5EL42E')
exprCheck('5 not_eq 42', 'neL5EL42E')
# ['<=', '>=', '<', '>']
exprCheck('5 <= 42', 'leL5EL42E')
exprCheck('A <= 42', 'le1AL42E')
@ -260,8 +268,14 @@ def test_expressions():
exprCheck('a >>= 5', 'rS1aL5E')
exprCheck('a <<= 5', 'lS1aL5E')
exprCheck('a &= 5', 'aN1aL5E')
exprCheck('a and_eq 5', 'aN1aL5E')
exprCheck('a ^= 5', 'eO1aL5E')
exprCheck('a xor_eq 5', 'eO1aL5E')
exprCheck('a |= 5', 'oR1aL5E')
exprCheck('a or_eq 5', 'oR1aL5E')
exprCheck('a = {1, 2, 3}', 'aS1ailL1EL2EL3EE')
# comma operator
exprCheck('a, 5', 'cm1aL5E')
# Additional tests
# a < expression that starts with something that could be a template
@ -530,30 +544,62 @@ def test_function_definitions():
def test_operators():
check('function', 'void operator new [ ] ()',
{1: "new-array-operator", 2: "nav"}, output='void operator new[]()')
check('function', 'void operator delete ()',
{1: "delete-operator", 2: "dlv"}, output='void operator delete()')
check('function', 'operator bool() const',
{1: "castto-b-operatorC", 2: "NKcvbEv"}, output='operator bool() const')
check('function', 'void operator new()', {1: "new-operator", 2: "nwv"})
check('function', 'void operator new[]()', {1: "new-array-operator", 2: "nav"})
check('function', 'void operator delete()', {1: "delete-operator", 2: "dlv"})
check('function', 'void operator delete[]()', {1: "delete-array-operator", 2: "dav"})
check('function', 'operator bool() const', {1: "castto-b-operatorC", 2: "NKcvbEv"})
check('function', 'void operator""_udl()', {2: 'li4_udlv'})
check('function', 'void operator * ()',
{1: "mul-operator", 2: "mlv"}, output='void operator*()')
check('function', 'void operator - ()',
{1: "sub-operator", 2: "miv"}, output='void operator-()')
check('function', 'void operator + ()',
{1: "add-operator", 2: "plv"}, output='void operator+()')
check('function', 'void operator = ()',
{1: "assign-operator", 2: "aSv"}, output='void operator=()')
check('function', 'void operator / ()',
{1: "div-operator", 2: "dvv"}, output='void operator/()')
check('function', 'void operator % ()',
{1: "mod-operator", 2: "rmv"}, output='void operator%()')
check('function', 'void operator ! ()',
{1: "not-operator", 2: "ntv"}, output='void operator!()')
check('function', 'void operator "" _udl()',
{2: 'li4_udlv'}, output='void operator""_udl()')
check('function', 'void operator~()', {1: "inv-operator", 2: "cov"})
check('function', 'void operator compl()', {2: "cov"})
check('function', 'void operator+()', {1: "add-operator", 2: "plv"})
check('function', 'void operator-()', {1: "sub-operator", 2: "miv"})
check('function', 'void operator*()', {1: "mul-operator", 2: "mlv"})
check('function', 'void operator/()', {1: "div-operator", 2: "dvv"})
check('function', 'void operator%()', {1: "mod-operator", 2: "rmv"})
check('function', 'void operator&()', {1: "and-operator", 2: "anv"})
check('function', 'void operator bitand()', {2: "anv"})
check('function', 'void operator|()', {1: "or-operator", 2: "orv"})
check('function', 'void operator bitor()', {2: "orv"})
check('function', 'void operator^()', {1: "xor-operator", 2: "eov"})
check('function', 'void operator xor()', {2: "eov"})
check('function', 'void operator=()', {1: "assign-operator", 2: "aSv"})
check('function', 'void operator+=()', {1: "add-assign-operator", 2: "pLv"})
check('function', 'void operator-=()', {1: "sub-assign-operator", 2: "mIv"})
check('function', 'void operator*=()', {1: "mul-assign-operator", 2: "mLv"})
check('function', 'void operator/=()', {1: "div-assign-operator", 2: "dVv"})
check('function', 'void operator%=()', {1: "mod-assign-operator", 2: "rMv"})
check('function', 'void operator&=()', {1: "and-assign-operator", 2: "aNv"})
check('function', 'void operator and_eq()', {2: "aNv"})
check('function', 'void operator|=()', {1: "or-assign-operator", 2: "oRv"})
check('function', 'void operator or_eq()', {2: "oRv"})
check('function', 'void operator^=()', {1: "xor-assign-operator", 2: "eOv"})
check('function', 'void operator xor_eq()', {2: "eOv"})
check('function', 'void operator<<()', {1: "lshift-operator", 2: "lsv"})
check('function', 'void operator>>()', {1: "rshift-operator", 2: "rsv"})
check('function', 'void operator<<=()', {1: "lshift-assign-operator", 2: "lSv"})
check('function', 'void operator>>=()', {1: "rshift-assign-operator", 2: "rSv"})
check('function', 'void operator==()', {1: "eq-operator", 2: "eqv"})
check('function', 'void operator!=()', {1: "neq-operator", 2: "nev"})
check('function', 'void operator not_eq()', {2: "nev"})
check('function', 'void operator<()', {1: "lt-operator", 2: "ltv"})
check('function', 'void operator>()', {1: "gt-operator", 2: "gtv"})
check('function', 'void operator<=()', {1: "lte-operator", 2: "lev"})
check('function', 'void operator>=()', {1: "gte-operator", 2: "gev"})
check('function', 'void operator!()', {1: "not-operator", 2: "ntv"})
check('function', 'void operator not()', {2: "ntv"})
check('function', 'void operator&&()', {1: "sand-operator", 2: "aav"})
check('function', 'void operator and()', {2: "aav"})
check('function', 'void operator||()', {1: "sor-operator", 2: "oov"})
check('function', 'void operator or()', {2: "oov"})
check('function', 'void operator++()', {1: "inc-operator", 2: "ppv"})
check('function', 'void operator--()', {1: "dec-operator", 2: "mmv"})
check('function', 'void operator,()', {1: "comma-operator", 2: "cmv"})
check('function', 'void operator->*()', {1: "pointer-by-pointer-operator", 2: "pmv"})
check('function', 'void operator->()', {1: "pointer-operator", 2: "ptv"})
check('function', 'void operator()()', {1: "call-operator", 2: "clv"})
check('function', 'void operator[]()', {1: "subscript-operator", 2: "ixv"})
def test_class_definitions():
@ -582,6 +628,12 @@ def test_class_definitions():
{2: 'I0E7has_varI1TNSt6void_tIDTadN1T3varEEEEE'})
check('class', 'template<typename ...Ts> T<int (*)(Ts)...>',
{2: 'IDpE1TIJPFi2TsEEE'})
check('class', 'template<int... Is> T<(Is)...>',
{2: 'I_DpiE1TIJX(Is)EEE', 3: 'I_DpiE1TIJX2IsEEE'})
def test_union_definitions():
check('union', 'A', {2: "1A"})
@ -860,6 +912,15 @@ def test_build_domain_cpp_backslash_ok(app, status, warning):
assert "WARNING: Parsing of expression failed. Using fallback parser." in ws[0]
@pytest.mark.sphinx(testroot='domain-cpp', confoverrides={'nitpicky': True})
def test_build_domain_cpp_anon_dup_decl(app, status, warning):
app.builder.build_all()
ws = filter_warnings(warning, "anon-dup-decl")
assert len(ws) == 2
assert "WARNING: cpp:identifier reference target not found: @a" in ws[0]
assert "WARNING: cpp:identifier reference target not found: @b" in ws[1]
@pytest.mark.sphinx(testroot='domain-cpp')
def test_build_domain_cpp_misuse_of_roles(app, status, warning):
app.builder.build_all()