Merge branch '3.x'

This commit is contained in:
Takeshi KOMIYA 2020-03-28 23:07:02 +09:00
commit 50fd2ff510
15 changed files with 467 additions and 224 deletions

View File

@ -47,6 +47,14 @@ Features added
Bugs fixed Bugs fixed
---------- ----------
* #7364: autosummary: crashed when :confval:`autosummary_generate` is False
* #7370: autosummary: raises UnboundLocalError when unknown module given
* #7367: C++, alternate operator spellings are now supported.
* C, alternate operator spellings are now supported.
* #7368: C++, comma operator in expressions, pack expansion in template
argument lists, and more comprehensive error messages in some cases.
* C, C++, fix crash and wrong duplicate warnings related to anon symbols.
Testing Testing
-------- --------

View File

@ -204,6 +204,14 @@ These are the basic steps needed to start developing on Sphinx.
#. Wait for a core developer to review your changes. #. Wait for a core developer to review your changes.
Translations
~~~~~~~~~~~~
The Sphinx core messages and documentations are translated on `Transifex
<https://www.transifex.com/>`_. Please join `Sphinx project on Transifex
<https://www.transifex.com/sphinx-doc/>`_ and translate them.
Core Developers Core Developers
~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~
@ -228,39 +236,6 @@ The following are some general guidelines for core developers:
author in the commit message and any relevant :file:`CHANGES` entry. author in the commit message and any relevant :file:`CHANGES` entry.
Locale updates
~~~~~~~~~~~~~~
The parts of messages in Sphinx that go into builds are translated into several
locales. The translations are kept as gettext ``.po`` files translated from the
master template ``sphinx/locale/sphinx.pot``.
Sphinx uses `Babel <http://babel.pocoo.org/en/latest/>`_ to extract messages
and maintain the catalog files. It is integrated in ``setup.py``:
* Use ``python setup.py extract_messages`` to update the ``.pot`` template.
* Use ``python setup.py update_catalog`` to update all existing language
catalogs in ``sphinx/locale/*/LC_MESSAGES`` with the current messages in the
template file.
* Use ``python setup.py compile_catalog`` to compile the ``.po`` files to binary
``.mo`` files and ``.js`` files.
When an updated ``.po`` file is submitted, run compile_catalog to commit both
the source and the compiled catalogs.
When a new locale is submitted, add a new directory with the ISO 639-1 language
identifier and put ``sphinx.po`` in there. Don't forget to update the possible
values for :confval:`language` in ``doc/usage/configuration.rst``.
The Sphinx core messages can also be translated on `Transifex
<https://www.transifex.com/>`_. There exists a client tool named ``tx`` in the
Python package "transifex_client", which can be used to pull translations in
``.po`` format from Transifex. To do this, go to ``sphinx/locale`` and then run
``tx pull -f -l LANG`` where LANG is an existing language identifier. It is
good practice to run ``python setup.py update_catalog`` afterwards to make sure
the ``.po`` file has the canonical Babel formatting.
Coding Guide Coding Guide
------------ ------------
@ -439,3 +414,42 @@ and other ``test_*.py`` files under ``tests`` directory.
.. versionadded:: 1.8 .. versionadded:: 1.8
Sphinx also runs JavaScript tests. Sphinx also runs JavaScript tests.
Release procedures
------------------
The release procedures are listed on ``utils/release-checklist``.
Locale Updates
~~~~~~~~~~~~~~
The parts of messages in Sphinx that go into builds are translated into several
locales. The translations are kept as gettext ``.po`` files translated from the
master template :file:`sphinx/locale/sphinx.pot`.
Sphinx uses `Babel <http://babel.pocoo.org/en/latest/>`_ to extract messages
and maintain the catalog files. It is integrated in ``setup.py``:
* Use ``python setup.py extract_messages`` to update the ``.pot`` template.
* Use ``python setup.py update_catalog`` to update all existing language
catalogs in ``sphinx/locale/*/LC_MESSAGES`` with the current messages in the
template file.
* Use ``python setup.py compile_catalog`` to compile the ``.po`` files to binary
``.mo`` files and ``.js`` files.
When an updated ``.po`` file is submitted, run compile_catalog to commit both
the source and the compiled catalogs.
When a new locale is submitted, add a new directory with the ISO 639-1 language
identifier and put ``sphinx.po`` in there. Don't forget to update the possible
values for :confval:`language` in ``doc/usage/configuration.rst``.
The Sphinx core messages can also be translated on `Transifex
<https://www.transifex.com/>`_. There exists a client tool named ``tx`` in the
Python package "transifex_client", which can be used to pull translations in
``.po`` format from Transifex. To do this, go to ``sphinx/locale`` and then run
``tx pull -f -l LANG`` where LANG is an existing language identifier. It is
good practice to run ``python setup.py update_catalog`` afterwards to make sure
the ``.po`` file has the canonical Babel formatting.

View File

@ -248,6 +248,7 @@ Documentation using sphinx_rtd_theme
* `PyPy <http://doc.pypy.org/>`__ * `PyPy <http://doc.pypy.org/>`__
* `python-sqlparse <https://sqlparse.readthedocs.io/>`__ * `python-sqlparse <https://sqlparse.readthedocs.io/>`__
* `PyVISA <https://pyvisa.readthedocs.io/>`__ * `PyVISA <https://pyvisa.readthedocs.io/>`__
* `pyvista <https://docs.pyvista.org/>`__
* `Read The Docs <https://docs.readthedocs.io/>`__ * `Read The Docs <https://docs.readthedocs.io/>`__
* `ROCm Platform <https://rocm-documentation.readthedocs.io/>`__ * `ROCm Platform <https://rocm-documentation.readthedocs.io/>`__
* `Free your information from their silos (French) <http://redaction-technique.org/>`__ (customized) * `Free your information from their silos (French) <http://redaction-technique.org/>`__ (customized)

View File

@ -31,7 +31,7 @@ clean-backupfiles:
clean-generated: clean-generated:
find . -name '.DS_Store' -exec rm -f {} + find . -name '.DS_Store' -exec rm -f {} +
rm -rf Sphinx.egg-info/ rm -rf Sphinx.egg-info/
rm -rf dists/ rm -rf dist/
rm -rf doc/_build/ rm -rf doc/_build/
rm -f sphinx/pycode/*.pickle rm -f sphinx/pycode/*.pickle
rm -f utils/*3.py* rm -f utils/*3.py*
@ -50,7 +50,7 @@ clean-buildfiles:
.PHONY: clean-mypyfiles .PHONY: clean-mypyfiles
clean-mypyfiles: clean-mypyfiles:
rm -rf .mypy_cache/ rm -rf **/.mypy_cache/
.PHONY: style-check .PHONY: style-check
style-check: style-check:

View File

@ -92,8 +92,8 @@
create a customized documentation using Sphinx written by the matplotlib create a customized documentation using Sphinx written by the matplotlib
developers.{%endtrans%}</p> developers.{%endtrans%}</p>
<p>{%trans%}There is a <a href="http://docs.sphinx-users.jp/">Japanese translation</a> <p>{%trans%}There is a translation team in <a href="https://www.transifex.com/sphinx-doc/sphinx-doc/dashboard/">Transifex</a>
of this documentation, thanks to the Japanese Sphinx user group.{%endtrans%}</p> of this documentation, thanks to the Sphinx document translators.{%endtrans%}</p>
<p>{%trans%}A Japanese book about Sphinx has been published by O'Reilly: <p>{%trans%}A Japanese book about Sphinx has been published by O'Reilly:
<a href="https://www.oreilly.co.jp/books/9784873116488/">Sphinxをはじめよう / <a href="https://www.oreilly.co.jp/books/9784873116488/">Sphinxをはじめよう /
Learning Sphinx</a>.{%endtrans%}</p> Learning Sphinx</a>.{%endtrans%}</p>

View File

@ -1,4 +1,4 @@
:tocdepth: 2 :tocdepth: 1
.. default-role:: any .. default-role:: any

View File

@ -53,21 +53,21 @@ _keywords = [
# these are ordered by preceedence # these are ordered by preceedence
_expression_bin_ops = [ _expression_bin_ops = [
['||'], ['||', 'or'],
['&&'], ['&&', 'and'],
['|'], ['|', 'bitor'],
['^'], ['^', 'xor'],
['&'], ['&', 'bitand'],
['==', '!='], ['==', '!=', 'not_eq'],
['<=', '>=', '<', '>'], ['<=', '>=', '<', '>'],
['<<', '>>'], ['<<', '>>'],
['+', '-'], ['+', '-'],
['*', '/', '%'], ['*', '/', '%'],
['.*', '->*'] ['.*', '->*']
] ]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "~"] _expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "not", "~", "compl"]
_expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=", _expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=",
">>=", "<<=", "&=", "^=", "|="] ">>=", "<<=", "&=", "and_eq", "^=", "xor_eq", "|=", "or_eq"]
_max_id = 1 _max_id = 1
_id_prefix = [None, 'c.', 'Cv2.'] _id_prefix = [None, 'c.', 'Cv2.']
@ -423,11 +423,16 @@ class ASTUnaryOpExpr(ASTExpression):
self.expr = expr self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str: def _stringify(self, transform: StringifyTransform) -> str:
return transform(self.op) + transform(self.expr) if self.op[0] in 'cn':
return transform(self.op) + " " + transform(self.expr)
else:
return transform(self.op) + transform(self.expr)
def describe_signature(self, signode: TextElement, mode: str, def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None: env: "BuildEnvironment", symbol: "Symbol") -> None:
signode.append(nodes.Text(self.op)) signode.append(nodes.Text(self.op))
if self.op[0] in 'cn':
signode.append(nodes.Text(" "))
self.expr.describe_signature(signode, mode, env, symbol) self.expr.describe_signature(signode, mode, env, symbol)
@ -1670,7 +1675,7 @@ class Symbol:
onMissingQualifiedSymbol, onMissingQualifiedSymbol,
ancestorLookupType=None, ancestorLookupType=None,
matchSelf=False, matchSelf=False,
recurseInAnon=True, recurseInAnon=False,
searchInSiblings=False) searchInSiblings=False)
assert lookupResult is not None # we create symbols all the way, so that can't happen assert lookupResult is not None # we create symbols all the way, so that can't happen
symbols = list(lookupResult.symbols) symbols = list(lookupResult.symbols)
@ -1981,6 +1986,10 @@ class DefinitionParser(BaseParser):
_prefix_keys = ('struct', 'enum', 'union') _prefix_keys = ('struct', 'enum', 'union')
@property
def language(self) -> str:
return 'C'
def _parse_string(self) -> str: def _parse_string(self) -> str:
if self.current_char != '"': if self.current_char != '"':
return None return None
@ -2241,7 +2250,11 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
for op in _expression_unary_ops: for op in _expression_unary_ops:
# TODO: hmm, should we be able to backtrack here? # TODO: hmm, should we be able to backtrack here?
if self.skip_string(op): if op[0] in 'cn':
res = self.skip_word(op)
else:
res = self.skip_string(op)
if res:
expr = self._parse_cast_expression() expr = self._parse_cast_expression()
return ASTUnaryOpExpr(op, expr) return ASTUnaryOpExpr(op, expr)
if self.skip_word_and_ws('sizeof'): if self.skip_word_and_ws('sizeof'):
@ -2313,8 +2326,12 @@ class DefinitionParser(BaseParser):
pos = self.pos pos = self.pos
oneMore = False oneMore = False
for op in _expression_bin_ops[opId]: for op in _expression_bin_ops[opId]:
if not self.skip_string(op): if op[0] in 'abcnox':
continue if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
if op == '&' and self.current_char == '&': if op == '&' and self.current_char == '&':
# don't split the && 'token' # don't split the && 'token'
self.pos -= 1 self.pos -= 1
@ -2353,8 +2370,12 @@ class DefinitionParser(BaseParser):
oneMore = False oneMore = False
self.skip_ws() self.skip_ws()
for op in _expression_assignment_ops: for op in _expression_assignment_ops:
if not self.skip_string(op): if op[0] in 'abcnox':
continue if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
expr = self._parse_logical_or_expression() expr = self._parse_logical_or_expression()
exprs.append(expr) exprs.append(expr)
ops.append(op) ops.append(op)
@ -2680,7 +2701,7 @@ class DefinitionParser(BaseParser):
restrict=restrict, volatile=volatile, const=const, restrict=restrict, volatile=volatile, const=const,
attrs=attrs) attrs=attrs)
if typed and self.current_char == '(': # note: peeking, not skipping if typed and self.current_char == '(': # note: peeking, not skipping
# maybe this is the beginning of params,try that first, # maybe this is the beginning of params, try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator ) # otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos pos = self.pos
try: try:
@ -2689,7 +2710,10 @@ class DefinitionParser(BaseParser):
typed) typed)
return res return res
except DefinitionError as exParamQual: except DefinitionError as exParamQual:
prevErrors.append((exParamQual, "If declId and parameters")) msg = "If declarator-id with parameters"
if paramMode == 'function':
msg += " (e.g., 'void f(int arg)')"
prevErrors.append((exParamQual, msg))
self.pos = pos self.pos = pos
try: try:
assert self.current_char == '(' assert self.current_char == '('
@ -2706,7 +2730,10 @@ class DefinitionParser(BaseParser):
return ASTDeclaratorParen(inner=inner, next=next) return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen: except DefinitionError as exNoPtrParen:
self.pos = pos self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator")) msg = "If parenthesis in noptr-declarator"
if paramMode == 'function':
msg += " (e.g., 'void (*f(int arg))(double)')"
prevErrors.append((exNoPtrParen, msg))
header = "Error in declarator" header = "Error in declarator"
raise self._make_multi_error(prevErrors, header) raise self._make_multi_error(prevErrors, header)
pos = self.pos pos = self.pos

View File

@ -306,6 +306,7 @@ _operator_re = re.compile(r'''(?x)
| ->\*? | \, | ->\*? | \,
| (<<|>>)=? | && | \|\| | (<<|>>)=? | && | \|\|
| [!<>=/*%+|&^~-]=? | [!<>=/*%+|&^~-]=?
| (\b(and|and_eq|bitand|bitor|compl|not|not_eq|or|or_eq|xor|xor_eq)\b)
''') ''')
_fold_operator_re = re.compile(r'''(?x) _fold_operator_re = re.compile(r'''(?x)
->\* | \.\* | \, ->\* | \.\* | \,
@ -464,37 +465,37 @@ _id_operator_v2 = {
# '-(unary)' : 'ng', # '-(unary)' : 'ng',
# '&(unary)' : 'ad', # '&(unary)' : 'ad',
# '*(unary)' : 'de', # '*(unary)' : 'de',
'~': 'co', '~': 'co', 'compl': 'co',
'+': 'pl', '+': 'pl',
'-': 'mi', '-': 'mi',
'*': 'ml', '*': 'ml',
'/': 'dv', '/': 'dv',
'%': 'rm', '%': 'rm',
'&': 'an', '&': 'an', 'bitand': 'an',
'|': 'or', '|': 'or', 'bitor': 'or',
'^': 'eo', '^': 'eo', 'xor': 'eo',
'=': 'aS', '=': 'aS',
'+=': 'pL', '+=': 'pL',
'-=': 'mI', '-=': 'mI',
'*=': 'mL', '*=': 'mL',
'/=': 'dV', '/=': 'dV',
'%=': 'rM', '%=': 'rM',
'&=': 'aN', '&=': 'aN', 'and_eq': 'aN',
'|=': 'oR', '|=': 'oR', 'or_eq': 'oR',
'^=': 'eO', '^=': 'eO', 'xor_eq': 'eO',
'<<': 'ls', '<<': 'ls',
'>>': 'rs', '>>': 'rs',
'<<=': 'lS', '<<=': 'lS',
'>>=': 'rS', '>>=': 'rS',
'==': 'eq', '==': 'eq',
'!=': 'ne', '!=': 'ne', 'not_eq': 'ne',
'<': 'lt', '<': 'lt',
'>': 'gt', '>': 'gt',
'<=': 'le', '<=': 'le',
'>=': 'ge', '>=': 'ge',
'!': 'nt', '!': 'nt', 'not': 'nt',
'&&': 'aa', '&&': 'aa', 'and': 'aa',
'||': 'oo', '||': 'oo', 'or': 'oo',
'++': 'pp', '++': 'pp',
'--': 'mm', '--': 'mm',
',': 'cm', ',': 'cm',
@ -511,8 +512,8 @@ _id_operator_unary_v2 = {
'&': 'ad', '&': 'ad',
'+': 'ps', '+': 'ps',
'-': 'ng', '-': 'ng',
'!': 'nt', '!': 'nt', 'not': 'nt',
'~': 'co' '~': 'co', 'compl': 'co'
} }
_id_char_from_prefix = { _id_char_from_prefix = {
None: 'c', 'u8': 'c', None: 'c', 'u8': 'c',
@ -520,21 +521,21 @@ _id_char_from_prefix = {
} # type: Dict[Any, str] } # type: Dict[Any, str]
# these are ordered by preceedence # these are ordered by preceedence
_expression_bin_ops = [ _expression_bin_ops = [
['||'], ['||', 'or'],
['&&'], ['&&', 'and'],
['|'], ['|', 'bitor'],
['^'], ['^', 'xor'],
['&'], ['&', 'bitand'],
['==', '!='], ['==', '!=', 'not_eq'],
['<=', '>=', '<', '>'], ['<=', '>=', '<', '>'],
['<<', '>>'], ['<<', '>>'],
['+', '-'], ['+', '-'],
['*', '/', '%'], ['*', '/', '%'],
['.*', '->*'] ['.*', '->*']
] ]
_expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "~"] _expression_unary_ops = ["++", "--", "*", "&", "+", "-", "!", "not", "~", "compl"]
_expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=", _expression_assignment_ops = ["=", "*=", "/=", "%=", "+=", "-=",
">>=", "<<=", "&=", "^=", "|="] ">>=", "<<=", "&=", "and_eq", "^=", "|=", "xor_eq", "or_eq"]
_id_explicit_cast = { _id_explicit_cast = {
'dynamic_cast': 'dc', 'dynamic_cast': 'dc',
'static_cast': 'sc', 'static_cast': 'sc',
@ -1260,7 +1261,10 @@ class ASTUnaryOpExpr(ASTExpression):
self.expr = expr self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str: def _stringify(self, transform: StringifyTransform) -> str:
return transform(self.op) + transform(self.expr) if self.op[0] in 'cn':
return transform(self.op) + " " + transform(self.expr)
else:
return transform(self.op) + transform(self.expr)
def get_id(self, version: int) -> str: def get_id(self, version: int) -> str:
return _id_operator_unary_v2[self.op] + self.expr.get_id(version) return _id_operator_unary_v2[self.op] + self.expr.get_id(version)
@ -1268,6 +1272,8 @@ class ASTUnaryOpExpr(ASTExpression):
def describe_signature(self, signode: TextElement, mode: str, def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None: env: "BuildEnvironment", symbol: "Symbol") -> None:
signode.append(nodes.Text(self.op)) signode.append(nodes.Text(self.op))
if self.op[0] in 'cn':
signode.append(nodes.Text(' '))
self.expr.describe_signature(signode, mode, env, symbol) self.expr.describe_signature(signode, mode, env, symbol)
@ -1499,8 +1505,38 @@ class ASTBinOpExpr(ASTExpression):
self.exprs[i].describe_signature(signode, mode, env, symbol) self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTBracedInitList(ASTBase):
def __init__(self, exprs: List[Union[ASTExpression, "ASTBracedInitList"]],
trailingComma: bool) -> None:
self.exprs = exprs
self.trailingComma = trailingComma
def get_id(self, version: int) -> str:
return "il%sE" % ''.join(e.get_id(version) for e in self.exprs)
def _stringify(self, transform: StringifyTransform) -> str:
exprs = [transform(e) for e in self.exprs]
trailingComma = ',' if self.trailingComma else ''
return '{%s%s}' % (', '.join(exprs), trailingComma)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
verify_description_mode(mode)
signode.append(nodes.Text('{'))
first = True
for e in self.exprs:
if not first:
signode.append(nodes.Text(', '))
else:
first = False
e.describe_signature(signode, mode, env, symbol)
if self.trailingComma:
signode.append(nodes.Text(','))
signode.append(nodes.Text('}'))
class ASTAssignmentExpr(ASTExpression): class ASTAssignmentExpr(ASTExpression):
def __init__(self, exprs: List[ASTExpression], ops: List[str]): def __init__(self, exprs: List[Union[ASTExpression, ASTBracedInitList]], ops: List[str]):
assert len(exprs) > 0 assert len(exprs) > 0
assert len(exprs) == len(ops) + 1 assert len(exprs) == len(ops) + 1
self.exprs = exprs self.exprs = exprs
@ -1534,6 +1570,31 @@ class ASTAssignmentExpr(ASTExpression):
self.exprs[i].describe_signature(signode, mode, env, symbol) self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTCommaExpr(ASTExpression):
def __init__(self, exprs: List[ASTExpression]):
assert len(exprs) > 0
self.exprs = exprs
def _stringify(self, transform: StringifyTransform) -> str:
return ', '.join(transform(e) for e in self.exprs)
def get_id(self, version: int) -> str:
id_ = _id_operator_v2[',']
res = []
for i in range(len(self.exprs) - 1):
res.append(id_)
res.append(self.exprs[i].get_id(version))
res.append(self.exprs[-1].get_id(version))
return ''.join(res)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
self.exprs[0].describe_signature(signode, mode, env, symbol)
for i in range(1, len(self.exprs)):
signode.append(nodes.Text(', '))
self.exprs[i].describe_signature(signode, mode, env, symbol)
class ASTFallbackExpr(ASTExpression): class ASTFallbackExpr(ASTExpression):
def __init__(self, expr: str): def __init__(self, expr: str):
self.expr = expr self.expr = expr
@ -1584,6 +1645,8 @@ class ASTOperatorBuildIn(ASTOperator):
def get_id(self, version: int) -> str: def get_id(self, version: int) -> str:
if version == 1: if version == 1:
ids = _id_operator_v1 ids = _id_operator_v1
if self.op not in ids:
raise NoOldIdError()
else: else:
ids = _id_operator_v2 ids = _id_operator_v2
if self.op not in ids: if self.op not in ids:
@ -1592,7 +1655,7 @@ class ASTOperatorBuildIn(ASTOperator):
return ids[self.op] return ids[self.op]
def _stringify(self, transform: StringifyTransform) -> str: def _stringify(self, transform: StringifyTransform) -> str:
if self.op in ('new', 'new[]', 'delete', 'delete[]'): if self.op in ('new', 'new[]', 'delete', 'delete[]') or self.op[0] in "abcnox":
return 'operator ' + self.op return 'operator ' + self.op
else: else:
return 'operator' + self.op return 'operator' + self.op
@ -1650,9 +1713,11 @@ class ASTTemplateArgConstant(ASTBase):
class ASTTemplateArgs(ASTBase): class ASTTemplateArgs(ASTBase):
def __init__(self, args: List[Union["ASTType", ASTTemplateArgConstant]]) -> None: def __init__(self, args: List[Union["ASTType", ASTTemplateArgConstant]],
packExpansion: bool) -> None:
assert args is not None assert args is not None
self.args = args self.args = args
self.packExpansion = packExpansion
def get_id(self, version: int) -> str: def get_id(self, version: int) -> str:
if version == 1: if version == 1:
@ -1664,13 +1729,21 @@ class ASTTemplateArgs(ASTBase):
res = [] res = []
res.append('I') res.append('I')
for a in self.args: if len(self.args) > 0:
res.append(a.get_id(version)) for a in self.args[:-1]:
res.append(a.get_id(version))
if self.packExpansion:
res.append('J')
res.append(self.args[-1].get_id(version))
if self.packExpansion:
res.append('E')
res.append('E') res.append('E')
return ''.join(res) return ''.join(res)
def _stringify(self, transform: StringifyTransform) -> str: def _stringify(self, transform: StringifyTransform) -> str:
res = ', '.join(transform(a) for a in self.args) res = ', '.join(transform(a) for a in self.args)
if self.packExpansion:
res += '...'
return '<' + res + '>' return '<' + res + '>'
def describe_signature(self, signode: TextElement, mode: str, def describe_signature(self, signode: TextElement, mode: str,
@ -1683,6 +1756,8 @@ class ASTTemplateArgs(ASTBase):
signode += nodes.Text(', ') signode += nodes.Text(', ')
first = False first = False
a.describe_signature(signode, 'markType', env, symbol=symbol) a.describe_signature(signode, 'markType', env, symbol=symbol)
if self.packExpansion:
signode += nodes.Text('...')
signode += nodes.Text('>') signode += nodes.Text('>')
@ -2633,7 +2708,7 @@ class ASTDeclaratorParen(ASTDeclarator):
############################################################################################## ##############################################################################################
class ASTPackExpansionExpr(ASTExpression): class ASTPackExpansionExpr(ASTExpression):
def __init__(self, expr: ASTExpression): def __init__(self, expr: Union[ASTExpression, ASTBracedInitList]):
self.expr = expr self.expr = expr
def _stringify(self, transform: StringifyTransform) -> str: def _stringify(self, transform: StringifyTransform) -> str:
@ -2650,7 +2725,7 @@ class ASTPackExpansionExpr(ASTExpression):
class ASTParenExprList(ASTBase): class ASTParenExprList(ASTBase):
def __init__(self, exprs: List[ASTExpression]) -> None: def __init__(self, exprs: List[Union[ASTExpression, ASTBracedInitList]]) -> None:
self.exprs = exprs self.exprs = exprs
def get_id(self, version: int) -> str: def get_id(self, version: int) -> str:
@ -2674,35 +2749,6 @@ class ASTParenExprList(ASTBase):
signode.append(nodes.Text(')')) signode.append(nodes.Text(')'))
class ASTBracedInitList(ASTBase):
def __init__(self, exprs: List[ASTExpression], trailingComma: bool) -> None:
self.exprs = exprs
self.trailingComma = trailingComma
def get_id(self, version: int) -> str:
return "il%sE" % ''.join(e.get_id(version) for e in self.exprs)
def _stringify(self, transform: StringifyTransform) -> str:
exprs = [transform(e) for e in self.exprs]
trailingComma = ',' if self.trailingComma else ''
return '{%s%s}' % (', '.join(exprs), trailingComma)
def describe_signature(self, signode: TextElement, mode: str,
env: "BuildEnvironment", symbol: "Symbol") -> None:
verify_description_mode(mode)
signode.append(nodes.Text('{'))
first = True
for e in self.exprs:
if not first:
signode.append(nodes.Text(', '))
else:
first = False
e.describe_signature(signode, mode, env, symbol)
if self.trailingComma:
signode.append(nodes.Text(','))
signode.append(nodes.Text('}'))
class ASTInitializer(ASTBase): class ASTInitializer(ASTBase):
def __init__(self, value: Union[ASTExpression, ASTBracedInitList], def __init__(self, value: Union[ASTExpression, ASTBracedInitList],
hasAssign: bool = True) -> None: hasAssign: bool = True) -> None:
@ -4113,7 +4159,7 @@ class Symbol:
ancestorLookupType=None, ancestorLookupType=None,
templateShorthand=False, templateShorthand=False,
matchSelf=False, matchSelf=False,
recurseInAnon=True, recurseInAnon=False,
correctPrimaryTemplateArgs=True, correctPrimaryTemplateArgs=True,
searchInSiblings=False) searchInSiblings=False)
assert lookupResult is not None # we create symbols all the way, so that can't happen assert lookupResult is not None # we create symbols all the way, so that can't happen
@ -4568,6 +4614,10 @@ class DefinitionParser(BaseParser):
super().__init__(definition, location=location) super().__init__(definition, location=location)
self.config = config self.config = config
@property
def language(self) -> str:
return 'C++'
def _parse_string(self) -> str: def _parse_string(self) -> str:
if self.current_char != '"': if self.current_char != '"':
return None return None
@ -4743,7 +4793,7 @@ class DefinitionParser(BaseParser):
self.pos = pos self.pos = pos
# fall back to a paren expression # fall back to a paren expression
try: try:
res = self._parse_expression(inTemplate=False) res = self._parse_expression()
self.skip_ws() self.skip_ws()
if not self.skip_string(')'): if not self.skip_string(')'):
self.fail("Expected ')' in end of parenthesized expression.") self.fail("Expected ')' in end of parenthesized expression.")
@ -4791,7 +4841,9 @@ class DefinitionParser(BaseParser):
return None return None
def _parse_initializer_list(self, name: str, open: str, close: str def _parse_initializer_list(self, name: str, open: str, close: str
) -> Tuple[List[ASTExpression], bool]: ) -> Tuple[List[Union[ASTExpression,
ASTBracedInitList]],
bool]:
# Parse open and close with the actual initializer-list inbetween # Parse open and close with the actual initializer-list inbetween
# -> initializer-clause '...'[opt] # -> initializer-clause '...'[opt]
# | initializer-list ',' initializer-clause '...'[opt] # | initializer-list ',' initializer-clause '...'[opt]
@ -4801,11 +4853,11 @@ class DefinitionParser(BaseParser):
if self.skip_string(close): if self.skip_string(close):
return [], False return [], False
exprs = [] # type: List[ASTExpression] exprs = [] # type: List[Union[ASTExpression, ASTBracedInitList]]
trailingComma = False trailingComma = False
while True: while True:
self.skip_ws() self.skip_ws()
expr = self._parse_expression(inTemplate=False) expr = self._parse_initializer_clause()
self.skip_ws() self.skip_ws()
if self.skip_string('...'): if self.skip_string('...'):
exprs.append(ASTPackExpansionExpr(expr)) exprs.append(ASTPackExpansionExpr(expr))
@ -4835,6 +4887,12 @@ class DefinitionParser(BaseParser):
return None return None
return ASTParenExprList(exprs) return ASTParenExprList(exprs)
def _parse_initializer_clause(self) -> Union[ASTExpression, ASTBracedInitList]:
bracedInitList = self._parse_braced_init_list()
if bracedInitList is not None:
return bracedInitList
return self._parse_assignment_expression(inTemplate=False)
def _parse_braced_init_list(self) -> ASTBracedInitList: def _parse_braced_init_list(self) -> ASTBracedInitList:
# -> '{' initializer-list ','[opt] '}' # -> '{' initializer-list ','[opt] '}'
# | '{' '}' # | '{' '}'
@ -4894,7 +4952,7 @@ class DefinitionParser(BaseParser):
self.fail("Expected '(' in '%s'." % cast) self.fail("Expected '(' in '%s'." % cast)
def parser(): def parser():
return self._parse_expression(inTemplate=False) return self._parse_expression()
expr = self._parse_expression_fallback([')'], parser) expr = self._parse_expression_fallback([')'], parser)
self.skip_ws() self.skip_ws()
if not self.skip_string(")"): if not self.skip_string(")"):
@ -4915,7 +4973,7 @@ class DefinitionParser(BaseParser):
try: try:
def parser(): def parser():
return self._parse_expression(inTemplate=False) return self._parse_expression()
expr = self._parse_expression_fallback([')'], parser) expr = self._parse_expression_fallback([')'], parser)
prefix = ASTTypeId(expr, isType=False) prefix = ASTTypeId(expr, isType=False)
if not self.skip_string(')'): if not self.skip_string(')'):
@ -4962,7 +5020,7 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
if prefixType in ['expr', 'cast', 'typeid']: if prefixType in ['expr', 'cast', 'typeid']:
if self.skip_string_and_ws('['): if self.skip_string_and_ws('['):
expr = self._parse_expression(inTemplate=False) expr = self._parse_expression()
self.skip_ws() self.skip_ws()
if not self.skip_string(']'): if not self.skip_string(']'):
self.fail("Expected ']' in end of postfix expression.") self.fail("Expected ']' in end of postfix expression.")
@ -5016,7 +5074,11 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
for op in _expression_unary_ops: for op in _expression_unary_ops:
# TODO: hmm, should we be able to backtrack here? # TODO: hmm, should we be able to backtrack here?
if self.skip_string(op): if op[0] in 'cn':
res = self.skip_word(op)
else:
res = self.skip_string(op)
if res:
expr = self._parse_cast_expression() expr = self._parse_cast_expression()
return ASTUnaryOpExpr(op, expr) return ASTUnaryOpExpr(op, expr)
if self.skip_word_and_ws('sizeof'): if self.skip_word_and_ws('sizeof'):
@ -5049,7 +5111,7 @@ class DefinitionParser(BaseParser):
if self.skip_word_and_ws('noexcept'): if self.skip_word_and_ws('noexcept'):
if not self.skip_string_and_ws('('): if not self.skip_string_and_ws('('):
self.fail("Expecting '(' after 'noexcept'.") self.fail("Expecting '(' after 'noexcept'.")
expr = self._parse_expression(inTemplate=False) expr = self._parse_expression()
self.skip_ws() self.skip_ws()
if not self.skip_string(')'): if not self.skip_string(')'):
self.fail("Expecting ')' to end 'noexcept'.") self.fail("Expecting ')' to end 'noexcept'.")
@ -5144,8 +5206,12 @@ class DefinitionParser(BaseParser):
pos = self.pos pos = self.pos
oneMore = False oneMore = False
for op in _expression_bin_ops[opId]: for op in _expression_bin_ops[opId]:
if not self.skip_string(op): if op[0] in 'abcnox':
continue if not self.skip_word(op):
continue
else:
if not self.skip_string(op):
continue
if op == '&' and self.current_char == '&': if op == '&' and self.current_char == '&':
# don't split the && 'token' # don't split the && 'token'
self.pos -= 1 self.pos -= 1
@ -5178,7 +5244,7 @@ class DefinitionParser(BaseParser):
# logical-or-expression # logical-or-expression
# | logical-or-expression "?" expression ":" assignment-expression # | logical-or-expression "?" expression ":" assignment-expression
# | logical-or-expression assignment-operator initializer-clause # | logical-or-expression assignment-operator initializer-clause
exprs = [] exprs = [] # type: List[Union[ASTExpression, ASTBracedInitList]]
ops = [] ops = []
orExpr = self._parse_logical_or_expression(inTemplate=inTemplate) orExpr = self._parse_logical_or_expression(inTemplate=inTemplate)
exprs.append(orExpr) exprs.append(orExpr)
@ -5187,9 +5253,13 @@ class DefinitionParser(BaseParser):
oneMore = False oneMore = False
self.skip_ws() self.skip_ws()
for op in _expression_assignment_ops: for op in _expression_assignment_ops:
if not self.skip_string(op): if op[0] in 'anox':
continue if not self.skip_word(op):
expr = self._parse_logical_or_expression(False) continue
else:
if not self.skip_string(op):
continue
expr = self._parse_initializer_clause()
exprs.append(expr) exprs.append(expr)
ops.append(op) ops.append(op)
oneMore = True oneMore = True
@ -5206,11 +5276,19 @@ class DefinitionParser(BaseParser):
# TODO: use _parse_conditional_expression_tail # TODO: use _parse_conditional_expression_tail
return orExpr return orExpr
def _parse_expression(self, inTemplate: bool) -> ASTExpression: def _parse_expression(self) -> ASTExpression:
# -> assignment-expression # -> assignment-expression
# | expression "," assignment-expresion # | expression "," assignment-expresion
# TODO: actually parse the second production exprs = [self._parse_assignment_expression(inTemplate=False)]
return self._parse_assignment_expression(inTemplate=inTemplate) while True:
self.skip_ws()
if not self.skip_string(','):
break
exprs.append(self._parse_assignment_expression(inTemplate=False))
if len(exprs) == 1:
return exprs[0]
else:
return ASTCommaExpr(exprs)
def _parse_expression_fallback(self, end: List[str], def _parse_expression_fallback(self, end: List[str],
parser: Callable[[], ASTExpression], parser: Callable[[], ASTExpression],
@ -5289,13 +5367,21 @@ class DefinitionParser(BaseParser):
return ASTOperatorType(type) return ASTOperatorType(type)
def _parse_template_argument_list(self) -> ASTTemplateArgs: def _parse_template_argument_list(self) -> ASTTemplateArgs:
# template-argument-list: (but we include the < and > here
# template-argument ...[opt]
# template-argument-list, template-argument ...[opt]
# template-argument:
# constant-expression
# type-id
# id-expression
self.skip_ws() self.skip_ws()
if not self.skip_string_and_ws('<'): if not self.skip_string_and_ws('<'):
return None return None
if self.skip_string('>'): if self.skip_string('>'):
return ASTTemplateArgs([]) return ASTTemplateArgs([], False)
prevErrors = [] prevErrors = []
templateArgs = [] # type: List[Union[ASTType, ASTTemplateArgConstant]] templateArgs = [] # type: List[Union[ASTType, ASTTemplateArgConstant]]
packExpansion = False
while 1: while 1:
pos = self.pos pos = self.pos
parsedComma = False parsedComma = False
@ -5303,31 +5389,35 @@ class DefinitionParser(BaseParser):
try: try:
type = self._parse_type(named=False) type = self._parse_type(named=False)
self.skip_ws() self.skip_ws()
if self.skip_string('>'): if self.skip_string_and_ws('...'):
packExpansion = True
parsedEnd = True
if not self.skip_string('>'):
self.fail('Expected ">" after "..." in template argument list.')
elif self.skip_string('>'):
parsedEnd = True parsedEnd = True
elif self.skip_string(','): elif self.skip_string(','):
parsedComma = True parsedComma = True
else: else:
self.fail('Expected ">" or "," in template argument list.') self.fail('Expected "...>", ">" or "," in template argument list.')
templateArgs.append(type) templateArgs.append(type)
except DefinitionError as e: except DefinitionError as e:
prevErrors.append((e, "If type argument")) prevErrors.append((e, "If type argument"))
self.pos = pos self.pos = pos
try: try:
# actually here we shouldn't use the fallback parser (hence allow=False), value = self._parse_constant_expression(inTemplate=True)
# because if actually took the < in an expression, then we _will_ fail,
# which is handled elsewhere. E.g., :cpp:expr:`A <= 0`.
def parser():
return self._parse_constant_expression(inTemplate=True)
value = self._parse_expression_fallback(
[',', '>'], parser, allow=False)
self.skip_ws() self.skip_ws()
if self.skip_string('>'): if self.skip_string_and_ws('...'):
packExpansion = True
parsedEnd = True
if not self.skip_string('>'):
self.fail('Expected ">" after "..." in template argument list.')
elif self.skip_string('>'):
parsedEnd = True parsedEnd = True
elif self.skip_string(','): elif self.skip_string(','):
parsedComma = True parsedComma = True
else: else:
self.fail('Expected ">" or "," in template argument list.') self.fail('Expected "...>", ">" or "," in template argument list.')
templateArgs.append(ASTTemplateArgConstant(value)) templateArgs.append(ASTTemplateArgConstant(value))
except DefinitionError as e: except DefinitionError as e:
self.pos = pos self.pos = pos
@ -5337,7 +5427,9 @@ class DefinitionParser(BaseParser):
if parsedEnd: if parsedEnd:
assert not parsedComma assert not parsedComma
break break
return ASTTemplateArgs(templateArgs) else:
assert not packExpansion
return ASTTemplateArgs(templateArgs, packExpansion)
def _parse_nested_name(self, memberPointer: bool = False) -> ASTNestedName: def _parse_nested_name(self, memberPointer: bool = False) -> ASTNestedName:
names = [] # type: List[ASTNestedNameElement] names = [] # type: List[ASTNestedNameElement]
@ -5427,7 +5519,7 @@ class DefinitionParser(BaseParser):
if not self.skip_string(')'): if not self.skip_string(')'):
self.fail("Expected ')' after 'decltype(auto'.") self.fail("Expected ')' after 'decltype(auto'.")
return ASTTrailingTypeSpecDecltypeAuto() return ASTTrailingTypeSpecDecltypeAuto()
expr = self._parse_expression(inTemplate=False) expr = self._parse_expression()
self.skip_ws() self.skip_ws()
if not self.skip_string(')'): if not self.skip_string(')'):
self.fail("Expected ')' after 'decltype(<expr>'.") self.fail("Expected ')' after 'decltype(<expr>'.")
@ -5440,7 +5532,6 @@ class DefinitionParser(BaseParser):
if self.skip_word_and_ws(k): if self.skip_word_and_ws(k):
prefix = k prefix = k
break break
nestedName = self._parse_nested_name() nestedName = self._parse_nested_name()
return ASTTrailingTypeSpecName(prefix, nestedName) return ASTTrailingTypeSpecName(prefix, nestedName)
@ -5450,7 +5541,7 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
if not self.skip_string('('): if not self.skip_string('('):
if paramMode == 'function': if paramMode == 'function':
self.fail('Expecting "(" in parameters_and_qualifiers.') self.fail('Expecting "(" in parameters-and-qualifiers.')
else: else:
return None return None
args = [] args = []
@ -5463,7 +5554,7 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
if not self.skip_string(')'): if not self.skip_string(')'):
self.fail('Expected ")" after "..." in ' self.fail('Expected ")" after "..." in '
'parameters_and_qualifiers.') 'parameters-and-qualifiers.')
break break
# note: it seems that function arguments can always be named, # note: it seems that function arguments can always be named,
# even in function pointers and similar. # even in function pointers and similar.
@ -5478,7 +5569,7 @@ class DefinitionParser(BaseParser):
break break
else: else:
self.fail( self.fail(
'Expecting "," or ")" in parameters_and_qualifiers, ' 'Expecting "," or ")" in parameters-and-qualifiers, '
'got "%s".' % self.current_char) 'got "%s".' % self.current_char)
# TODO: why did we have this bail-out? # TODO: why did we have this bail-out?
@ -5509,7 +5600,7 @@ class DefinitionParser(BaseParser):
exceptionSpec = 'noexcept' exceptionSpec = 'noexcept'
self.skip_ws() self.skip_ws()
if self.skip_string('('): if self.skip_string('('):
self.fail('Parameterised "noexcept" not implemented.') self.fail('Parameterised "noexcept" not yet implemented.')
self.skip_ws() self.skip_ws()
override = self.skip_word_and_ws('override') override = self.skip_word_and_ws('override')
@ -5672,7 +5763,7 @@ class DefinitionParser(BaseParser):
continue continue
def parser(): def parser():
return self._parse_expression(inTemplate=False) return self._parse_expression()
value = self._parse_expression_fallback([']'], parser) value = self._parse_expression_fallback([']'], parser)
if not self.skip_string(']'): if not self.skip_string(']'):
self.fail("Expected ']' in end of array operator.") self.fail("Expected ']' in end of array operator.")
@ -5734,6 +5825,42 @@ class DefinitionParser(BaseParser):
if typed and self.skip_string("..."): if typed and self.skip_string("..."):
next = self._parse_declarator(named, paramMode, False) next = self._parse_declarator(named, paramMode, False)
return ASTDeclaratorParamPack(next=next) return ASTDeclaratorParamPack(next=next)
if typed and self.current_char == '(': # note: peeking, not skipping
if paramMode == "operatorCast":
# TODO: we should be able to parse cast operators which return
# function pointers. For now, just hax it and ignore.
return ASTDeclaratorNameParamQual(declId=None, arrayOps=[],
paramQual=None)
# maybe this is the beginning of params and quals,try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos
try:
# assume this is params and quals
res = self._parse_declarator_name_suffix(named, paramMode,
typed)
return res
except DefinitionError as exParamQual:
prevErrors.append((exParamQual,
"If declarator-id with parameters-and-qualifiers"))
self.pos = pos
try:
assert self.current_char == '('
self.skip_string('(')
# TODO: hmm, if there is a name, it must be in inner, right?
# TODO: hmm, if there must be parameters, they must be
# inside, right?
inner = self._parse_declarator(named, paramMode, typed)
if not self.skip_string(')'):
self.fail("Expected ')' in \"( ptr-declarator )\"")
next = self._parse_declarator(named=False,
paramMode="type",
typed=typed)
return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen:
self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator"))
header = "Error in declarator"
raise self._make_multi_error(prevErrors, header)
if typed: # pointer to member if typed: # pointer to member
pos = self.pos pos = self.pos
try: try:
@ -5760,48 +5887,18 @@ class DefinitionParser(BaseParser):
break break
next = self._parse_declarator(named, paramMode, typed) next = self._parse_declarator(named, paramMode, typed)
return ASTDeclaratorMemPtr(name, const, volatile, next=next) return ASTDeclaratorMemPtr(name, const, volatile, next=next)
if typed and self.current_char == '(': # note: peeking, not skipping
if paramMode == "operatorCast":
# TODO: we should be able to parse cast operators which return
# function pointers. For now, just hax it and ignore.
return ASTDeclaratorNameParamQual(declId=None, arrayOps=[],
paramQual=None)
# maybe this is the beginning of params and quals,try that first,
# otherwise assume it's noptr->declarator > ( ptr-declarator )
pos = self.pos
try:
# assume this is params and quals
res = self._parse_declarator_name_suffix(named, paramMode,
typed)
return res
except DefinitionError as exParamQual:
prevErrors.append((exParamQual, "If declId, parameters, and qualifiers"))
self.pos = pos
try:
assert self.current_char == '('
self.skip_string('(')
# TODO: hmm, if there is a name, it must be in inner, right?
# TODO: hmm, if there must be parameters, they must b
# inside, right?
inner = self._parse_declarator(named, paramMode, typed)
if not self.skip_string(')'):
self.fail("Expected ')' in \"( ptr-declarator )\"")
next = self._parse_declarator(named=False,
paramMode="type",
typed=typed)
return ASTDeclaratorParen(inner=inner, next=next)
except DefinitionError as exNoPtrParen:
self.pos = pos
prevErrors.append((exNoPtrParen, "If parenthesis in noptr-declarator"))
header = "Error in declarator"
raise self._make_multi_error(prevErrors, header)
pos = self.pos pos = self.pos
try: try:
return self._parse_declarator_name_suffix(named, paramMode, typed) res = self._parse_declarator_name_suffix(named, paramMode, typed)
# this is a heuristic for error messages, for when there is a < after a
# nested name, but it was not a successful template argument list
if self.current_char == '<':
self.otherErrors.append(self._make_multi_error(prevErrors, ""))
return res
except DefinitionError as e: except DefinitionError as e:
self.pos = pos self.pos = pos
prevErrors.append((e, "If declarator-id")) prevErrors.append((e, "If declarator-id"))
header = "Error in declarator or parameters and qualifiers" header = "Error in declarator or parameters-and-qualifiers"
raise self._make_multi_error(prevErrors, header) raise self._make_multi_error(prevErrors, header)
def _parse_initializer(self, outer: str = None, allowFallback: bool = True def _parse_initializer(self, outer: str = None, allowFallback: bool = True
@ -5866,7 +5963,6 @@ class DefinitionParser(BaseParser):
raise Exception('Internal error, unknown outer "%s".' % outer) raise Exception('Internal error, unknown outer "%s".' % outer)
if outer != 'operatorCast': if outer != 'operatorCast':
assert named assert named
if outer in ('type', 'function'): if outer in ('type', 'function'):
# We allow type objects to just be a name. # We allow type objects to just be a name.
# Some functions don't have normal return types: constructors, # Some functions don't have normal return types: constructors,
@ -5974,10 +6070,10 @@ class DefinitionParser(BaseParser):
if eExpr is None: if eExpr is None:
raise eType raise eType
errs = [] errs = []
errs.append((eExpr, "If default is an expression")) errs.append((eExpr, "If default template argument is an expression"))
errs.append((eType, "If default is a type")) errs.append((eType, "If default template argument is a type"))
msg = "Error in non-type template parameter" msg = "Error in non-type template parameter"
msg += " or constrianted template paramter." msg += " or constrained template parameter."
raise self._make_multi_error(errs, msg) raise self._make_multi_error(errs, msg)
def _parse_type_using(self) -> ASTTypeUsing: def _parse_type_using(self) -> ASTTypeUsing:
@ -6060,8 +6156,8 @@ class DefinitionParser(BaseParser):
self.skip_ws() self.skip_ws()
if not self.skip_string("<"): if not self.skip_string("<"):
self.fail("Expected '<' after 'template'") self.fail("Expected '<' after 'template'")
prevErrors = []
while 1: while 1:
prevErrors = []
self.skip_ws() self.skip_ws()
if self.skip_word('template'): if self.skip_word('template'):
# declare a tenplate template parameter # declare a tenplate template parameter
@ -6114,6 +6210,7 @@ class DefinitionParser(BaseParser):
if self.skip_string('>'): if self.skip_string('>'):
return ASTTemplateParams(templateParams) return ASTTemplateParams(templateParams)
elif self.skip_string(','): elif self.skip_string(','):
prevErrors = []
continue continue
else: else:
header = "Error in template parameter list." header = "Error in template parameter list."
@ -6333,7 +6430,7 @@ class DefinitionParser(BaseParser):
def parse_expression(self) -> Union[ASTExpression, ASTType]: def parse_expression(self) -> Union[ASTExpression, ASTType]:
pos = self.pos pos = self.pos
try: try:
expr = self._parse_expression(False) expr = self._parse_expression()
self.skip_ws() self.skip_ws()
self.assert_end() self.assert_end()
return expr return expr

View File

@ -718,6 +718,8 @@ def process_generate_options(app: Sphinx) -> None:
env = app.builder.env env = app.builder.env
genfiles = [env.doc2path(x, base=None) for x in env.found_docs genfiles = [env.doc2path(x, base=None) for x in env.found_docs
if os.path.isfile(env.doc2path(x))] if os.path.isfile(env.doc2path(x))]
elif genfiles is False:
pass
else: else:
ext = list(app.config.source_suffix) ext = list(app.config.source_suffix)
genfiles = [genfile + (ext[0] if not genfile.endswith(tuple(ext)) else '') genfiles = [genfile + (ext[0] if not genfile.endswith(tuple(ext)) else '')

View File

@ -275,7 +275,7 @@ def generate_autosummary_docs(sources: List[str], output_dir: str = None,
try: try:
name, obj, parent, mod_name = import_by_name(entry.name) name, obj, parent, mod_name = import_by_name(entry.name)
except ImportError as e: except ImportError as e:
_warn(__('[autosummary] failed to import %r: %s') % (name, e)) _warn(__('[autosummary] failed to import %r: %s') % (entry.name, e))
continue continue
content = generate_autosummary_content(name, obj, parent, template, entry.template, content = generate_autosummary_content(name, obj, parent, template, entry.template,

View File

@ -154,19 +154,23 @@ class BaseParser:
result = [header, '\n'] result = [header, '\n']
for e in errors: for e in errors:
if len(e[1]) > 0: if len(e[1]) > 0:
ident = ' ' indent = ' '
result.append(e[1]) result.append(e[1])
result.append(':\n') result.append(':\n')
for line in str(e[0]).split('\n'): for line in str(e[0]).split('\n'):
if len(line) == 0: if len(line) == 0:
continue continue
result.append(ident) result.append(indent)
result.append(line) result.append(line)
result.append('\n') result.append('\n')
else: else:
result.append(str(e[0])) result.append(str(e[0]))
return DefinitionError(''.join(result)) return DefinitionError(''.join(result))
@property
def language(self) -> str:
raise NotImplementedError
def status(self, msg: str) -> None: def status(self, msg: str) -> None:
# for debugging # for debugging
indicator = '-' * self.pos + '^' indicator = '-' * self.pos + '^'
@ -176,8 +180,8 @@ class BaseParser:
errors = [] errors = []
indicator = '-' * self.pos + '^' indicator = '-' * self.pos + '^'
exMain = DefinitionError( exMain = DefinitionError(
'Invalid definition: %s [error at %d]\n %s\n %s' % 'Invalid %s declaration: %s [error at %d]\n %s\n %s' %
(msg, self.pos, self.definition, indicator)) (self.language, msg, self.pos, self.definition, indicator))
errors.append((exMain, "Main error")) errors.append((exMain, "Main error"))
for err in self.otherErrors: for err in self.otherErrors:
errors.append((err, "Potential other error")) errors.append((err, "Potential other error"))

View File

@ -0,0 +1,5 @@
.. c:struct:: anon_dup_decl
.. c:struct:: @a.A
.. c:struct:: @b.A
.. c:struct:: A

View File

@ -0,0 +1,4 @@
.. cpp:namespace:: anon_dup_decl
.. cpp:class:: @a::A
.. cpp:class:: @b::A
.. cpp:class:: A

View File

@ -172,7 +172,9 @@ def test_expressions():
exprCheck('+5') exprCheck('+5')
exprCheck('-5') exprCheck('-5')
exprCheck('!5') exprCheck('!5')
exprCheck('not 5')
exprCheck('~5') exprCheck('~5')
exprCheck('compl 5')
exprCheck('sizeof(T)') exprCheck('sizeof(T)')
exprCheck('sizeof -42') exprCheck('sizeof -42')
exprCheck('alignof(T)') exprCheck('alignof(T)')
@ -180,13 +182,19 @@ def test_expressions():
exprCheck('(int)2') exprCheck('(int)2')
# binary op # binary op
exprCheck('5 || 42') exprCheck('5 || 42')
exprCheck('5 or 42')
exprCheck('5 && 42') exprCheck('5 && 42')
exprCheck('5 and 42')
exprCheck('5 | 42') exprCheck('5 | 42')
exprCheck('5 bitor 42')
exprCheck('5 ^ 42') exprCheck('5 ^ 42')
exprCheck('5 xor 42')
exprCheck('5 & 42') exprCheck('5 & 42')
exprCheck('5 bitand 42')
# ['==', '!='] # ['==', '!=']
exprCheck('5 == 42') exprCheck('5 == 42')
exprCheck('5 != 42') exprCheck('5 != 42')
exprCheck('5 not_eq 42')
# ['<=', '>=', '<', '>'] # ['<=', '>=', '<', '>']
exprCheck('5 <= 42') exprCheck('5 <= 42')
exprCheck('5 >= 42') exprCheck('5 >= 42')
@ -215,8 +223,11 @@ def test_expressions():
exprCheck('a >>= 5') exprCheck('a >>= 5')
exprCheck('a <<= 5') exprCheck('a <<= 5')
exprCheck('a &= 5') exprCheck('a &= 5')
exprCheck('a and_eq 5')
exprCheck('a ^= 5') exprCheck('a ^= 5')
exprCheck('a xor_eq 5')
exprCheck('a |= 5') exprCheck('a |= 5')
exprCheck('a or_eq 5')
def test_type_definitions(): def test_type_definitions():
@ -463,6 +474,15 @@ def test_build_domain_c(app, status, warning):
assert len(ws) == 0 assert len(ws) == 0
@pytest.mark.sphinx(testroot='domain-c', confoverrides={'nitpicky': True})
def test_build_domain_c_anon_dup_decl(app, status, warning):
app.builder.build_all()
ws = filter_warnings(warning, "anon-dup-decl")
assert len(ws) == 2
assert "WARNING: c:identifier reference target not found: @a" in ws[0]
assert "WARNING: c:identifier reference target not found: @b" in ws[1]
def test_cfunction(app): def test_cfunction(app):
text = (".. c:function:: PyObject* " text = (".. c:function:: PyObject* "
"PyType_GenericAlloc(PyTypeObject *type, Py_ssize_t nitems)") "PyType_GenericAlloc(PyTypeObject *type, Py_ssize_t nitems)")

View File

@ -196,7 +196,9 @@ def test_expressions():
exprCheck('+5', 'psL5E') exprCheck('+5', 'psL5E')
exprCheck('-5', 'ngL5E') exprCheck('-5', 'ngL5E')
exprCheck('!5', 'ntL5E') exprCheck('!5', 'ntL5E')
exprCheck('not 5', 'ntL5E')
exprCheck('~5', 'coL5E') exprCheck('~5', 'coL5E')
exprCheck('compl 5', 'coL5E')
exprCheck('sizeof...(a)', 'sZ1a') exprCheck('sizeof...(a)', 'sZ1a')
exprCheck('sizeof(T)', 'st1T') exprCheck('sizeof(T)', 'st1T')
exprCheck('sizeof -42', 'szngL42E') exprCheck('sizeof -42', 'szngL42E')
@ -220,13 +222,19 @@ def test_expressions():
exprCheck('(int)2', 'cviL2E') exprCheck('(int)2', 'cviL2E')
# binary op # binary op
exprCheck('5 || 42', 'ooL5EL42E') exprCheck('5 || 42', 'ooL5EL42E')
exprCheck('5 or 42', 'ooL5EL42E')
exprCheck('5 && 42', 'aaL5EL42E') exprCheck('5 && 42', 'aaL5EL42E')
exprCheck('5 and 42', 'aaL5EL42E')
exprCheck('5 | 42', 'orL5EL42E') exprCheck('5 | 42', 'orL5EL42E')
exprCheck('5 bitor 42', 'orL5EL42E')
exprCheck('5 ^ 42', 'eoL5EL42E') exprCheck('5 ^ 42', 'eoL5EL42E')
exprCheck('5 xor 42', 'eoL5EL42E')
exprCheck('5 & 42', 'anL5EL42E') exprCheck('5 & 42', 'anL5EL42E')
exprCheck('5 bitand 42', 'anL5EL42E')
# ['==', '!='] # ['==', '!=']
exprCheck('5 == 42', 'eqL5EL42E') exprCheck('5 == 42', 'eqL5EL42E')
exprCheck('5 != 42', 'neL5EL42E') exprCheck('5 != 42', 'neL5EL42E')
exprCheck('5 not_eq 42', 'neL5EL42E')
# ['<=', '>=', '<', '>'] # ['<=', '>=', '<', '>']
exprCheck('5 <= 42', 'leL5EL42E') exprCheck('5 <= 42', 'leL5EL42E')
exprCheck('A <= 42', 'le1AL42E') exprCheck('A <= 42', 'le1AL42E')
@ -260,8 +268,14 @@ def test_expressions():
exprCheck('a >>= 5', 'rS1aL5E') exprCheck('a >>= 5', 'rS1aL5E')
exprCheck('a <<= 5', 'lS1aL5E') exprCheck('a <<= 5', 'lS1aL5E')
exprCheck('a &= 5', 'aN1aL5E') exprCheck('a &= 5', 'aN1aL5E')
exprCheck('a and_eq 5', 'aN1aL5E')
exprCheck('a ^= 5', 'eO1aL5E') exprCheck('a ^= 5', 'eO1aL5E')
exprCheck('a xor_eq 5', 'eO1aL5E')
exprCheck('a |= 5', 'oR1aL5E') exprCheck('a |= 5', 'oR1aL5E')
exprCheck('a or_eq 5', 'oR1aL5E')
exprCheck('a = {1, 2, 3}', 'aS1ailL1EL2EL3EE')
# comma operator
exprCheck('a, 5', 'cm1aL5E')
# Additional tests # Additional tests
# a < expression that starts with something that could be a template # a < expression that starts with something that could be a template
@ -530,30 +544,62 @@ def test_function_definitions():
def test_operators(): def test_operators():
check('function', 'void operator new [ ] ()', check('function', 'void operator new()', {1: "new-operator", 2: "nwv"})
{1: "new-array-operator", 2: "nav"}, output='void operator new[]()') check('function', 'void operator new[]()', {1: "new-array-operator", 2: "nav"})
check('function', 'void operator delete ()', check('function', 'void operator delete()', {1: "delete-operator", 2: "dlv"})
{1: "delete-operator", 2: "dlv"}, output='void operator delete()') check('function', 'void operator delete[]()', {1: "delete-array-operator", 2: "dav"})
check('function', 'operator bool() const', check('function', 'operator bool() const', {1: "castto-b-operatorC", 2: "NKcvbEv"})
{1: "castto-b-operatorC", 2: "NKcvbEv"}, output='operator bool() const') check('function', 'void operator""_udl()', {2: 'li4_udlv'})
check('function', 'void operator * ()', check('function', 'void operator~()', {1: "inv-operator", 2: "cov"})
{1: "mul-operator", 2: "mlv"}, output='void operator*()') check('function', 'void operator compl()', {2: "cov"})
check('function', 'void operator - ()', check('function', 'void operator+()', {1: "add-operator", 2: "plv"})
{1: "sub-operator", 2: "miv"}, output='void operator-()') check('function', 'void operator-()', {1: "sub-operator", 2: "miv"})
check('function', 'void operator + ()', check('function', 'void operator*()', {1: "mul-operator", 2: "mlv"})
{1: "add-operator", 2: "plv"}, output='void operator+()') check('function', 'void operator/()', {1: "div-operator", 2: "dvv"})
check('function', 'void operator = ()', check('function', 'void operator%()', {1: "mod-operator", 2: "rmv"})
{1: "assign-operator", 2: "aSv"}, output='void operator=()') check('function', 'void operator&()', {1: "and-operator", 2: "anv"})
check('function', 'void operator / ()', check('function', 'void operator bitand()', {2: "anv"})
{1: "div-operator", 2: "dvv"}, output='void operator/()') check('function', 'void operator|()', {1: "or-operator", 2: "orv"})
check('function', 'void operator % ()', check('function', 'void operator bitor()', {2: "orv"})
{1: "mod-operator", 2: "rmv"}, output='void operator%()') check('function', 'void operator^()', {1: "xor-operator", 2: "eov"})
check('function', 'void operator ! ()', check('function', 'void operator xor()', {2: "eov"})
{1: "not-operator", 2: "ntv"}, output='void operator!()') check('function', 'void operator=()', {1: "assign-operator", 2: "aSv"})
check('function', 'void operator+=()', {1: "add-assign-operator", 2: "pLv"})
check('function', 'void operator "" _udl()', check('function', 'void operator-=()', {1: "sub-assign-operator", 2: "mIv"})
{2: 'li4_udlv'}, output='void operator""_udl()') check('function', 'void operator*=()', {1: "mul-assign-operator", 2: "mLv"})
check('function', 'void operator/=()', {1: "div-assign-operator", 2: "dVv"})
check('function', 'void operator%=()', {1: "mod-assign-operator", 2: "rMv"})
check('function', 'void operator&=()', {1: "and-assign-operator", 2: "aNv"})
check('function', 'void operator and_eq()', {2: "aNv"})
check('function', 'void operator|=()', {1: "or-assign-operator", 2: "oRv"})
check('function', 'void operator or_eq()', {2: "oRv"})
check('function', 'void operator^=()', {1: "xor-assign-operator", 2: "eOv"})
check('function', 'void operator xor_eq()', {2: "eOv"})
check('function', 'void operator<<()', {1: "lshift-operator", 2: "lsv"})
check('function', 'void operator>>()', {1: "rshift-operator", 2: "rsv"})
check('function', 'void operator<<=()', {1: "lshift-assign-operator", 2: "lSv"})
check('function', 'void operator>>=()', {1: "rshift-assign-operator", 2: "rSv"})
check('function', 'void operator==()', {1: "eq-operator", 2: "eqv"})
check('function', 'void operator!=()', {1: "neq-operator", 2: "nev"})
check('function', 'void operator not_eq()', {2: "nev"})
check('function', 'void operator<()', {1: "lt-operator", 2: "ltv"})
check('function', 'void operator>()', {1: "gt-operator", 2: "gtv"})
check('function', 'void operator<=()', {1: "lte-operator", 2: "lev"})
check('function', 'void operator>=()', {1: "gte-operator", 2: "gev"})
check('function', 'void operator!()', {1: "not-operator", 2: "ntv"})
check('function', 'void operator not()', {2: "ntv"})
check('function', 'void operator&&()', {1: "sand-operator", 2: "aav"})
check('function', 'void operator and()', {2: "aav"})
check('function', 'void operator||()', {1: "sor-operator", 2: "oov"})
check('function', 'void operator or()', {2: "oov"})
check('function', 'void operator++()', {1: "inc-operator", 2: "ppv"})
check('function', 'void operator--()', {1: "dec-operator", 2: "mmv"})
check('function', 'void operator,()', {1: "comma-operator", 2: "cmv"})
check('function', 'void operator->*()', {1: "pointer-by-pointer-operator", 2: "pmv"})
check('function', 'void operator->()', {1: "pointer-operator", 2: "ptv"})
check('function', 'void operator()()', {1: "call-operator", 2: "clv"})
check('function', 'void operator[]()', {1: "subscript-operator", 2: "ixv"})
def test_class_definitions(): def test_class_definitions():
@ -582,6 +628,12 @@ def test_class_definitions():
{2: 'I0E7has_varI1TNSt6void_tIDTadN1T3varEEEEE'}) {2: 'I0E7has_varI1TNSt6void_tIDTadN1T3varEEEEE'})
check('class', 'template<typename ...Ts> T<int (*)(Ts)...>',
{2: 'IDpE1TIJPFi2TsEEE'})
check('class', 'template<int... Is> T<(Is)...>',
{2: 'I_DpiE1TIJX(Is)EEE', 3: 'I_DpiE1TIJX2IsEEE'})
def test_union_definitions(): def test_union_definitions():
check('union', 'A', {2: "1A"}) check('union', 'A', {2: "1A"})
@ -860,6 +912,15 @@ def test_build_domain_cpp_backslash_ok(app, status, warning):
assert "WARNING: Parsing of expression failed. Using fallback parser." in ws[0] assert "WARNING: Parsing of expression failed. Using fallback parser." in ws[0]
@pytest.mark.sphinx(testroot='domain-cpp', confoverrides={'nitpicky': True})
def test_build_domain_cpp_anon_dup_decl(app, status, warning):
app.builder.build_all()
ws = filter_warnings(warning, "anon-dup-decl")
assert len(ws) == 2
assert "WARNING: cpp:identifier reference target not found: @a" in ws[0]
assert "WARNING: cpp:identifier reference target not found: @b" in ws[1]
@pytest.mark.sphinx(testroot='domain-cpp') @pytest.mark.sphinx(testroot='domain-cpp')
def test_build_domain_cpp_misuse_of_roles(app, status, warning): def test_build_domain_cpp_misuse_of_roles(app, status, warning):
app.builder.build_all() app.builder.build_all()