Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
MegEngine 天元
MegEngine
提交
e0e18964
MegEngine
项目概览
MegEngine 天元
/
MegEngine
9 个月 前同步成功
通知
392
Star
4702
Fork
582
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
MegEngine
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
e0e18964
编写于
11月 09, 2020
作者:
M
Megvii Engine Team
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
docs(mge): formatting math in docstring
GitOrigin-RevId: e6447a278624e3923bf1c41ce7b325f980ee71ab
上级
18ec5341
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
15 addition
and
15 deletion
+15
-15
imperative/python/megengine/functional/nn.py
imperative/python/megengine/functional/nn.py
+15
-15
未找到文件。
imperative/python/megengine/functional/nn.py
浏览文件 @
e0e18964
...
...
@@ -441,22 +441,22 @@ def softplus(inp: Tensor) -> Tensor:
def
logsoftmax
(
inp
:
Tensor
,
axis
:
Union
[
int
,
Sequence
[
int
]])
->
Tensor
:
r
"""
Applies the :math:`\log(\text{
S
oftmax}(x))` function to an n-dimensional
input
Tensor. The LogSoftmax
formulation can be simplified as:
Applies the :math:`\log(\text{
s
oftmax}(x))` function to an n-dimensional
input
tensor. The :math:`\text{logsoftmax}(x)`
formulation can be simplified as:
.. math::
\text{
LogS
oftmax}(x_{i}) = \log(\frac{\exp(x_i) }{ \sum_j \exp(x_j)} )
\text{
logs
oftmax}(x_{i}) = \log(\frac{\exp(x_i) }{ \sum_j \exp(x_j)} )
For numerical stability the implementation follows this transformation:
.. math::
\
operatorname
{logsoftmax}(x)
\
text
{logsoftmax}(x)
= \log (\frac{\exp (x)}{\sum_{i}(\exp (x_{i}))})
= x - \log (\sum_{i}(\exp (x_{i})))
= x -
logsumexp
(x)
= x -
\text{logsumexp}
(x)
:param inp: input tensor.
:param axis: axis along which
logsoftmax
will be applied.
:param axis: axis along which
:math:`\text{logsoftmax}(x)`
will be applied.
Examples:
...
...
@@ -487,8 +487,8 @@ def logsigmoid(inp: Tensor) -> Tensor:
.. math::
\text{logsigmoid}(x) = \log(\frac{ 1 }{ 1 + \exp(-x)})
= \log(1/(1 + exp(-x)))
= - \log(1 + exp(-x))
= \log(1/(1 +
\
exp(-x)))
= - \log(1 +
\
exp(-x))
= - \text{softplus}(-x)
:param inp: input tensor.
...
...
@@ -524,14 +524,14 @@ def logsumexp(
.. math::
\
operatorname{logsumexp}(\boldsymbol{x}
)= \log \sum_{j=1}^{n} \exp \left(x_{j}\right)
\
text{logsumexp}(x
)= \log \sum_{j=1}^{n} \exp \left(x_{j}\right)
For numerical stability, the implementation follows this transformation:
.. math::
\
operatorname{logsumexp}(\boldsymbol{x}
)= \log \sum_{j=1}^{n} \exp \left(x_{j}\right)
= \
operatorname{logsumexp}(\boldsymbol{x}
)=b+\log \sum_{j=1}^{n} \exp \left(x_{j}-b\right)
\
text{logsumexp}(x
)= \log \sum_{j=1}^{n} \exp \left(x_{j}\right)
= \
text{logsumexp}(x
)=b+\log \sum_{j=1}^{n} \exp \left(x_{j}-b\right)
where
...
...
@@ -578,10 +578,10 @@ def _get_softmax_axis(ndim: int) -> int:
def
softmax
(
inp
:
Tensor
,
axis
:
Optional
[
int
]
=
None
)
->
Tensor
:
r
"""
Applies a
softmax function. Softmax
is defined as:
Applies a
:math:`\text{softmax}(x)` function. :math:`\text{softmax}(x)`
is defined as:
.. math::
\text{
S
oftmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}
\text{
s
oftmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}
It is applied to all elements along axis, and rescales elements so that
they stay in the range `[0, 1]` and sum to 1.
...
...
@@ -589,8 +589,8 @@ def softmax(inp: Tensor, axis: Optional[int] = None) -> Tensor:
See :class:`~megengine.module.activation.Softmax` for more details.
:param inp: input tensor.
:param axis: an axis along which
softmax
will be applied. By default,
softmax
will apply along the highest ranked axis.
:param axis: an axis along which
:math:`\text{softmax}(x)`
will be applied. By default,
:math:`\text{softmax}(x)`
will apply along the highest ranked axis.
Examples:
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录