Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Greenplum
Annotated Deep Learning Paper Implementations
提交
ce23bc10
A
Annotated Deep Learning Paper Implementations
项目概览
Greenplum
/
Annotated Deep Learning Paper Implementations
9 个月 前同步成功
通知
6
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
A
Annotated Deep Learning Paper Implementations
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
ce23bc10
编写于
7月 25, 2021
作者:
V
Varuna Jayasiri
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
gat v2 docs
上级
800df5c4
变更
8
展开全部
隐藏空白更改
内联
并排
Showing
8 changed file
with
1953 addition
and
4 deletion
+1953
-4
docs/graphs/gatv2/experiment.html
docs/graphs/gatv2/experiment.html
+1194
-0
docs/graphs/gatv2/index.html
docs/graphs/gatv2/index.html
+589
-0
docs/graphs/gatv2/readme.html
docs/graphs/gatv2/readme.html
+149
-0
docs/graphs/index.html
docs/graphs/index.html
+1
-0
docs/index.html
docs/index.html
+1
-0
docs/sitemap.xml
docs/sitemap.xml
+16
-2
labml_nn/graphs/gatv2/__init__.py
labml_nn/graphs/gatv2/__init__.py
+2
-2
readme.md
readme.md
+1
-0
未找到文件。
docs/graphs/gatv2/experiment.html
0 → 100644
浏览文件 @
ce23bc10
此差异已折叠。
点击以展开。
docs/graphs/gatv2/index.html
0 → 100644
浏览文件 @
ce23bc10
此差异已折叠。
点击以展开。
docs/graphs/gatv2/readme.html
0 → 100644
浏览文件 @
ce23bc10
<!DOCTYPE html>
<html>
<head>
<meta
http-equiv=
"content-type"
content=
"text/html;charset=utf-8"
/>
<meta
name=
"viewport"
content=
"width=device-width, initial-scale=1.0"
/>
<meta
name=
"description"
content=
""
/>
<meta
name=
"twitter:card"
content=
"summary"
/>
<meta
name=
"twitter:image:src"
content=
"https://avatars1.githubusercontent.com/u/64068543?s=400&v=4"
/>
<meta
name=
"twitter:title"
content=
"Graph Attention Networks v2 (GATv2)"
/>
<meta
name=
"twitter:description"
content=
""
/>
<meta
name=
"twitter:site"
content=
"@labmlai"
/>
<meta
name=
"twitter:creator"
content=
"@labmlai"
/>
<meta
property=
"og:url"
content=
"https://nn.labml.ai/graphs/gatv2/readme.html"
/>
<meta
property=
"og:title"
content=
"Graph Attention Networks v2 (GATv2)"
/>
<meta
property=
"og:image"
content=
"https://avatars1.githubusercontent.com/u/64068543?s=400&v=4"
/>
<meta
property=
"og:site_name"
content=
"LabML Neural Networks"
/>
<meta
property=
"og:type"
content=
"object"
/>
<meta
property=
"og:title"
content=
"Graph Attention Networks v2 (GATv2)"
/>
<meta
property=
"og:description"
content=
""
/>
<title>
Graph Attention Networks v2 (GATv2)
</title>
<link
rel=
"shortcut icon"
href=
"/icon.png"
/>
<link
rel=
"stylesheet"
href=
"../../pylit.css"
>
<link
rel=
"canonical"
href=
"https://nn.labml.ai/graphs/gatv2/readme.html"
/>
<!-- Global site tag (gtag.js) - Google Analytics -->
<script
async
src=
"https://www.googletagmanager.com/gtag/js?id=G-4V3HC8HBLH"
></script>
<script>
window
.
dataLayer
=
window
.
dataLayer
||
[];
function
gtag
()
{
dataLayer
.
push
(
arguments
);
}
gtag
(
'
js
'
,
new
Date
());
gtag
(
'
config
'
,
'
G-4V3HC8HBLH
'
);
</script>
</head>
<body>
<div
id=
'container'
>
<div
id=
"background"
></div>
<div
class=
'section'
>
<div
class=
'docs'
>
<p>
<a
class=
"parent"
href=
"/"
>
home
</a>
<a
class=
"parent"
href=
"../index.html"
>
graphs
</a>
<a
class=
"parent"
href=
"index.html"
>
gatv2
</a>
</p>
<p>
<a
href=
"https://github.com/lab-ml/labml_nn/tree/master/labml_nn/graphs/gatv2/readme.md"
>
<img
alt=
"Github"
src=
"https://img.shields.io/github/stars/lab-ml/nn?style=social"
style=
"max-width:100%;"
/></a>
<a
href=
"https://twitter.com/labmlai"
rel=
"nofollow"
>
<img
alt=
"Twitter"
src=
"https://img.shields.io/twitter/follow/labmlai?style=social"
style=
"max-width:100%;"
/></a>
</p>
</div>
</div>
<div
class=
'section'
id=
'section-0'
>
<div
class=
'docs'
>
<div
class=
'section-link'
>
<a
href=
'#section-0'
>
#
</a>
</div>
<h1><a
href=
"https://nn.labml.ai/graph/gatv2/index.html"
>
Graph Attention Networks v2 (GATv2)
</a></h1>
<p>
This is a
<a
href=
"https://pytorch.org"
>
PyTorch
</a>
implementation of the GATv2 opeartor from the paper
<a
href=
"https://arxiv.org/abs/2105.14491"
>
How Attentive are Graph Attention Networks?
</a>
.
</p>
<p>
GATv2s work on graph data.
A graph consists of nodes and edges connecting nodes.
For example, in Cora dataset the nodes are research papers and the edges are citations that
connect the papers.
</p>
<p>
The GATv2 operator which fixes the static attention problem of the standard GAT:
since the linear layers in the standard GAT are applied right after each other, the ranking
of attended nodes is unconditioned on the query node.
In contrast, in GATv2, every node can attend to any other node.
</p>
<p>
Here is
<a
href=
"https://nn.labml.ai/graph/gatv2/experiment.html"
>
the training code
</a>
for training
a two-layer GAT on Cora dataset.
</p>
<p><a
href=
"https://app.labml.ai/run/8e27ad82ed2611ebabb691fb2028a868"
><img
alt=
"View Run"
src=
"https://img.shields.io/badge/labml-experiment-brightgreen"
/></a></p>
</div>
<div
class=
'code'
>
</div>
</div>
</div>
</div>
<script
src=
"https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.4/MathJax.js?config=TeX-AMS_HTML"
>
</script>
<!-- MathJax configuration -->
<script
type=
"text/x-mathjax-config"
>
MathJax
.
Hub
.
Config
({
tex2jax
:
{
inlineMath
:
[
[
'
$
'
,
'
$
'
]
],
displayMath
:
[
[
'
$$
'
,
'
$$
'
]
],
processEscapes
:
true
,
processEnvironments
:
true
},
// Center justify equations in code and markdown cells. Elsewhere
// we use CSS to left justify single line equations in code cells.
displayAlign
:
'
center
'
,
"
HTML-CSS
"
:
{
fonts
:
[
"
TeX
"
]
}
});
</script>
<script>
function
handleImages
()
{
var
images
=
document
.
querySelectorAll
(
'
p>img
'
)
console
.
log
(
images
);
for
(
var
i
=
0
;
i
<
images
.
length
;
++
i
)
{
handleImage
(
images
[
i
])
}
}
function
handleImage
(
img
)
{
img
.
parentElement
.
style
.
textAlign
=
'
center
'
var
modal
=
document
.
createElement
(
'
div
'
)
modal
.
id
=
'
modal
'
var
modalContent
=
document
.
createElement
(
'
div
'
)
modal
.
appendChild
(
modalContent
)
var
modalImage
=
document
.
createElement
(
'
img
'
)
modalContent
.
appendChild
(
modalImage
)
var
span
=
document
.
createElement
(
'
span
'
)
span
.
classList
.
add
(
'
close
'
)
span
.
textContent
=
'
x
'
modal
.
appendChild
(
span
)
img
.
onclick
=
function
()
{
console
.
log
(
'
clicked
'
)
document
.
body
.
appendChild
(
modal
)
modalImage
.
src
=
img
.
src
}
span
.
onclick
=
function
()
{
document
.
body
.
removeChild
(
modal
)
}
}
handleImages
()
</script>
</body>
</html>
\ No newline at end of file
docs/graphs/index.html
浏览文件 @
ce23bc10
...
@@ -69,6 +69,7 @@
...
@@ -69,6 +69,7 @@
<h1>
Graph Neural Networks
</h1>
<h1>
Graph Neural Networks
</h1>
<ul>
<ul>
<li><a
href=
"gat/index.html"
>
Graph Attention Networks (GAT)
</a></li>
<li><a
href=
"gat/index.html"
>
Graph Attention Networks (GAT)
</a></li>
<li><a
href=
"gatv2/index.html"
>
Graph Attention Networks v2 (GATv2)
</a></li>
</ul>
</ul>
</div>
</div>
<div
class=
'code'
>
<div
class=
'code'
>
...
...
docs/index.html
浏览文件 @
ce23bc10
...
@@ -115,6 +115,7 @@ implementations.</p>
...
@@ -115,6 +115,7 @@ implementations.</p>
<h4>
✨ Graph Neural Networks
</h4>
<h4>
✨ Graph Neural Networks
</h4>
<ul>
<ul>
<li><a
href=
"graphs/gat/index.html"
>
Graph Attention Networks (GAT)
</a></li>
<li><a
href=
"graphs/gat/index.html"
>
Graph Attention Networks (GAT)
</a></li>
<li><a
href=
"gatv2/index.html"
>
Graph Attention Networks v2 (GATv2)
</a></li>
</ul>
</ul>
<h4>
✨
<a
href=
"cfr/index.html"
>
Counterfactual Regret Minimization (CFR)
</a></h4>
<h4>
✨
<a
href=
"cfr/index.html"
>
Counterfactual Regret Minimization (CFR)
</a></h4>
<p>
Solving games with incomplete information such as poker with CFR.
</p>
<p>
Solving games with incomplete information such as poker with CFR.
</p>
...
...
docs/sitemap.xml
浏览文件 @
ce23bc10
...
@@ -281,7 +281,7 @@
...
@@ -281,7 +281,7 @@
<url>
<url>
<loc>
https://nn.labml.ai/index.html
</loc>
<loc>
https://nn.labml.ai/index.html
</loc>
<lastmod>
2021-07-
17
T16:30:00+00:00
</lastmod>
<lastmod>
2021-07-
25
T16:30:00+00:00
</lastmod>
<priority>
1.00
</priority>
<priority>
1.00
</priority>
</url>
</url>
...
@@ -741,9 +741,23 @@
...
@@ -741,9 +741,23 @@
</url>
</url>
<url>
<loc>
https://nn.labml.ai/graphs/gatv2/index.html
</loc>
<lastmod>
2021-07-25T16:30:00+00:00
</lastmod>
<priority>
1.00
</priority>
</url>
<url>
<loc>
https://nn.labml.ai/graphs/gatv2/experiment.html
</loc>
<lastmod>
2021-07-25T16:30:00+00:00
</lastmod>
<priority>
1.00
</priority>
</url>
<url>
<url>
<loc>
https://nn.labml.ai/graphs/index.html
</loc>
<loc>
https://nn.labml.ai/graphs/index.html
</loc>
<lastmod>
2021-07-
08
T16:30:00+00:00
</lastmod>
<lastmod>
2021-07-
25
T16:30:00+00:00
</lastmod>
<priority>
1.00
</priority>
<priority>
1.00
</priority>
</url>
</url>
...
...
labml_nn/graphs/gatv2/__init__.py
浏览文件 @
ce23bc10
...
@@ -117,7 +117,7 @@ class GraphAttentionV2Layer(Module):
...
@@ -117,7 +117,7 @@ class GraphAttentionV2Layer(Module):
# We calculate these for each head $k$. *We have omitted $\cdot^k$ for simplicity*.
# We calculate these for each head $k$. *We have omitted $\cdot^k$ for simplicity*.
#
#
# $$e_{ij} = a(\mathbf{W_l} \overrightarrow{h_i}, \mathbf{W_r} \overrightarrow{h_j}) =
# $$e_{ij} = a(\mathbf{W_l} \overrightarrow{h_i}, \mathbf{W_r} \overrightarrow{h_j}) =
# a(\overrightarrow{{g_l}_i}
}, \overrightarrow{{g_r}_j}
})$$
# a(\overrightarrow{{g_l}_i}
, \overrightarrow{{g_r}_j
})$$
#
#
# $e_{ij}$ is the attention score (importance) from node $j$ to node $i$.
# $e_{ij}$ is the attention score (importance) from node $j$ to node $i$.
# We calculate this for each head.
# We calculate this for each head.
...
@@ -131,7 +131,7 @@ class GraphAttentionV2Layer(Module):
...
@@ -131,7 +131,7 @@ class GraphAttentionV2Layer(Module):
#
#
# $$e_{ij} = \mathbf{a}^\top \text{LeakyReLU} \Big(
# $$e_{ij} = \mathbf{a}^\top \text{LeakyReLU} \Big(
# \Big[
# \Big[
# \overrightarrow{{g_l}_i}
} + \overrightarrow{{g_r}_j}
}
# \overrightarrow{{g_l}_i}
+ \overrightarrow{{g_r}_j
}
# \Big] \Big)$$
# \Big] \Big)$$
# First we calculate
# First we calculate
...
...
readme.md
浏览文件 @
ce23bc10
...
@@ -62,6 +62,7 @@ implementations almost weekly.
...
@@ -62,6 +62,7 @@ implementations almost weekly.
#### ✨ Graph Neural Networks
#### ✨ Graph Neural Networks
*
[
Graph Attention Networks (GAT)
](
https://nn.labml.ai/graphs/gat/index.html
)
*
[
Graph Attention Networks (GAT)
](
https://nn.labml.ai/graphs/gat/index.html
)
*
[
Graph Attention Networks v2 (GATv2)
](
https://nn.labml.ai/graphs/gatv2/index.html
)
#### ✨ [Counterfactual Regret Minimization (CFR)](https://nn.labml.ai/cfr/index.html)
#### ✨ [Counterfactual Regret Minimization (CFR)](https://nn.labml.ai/cfr/index.html)
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录