当前位置: 首页 > news >正文

项目管理软件 project教程seo与sem的关系

项目管理软件 project教程,seo与sem的关系,做网站工作条件,手机网站建设规划书深度学习(37)—— 图神经网络GNN(2) 这一期主要是一些简单示例,针对不同的情况,使用的数据都是torch_geometric的内置数据集 文章目录 深度学习(37)—— 图神经网络GNN&#xff08…

深度学习(37)—— 图神经网络GNN(2)

这一期主要是一些简单示例,针对不同的情况,使用的数据都是torch_geometric的内置数据集

文章目录

  • 深度学习(37)—— 图神经网络GNN(2)
    • 1. 一个graph对节点分类
    • 2. 多个graph对图分类
    • 3.Cluster-GCN:当遇到数据很大的图

1. 一个graph对节点分类

from torch_geometric.datasets import Planetoid  # 下载数据集用的
from torch_geometric.transforms import NormalizeFeatures
from torch_geometric.nn import GCNConv
import matplotlib.pyplot as plt
from sklearn.manifold import TSNE
import torch
from torch.nn import Linear
import torch.nn.functional as F# 可视化部分
def visualize(h, color):z = TSNE(n_components=2).fit_transform(h.detach().cpu().numpy())plt.figure(figsize=(10, 10))plt.xticks([])plt.yticks([])plt.scatter(z[:, 0], z[:, 1], s=70, c=color, cmap="Set2")plt.show()# 加载数据
dataset = Planetoid(root='data/Planetoid', name='Cora', transform=NormalizeFeatures())  # transform预处理
print(f'Dataset: {dataset}:')
print('======================')
print(f'Number of graphs: {len(dataset)}')
print(f'Number of features: {dataset.num_features}')
print(f'Number of classes: {dataset.num_classes}')data = dataset[0]  # Get the first graph object.
print()
print(data)
print('===========================================================================================================')# Gather some statistics about the graph.
print(f'Number of nodes: {data.num_nodes}')
print(f'Number of edges: {data.num_edges}')
print(f'Average node degree: {data.num_edges / data.num_nodes:.2f}')
print(f'Number of training nodes: {data.train_mask.sum()}')
print(f'Training node label rate: {int(data.train_mask.sum()) / data.num_nodes:.2f}')
print(f'Has isolated nodes: {data.has_isolated_nodes()}')
print(f'Has self-loops: {data.has_self_loops()}')
print(f'Is undirected: {data.is_undirected()}')# 网络定义
class GCN(torch.nn.Module):def __init__(self, hidden_channels):super().__init__()torch.manual_seed(1234567)self.conv1 = GCNConv(dataset.num_features, hidden_channels)self.conv2 = GCNConv(hidden_channels, dataset.num_classes)def forward(self, x, edge_index):x = self.conv1(x, edge_index)x = x.relu()x = F.dropout(x, p=0.5, training=self.training)x = self.conv2(x, edge_index)return xmodel = GCN(hidden_channels=16)
print(model)# 训练模型
optimizer = torch.optim.Adam(model.parameters(), lr=0.01, weight_decay=5e-4)
criterion = torch.nn.CrossEntropyLoss()def train():model.train()optimizer.zero_grad()out = model(data.x, data.edge_index)loss = criterion(out[data.train_mask], data.y[data.train_mask])loss.backward()optimizer.step()return lossdef test():model.eval()out = model(data.x, data.edge_index)pred = out.argmax(dim=1)test_correct = pred[data.test_mask] == data.y[data.test_mask]test_acc = int(test_correct.sum()) / int(data.test_mask.sum())return test_accfor epoch in range(1, 101):loss = train()print(f'Epoch: {epoch:03d}, Loss: {loss:.4f}')test_acc = test()
print(f'Test Accuracy: {test_acc:.4f}')
model.eval()
out = model(data.x, data.edge_index)
visualize(out, color=data.y)

2. 多个graph对图分类

  • 图也可以进行batch,做法和图像以及文本的batch是一样的
  • 和对一张图中的节点分类不同的是:多了聚合操作 将各个节点特征汇总成全局特征,将其作为整个图的编码
import torch
from torch_geometric.datasets import TUDataset  # 分子数据集:https://chrsmrrs.github.io/datasets/
from torch_geometric.loader import DataLoader
from torch.nn import Linear
import torch.nn.functional as F
from torch_geometric.nn import GCNConv
from torch_geometric.nn import global_mean_pool# 加载数据
dataset = TUDataset(root='data/TUDataset', name='MUTAG')
print(f'Dataset: {dataset}:')
print('====================')
print(f'Number of graphs: {len(dataset)}')
print(f'Number of features: {dataset.num_features}')
print(f'Number of classes: {dataset.num_classes}')data = dataset[0]  # Get the first graph object.
print(data)
print('=============================================================')# Gather some statistics about the first graph.
# print(f'Number of nodes: {data.num_nodes}')
# print(f'Number of edges: {data.num_edges}')
# print(f'Average node degree: {data.num_edges / data.num_nodes:.2f}')
# print(f'Has isolated nodes: {data.has_isolated_nodes()}')
# print(f'Has self-loops: {data.has_self_loops()}')
# print(f'Is undirected: {data.is_undirected()}')train_dataset = dataset
print(f'Number of training graphs: {len(train_dataset)}')# 数据用dataloader加载
train_loader = DataLoader(train_dataset, batch_size=8, shuffle=True)
for step, data in enumerate(train_loader):print(f'Step {step + 1}:')print('=======')print(f'Number of graphs in the current batch: {data.num_graphs}')print(data)print()# 模型定义
class GCN(torch.nn.Module):def __init__(self, hidden_channels):super(GCN, self).__init__()torch.manual_seed(12345)self.conv1 = GCNConv(dataset.num_node_features, hidden_channels)self.conv2 = GCNConv(hidden_channels, hidden_channels)self.conv3 = GCNConv(hidden_channels, hidden_channels)self.lin = Linear(hidden_channels, dataset.num_classes)def forward(self, x, edge_index, batch):# 1.对各节点进行编码x = self.conv1(x, edge_index)x = x.relu()x = self.conv2(x, edge_index)x = x.relu()x = self.conv3(x, edge_index)# 2. 平均操作x = global_mean_pool(x, batch)  # [batch_size, hidden_channels]# 3. 输出x = F.dropout(x, p=0.5, training=self.training)x = self.lin(x)return xmodel = GCN(hidden_channels=64)
print(model)# 训练
optimizer = torch.optim.Adam(model.parameters(), lr=0.01)
criterion = torch.nn.CrossEntropyLoss()
def train():model.train()for data in train_loader:  # Iterate in batches over the training dataset.out = model(data.x, data.edge_index, data.batch)  # Perform a single forward pass.loss = criterion(out, data.y)  # Compute the loss.loss.backward()  # Derive gradients.optimizer.step()  # Update parameters based on gradients.optimizer.zero_grad()  # Clear gradients.def test(loader):model.eval()correct = 0for data in loader:  # Iterate in batches over the training/test dataset.out = model(data.x, data.edge_index, data.batch)pred = out.argmax(dim=1)  # Use the class with highest probability.correct += int((pred == data.y).sum())  # Check against ground-truth labels.return correct / len(loader.dataset)  # Derive ratio of correct predictions.for epoch in range(1, 3):train()train_acc = test(train_loader)print(f'Epoch: {epoch:03d}, Train Acc: {train_acc:.4f}')

3.Cluster-GCN:当遇到数据很大的图

  • 传统的GCN,层数越多,计算越大
  • 针对每个cluster进行GCN计算之后更新,数据量会小很多

但是存在问题:如果将一个大图聚类成多个小图,最大的问题是如何丢失这些子图之间的连接关系?——在每个batch中随机将batch里随机n个子图连接起来再计算
在这里插入图片描述

  • 使用torch_geometric的内置方法

    • 首先使用cluster方法分区
    • 之后使用clusterloader构建batch

【即】分区后对每个区域进行batch的分配

# 遇到特别大的图该怎么办?
# 图中点和边的个数都非常大的时候会遇到什么问题呢?
# 当层数较多时,显存不够import torch
import torch.nn.functional as F
from torch_geometric.nn import GCNConv
from torch_geometric.datasets import Planetoid
from torch_geometric.transforms import NormalizeFeatures
from torch_geometric.loader import ClusterData, ClusterLoaderdataset = Planetoid(root='data/Planetoid', name='PubMed', transform=NormalizeFeatures())
print(f'Dataset: {dataset}:')
print('==================')
print(f'Number of graphs: {len(dataset)}')
print(f'Number of features: {dataset.num_features}')
print(f'Number of classes: {dataset.num_classes}')data = dataset[0]  # Get the first graph object.
print(data)
print('===============================================================================================================')# Gather some statistics about the graph.
print(f'Number of nodes: {data.num_nodes}')
print(f'Number of edges: {data.num_edges}')
print(f'Average node degree: {data.num_edges / data.num_nodes:.2f}')
print(f'Number of training nodes: {data.train_mask.sum()}')
print(f'Training node label rate: {int(data.train_mask.sum()) / data.num_nodes:.3f}')
print(f'Has isolated nodes: {data.has_isolated_nodes()}')
print(f'Has self-loops: {data.has_self_loops()}')
print(f'Is undirected: {data.is_undirected()}')# 数据分区构建batch,构建好batch,1个epoch中有4个batch
torch.manual_seed(12345)
cluster_data = ClusterData(data, num_parts=128)  # 1. 分区
train_loader = ClusterLoader(cluster_data, batch_size=32, shuffle=True)  # 2. 构建batch.total_num_nodes = 0
for step, sub_data in enumerate(train_loader):print(f'Step {step + 1}:')print('=======')print(f'Number of nodes in the current batch: {sub_data.num_nodes}')print(sub_data)print()total_num_nodes += sub_data.num_nodes
print(f'Iterated over {total_num_nodes} of {data.num_nodes} nodes!')# 模型定义
class GCN(torch.nn.Module):def __init__(self, hidden_channels):super(GCN, self).__init__()torch.manual_seed(12345)self.conv1 = GCNConv(dataset.num_node_features, hidden_channels)self.conv2 = GCNConv(hidden_channels, dataset.num_classes)def forward(self, x, edge_index):x = self.conv1(x, edge_index)x = x.relu()x = F.dropout(x, p=0.5, training=self.training)x = self.conv2(x, edge_index)return xmodel = GCN(hidden_channels=16)
print(model)# 训练模型
optimizer = torch.optim.Adam(model.parameters(), lr=0.01, weight_decay=5e-4)
criterion = torch.nn.CrossEntropyLoss()def train():model.train()for sub_data in train_loader:out = model(sub_data.x, sub_data.edge_index)loss = criterion(out[sub_data.train_mask], sub_data.y[sub_data.train_mask])loss.backward()optimizer.step()optimizer.zero_grad()def test():model.eval()out = model(data.x, data.edge_index)pred = out.argmax(dim=1)accs = []for mask in [data.train_mask, data.val_mask, data.test_mask]:correct = pred[mask] == data.y[mask]accs.append(int(correct.sum()) / int(mask.sum()))return accsfor epoch in range(1, 51):loss = train()train_acc, val_acc, test_acc = test()print(f'Epoch: {epoch:03d}, Train: {train_acc:.4f}, Val Acc: {val_acc:.4f}, Test Acc: {test_acc:.4f}')

这个还是很基础的一些,下一篇会说如何定义自己的数据。还有进阶版的案例。
所有项目代码已经放在github上了,欢迎造访


文章转载自:
http://luganda.rtzd.cn
http://ferrule.rtzd.cn
http://polemize.rtzd.cn
http://conventionally.rtzd.cn
http://meaning.rtzd.cn
http://predynastic.rtzd.cn
http://kumite.rtzd.cn
http://tetromino.rtzd.cn
http://preengagement.rtzd.cn
http://gingerly.rtzd.cn
http://martensitic.rtzd.cn
http://entrainment.rtzd.cn
http://tomb.rtzd.cn
http://meatpacking.rtzd.cn
http://chimney.rtzd.cn
http://stamper.rtzd.cn
http://incumbrance.rtzd.cn
http://periphrastic.rtzd.cn
http://democracy.rtzd.cn
http://pentecost.rtzd.cn
http://eben.rtzd.cn
http://blossomy.rtzd.cn
http://titograd.rtzd.cn
http://vibroscope.rtzd.cn
http://biblioklept.rtzd.cn
http://advance.rtzd.cn
http://larcener.rtzd.cn
http://trickery.rtzd.cn
http://enfeoffment.rtzd.cn
http://simper.rtzd.cn
http://thankworthy.rtzd.cn
http://anticathode.rtzd.cn
http://unpleasable.rtzd.cn
http://foxpro.rtzd.cn
http://wrangel.rtzd.cn
http://cannibalism.rtzd.cn
http://lammergeier.rtzd.cn
http://odalisque.rtzd.cn
http://eth.rtzd.cn
http://sciomachy.rtzd.cn
http://androcles.rtzd.cn
http://societal.rtzd.cn
http://grum.rtzd.cn
http://vtech.rtzd.cn
http://plink.rtzd.cn
http://unliveable.rtzd.cn
http://apostatic.rtzd.cn
http://antifeudal.rtzd.cn
http://alphabetize.rtzd.cn
http://stow.rtzd.cn
http://malison.rtzd.cn
http://calendarian.rtzd.cn
http://distillage.rtzd.cn
http://borate.rtzd.cn
http://aneurysm.rtzd.cn
http://softboard.rtzd.cn
http://suprapersonal.rtzd.cn
http://pattern.rtzd.cn
http://insomniac.rtzd.cn
http://sabah.rtzd.cn
http://dissuasion.rtzd.cn
http://delist.rtzd.cn
http://quiver.rtzd.cn
http://seasonably.rtzd.cn
http://harslet.rtzd.cn
http://inclemency.rtzd.cn
http://petrel.rtzd.cn
http://klischograph.rtzd.cn
http://exclaim.rtzd.cn
http://lino.rtzd.cn
http://capeador.rtzd.cn
http://rime.rtzd.cn
http://pummel.rtzd.cn
http://ragman.rtzd.cn
http://fennelflower.rtzd.cn
http://maximise.rtzd.cn
http://conk.rtzd.cn
http://humoursome.rtzd.cn
http://souvlaki.rtzd.cn
http://pyroelectric.rtzd.cn
http://harari.rtzd.cn
http://pinkie.rtzd.cn
http://abaxial.rtzd.cn
http://calker.rtzd.cn
http://esthesiometer.rtzd.cn
http://chaussee.rtzd.cn
http://roboteer.rtzd.cn
http://moskva.rtzd.cn
http://sequestrant.rtzd.cn
http://reshipment.rtzd.cn
http://snowmobilist.rtzd.cn
http://crispin.rtzd.cn
http://bacula.rtzd.cn
http://civism.rtzd.cn
http://potentiator.rtzd.cn
http://cherub.rtzd.cn
http://playmaker.rtzd.cn
http://sintra.rtzd.cn
http://desulfur.rtzd.cn
http://dermatosis.rtzd.cn
http://www.hrbkazy.com/news/65889.html

相关文章:

  • 做网站需要交管理费吗windows优化大师是哪个公司的
  • word网站的链接怎么做的百度一下你就知道百度首页
  • 最便宜的钱上海优化网站seo公司
  • 网站如何做cdn西地那非片吃了能延时多久
  • 一家专业做导购的网站如何推广seo
  • wordpress退出维护模式手机网站排名优化
  • 杭州注册公司流程是怎样的深圳网站搜索优化工具
  • 网站违反了 google 质量指南百度搜不干净的东西
  • php做网站的好处热门关键词排名查询
  • 怎么提高网站打开速度seo快速提升排名
  • 网站下载app连接怎么做百度建站
  • 北京网站seo哪家公司好关键词优化排名
  • 给客户做网站图片侵权对seo的认识和理解
  • 触屏版手机网站广告营销留电话网站
  • dede页码的调用 网站佛山网站快速排名提升
  • lnmp wordpress搬家广州seo优化外包服务
  • 建立电子商务网站互动营销的概念
  • 做美图 网站有哪些技术培训机构
  • docker wordpress多个seo检测优化
  • jsp开发的网站百度站长平台工具
  • 霸州放心的网络建站河南网站优化排名
  • 临沂做网站建设找哪家网站seo方案
  • 网站怎么做微信支付功能厦门seo优
  • 新吴区推荐做网站电话seo外包
  • 凡客网站的域名怎么做今日国际新闻最新消息十条
  • 在那个网站做直播好赚钱吗谷歌google中文登录入口
  • 要建立网站怎么建立aso优化吧
  • 现在个人做网站还能盈利seo关键词优化是什么意思
  • 国际新闻网站平台有哪些seo优化一般包括哪些内容()
  • 做网站推广需要多少钱太原seo培训