ResNet |
您所在的位置:网站首页 › 打碟机教程图解连接说明书图片 › ResNet |
现在很多网络结构都是一个命名+数字,比如(ResNet18),数字代表的是网络的深度,也就是说ResNet18 网络就是18层的吗?其实这里的18指定的是带有权重的 18层,包括卷积层和全连接层,不包括池化层和BN层。下面先贴出ResNet论文中给出的结构列表。 ps: 看这篇博文之前,一定要看 ResNet | 认识残差网络 参考资料 Resnet-18网络图示理解 ResNet 18 的结构解读 通过Pytorch实现ResNet18我们这里对 resnet-18 进行讲解。 还是要明确一点,18 包括 卷积层 全连接层不包括 池化层 BN 层 等 BN 就是批量归一化 RELU 就是激活函数 lambda x:x 这个函数的意思是输出等于输入 identity 就是残差 1个resnet block 包含2个basic block 1个resnet block 需要添加2个残差 在resnet block之间残差形式是1*1conv,在resnet block内部残差形式是lambda x:x resnet block之间的残差用粗箭头表示,resnet block内部的残差用细箭头表示 3*3conv s=2,p=1 特征图尺寸会缩小 3*3conv s=1,p=1 特征图尺寸不变 来看一下 restnet-18 的结构图 红框的一共是 18 层。 我们直接贴一下代码。 12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970import torchimport torch.nn as nnimport torch.nn.functional as Fimport numpy as np# 定义残差块ResBlockclass ResBlock(nn.Module): def __init__(self, inchannel, outchannel, stride=1): super(ResBlock, self).__init__() # 这里定义了残差块内连续的2个卷积层 self.left = nn.Sequential( nn.Conv2d(inchannel, outchannel, kernel_size=(1, 3), stride=stride, padding=(0, 1), bias=False), nn.BatchNorm2d(outchannel), nn.ReLU(inplace=True), nn.Conv2d(outchannel, outchannel, kernel_size=(1, 3), stride=1, padding=(0, 1), bias=False), nn.BatchNorm2d(outchannel) ) self.shortcut = nn.Sequential() if stride != 1 or inchannel != outchannel: # shortcut,这里为了跟2个卷积层的结果结构一致,要做处理 self.shortcut = nn.Sequential( nn.Conv2d(inchannel, outchannel, kernel_size=1, stride=stride, bias=False), nn.BatchNorm2d(outchannel) ) def forward(self, x): out = self.left(x) # 将2个卷积层的输出跟处理过的x相加,实现ResNet的基本结构 out = out + self.shortcut(x) out = F.relu(out) return outclass ResNet(nn.Module): def __init__(self, ResBlock, num_classes=1000): super(ResNet, self).__init__() self.inchannel = 64 self.conv1 = nn.Sequential( nn.Conv2d(1, 64, kernel_size=(1, 3), stride=1, padding=(0, 1), bias=False), nn.BatchNorm2d(64), nn.ReLU() ) self.layer1 = self.make_layer(ResBlock, 64, 2, stride=1) self.layer2 = self.make_layer(ResBlock, 128, 2, stride=2) self.layer3 = self.make_layer(ResBlock, 256, 2, stride=2) self.layer4 = self.make_layer(ResBlock, 512, 2, stride=2) self.fc = nn.Linear(15872, num_classes) # 这个函数主要是用来,重复同一个残差块 def make_layer(self, block, channels, num_blocks, stride): strides = [stride] + [1] * (num_blocks - 1) layers = [] for stride in strides: layers.append(block(self.inchannel, channels, stride)) self.inchannel = channels return nn.Sequential(*layers) def forward(self, x): # 在这里,整个ResNet18的结构就很清晰了 out = self.conv1(x) out = self.layer1(out) out = self.layer2(out) out = self.layer3(out) out = self.layer4(out) out = F.avg_pool2d(out, kernel_size=(1, 4)) out = out.view(out.size(0), -1) out = self.fc(out).reshape((out.shape[0], 1, 1, 1000)) return out里面具体多少层是 make_layer 来决定的。 12345678# 这个函数主要是用来,重复同一个残差块def make_layer(self, block, channels, num_blocks, stride): strides = [stride] + [1] * (num_blocks - 1) layers = [] for stride in strides: layers.append(block(self.inchannel, channels, stride)) self.inchannel = channels return nn.Sequential(*layers)这里面的图大致为「网上找的图,所以参数不匹配,结构是相通的」 |
今日新闻 |
推荐新闻 |
CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3 |