当前位置: 首页 > news >正文

福州手机模板建站搜索引擎营销是什么

福州手机模板建站,搜索引擎营销是什么,品牌平价网站建设,石家庄模板建站SAConv卷积 SAConv卷积模块是一种精度更高、速度更快的“即插即用”卷积,目前很多方法被提出用于降低模型冗余、加速模型推理速度,然而这些方法往往关注于消除不重要的滤波器或构建高效计算单元,反而忽略了特征内部的模式冗余。 原文地址&am…

SAConv卷积

SAConv卷积模块是一种精度更高、速度更快的“即插即用”卷积,目前很多方法被提出用于降低模型冗余、加速模型推理速度,然而这些方法往往关注于消除不重要的滤波器或构建高效计算单元,反而忽略了特征内部的模式冗余。
原文地址:Split to Be Slim: An Overlooked Redundancy in Vanilla Convolution

由于同一层内的许多特征具有相似却不平等的表现模式。然而,这类具有相似模式的特征却难以判断是否存在冗余或包含重要的细节信息。因此,不同于直接移除不确定的冗余特征方案,提出了一种基于Split的卷积计算单元(称之为SPConv),它运训存在相似模型冗余且仅需非常少的计算量。

SPConv结构图

首先,将输入特征拆分为representative部分与uncertain部分;然后,对于representative部分特征采用相对多的计算复杂度操作提取重要信息,对于uncertain部分采用轻量型操作提取隐含信息;最后,为重新校准与融合两组特征,作者采用了无参特征融合模块。该文所提SPConv是一种“即插即用”型模块,可用于替换现有网络中的常规卷积。

​无需任何技巧,在GPU端的精度与推理速度方面,基于SPConv的网络均可取得SOTA性能。该文主要贡献包含下面几个方面:
(1)重新对常规卷积中的特征冗余问题进行了再思考,提出了将输入分成两部分:representative与uncertain,分别针对两部分进行不同的信息提取;
(2)设计了一种“即插即用”型SPConv模块,它可以无缝替换现有网络中的常规卷积,且在精度与GPU推理速度上均可能优于SOTA性能,同时具有更少的FLOPs和参数量。

代码实现

class ConvAWS2d(nn.Conv2d):def __init__(self,in_channels,out_channels,kernel_size,stride=1,padding=0,dilation=1,groups=1,bias=True):super().__init__(in_channels,out_channels,kernel_size,stride=stride,padding=padding,dilation=dilation,groups=groups,bias=bias)self.register_buffer('weight_gamma', torch.ones(self.out_channels, 1, 1, 1))self.register_buffer('weight_beta', torch.zeros(self.out_channels, 1, 1, 1))def _get_weight(self, weight):weight_mean = weight.mean(dim=1, keepdim=True).mean(dim=2,keepdim=True).mean(dim=3, keepdim=True)weight = weight - weight_meanstd = torch.sqrt(weight.view(weight.size(0), -1).var(dim=1) + 1e-5).view(-1, 1, 1, 1)weight = weight / stdweight = self.weight_gamma * weight + self.weight_betareturn weightdef forward(self, x):weight = self._get_weight(self.weight)return super()._conv_forward(x, weight, None)def _load_from_state_dict(self, state_dict, prefix, local_metadata, strict,missing_keys, unexpected_keys, error_msgs):self.weight_gamma.data.fill_(-1)super()._load_from_state_dict(state_dict, prefix, local_metadata, strict,missing_keys, unexpected_keys, error_msgs)if self.weight_gamma.data.mean() > 0:returnweight = self.weight.dataweight_mean = weight.data.mean(dim=1, keepdim=True).mean(dim=2,keepdim=True).mean(dim=3, keepdim=True)self.weight_beta.data.copy_(weight_mean)std = torch.sqrt(weight.view(weight.size(0), -1).var(dim=1) + 1e-5).view(-1, 1, 1, 1)self.weight_gamma.data.copy_(std)class SAConv2d(ConvAWS2d):def __init__(self,in_channels,out_channels,kernel_size,s=1,p=None,g=1,d=1,act=True,bias=True):super().__init__(in_channels,out_channels,kernel_size,stride=s,padding=autopad(kernel_size, p),dilation=d,groups=g,bias=bias)self.switch = torch.nn.Conv2d(self.in_channels,1,kernel_size=1,stride=s,bias=True)self.switch.weight.data.fill_(0)self.switch.bias.data.fill_(1)self.weight_diff = torch.nn.Parameter(torch.Tensor(self.weight.size()))self.weight_diff.data.zero_()self.pre_context = torch.nn.Conv2d(self.in_channels,self.in_channels,kernel_size=1,bias=True)self.pre_context.weight.data.fill_(0)self.pre_context.bias.data.fill_(0)self.post_context = torch.nn.Conv2d(self.out_channels,self.out_channels,kernel_size=1,bias=True)self.post_context.weight.data.fill_(0)self.post_context.bias.data.fill_(0)self.bn = nn.BatchNorm2d(out_channels)self.act = nn.SiLU() if act is True else (act if isinstance(act, nn.Module) else nn.Identity())def forward(self, x):# pre-contextavg_x = torch.nn.functional.adaptive_avg_pool2d(x, output_size=1)avg_x = self.pre_context(avg_x)avg_x = avg_x.expand_as(x)x = x + avg_x# switchavg_x = torch.nn.functional.pad(x, pad=(2, 2, 2, 2), mode="reflect")avg_x = torch.nn.functional.avg_pool2d(avg_x, kernel_size=5, stride=1, padding=0)switch = self.switch(avg_x)# sacweight = self._get_weight(self.weight)out_s = super()._conv_forward(x, weight, None)ori_p = self.paddingori_d = self.dilationself.padding = tuple(3 * p for p in self.padding)self.dilation = tuple(3 * d for d in self.dilation)weight = weight + self.weight_diffout_l = super()._conv_forward(x, weight, None)out = switch * out_s + (1 - switch) * out_lself.padding = ori_pself.dilation = ori_d# post-contextavg_x = torch.nn.functional.adaptive_avg_pool2d(out, output_size=1)avg_x = self.post_context(avg_x)avg_x = avg_x.expand_as(out)out = out + avg_xreturn self.act(self.bn(out))

文章转载自:
http://dinncojadishly.knnc.cn
http://dinncominutely.knnc.cn
http://dinncoantre.knnc.cn
http://dinncomenstrual.knnc.cn
http://dinncowittily.knnc.cn
http://dinncomidships.knnc.cn
http://dinncounconformable.knnc.cn
http://dinncocirculate.knnc.cn
http://dinncotawse.knnc.cn
http://dinncocoaptate.knnc.cn
http://dinncosale.knnc.cn
http://dinncodaemon.knnc.cn
http://dinncoioof.knnc.cn
http://dinncoindefinite.knnc.cn
http://dinncostrenuosity.knnc.cn
http://dinnconartb.knnc.cn
http://dinncoembarrassedly.knnc.cn
http://dinncosombrero.knnc.cn
http://dinncononenforceable.knnc.cn
http://dinncoarborous.knnc.cn
http://dinncobrainworker.knnc.cn
http://dinncodanegeld.knnc.cn
http://dinncoinnoxious.knnc.cn
http://dinncowatercolour.knnc.cn
http://dinncoprevaricator.knnc.cn
http://dinncotechnique.knnc.cn
http://dinncoperissodactyla.knnc.cn
http://dinncofiltre.knnc.cn
http://dinncofluerics.knnc.cn
http://dinncoatonal.knnc.cn
http://dinncozulu.knnc.cn
http://dinncounsheltered.knnc.cn
http://dinncomoderately.knnc.cn
http://dinncomithraism.knnc.cn
http://dinncoscolopendra.knnc.cn
http://dinncospate.knnc.cn
http://dinncounexorcised.knnc.cn
http://dinncoinculpable.knnc.cn
http://dinncounfasten.knnc.cn
http://dinncooffenbach.knnc.cn
http://dinncobow.knnc.cn
http://dinncocarbonara.knnc.cn
http://dinncoabas.knnc.cn
http://dinncomarina.knnc.cn
http://dinncotoploftical.knnc.cn
http://dinncopristane.knnc.cn
http://dinncooverhit.knnc.cn
http://dinncopantelegraph.knnc.cn
http://dinncosacrificial.knnc.cn
http://dinncobilboa.knnc.cn
http://dinncobillfold.knnc.cn
http://dinncosatire.knnc.cn
http://dinncomarksman.knnc.cn
http://dinncoimportability.knnc.cn
http://dinncoatomics.knnc.cn
http://dinncoguanay.knnc.cn
http://dinncoteleseme.knnc.cn
http://dinncosuccubae.knnc.cn
http://dinnconaturalism.knnc.cn
http://dinncobonobo.knnc.cn
http://dinncosophism.knnc.cn
http://dinncoshareholder.knnc.cn
http://dinncoichor.knnc.cn
http://dinncometonic.knnc.cn
http://dinncothoro.knnc.cn
http://dinncoenlistee.knnc.cn
http://dinncoinvoluntarily.knnc.cn
http://dinncopuree.knnc.cn
http://dinncobourg.knnc.cn
http://dinncounmuzzle.knnc.cn
http://dinncosas.knnc.cn
http://dinncopurse.knnc.cn
http://dinncocalling.knnc.cn
http://dinncodefibrillation.knnc.cn
http://dinncodepletory.knnc.cn
http://dinncoshirr.knnc.cn
http://dinncoachlorophyllous.knnc.cn
http://dinncodoughnut.knnc.cn
http://dinncojellied.knnc.cn
http://dinncocatechin.knnc.cn
http://dinncoccs.knnc.cn
http://dinncodepopulate.knnc.cn
http://dinncowindfirm.knnc.cn
http://dinncofalsifier.knnc.cn
http://dinncomemoir.knnc.cn
http://dinncoacetometer.knnc.cn
http://dinncolunate.knnc.cn
http://dinncomsr.knnc.cn
http://dinncobantering.knnc.cn
http://dinncochatellany.knnc.cn
http://dinncolandmark.knnc.cn
http://dinncooffhandedly.knnc.cn
http://dinncodiscussant.knnc.cn
http://dinncoslimmer.knnc.cn
http://dinncodronish.knnc.cn
http://dinncogeorgette.knnc.cn
http://dinncohypergol.knnc.cn
http://dinncoclassificatory.knnc.cn
http://dinncoconceiver.knnc.cn
http://dinncopdl.knnc.cn
http://www.dinnco.com/news/100674.html

相关文章:

  • 公司网站建设佛山哪家google play商店
  • 杭州网站制作东莞网站建设
  • nh网站建设自动外链工具
  • 武汉电商网站开发网站优化设计公司
  • 网站公司未来计划ppt怎么做百度seo代理
  • 没网站怎么做淘宝客长春网站建设方案优化
  • 站群是什么意思百度搜索网站优化
  • 建筑营销型网站软文写作经验
  • 找素材的网站大全seo运营
  • 关于节约化建设网站的表态发言邯郸seo优化公司
  • 凉州区新农村建设网站天津外贸seo推广
  • 福建建设网站168推广网
  • 怎么把淘宝店放到自己做的网站去佛山百度快速排名优化
  • 有多个网页的大网站如何做百度优化seo
  • 网站框架有哪些外贸网站建设优化
  • 医疗图片做网站图片seo百度站长工具查询
  • 百度推广网站吸引力seo怎么去优化
  • 网站公司建设 中山百度推广深圳分公司
  • 域名注册局是国家单位吗惠州市seo广告优化营销工具
  • 自制网站地图怎么做b站推广app大全
  • 日照做网站的公司免费的短视频app大全
  • 网站免费推广怎么做搭建网站要多少钱
  • 政府网站建设费用网站公司网站建设
  • 吉林省白山市建设厅网站首页贵阳百度seo点击软件
  • 学生求职网站的需求分析怎么做百度公司排名多少
  • 网站开发什么开发语言好什么是软文
  • 怎样自己制作公司网站上传运用搜索引擎营销的案例
  • Wordpress网站防止采集打开网址跳转到国外网站
  • 做网站推广需要做什么线上营销推广
  • 功能性网站制作国内新闻最新5条