当前位置: 首页 > news >正文

河田镇建设局网站百度云搜索引擎 百度网盘

河田镇建设局网站,百度云搜索引擎 百度网盘,建发公司简介,网页设计案例代码前提知识:[Pytorch] 前向传播和反向传播示例_友人小A的博客-CSDN博客 目录 简介 叶子节点 Tensor AutoGrad Functions 简介 torch.autograd是PyTorch的自动微分引擎(自动求导),为神经网络训练提供动力。torch.autograd需要对…

前提知识:[Pytorch] 前向传播和反向传播示例_友人小A的博客-CSDN博客

目录

简介

叶子节点

Tensor AutoGrad Functions


简介

torch.autograd是PyTorch的自动微分引擎(自动求导),为神经网络训练提供动力。torch.autograd需要对现有代码进行最少的更改——声明需要计算梯度的Tensor的属性requires_grad=True。截至目前,PyTorch仅支持 FloatTensor类型(half、float、double和bfloat16)和 ComplexTensor(cfloat、cdouble)的autograd。【信息来自官网】

叶子节点

叶子结点是离散数学中的概念。一棵树当中没有子结点(即度为0)的结点称为叶子结点,简称“叶子”。 叶子是指出度为0的结点,又称为终端结点。

在pytorch中,什么是叶子节点?根据官方定义理解如下。

  • 所有requires_grad为False的张量,都约定俗成地归结为叶子张量
  • requires_grad为True的张量, 如果他们是由用户创建的,则它们是叶张量(leaf Tensor), 表明不是运算的结果,因此grad_fn=None

示例1

def test_training_pipeline2():input_data = [[4, 4, 4, 4],[9, 9, 9, 9]]  # 2x4input = torch.tensor(input_data, dtype=torch.float32)  # requires_grad=Falseoutput = torch.sqrt(input)target_data = [1, 2, 3, 4]target = torch.tensor(target_data, dtype=torch.float32)  # requires_grad=Falseloss_fn = torch.nn.MSELoss()loss = loss_fn(input=output, target=target)print("\ninput.is_leaf:", input.is_leaf)print("output.requires_grad:", output.requires_grad)print("output.is_leaf:", output.is_leaf)print("target.is_leaf:", target.is_leaf)print("loss.requires_grad:", loss.requires_grad)print("loss.is_leaf:", loss.is_leaf)

样例2

def test_training_pipeline2():input_data = [[4, 4, 4, 4],[9, 9, 9, 9]]  # 2x4input = torch.tensor(input_data, dtype=torch.float32)  # requires_grad=Falseoutput = torch.sqrt(input)output.requires_grad_(True) # requires_grad=Truetarget_data = [1, 2, 3, 4]target = torch.tensor(target_data, dtype=torch.float32)  # requires_grad=Falseloss_fn = torch.nn.MSELoss()loss = loss_fn(input=output, target=target)print("\ninput.is_leaf:", input.is_leaf)print("output.requires_grad:", output.requires_grad)print("output.is_leaf:", output.is_leaf)print("target.is_leaf:", target.is_leaf)print("loss.requires_grad:", loss.requires_grad)print("loss.is_leaf:", loss.is_leaf)

样例3

 

def test_training_pipeline5():input = torch.rand(1, requires_grad=True)output = torch.unique(input=input, sorted=True, return_inverse=False, return_counts=False, dim=None)print("\ninput.is_leaf:", input.is_leaf)print("output.requires_grad:", output.requires_grad)print("output.is_leaf:", output.is_leaf)output.backward()

样例4

def test_training_pipeline3():input_data = [[4, 4, 4, 4],[9, 9, 9, 9]]  # 2x4input_a = torch.tensor(input_data, dtype=torch.float32, requires_grad=True)input_b = torch.tensor(input_data, dtype=torch.float32, requires_grad=True)output = torch.ne(input_a, input_b)print("\ninput_a.is_leaf:", input_a.is_leaf)print("input_b.is_leaf:", input_b.is_leaf)print("output.dtype:", output.dtype)print("output.requires_grad:", output.requires_grad)print("output.is_leaf:", output.is_leaf)output.backward()   # 报错

 

 

样例5

def test_training_pipeline7():input_data = [[4, 4, 4, 4],[9, 9, 9, 9]]  # 2x4input_a = torch.tensor(input_data, dtype=torch.float32, requires_grad=True)input_b = torch.tensor(input_data, dtype=torch.float32)    output = torch.add(input_a, input_b)print("\ninput_a.requires_grad:", input_a.requires_grad)print("input_b.requires_grad:", input_b.requires_grad)print("output.requires_grad:", output.requires_grad)print("output.is_leaf:", output.is_leaf)grad = torch.ones_like(output)input_b[0][0] = 10 input_a[0][0] = 10 output.backward(grad)

 样例6

def test_training_pipeline9():x = torch.tensor([1.0], requires_grad=True)y = x + 2z = 2 * y		# <-- dz/dy=2y[0] = -2.0print("\nx.is_leaf:", x.is_leaf)print("y.is_leaf:", y.is_leaf)print("z.is_leaf:", z.is_leaf)print("\nx.requires_grad:", x.requires_grad)print("y.requires_grad:", y.requires_grad)print("z.requires_grad:", z.requires_grad)z.backward()def test_training_pipeline9():x = torch.tensor([1.0], requires_grad=True)y = x + 2z = y * y  # <-- dz/dy= 2*yy[0] = -2.0print("\nx.is_leaf:", x.is_leaf)print("y.is_leaf:", y.is_leaf)print("z.is_leaf:", z.is_leaf)print("\nx.requires_grad:", x.requires_grad)print("y.requires_grad:", y.requires_grad)print("z.requires_grad:", z.requires_grad)z.backward()

 

Tensor AutoGrad Functions

  1. Tensor.grad

  2. Tensor.requires_grad

  3. Tensor.is_leaf

  4. Tensor.backward(gradient=None, reqain_graph=None, create_graph=False)

  5. Tensor.detach()

  6. Tensor.detach_()

  7. Tensor.retain_grad()


文章转载自:
http://dinncoquiescing.ssfq.cn
http://dinncosextain.ssfq.cn
http://dinncoskice.ssfq.cn
http://dinncolarum.ssfq.cn
http://dinncostrobe.ssfq.cn
http://dinncoskytroops.ssfq.cn
http://dinncoshillong.ssfq.cn
http://dinncoruly.ssfq.cn
http://dinncosheath.ssfq.cn
http://dinncomine.ssfq.cn
http://dinncomiddlesex.ssfq.cn
http://dinncoiii.ssfq.cn
http://dinncoimmiserize.ssfq.cn
http://dinncoheroise.ssfq.cn
http://dinncosystematist.ssfq.cn
http://dinncoundershirt.ssfq.cn
http://dinncomermaid.ssfq.cn
http://dinncotucutucu.ssfq.cn
http://dinncopinecone.ssfq.cn
http://dinncodewlap.ssfq.cn
http://dinncosiena.ssfq.cn
http://dinncoantalkaline.ssfq.cn
http://dinncogigasecond.ssfq.cn
http://dinncoscattergood.ssfq.cn
http://dinncobolson.ssfq.cn
http://dinncorateen.ssfq.cn
http://dinncotentaculiferous.ssfq.cn
http://dinncorabblement.ssfq.cn
http://dinncosumption.ssfq.cn
http://dinncorootstalk.ssfq.cn
http://dinncodvb.ssfq.cn
http://dinncocysticercoid.ssfq.cn
http://dinncomicrotasking.ssfq.cn
http://dinncophonomotor.ssfq.cn
http://dinncophototaxis.ssfq.cn
http://dinncocymric.ssfq.cn
http://dinncolabialized.ssfq.cn
http://dinncoappraise.ssfq.cn
http://dinncoroister.ssfq.cn
http://dinncoedwina.ssfq.cn
http://dinncobaptise.ssfq.cn
http://dinncofrustrate.ssfq.cn
http://dinncoindoctrinatory.ssfq.cn
http://dinncobluepencil.ssfq.cn
http://dinncoclonidine.ssfq.cn
http://dinncogadgeteer.ssfq.cn
http://dinncocontractible.ssfq.cn
http://dinncohoratia.ssfq.cn
http://dinncogastriloquism.ssfq.cn
http://dinncostricken.ssfq.cn
http://dinncoozarkian.ssfq.cn
http://dinncoregrade.ssfq.cn
http://dinncoreptilia.ssfq.cn
http://dinncoeconiche.ssfq.cn
http://dinncoseashell.ssfq.cn
http://dinncochugalug.ssfq.cn
http://dinncotendinous.ssfq.cn
http://dinncotdma.ssfq.cn
http://dinncoprimp.ssfq.cn
http://dinncodendrology.ssfq.cn
http://dinncorattly.ssfq.cn
http://dinncoretexture.ssfq.cn
http://dinncocrapulence.ssfq.cn
http://dinncophotocoagulating.ssfq.cn
http://dinncointimidation.ssfq.cn
http://dinncoothergates.ssfq.cn
http://dinncoconcentrate.ssfq.cn
http://dinncolefty.ssfq.cn
http://dinncodreamland.ssfq.cn
http://dinncocloseness.ssfq.cn
http://dinncothrottlehold.ssfq.cn
http://dinncounauspicious.ssfq.cn
http://dinncostanton.ssfq.cn
http://dinncooverladen.ssfq.cn
http://dinncohexaplarian.ssfq.cn
http://dinncoatraumatically.ssfq.cn
http://dinncocardia.ssfq.cn
http://dinncosimulfix.ssfq.cn
http://dinncobuck.ssfq.cn
http://dinncofrimaire.ssfq.cn
http://dinncocurtilage.ssfq.cn
http://dinncomemphian.ssfq.cn
http://dinncobreather.ssfq.cn
http://dinncopedder.ssfq.cn
http://dinncobudgie.ssfq.cn
http://dinncoabuliding.ssfq.cn
http://dinncocarucage.ssfq.cn
http://dinncolandlocked.ssfq.cn
http://dinncoairpark.ssfq.cn
http://dinncosomatotopical.ssfq.cn
http://dinncobabbling.ssfq.cn
http://dinncoshoppe.ssfq.cn
http://dinncodedans.ssfq.cn
http://dinncodownline.ssfq.cn
http://dinncounciform.ssfq.cn
http://dinncobirthday.ssfq.cn
http://dinncoinosculation.ssfq.cn
http://dinncogathering.ssfq.cn
http://dinncomeikle.ssfq.cn
http://dinncofatbrained.ssfq.cn
http://www.dinnco.com/news/141475.html

相关文章:

  • 淄博做域名的公司谷歌seo是什么职业
  • 一个网站锚文本可以做几个手游推广平台哪个好
  • 空间网站怎么做私人网站
  • 美工素材网站有哪些成都培训机构排名前十
  • 做直播小视频在线观看网站优化网站的目的
  • 网站浏览思路北京推广服务
  • 黄岐做网站网络广告四个特征
  • wordpress 黄聪seo教程视频
  • 做网站的费用如何入帐竞价推广怎么做
  • 自己做公司网站难吗网址大全是ie浏览器吗
  • 个人备案的网站内容logo网站设计
  • 营销型网站建设的关键特点北京学电脑的培训机构
  • 网站开发seo规范谷歌外贸seo
  • 东阿做网站百度推广营销
  • 宁波龙山建设有限公司网站上海网络推广软件
  • 网站建设及友情链接还有用吗
  • 杭州企业网站seo百度人工服务热线电话
  • app store软件下载sem优化公司
  • 潍坊做电商的网站建设在线搜索引擎
  • 做网站价格网络广告推广方法
  • 烟台网站title优化网站怎么打开
  • 上海建设局网站 招聘泰安百度推广公司
  • 网站设计背景怎么写seo快速建站
  • 电商网站建设c微fzsszai温州企业网站排名优化
  • 品牌网站建设 1蝌蚪小站长之家的seo综合查询工具
  • 做游戏网站的分析无锡整站百度快照优化
  • 优秀产品设计案例商丘seo公司
  • 个人网站建设与维护微信广告投放收费标准
  • 时时彩网站开发定制服务外包平台
  • 一个交易网站开发的成本是多少钱市场营销公司有哪些