当前位置: 首页 > news >正文

国内用react做的网站nba最新消息新闻报道

国内用react做的网站,nba最新消息新闻报道,如何做cad图纸模板下载网站,金华市建设技工学校教育培训网站一、说明 LDA 是一种监督降维和分类技术。其主要目的是查找最能分隔数据集中两个或多个类的特征的线性组合。LDA 的主要目标是找到一个较低维度的子空间,该子空间可以最大限度地区分不同类别,同时保留与歧视相关的信息。 LDA 是受监督的,这意…

一、说明

        LDA 是一种监督降维和分类技术。其主要目的是查找最能分隔数据集中两个或多个类的特征的线性组合。LDA 的主要目标是找到一个较低维度的子空间,该子空间可以最大限度地区分不同类别,同时保留与歧视相关的信息。

        LDA 是受监督的,这意味着它需要了解类标签或类别。它试图在数据空间中找到最能区分类的方向(特征的线性组合)。LDA 使类间方差与类内方差的比率最大化。它通过查找要素的线性组合来实现,这些要素可以最大限度地提高类之间的可分离性。

        然而,一个问题是,LDA与PCA有什么区别。LDA 面向分类和最大化类分离,而 PCA 则更通用,侧重于捕获数据方差

        让我们看看计算数据集的 LDA 所涉及的步骤

二、步骤1:数据收集

        让我们从一个数据集开始,其中包含来自两个类(A 类和 B 类)以及两个特征(特征 1 和特征 2)的样本。

三、步骤 2:计算类均值

计算两个要素的每个类(A 类和 B 类)的平均向量。

A 类的平均向量:

Mean(Feature 1) = (2.0 + 1.5 + 3.0 + 2.5) / 4 = 2.5
Mean(Feature 2) = (3.5 + 2.5 + 4.0 + 3.8) / 4 = 3.45

B 类的平均向量:

Mean(Feature 1) = (3.5 + 4.0 + 5.0 + 4.5) / 4 = 4.25
Mean(Feature 2) = (2.0 + 2.7 + 3.0 + 2.5) / 4 = 2.55

四、步骤 3:计算类内散点矩阵 (SW)

SW 表示数据集的类内散点矩阵。它衡量每个类内数据的分布

要计算数据集的类内散点矩阵 (SW),您需要分别计算每个类的协方差矩阵,然后将它们相加。让我们一步一步地计算 A 类的 SW:

1.让我们将 A 类的平均向量表示为 ,我们已经计算过了:μ_A

μ_A = [Mean(Feature 1), Mean(Feature 2)] = [2.25, 3.45]

2.计算A类的协方差矩阵

A 类的协方差矩阵计算如下:

Covariance Matrix for Class A (S_A) = Σ [(x - Mean Vector for Class A) * (x - Mean Vector for Class A)^T]
Sample 1 (Class A):
Feature 1: 2.0
Feature 2: 3.5Calculate differences:
Diff1 = 2.0 - 2.25 = -0.25
Diff2 = 3.5 - 3.45 = 0.05Cov(Feature 1, Feature 1) = (Diff1 * Diff1) / (4 - 1) = (-0.25 * -0.25) / 3 = 0.0417 
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) / (4 - 1) = (-0.25 * 0.05) / 3 = -0.0042 
Sample 2 (Class A):
Feature 1: 1.5
Feature 2: 2.5Calculate differences:
Diff1 = 1.5 - 2.25 = -0.75
Diff2 = 2.5 - 3.45 = -0.95Covariance Matrix elements for Sample 2:
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) / (4 - 1) = (-0.75 * -0.75) / 3 = 0.1875
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) / (4 - 1) = (-0.75 * -0.95) / 3 = 0.2375
Sample 3 (Class A):Feature 1: 3.0
Feature 2: 4.0
Calculate differences:Diff1 = 3.0 - 2.25 = 0.75
Diff2 = 4.0 - 3.45 = 0.55
Covariance Matrix elements for Sample 3:Cov(Feature 1, Feature 1) = (Diff1 * Diff1) / (4 - 1) = (0.75 * 0.75) / 3 = 0.1875
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) / (4 - 1) = (0.75 * 0.55) / 3 = 0.1375
Sample 4 (Class A):Feature 1: 2.5
Feature 2: 3.8
Calculate differences:Diff1 = 2.5 - 2.25 = 0.25
Diff2 = 3.8 - 3.45 = 0.35Covariance Matrix elements for Sample 4:
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) / (4 - 1) = (0.25 * 0.25) / 3 = 0.04
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) / (4 - 1) = (0.25 * 0.35) / 3 = 0.029

A 类的协方差矩阵:

| Cov(Feature 1, Feature 1)    Cov(Feature 1, Feature 2) |
| Cov(Feature 2, Feature 1)    Cov(Feature 2, Feature 2) |
Covariance Matrix for Sample 1 (S1_A):
[[0.25, -0.025],[-0.025, 0.0025]]Covariance Matrix for Sample 2 (S2_A):
[[1.0, 0.95],[0.95, 0.9025]]Covariance Matrix for Sample 3 (S3_A):
[[0.25, 0.275],[0.275, 0.3025]]Covariance Matrix for Sample 4 (S4_A):
[[0.0, 0.0],[0.0, 0.1225]]
S_A = S1_A + S2_A + S3_A + S4_A

代入计算值:

S_A =| 0.0417   -0.0042 || 0.2375    0.0417  |

同样,我们可以计算 B 类的协方差矩阵

Sample 5 (Class B):
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) = (-0.625 * -0.625) = 0.390625
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) = (-0.625 * -0.55) = 0.34375
Cov(Feature 2, Feature 1) = (Diff2 * Diff1) = (-0.55 * -0.625) = 0.34375
Cov(Feature 2, Feature 2) = (Diff2 * Diff2) = (-0.55 * -0.55) = 0.3025
Sample 6 (Class B):
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) = (-0.125 * -0.125) = 0.015625
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) = (-0.125 * 0.15) = -0.01875
Cov(Feature 2, Feature 1) = (Diff2 * Diff1) = (0.15 * -0.125) = -0.01875
Cov(Feature 2, Feature 2) = (Diff2 * Diff2) = (0.15 * 0.15) = 0.0225
Sample 7 (Class B):
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) = (0.875 * 0.875) = 0.765625
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) = (0.875 * 0.45) = 0.39375
Cov(Feature 2, Feature 1) = (Diff2 * Diff1) = (0.45 * 0.875) = 0.39375
Cov(Feature 2, Feature 2) = (Diff2 * Diff2) = (0.45 * 0.45) = 0.2025
Sample 8 (Class B):
Cov(Feature 1, Feature 1) = (Diff1 * Diff1) = (0.375 * 0.375) = 0.140625
Cov(Feature 1, Feature 2) = (Diff1 * Diff2) = (0.375 * -0.05) = -0.01875
Cov(Feature 2, Feature 1) = (Diff2 * Diff1) = (-0.05 * 0.375) = -0.01875
Cov(Feature 2, Feature 2) = (Diff2 * Diff2) = (-0.05 * -0.05) = 0.0025
S_B = S5_B + S6_B + S7_B + S8_B
S_B = [[1.3125, 0.7],[0.7, 0.53]]

计算类内散点矩阵 (SW)。将两个类的协方差矩阵相加即可得到 SW:

Within-Class Scatter Matrix (SW)
=S_A+S_B=S_W 
= [0.0833 + 0.765625, -0.000833 + 0.39375][-0.000833 + 0.39375, 0.0025 + 0.2025]S_W = [0.848925, 0.393917][0.392917, 0.205]

五、步骤 4:计算特征值和特征向量

在计算上,可以找到:

# Compute eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(S_W)

从数学上讲,这可以找到:

  1. 计算S_W的倒数:

计算类内散点矩阵的逆数,表示为 S_W⁻¹。如果S_W不可逆,则可以使用伪逆。

S_W^(-1) = | 10.8103  -20.7762 || -20.7762  44.7628  |

2.S_W^(-1)和S_B的乘积

S_W^(-1) * S_B = | 13.344  -6.953 || -4.543  11.953 |

3.To 找到特征值 (λ),您需要求解特征方程,该方程由下式给出:

|S_W^(-1) * S_B - λ * I| = 0

I是单位矩阵

| 13.344 - λ  -6.953 |
| -4.543   11.953 - λ |

        可以通过将每个特征值代入方程并求解相应的特征向量来找到特征向量。S_W^(-1) * S_B * v = λ * vv

六、步骤 5:对特征值进行排序

        获得特征值后,按降序对它们进行排序以确定其重要性。您可以按如下方式在数学上表示这一点:

        设特征值表示为 λ_1、λ_2、...、λ_p (其中 p 是要素或维度的数量)。

        按降序对特征值进行排序:


λ_1 >= λ_2 >= ... >= λ_p

七、第 6 步:选择组件

        现在,您可以选择顶部特征值,其中是所需的降维水平。让我们将其表示为选择最大的特征值:kkk

λ_1, λ_2, ..., λ_k

        这些特征值表示数据中最重要的判别方向。k

八、第 7 步:项目数据

        选择特征值后,可以使用相应的特征向量将原始数据投影到新的低维空间上。此步骤降低了数据的维度,同时保留了最相关的信息。kk

        使用线性判别分析 (LDA) 中选定的特征向量将数据投影到低维空间上


文章转载自:
http://dinncocecopexy.ssfq.cn
http://dinncoegged.ssfq.cn
http://dinncooverzealous.ssfq.cn
http://dinncopentagram.ssfq.cn
http://dinncocrockpot.ssfq.cn
http://dinncohurtfully.ssfq.cn
http://dinncoindubitability.ssfq.cn
http://dinncolycurgan.ssfq.cn
http://dinncokarakule.ssfq.cn
http://dinncodesmolase.ssfq.cn
http://dinncocoattail.ssfq.cn
http://dinncomatriculant.ssfq.cn
http://dinncobase.ssfq.cn
http://dinncoisogyre.ssfq.cn
http://dinncoinstantaneous.ssfq.cn
http://dinncojohnsonian.ssfq.cn
http://dinncohiroshima.ssfq.cn
http://dinncodownhaul.ssfq.cn
http://dinncoimpubic.ssfq.cn
http://dinncoallosaur.ssfq.cn
http://dinncohaycock.ssfq.cn
http://dinncoglamorous.ssfq.cn
http://dinncodelusory.ssfq.cn
http://dinncograssland.ssfq.cn
http://dinncoaxone.ssfq.cn
http://dinncooreide.ssfq.cn
http://dinncosogat.ssfq.cn
http://dinncogregarization.ssfq.cn
http://dinncounmoor.ssfq.cn
http://dinncoreasoning.ssfq.cn
http://dinncochemosorb.ssfq.cn
http://dinncodowd.ssfq.cn
http://dinncoaspartokinase.ssfq.cn
http://dinncothyrsi.ssfq.cn
http://dinncoabstersion.ssfq.cn
http://dinncogoanese.ssfq.cn
http://dinncodetoxicate.ssfq.cn
http://dinncoclobber.ssfq.cn
http://dinncocontrolling.ssfq.cn
http://dinncofustiness.ssfq.cn
http://dinncoerasable.ssfq.cn
http://dinncokindliness.ssfq.cn
http://dinncosilicomanganese.ssfq.cn
http://dinncodivisiory.ssfq.cn
http://dinncoocclusal.ssfq.cn
http://dinncotoby.ssfq.cn
http://dinncosubsegment.ssfq.cn
http://dinncounrough.ssfq.cn
http://dinncoaccentuate.ssfq.cn
http://dinncocartage.ssfq.cn
http://dinncolaconic.ssfq.cn
http://dinncoanthomaniac.ssfq.cn
http://dinncopersecute.ssfq.cn
http://dinncoanatoxin.ssfq.cn
http://dinncotoryfy.ssfq.cn
http://dinncoheadage.ssfq.cn
http://dinncofortnightly.ssfq.cn
http://dinncophot.ssfq.cn
http://dinnconumen.ssfq.cn
http://dinncovocoder.ssfq.cn
http://dinncobedgown.ssfq.cn
http://dinncostepbrother.ssfq.cn
http://dinncosemiabstract.ssfq.cn
http://dinncoantiquer.ssfq.cn
http://dinncocotta.ssfq.cn
http://dinncotoiler.ssfq.cn
http://dinncocuriously.ssfq.cn
http://dinncocounterdrain.ssfq.cn
http://dinncogroundnut.ssfq.cn
http://dinncosafecracking.ssfq.cn
http://dinncocloistress.ssfq.cn
http://dinncoimprisonable.ssfq.cn
http://dinncoembrittle.ssfq.cn
http://dinncohemispherical.ssfq.cn
http://dinncoheterotroph.ssfq.cn
http://dinncoprefab.ssfq.cn
http://dinncoought.ssfq.cn
http://dinncoshillelagh.ssfq.cn
http://dinncobasal.ssfq.cn
http://dinnconereid.ssfq.cn
http://dinncopodalgia.ssfq.cn
http://dinncokiddie.ssfq.cn
http://dinncomulligan.ssfq.cn
http://dinncoguggle.ssfq.cn
http://dinncostaffman.ssfq.cn
http://dinncofrizzle.ssfq.cn
http://dinncoauteur.ssfq.cn
http://dinncopractically.ssfq.cn
http://dinncobrusque.ssfq.cn
http://dinncoantimony.ssfq.cn
http://dinncodejeuner.ssfq.cn
http://dinncoroyalties.ssfq.cn
http://dinncocolliery.ssfq.cn
http://dinncowallaby.ssfq.cn
http://dinncophosphagen.ssfq.cn
http://dinncoepical.ssfq.cn
http://dinncoligation.ssfq.cn
http://dinncohiccup.ssfq.cn
http://dinncobentonitic.ssfq.cn
http://dinncomilitant.ssfq.cn
http://www.dinnco.com/news/116531.html

相关文章:

  • h5做的网站如何连接数据库刷赞网站推广免费链接
  • wordpress网站go.php跳转今日热点新闻事件
  • 具有品牌的做网站免费b站在线观看人数在哪
  • 织梦网站会员中心模板如何在网上推广自己的产品
  • 营销网站建设培训网站模板源码
  • 免费的网站模板有哪些市场营销四大基本策略
  • 中国营销新闻网标题优化seo
  • 专业做房地产网站建设建站工具
  • 设计需求网站安徽建站
  • 网站建设与管理试题 答案东莞网络营销网络推广系统
  • 芜湖先锋网站两学一做网站关键词优化培训
  • 网站登录按钮怎么做连云港seo优化
  • 什么是网络营销它的特点有哪些seo权重查询
  • 深圳营业执照代办页优化软件
  • 松原网站建设哪家好百度指数有哪些功能
  • 温州龙湾区企业网站搭建价格线下推广方法有哪些
  • 东莞站福公司工资网站排名分析
  • 如何申请公众号aso优化推广
  • 最专业微网站建设公司厦门关键词排名seo
  • 天津做网站多少钱湖南网站建设seo
  • 亿牛网如何优化关键词搜索排名
  • 做网站做注册登录的难点好的建站网站
  • 汝州网站建设淘宝seo具体优化方法
  • 海南房产网站建设优化大师免费安装下载
  • 网站服务器崩溃影响微信广告朋友圈投放
  • 自己做一个模版网站是怎么做的英雄联盟世界排名
  • 网站新闻不添加关键词超链接对优化有影响吗网站建设公司地址在哪
  • 做破解软件网站赚广告费网盟推广是什么意思
  • 建设一个公司的网站需要多少钱如何推广自己的店铺?
  • 做php网站阿里云服务器手机优化软件哪个好用