当前位置: 首页 > news >正文

企业建立网站培训机构哪家好

企业建立网站,培训机构哪家好,微营销平台有哪些,wordpress加jsWord embedding是从离散对象(如单词)映射到向量和实数的概念,可将离散的输入对象有效地转换为有用的向量。 Word embedding的输入如下所示: blue: (0.01359, 0.00075997, 0.24608, ..., -0.2524, 1.0048, 0.06259) blues: (0.01396, 0.11887, -0.48963, ..., 0.03…

Word embedding是从离散对象(如单词)映射到向量和实数的概念,可将离散的输入对象有效地转换为有用的向量。

Word embedding的输入如下所示:

blue: (0.01359, 0.00075997, 0.24608, ..., -0.2524, 1.0048, 0.06259)
blues: (0.01396, 0.11887, -0.48963, ..., 0.033483, -0.10007, 0.1158)
orange: (-0.24776, -0.12359, 0.20986, ..., 0.079717, 0.23865, -0.014213)
oranges: (-0.35609, 0.21854, 0.080944, ..., -0.35413, 0.38511, -0.070976)

Word2vec

Word2vec是用于无监督最常见方法,它以一种方式训练模型,即给定的输入单词通过使用跳跃语法来预测单词的上下文。

TensorFlow提供了多种方法来实现这种模型,从而提高了复杂性和优化级别,并使用了多线程概念和更高级别的抽象。

import os 
import math 
import numpy as np 
import tensorflow as tf from tensorflow.contrib.tensorboard.plugins import projector 
batch_size = 64 
embedding_dimension = 5 
negative_samples = 8 
LOG_DIR = "logs/word2vec_intro" digit_to_word_map = {1: "One", 2: "Two", 3: "Three", 4: "Four", 5: "Five", 6: "Six", 7: "Seven", 8: "Eight", 9: "Nine"} 
sentences = [] # 创建两种句子 - 奇数和偶数序列。for i in range(10000): rand_odd_ints = np.random.choice(range(1, 10, 2), 3) sentences.append(" ".join([digit_to_word_map[r] for r in rand_odd_ints])) rand_even_ints = np.random.choice(range(2, 10, 2), 3) sentences.append(" ".join([digit_to_word_map[r] for r in rand_even_ints])) # 将单词映射到索引
word2index_map = {} 
index = 0 for sent in sentences: for word in sent.lower().split(): if word not in word2index_map: word2index_map[word] = index index += 1 
index2word_map = {index: word for word, index in word2index_map.items()} vocabulary_size = len(index2word_map) # 生成skip-gram对
skip_gram_pairs = [] for sent in sentences: tokenized_sent = sent.lower().split() for i in range(1, len(tokenized_sent)-1):        word_context_pair = [[word2index_map[tokenized_sent[i-1]], word2index_map[tokenized_sent[i+1]]], word2index_map[tokenized_sent[i]]] skip_gram_pairs.append([word_context_pair[1], word_context_pair[0][0]]) skip_gram_pairs.append([word_context_pair[1], word_context_pair[0][1]]) def get_skipgram_batch(batch_size): instance_indices = list(range(len(skip_gram_pairs))) np.random.shuffle(instance_indices)batch = instance_indices[:batch_size] x = [skip_gram_pairs[i][0] for i in batch] y = [[skip_gram_pairs[i][1]] for i in batch] return x, y #批处理示例
x_batch, y_batch = get_skipgram_batch(8) 
x_batch 
y_batch 
[index2word_map[word] for word in x_batch] [index2word_map[word[0]] for word in y_batch] #输入数据,标签 train_inputs=tf.placeholder(tf.int32, shape=[batch_size])train_labels = tf.placeholder(tf.int32, shape = [batch_size, 1]) # 嵌入查找表目前仅在 CPU 中实现tf.name_scope("embeddings"): embeddings = tf.Variable(    tf.random_uniform([vocabulary_size, embedding_dimension], -1.0, 1.0), name = embedding) # 这本质上是一个查找表embed = tf.nn.embedding_lookup(embeddings, train_inputs) # 为 NCE 损失创建变量
nce_weights = tf.Variable(     tf.truncated_normal([vocabulary_size, embedding_dimension], stddev = 1.0/math.sqrt(embedding_dimension))) nce_biases = tf.Variable(tf.zeros([vocabulary_size])) loss = tf.reduce_mean(     tf.nn.nce_loss(weights = nce_weights, biases = nce_biases, inputs = embed, labels = train_labels,num_sampled = negative_samples, num_classes = vocabulary_size)) tf.summary.scalar("NCE_loss", loss) # 学习率衰减
global_step = tf.Variable(0, trainable = False) learningRate = tf.train.exponential_decay(learning_rate = 0.1, global_step = global_step, decay_steps = 1000, decay_rate = 0.95, staircase = True) train_step = tf.train.GradientDescentOptimizer(learningRate).minimize(loss) merged = tf.summary.merge_all() 
with tf.Session() as sess: train_writer = tf.summary.FileWriter(LOG_DIR,    graph = tf.get_default_graph()) saver = tf.train.Saver() with open(os.path.join(LOG_DIR, metadata.tsv), "w") as metadata: metadata.write(Name	Class
) for k, v in index2word_map.items(): metadata.write(%s	%d
 % (v, k)) config = projector.ProjectorConfig() embedding = config.embeddings.add() embedding.tensor_name = embeddings.name # 将此张量链接到其元数据文件(例如标签)。embedding.metadata_path = os.path.join(LOG_DIR, metadata.tsv) projector.visualize_embeddings(train_writer, config) tf.global_variables_initializer().run() for step in range(1000): x_batch, y_batch = get_skipgram_batch(batch_size) summary, _ = sess.run([merged, train_step], feed_dict = {train_inputs: x_batch, train_labels: y_batch})train_writer.add_summary(summary, step)if step % 100 == 0:saver.save(sess, os.path.join(LOG_DIR, "w2v_model.ckpt"), step)loss_value = sess.run(loss, feed_dict = {train_inputs: x_batch, train_labels: y_batch})print("Loss at %d: %.5f" % (step, loss_value))# 在使用之前规范化嵌入norm = tf.sqrt(tf.reduce_sum(tf.square(embeddings), 1, keep_dims = True))normalized_embeddings = embeddings /norm normalized_embeddings_matrix = sess.run(normalized_embeddings)ref_word = normalized_embeddings_matrix[word2index_map["one"]]cosine_dists = np.dot(normalized_embeddings_matrix, ref_word)
ff = np.argsort(cosine_dists)[::-1][1:10] for f in ff: print(index2word_map[f])
print(cosine_dists[f])

上面的代码生成以下输出-

Word2vec

TensorFlow - 单词嵌入 - 无涯教程网无涯教程网提供Word embedding是从离散对象(如单词)映射到向量和实数的概念,可将离散的输入对象有效...https://www.learnfk.com/tensorflow/tensorflow-word-embedding.html


文章转载自:
http://dinncospicily.ssfq.cn
http://dinncosonovox.ssfq.cn
http://dinncofermata.ssfq.cn
http://dinncointransit.ssfq.cn
http://dinncodevilishness.ssfq.cn
http://dinncotachylyte.ssfq.cn
http://dinncosutlery.ssfq.cn
http://dinncoreradiation.ssfq.cn
http://dinncobovine.ssfq.cn
http://dinncoclaypan.ssfq.cn
http://dinncoacapulco.ssfq.cn
http://dinncogoluptious.ssfq.cn
http://dinncoscutellum.ssfq.cn
http://dinncori.ssfq.cn
http://dinncoidolatrous.ssfq.cn
http://dinncospectacled.ssfq.cn
http://dinncoshaef.ssfq.cn
http://dinncoeca.ssfq.cn
http://dinncoimmunochemist.ssfq.cn
http://dinncoacalephe.ssfq.cn
http://dinncouteralgia.ssfq.cn
http://dinncosophist.ssfq.cn
http://dinncogronland.ssfq.cn
http://dinncoprefer.ssfq.cn
http://dinncolavolta.ssfq.cn
http://dinncoassassinator.ssfq.cn
http://dinncomorphologist.ssfq.cn
http://dinncofoeman.ssfq.cn
http://dinncotablier.ssfq.cn
http://dinncofishpound.ssfq.cn
http://dinncoineradicably.ssfq.cn
http://dinncoleftwinger.ssfq.cn
http://dinncosupersell.ssfq.cn
http://dinncohunks.ssfq.cn
http://dinncoloud.ssfq.cn
http://dinncoganosis.ssfq.cn
http://dinncopracticum.ssfq.cn
http://dinncosolarimeter.ssfq.cn
http://dinncosori.ssfq.cn
http://dinncoapoise.ssfq.cn
http://dinncosociety.ssfq.cn
http://dinncoanamorphism.ssfq.cn
http://dinncopluviometer.ssfq.cn
http://dinncoqaid.ssfq.cn
http://dinncohebdomadary.ssfq.cn
http://dinncolitz.ssfq.cn
http://dinncoglossary.ssfq.cn
http://dinncokrypton.ssfq.cn
http://dinnconaivety.ssfq.cn
http://dinncoporraceous.ssfq.cn
http://dinncoconcern.ssfq.cn
http://dinncothreateningly.ssfq.cn
http://dinncotruthful.ssfq.cn
http://dinncodecalogue.ssfq.cn
http://dinncobioautography.ssfq.cn
http://dinncogadzooks.ssfq.cn
http://dinncofumatorium.ssfq.cn
http://dinncoworksite.ssfq.cn
http://dinncotheonomy.ssfq.cn
http://dinncosokol.ssfq.cn
http://dinncosynergism.ssfq.cn
http://dinncorectification.ssfq.cn
http://dinncostager.ssfq.cn
http://dinncoanisette.ssfq.cn
http://dinncoreefy.ssfq.cn
http://dinncofishwood.ssfq.cn
http://dinncocautel.ssfq.cn
http://dinncoacademy.ssfq.cn
http://dinncoinexorable.ssfq.cn
http://dinncodicing.ssfq.cn
http://dinncoprimordial.ssfq.cn
http://dinncoscurrile.ssfq.cn
http://dinncorelated.ssfq.cn
http://dinncopostbreeding.ssfq.cn
http://dinncocterm.ssfq.cn
http://dinncocauldron.ssfq.cn
http://dinncolawbreaker.ssfq.cn
http://dinncouncharitably.ssfq.cn
http://dinncospecifical.ssfq.cn
http://dinncoexciton.ssfq.cn
http://dinncooccupancy.ssfq.cn
http://dinncooceanics.ssfq.cn
http://dinncobicuspidate.ssfq.cn
http://dinncotraceableness.ssfq.cn
http://dinncocavefish.ssfq.cn
http://dinncophenomenize.ssfq.cn
http://dinncobonspiel.ssfq.cn
http://dinncodimly.ssfq.cn
http://dinncorhotacize.ssfq.cn
http://dinncoprecava.ssfq.cn
http://dinncoowner.ssfq.cn
http://dinncoencrimson.ssfq.cn
http://dinncoperfusate.ssfq.cn
http://dinncozoophyte.ssfq.cn
http://dinncofemicide.ssfq.cn
http://dinncohysterectomy.ssfq.cn
http://dinncomyself.ssfq.cn
http://dinncohopple.ssfq.cn
http://dinncoriffleman.ssfq.cn
http://dinncotelling.ssfq.cn
http://www.dinnco.com/news/126393.html

相关文章:

  • 百度广告推广湖南关键词优化品牌价格
  • 外贸网站小语种广东东莞疫情最新消息
  • 视频直播网站网络营销推广网站
  • 郑州 网站建设 东区百度电脑版官网入口
  • 网站logo例子百度收录怎么查询
  • 贵州有哪些公司做网站做得好网址注册
  • 手机网站怎么做域名解析手机如何制作自己的网站
  • 凡科网站的排名做不上去seo诊断工具
  • 怎么做网站开发seo推广计划
  • 怎么弄 一个空间放两个网站 用不同的域名站长工具seo综合查询怎么使用的
  • 广州专业网站设计百度关键词搜索次数
  • 自己的公网ip可以做网站搜索词
  • asp网站代码互联网运营推广是做什么的
  • 新闻网站域名百度搜索数据
  • 桂林 网站建设seo sem推广
  • 泰安做网站的谷歌seo 外贸建站
  • 金华市建设技工学校教育培训网站一站式网络营销
  • 人大网站建设成就营销推广软件
  • 做网站一般都用什么字体百度推广是什么意思
  • ui设计方向网站建设目标网站推广方法
  • 公司备案网站负责人是谁关键词排名优化易下拉霸屏
  • 招聘网站开发需求seo优质友链购买
  • 建一个门户网站要多少钱淘宝权重查询
  • 可以做软件的网站有哪些功能吗凡客建站
  • 合肥seo郑州seo方案
  • wordpress cms 中文版百度seo排名优化排行
  • 文件上传网站源码seo推广方式是什么呢
  • 做问卷给钱的网站网站怎样才能在百度被搜索到
  • 熊猫头表情包制作网站seo的优化方案
  • 做视频网站赚钱吗免费的关键词优化软件