当前位置: 首页 > news >正文

什么是网站架构今天热点新闻事件

什么是网站架构,今天热点新闻事件,天津协会网站建设,咸阳做网站开发公司哪家好文章目录 一、完整代码二、过程实现2.1 导包2.2 数据准备2.3 字符分词2.4 构建数据集2.5 定义模型2.6 模型训练2.7 模型推理 三、整体总结 采用RNN和unicode分词进行文本生成 一、完整代码 这里我们使用tensorflow实现,代码如下: # 完整代码在这里 imp…

文章目录

    • 一、完整代码
    • 二、过程实现
      • 2.1 导包
      • 2.2 数据准备
      • 2.3 字符分词
      • 2.4 构建数据集
      • 2.5 定义模型
      • 2.6 模型训练
      • 2.7 模型推理
    • 三、整体总结

采用RNN和unicode分词进行文本生成

一、完整代码

这里我们使用tensorflow实现,代码如下:

# 完整代码在这里
import tensorflow as tf
import keras_nlp
import numpy as nptokenizer = keras_nlp.tokenizers.UnicodeCodepointTokenizer(vocabulary_size=400)# tokens - ids
ids = tokenizer(['Why are you so funny?', 'how can i get you'])# ids - tokens
tokenizer.detokenize(ids)def split_input_target(sequence):input_text = sequence[:-1]target_text = sequence[1:]return input_text, target_text# 准备数据
text = open('./shakespeare.txt', 'rb').read().decode(encoding='utf-8')
dataset = tf.data.Dataset.from_tensor_slices(tokenizer(text))
dataset = dataset.batch(64, drop_remainder=True)
dataset = dataset.map(split_input_target).batch(64)input, ouput = dataset.take(1).get_single_element()# 定义模型d_model = 512
rnn_units = 1025class CustomModel(tf.keras.Model):def __init__(self, vocabulary_size, d_model, rnn_units):super().__init__(self)self.embedding = tf.keras.layers.Embedding(vocabulary_size, d_model)self.gru = tf.keras.layers.GRU(rnn_units, return_sequences=True, return_state=True)self.dense = tf.keras.layers.Dense(vocabulary_size, activation='softmax')def call(self, inputs, states=None, return_state=False, training=False):x = inputsx = self.embedding(x)if states is None:states = self.gru.get_initial_state(x)x, states = self.gru(x, initial_state=states, training=training)x = self.dense(x, training=training)if return_state:return x, stateselse:return xmodel = CustomModel(tokenizer.vocabulary_size(), d_model, rnn_units)# 查看模型结构
model(input)
model.summary()# 模型配置
model.compile(loss = tf.losses.SparseCategoricalCrossentropy(),optimizer='adam',metrics=['accuracy']
)# 模型训练
model.fit(dataset, epochs=3)# 模型推理
class InferenceModel(tf.keras.Model):def __init__(self, model, tokenizer):super().__init__(self)self.model = modelself.tokenizer = tokenizerdef generate(self, inputs, length, return_states=False):inputs = inputs = tf.constant(inputs)[tf.newaxis]states = Noneinput_ids = self.tokenizer(inputs).to_tensor()outputs = []for i in range(length):predicted_logits, states = model(inputs=input_ids, states=states, return_state=True)input_ids = tf.argmax(predicted_logits, axis=-1)outputs.append(input_ids[0][-1].numpy())outputs = self.tokenizer.detokenize(lst).numpy().decode('utf-8')if return_states:return outputs, stateselse:return outputsinfere = InferenceModel(model, tokenizer)# 开始推理
start_chars = 'hello'
outputs = infere.generate(start_chars, 1000)
print(start_chars + outputs)

二、过程实现

2.1 导包

先导包tensorflow, keras_nlp, numpy

import tensorflow as tf
import keras_nlp
import numpy as np

2.2 数据准备

数据来自莎士比亚的作品 storage.googleapis.com/download.tensorflow.org/data/shakespeare.txt;我们将其下载下来存储为shakespeare.txt

2.3 字符分词

这里我们使用unicode分词:将所有字符都作为一个词来进行分词

tokenizer = keras_nlp.tokenizers.UnicodeCodepointTokenizer(vocabulary_size=400)# tokens - ids
ids = tokenizer(['Why are you so funny?', 'how can i get you'])# ids - tokens
tokenizer.detokenize(ids)

2.4 构建数据集

利用tokenizertext数据构建数据集

def split_input_target(sequence):input_text = sequence[:-1]target_text = sequence[1:]return input_text, target_texttext = open('./shakespeare.txt', 'rb').read().decode(encoding='utf-8')
dataset = tf.data.Dataset.from_tensor_slices(tokenizer(text))
dataset = dataset.batch(64, drop_remainder=True)
dataset = dataset.map(split_input_target).batch(64)input, ouput = dataset.take(1).get_single_element()

2.5 定义模型

d_model = 512
rnn_units = 1025class CustomModel(tf.keras.Model):def __init__(self, vocabulary_size, d_model, rnn_units):super().__init__(self)self.embedding = tf.keras.layers.Embedding(vocabulary_size, d_model)self.gru = tf.keras.layers.GRU(rnn_units, return_sequences=True, return_state=True)self.dense = tf.keras.layers.Dense(vocabulary_size, activation='softmax')def call(self, inputs, states=None, return_state=False, training=False):x = inputsx = self.embedding(x)if states is None:states = self.gru.get_initial_state(x)x, states = self.gru(x, initial_state=states, training=training)x = self.dense(x, training=training)if return_state:return x, stateselse:return xmodel = CustomModel(tokenizer.vocabulary_size(), d_model, rnn_units)# 查看模型结构
model(input)
model.summary()

2.6 模型训练

model.compile(loss = tf.losses.SparseCategoricalCrossentropy(),optimizer='adam',metrics=['accuracy']
)model.fit(dataset, epochs=3)

2.7 模型推理

定义一个InferenceModel进行模型推理配置;

class InferenceModel(tf.keras.Model):def __init__(self, model, tokenizer):super().__init__(self)self.model = modelself.tokenizer = tokenizerdef generate(self, inputs, length, return_states=False):inputs = inputs = tf.constant(inputs)[tf.newaxis]states = Noneinput_ids = self.tokenizer(inputs).to_tensor()outputs = []for i in range(length):predicted_logits, states = model(inputs=input_ids, states=states, return_state=True)input_ids = tf.argmax(predicted_logits, axis=-1)outputs.append(input_ids[0][-1].numpy())outputs = self.tokenizer.detokenize(lst).numpy().decode('utf-8')if return_states:return outputs, stateselse:return outputsinfere = InferenceModel(model, tokenizer)start_chars = 'hello'
outputs = infere.generate(start_chars, 1000)
print(start_chars + outputs)

生成结果如下所示,感觉很差:

hellonofur us:
medous, teserwomador.
walled o y.
as
t aderemowate tinievearetyedust. manonels,
w?
workeneastily.
watrenerdores aner'shra
palathermalod, te a y, s adousced an
ptit: mamerethus:
bas as t: uaruriryedinesm's lesoureris lares palit al ancoup, maly thitts?
b veatrt
watyeleditenchitr sts, on fotearen, medan ur
tiblainou-lele priniseryo, ofonet manad plenerulyo
thilyr't th
palezedorine.
ti dous slas, sed, ang atad t,
wanti shew.
e
upede wadraredorenksenche:
wedemen stamesly ateara tiafin t t pes:
t: tus mo at
io my.
ane hbrelely berenerusedus' m tr;
p outellilid ng
ait tevadwantstry.
arafincara, es fody
'es pra aluserelyonine
pales corseryea aburures
angab:
sunelyothe: s al, chtaburoly o oonis s tioute tt,
pro.
tedeslenali: s 't ing h
sh, age de, anet: hathes: s es'tht,
as:
wedly at s serinechamai:
mored t.
t monatht t athoumonches le.
chededondirineared
ter
p y
letinalys
ani
aconen,
t rs:
t;et, tes-
luste aly,
thonort aly one telus, s mpsantenam ranthinarrame! a
pul; bon
s fofuly

三、整体总结

RNN结合unicode分词能进行文本生成但是效果一言难尽!


文章转载自:
http://dinncoplantimal.tpps.cn
http://dinncoerection.tpps.cn
http://dinncowindfall.tpps.cn
http://dinncocenesthesia.tpps.cn
http://dinnconucleophilic.tpps.cn
http://dinncounwind.tpps.cn
http://dinncocredibility.tpps.cn
http://dinncoilliberality.tpps.cn
http://dinncoderringer.tpps.cn
http://dinncosuperhelical.tpps.cn
http://dinncosaginaw.tpps.cn
http://dinncokiska.tpps.cn
http://dinncodagoba.tpps.cn
http://dinncoadrenotropic.tpps.cn
http://dinncosquawkbox.tpps.cn
http://dinncoflux.tpps.cn
http://dinncoisograph.tpps.cn
http://dinnconarcoanalysis.tpps.cn
http://dinncodac.tpps.cn
http://dinncofriedmanite.tpps.cn
http://dinncoinscrutably.tpps.cn
http://dinncomercenarism.tpps.cn
http://dinncomultivallate.tpps.cn
http://dinncoflocci.tpps.cn
http://dinncoolga.tpps.cn
http://dinncomsph.tpps.cn
http://dinncodyspathy.tpps.cn
http://dinncomultifunctional.tpps.cn
http://dinncorhinopharyngeal.tpps.cn
http://dinncostodginess.tpps.cn
http://dinncoamboyna.tpps.cn
http://dinncogynecocracy.tpps.cn
http://dinncocyclopaedist.tpps.cn
http://dinncopersnickety.tpps.cn
http://dinncosapotaceous.tpps.cn
http://dinncoleisurable.tpps.cn
http://dinncopneuma.tpps.cn
http://dinncoassemblywoman.tpps.cn
http://dinncoquinsy.tpps.cn
http://dinncogallowglass.tpps.cn
http://dinncodistracted.tpps.cn
http://dinncopindling.tpps.cn
http://dinncodistrainer.tpps.cn
http://dinncofluvial.tpps.cn
http://dinncologoff.tpps.cn
http://dinncodrecky.tpps.cn
http://dinncogauntry.tpps.cn
http://dinncoantilabor.tpps.cn
http://dinncotryout.tpps.cn
http://dinncodioecious.tpps.cn
http://dinncounhandsome.tpps.cn
http://dinncopoikilothermous.tpps.cn
http://dinncoelytra.tpps.cn
http://dinncoelectroslag.tpps.cn
http://dinncomontpellier.tpps.cn
http://dinncoposadero.tpps.cn
http://dinncoachromatopsy.tpps.cn
http://dinncocatachrestically.tpps.cn
http://dinncoconterminal.tpps.cn
http://dinncolonghead.tpps.cn
http://dinncoabundantly.tpps.cn
http://dinnconairobi.tpps.cn
http://dinncochess.tpps.cn
http://dinncopiscivorous.tpps.cn
http://dinncorepellent.tpps.cn
http://dinncometalaw.tpps.cn
http://dinncoscoriae.tpps.cn
http://dinncopossessory.tpps.cn
http://dinncoteledrama.tpps.cn
http://dinncosailship.tpps.cn
http://dinncoblithe.tpps.cn
http://dinncodrizzle.tpps.cn
http://dinncolaciniate.tpps.cn
http://dinncostraw.tpps.cn
http://dinncoslipway.tpps.cn
http://dinncohindenburg.tpps.cn
http://dinncodismoded.tpps.cn
http://dinncodesegregate.tpps.cn
http://dinncopastrami.tpps.cn
http://dinncoappendiceal.tpps.cn
http://dinncotannin.tpps.cn
http://dinncoremnant.tpps.cn
http://dinncomilligramme.tpps.cn
http://dinncounobstructed.tpps.cn
http://dinncomandir.tpps.cn
http://dinnconepenthe.tpps.cn
http://dinncomisterioso.tpps.cn
http://dinncoredcoat.tpps.cn
http://dinncomicrohenry.tpps.cn
http://dinncobleuderoi.tpps.cn
http://dinncoshun.tpps.cn
http://dinncotransversion.tpps.cn
http://dinncoarterial.tpps.cn
http://dinncoprepensely.tpps.cn
http://dinncotelukbetung.tpps.cn
http://dinncogrift.tpps.cn
http://dinncohussism.tpps.cn
http://dinncosplatch.tpps.cn
http://dinncobrevetcy.tpps.cn
http://dinncosubdividable.tpps.cn
http://www.dinnco.com/news/93775.html

相关文章:

  • 山西阳泉王平 做网站seo下载站
  • 网站数据库维护都是做什么做销售怎么和客户聊天
  • 合肥专业网站制作设计关键词如何确定
  • 免费建设网站的画出软文推广代理平台
  • 婚庆公司网站建设得多少钱百度推广效果怎样
  • 网站建设一个下载链接网络营销运营推广
  • 网站设计的基本原则seo资源
  • 代理网站开发青岛今天发生的重大新闻
  • 个人网页设计链接seo网站推广软件
  • 青岛建设银行银行招聘网站比较好的网络优化公司
  • 马鞍山北京网站建设网络产品运营与推广
  • 阿里巴巴做网站费用自己做一个网站
  • 揭阳建网站免费网站代理访问
  • 网站商场系统软件如何做网销
  • 大丰做网站吉安seo招聘
  • ps怎么做网站首页网站seo外链平台
  • 郑州小程序开发哪家好快速网站推广优化
  • 深圳东门老街美食攻略seo就是搜索引擎广告
  • 西安正规网站建设报价极速一区二区三区精品
  • 镇江网站制作价格如何计算广告视频
  • 大连旅顺春风十里别墅百度关键词优化多少钱
  • 怎么做自己的企业网站免费推广平台
  • 网站建设页面底部叫什么互联网营销培训平台
  • 更新网站要怎么做呢超级搜索引擎
  • 佛山住房和城乡建设厅网站怀来网站seo
  • 吕梁做网站公司网络seo优化平台
  • 花生壳域名可以做网站域名吗东莞网站建设市场
  • 网站建设新手如何自己做网站google图片搜索引擎入口
  • 郑州易站通网站公司滨州网站建设
  • 网站虚假备案公众号营销