首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >基于Tensrorflow和cifar10的深度学习

基于Tensrorflow和cifar10的深度学习
EN

Stack Overflow用户
提问于 2020-12-12 20:39:46
回答 1查看 50关注 0票数 0

我试图用softmax对cifar10图像进行分类,但是模型没有学习任何东西。

下面的代码打印

代码语言:javascript
复制
 0 None
 1 None 
 2 None 

诸若此类。我怎样才能修正我的代码,或者找出为什么它总是没有?

代码语言:javascript
复制
import tensorflow as tf
import numpy as np
from tensorflow.keras.datasets import cifar10, mnist
(x_train,y_train),(x_test,y_test) = cifar10.load_data()
x = tf.placeholder(tf.float32,[None,3072])
y_ = tf.placeholder(tf.float32,[None])
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
#reshaping the y_train and x_train
y_train = y_train.reshape((5000,10))
x_test =x_test.reshape(10000,3072)
x_train = x_train.reshape(50000,3072)
y_test =y_test.reshape(1000,10)

# Data normalization
x_train = x_train/255
y_train = y_train/255
W1 = tf.Variable(tf.zeros([3072,10]))
b1 = tf.Variable(tf.zeros([10]))
y = tf.nn.softmax(tf.matmul(x,W1)+b1)
# this is a cross entropy
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y),reduction_indices=[1]))
# let's train to get the minimum loss using back+forward propagation
train_step = tf.train.AdamOptimizer(0.5).minimize(cross_entropy)
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
for i in range(1000):
    for j in range(50):
       print(j,sess.run(train_step,feed_dict={x : x_train , y_ : y_train[j]}))
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2020-12-13 02:23:25

当然,它将不返回任何变量,因为文档表示它将返回一个更新var_list中变量的操作

如果您想查看损失值,可以写以下内容:

代码语言:javascript
复制
_, loss = sess.run([train_step, cross_entropy], feed_dict={x : x_train , y_ : y_train[j]})
print(j, loss)
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/65269606

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档