Tensorflow概念
2017-09-19 本文已影响138人
Elinx
AI将会变得更加民主化, AI程序开发迟早会变为程序员的必备技能,还犹豫什么,赶紧来学习. Google的Tensorflow无疑是目前最有前景的框架, 那么Tensorflow到底好不好学呢?我们拭目以待. 本篇介绍Tensorflow的基本概念.
1. 基本元素
1.1 constant
const的原型是tf.constant(value, dtype=None, shape=None, name='Const', verify_shape=False)
,可以是常数,向量,矩阵等.例子如下:
import tensorflow as tf
def const_literal():
a = tf.constant(2, name='a')
b = tf.constant(3, name='b')
x = tf.add(a, b, name='add')
with tf.Session() as sess:
writer = tf.summary.FileWriter('./graphs', sess.graph)
print(sess.run(x))
writer.close()
def const_tensor():
a = tf.constant([2, 2], name='a')
b = tf.constant([[0, 1], [2, 3]], name='b')
x = tf.add(a, b, name='add')
y = tf.multiply(a, b, name='mul') # element wise multiply
with tf.Session() as sess:
x, y = sess.run([x, y])
print('x:')
print(x)
print('y:')
print(y)
def const_zeros():
"""tf.zeros and tf.ones has same API"""
a = tf.zeros([2, 3], tf.int32)
b = tf.zeros_like(a, tf.float32)
with tf.Session() as sess:
print(sess.run(a))
print(sess.run(b))
def const_fill(val):
"""fill the tensor with a value"""
a = tf.fill([2, 3], val)
with tf.Session() as sess:
print(sess.run(a))
def const_linear(start, stop, num):
"""linear space numbers in [start, stop], only float32, float64 permited"""
a = tf.linspace(start, stop, num)
b = tf.range(start, stop, 1.0)
with tf.Session() as sess:
print(sess.run(a))
print(sess.run(b))
def const_random():
"""
tf.random_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)
tf.truncated_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)
tf.random_uniform(shape, minval=0, maxval=None, dtype=tf.float32, seed=None, name=None)
tf.random_shuffle(value, seed=None, name=None)
tf.random_crop(value, size, seed=None, name=None)
tf.multinomial(logits, num_samples, seed=None, name=None)
tf.random_gamma(shape, alpha, beta=None, dtype=tf.float32, seed=None, name=None)
"""
pass
def const_graph():
"""don't need to use my_const, it has already been in the compute graph"""
my_const = tf.constant([1.0, 2.0], name='my_const')
with tf.Session() as sess:
print(sess.graph.as_graph_def())
if __name__ == '__main__':
# const_linear(0.0, 99.0, 99)
const_graph()
Tensorboard
Tensorboard查看Graph.
Variable
constant是一个operation,子graph构建的时候定义,Variable是一个类,代表变量.constant在图的的定义里边,Variable可以在参数服务器.
变量在使用前要进行显示的初始化,否则报未初始化的错.
可以使用eval()进行求知,只有operation和tensor有eval()函数,Tensor.eval()相当于get_default_session().run(t)
.
每一个Variable都有一个initializer,只有Variable被初始化了或者赋值成功了,才可以eval()
import tensorflow as tf
def test_eval():
W = tf.constant(10)
with tf.Session():
print(W.eval()) # 10
def test_eval_Variable():
W = tf.Variable(10)
with tf.Session() as sess:
print(sess.run(W.initializer)) # None <--- 1.
print(W.eval()) # 10
def test_eval_Variable_all():
W = tf.Variable(10)
with tf.Session():
print(W.initializer.eval()) # error: object has no attribute 'eval'
print(W.eval())
def initialize_properly():
W = tf.Variable(10)
with tf.Session() as sess:
#This way
tf.global_variables_initializer().run()
print(W.eval())
print(sess.run(W))
def run_multiple_times():
W = tf.Variable(10)
a_times_two = W.assign(2 * W)
with tf.Session():
tf.global_variables_initializer().run()
print(W.eval()) # 10
print(a_times_two.eval()) # 20
print(a_times_two.eval()) # 40
if __name__ == '__main__':
test_eval()
test_eval_Variable()
test_eval_Variable_all()
Placeholders
placeholder和Variable在普通的编程意义上差不多,不过在tensorflow里边,placeholder用来表示输入输出的数据,相当于C/C++的io, Variable代表在学习中可以更新,迭代,存储的参数,更接近于普通意义上的变量. 具体来说有一下不同:
- Variable需要用Tensor初始化; placeholder不用,也不能初始化
- Variable的数据可以在训练中更新
- Variable可以共享,并且可以是nontrainble
- Variable学习好的参数可以保存在磁盘中
- Variable创建的时候有三个op自动创建: variable op, initializer op, ops for the initial value
- Variable是一个class, placeholder是一个function
- 在分布式环境下,Variable在参数服务器里边,并且在不同的worker里边共享
- Variable使用前要初始化,在使用的过程中shape是固定的, placeholder在使用的时候要feed数据.
Session
import tensorflow as tf
x = tf.Variable(3, name='x')
y = tf.Variable(4, name='y')
f = x*x*y + y + 2
with tf.Session() as sess:
x.initializer.run()
y.initializer.run()
# result = f.eval()
# result = sess.run(f)
result = tf.get_default_session().run(f)
tf.reset_default_graph()
print(result)
result = None
- 这段代码里边用了三种方法求值
- InteractiveSession自动创建一个Session并且是default Session,不需要with block
Graph操作
x1 = tf.Variable(1)
x1.graph is tf.get_default_graph() # True
graph = tf.Graph()
with graph.as_default():
x2 = tf.Variable(2)
x2.graph is graph # True
x2.graph is tf.get_default_graph() # False
- 任何一个创建的node都会自动放到default graph里边
- 当然也可以给node指定graph,尤其是程序中有多个graph的情形.
Node的生命周期
w = tf.constant(3)
x = w + 2
y = x + 5
z = x * 3
with tf.Session() as sess:
print(y.eval())
print(z.eval())
- 在这个代码中求y的时候需要先求x和w,求z的时候也是x和w,但是第二次不能复用第一次的结果
- 在graph的run函数调用后,除了variable之外的数据都会被丢弃,variable的声明周期始于初始化,止于session关闭.
- 为了让求值更搞笑,需要在一个run中计算y和z:
with tf.Session() as sesss:
y_val, z_val = sess.run([y, z])
print(y_val)
print(z_val)