TensorFlow 2.0 Tutorial: 4 - 几个常
2019-04-15 本文已影响10人
不会停的蜗牛
1. 先搭建一个最简单的模型 中学习了用 TF 2.0 搭建一个神经网络的基本流程
2 - 识别 Fashion MNIST 中学习了图片识别和可视化的方法
3 - 几种 RNN 模型的实现 中对比学习了几种常见 RNN 模型的代码实现
这里我们将学习一些基础操作:
- 特征标准化
- 画学习曲线
- callbacks
%matplotlib inline
%load_ext tensorboard.notebook
import matplotlib as mpl
import matplotlib.pyplot as plt
import numpy as np
import os
import pandas as pd
import sklearn
import sys
import tensorflow as tf
from tensorflow import keras # tf.keras
import time
assert sys.version_info >= (3, 5) # Python ≥3.5 required
assert tf.__version__ >= "2.0" # TensorFlow ≥2.0 required
fashion_mnist = keras.datasets.fashion_mnist
(X_train_full, y_train_full), (X_test, y_test) = (
fashion_mnist.load_data())
X_valid, X_train = X_train_full[:5000], X_train_full[5000:]
y_valid, y_train = y_train_full[:5000], y_train_full[5000:]
plt.imshow(X_train[0], cmap="binary")
plt.show()
class_names = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat",
"Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"]
1. 建立神经网络
下面这个过程是一个最基础的模型建立到评估到预测的流程,
几乎都是遵循这样的一个过程,
- 先是建立一个基础的网络模型,
输入层,先将 28x28 图片转换成 1x784 ,
隐藏层,定义神经元个数和激活函数,
输出层,定义类别的个数,并用 softmax 得到概率。
model = keras.models.Sequential([
keras.layers.Flatten(input_shape=[28, 28]),
keras.layers.Dense(300, activation="relu"),
keras.layers.Dense(100, activation="relu"),
keras.layers.Dense(10, activation="softmax")
])
然后是查看模型,
- 编译模型,此时定义 loss,optimizer,metrics,
- 训练模型可以用最简单的 fit,
- 评估模型用 model.evaluate
- 最后是预测新数据
model.summary()
model.compile(loss="sparse_categorical_crossentropy",
optimizer="sgd",
metrics=["accuracy"])
history = model.fit(X_train, y_train, epochs=10,
validation_data=(X_valid, y_valid))
model.evaluate(X_test, y_test)
n_new = 10
X_new = X_test[:n_new]
y_proba = model.predict(X_new)
2. 常用技术
1. 我们可以对特征进行标准化的预处理:
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train.astype(np.float32).reshape(-1, 1)).reshape(-1, 28, 28)
X_valid_scaled = scaler.transform(X_valid.astype(np.float32).reshape(-1, 1)).reshape(-1, 28, 28)
X_test_scaled = scaler.transform(X_test.astype(np.float32).reshape(-1, 1)).reshape(-1, 28, 28)
然后在模型训练和评估时使用标准化的数据,可以对比两次效果:
history = model.fit(X_train_scaled, y_train, epochs=20,
validation_data=(X_valid_scaled, y_valid))
model.evaluate(X_test_scaled, y_test)
2. 我们还可以用 pd.DataFrame(history.history).plot 画出学习曲线
def plot_learning_curves(history):
pd.DataFrame(history.history).plot(figsize=(8, 5))
plt.grid(True)
plt.gca().set_ylim(0, 1)
plt.show()
plot_learning_curves(history)
3. 想要知道预测结果的类别 ID,可以用下面两种方式:
y_pred = y_proba.argmax(axis=1)
y_pred = model.predict_classes(X_new)
要查看 top k 类别:
k = 3
top_k = np.argsort(-y_proba, axis=1)[:, :k]
top_k
4. fit() 里面可以接收 callbacks:
callbacks 回调函数是一个函数的合集,可以调用 TensorBoard,EarlyStopping, ModelCheckpoint 等函数。
model 和 compile 没有变化,只需要将 callbacks 传递给 fit:
model = keras.models.Sequential([
keras.layers.Flatten(input_shape=[28, 28]),
keras.layers.Dense(300, activation="relu"),
keras.layers.Dense(100, activation="relu"),
keras.layers.Dense(10, activation="softmax")
])
model.compile(loss="sparse_categorical_crossentropy",
optimizer="sgd", metrics=["accuracy"])
logdir = os.path.join(root_logdir, "run_{}".format(time.time()))
callbacks = [
keras.callbacks.TensorBoard(logdir),
keras.callbacks.EarlyStopping(patience=5),
keras.callbacks.ModelCheckpoint("my_mnist_model.h5", save_best_only=True),
]
history = model.fit(X_train_scaled, y_train, epochs=50,
validation_data=(X_valid_scaled, y_valid),
callbacks=callbacks)
学习资料:
https://github.com/ageron/tf2_course/blob/master/01_neural_nets_with_keras.ipynb
大家好!
我是 不会停的蜗牛 Alice,
喜欢人工智能,每天写点机器学习干货,
欢迎关注我!