keras深度学习模型深度学习深度学习-推荐系统-CV-NLP

【Keras】用Anaconda安装Keras,并在IDE(VS

2019-08-15  本文已影响1人  BG大龍

版权声明:小博主水平有限,希望大家多多指导。

1、注意!!!

Keras的运行依赖于后端,一般有Tensorflow、Theano和CNTK三种。根据主流,推荐安装TensorFlow作为Keras的backend。

所以,先安装TensorFlow,再在tensorflow的虚拟环境中安装Keras


2、参考:

1、anaconda 下安装tensorflow & keras - qy13913453196的博客 - CSDN博客

https://blog.csdn.net/qy13913453196/article/details/82589792​blog.csdn.net

3、Windows+Anaconda下搭建Keras环境 - qq_22885109的博客 - CSDN博客

https://blog.csdn.net/qq_22885109/article/details/80995134​blog.csdn.net


3、安装教程

【1】安装tensorflow

怎么安装tensorflow?我写了专门的博客:

BG大龍:【TensorFlow】用Anaconda安装tensorflow,并在IDE(VScode)运行​zhuanlan.zhihu.com


【2】激活tensorflow虚拟环境,在该tensorflow的虚拟环境下安装Keras

激活环境,在cmd中输入:conda activate tensorflow_env

在cmd中输入:pip install keras

出现安装信息,自从清华的镜像通道被美国在2019.5.16封了,只能访问国外网站下载,所以,慢慢等待进度条……

也许,你会遇到这样的报错,没关系,重新输入pip install keras 就可以

image

直到你看到了这个,说明成功了


【3】在命令行中验证

激活环境,在cmd中输入:conda activate tensorflow_env

image

在cmd中输入:python
再输入:import keras
会出现“Using TensorFlow backend”,说明成功


【4】在基于tensorflow环境的jupyter网页中验证()

输入:import keras
会如下出现“Using TensorFlow backend”,说明成功


【5】跑一下实际例子,来验证

(1)这里给出官方链接——keras中文文档
Keras:基于Python的深度学习库 - Keras中文文档

Keras:基于Python的深度学习库 - Keras中文文档​keras-cn.readthedocs.io

(2)该例子来自于:

Sequential model - Keras中文文档 Sequential model - Keras中文文档

image

MLP的二分类代码:
有一个亲测后的小细节:我用的VScode作为python的IDE,如activation="relu",必须双引号。
而用pycharm作为python的IDE,如activation='relu',必须单引号。

import numpy as np
import tensorflow as keras
from keras.models import Sequential
from keras.layers import Dense, Dropout

# Generate dummy data
x_train = np.random.random((1000, 20))
y_train = np.random.randint(2, size=(1000, 1))
x_test = np.random.random((100, 20))
y_test = np.random.randint(2, size=(100, 1))

model = Sequential()
model.add(Dense(64, input_dim=20, activation="relu"))
model.add(Dropout(0.5))
model.add(Dense(64, activation="relu"))
model.add(Dropout(0.5))
model.add(Dense(1, activation="sigmoid"))

model.compile(loss="binary_crossentropy",
              optimizer="rmsprop",
              metrics=["accuracy"])
model.fit(x_train, y_train,
          epochs=20,
          batch_size=128)
score = model.evaluate(x_test, y_test, batch_size=128)

运行结果:

Epoch 1/20
1000/1000 [==============================] - 1s 882us/step - loss: 0.7117 - acc: 0.5040
Epoch 2/20
1000/1000 [==============================] - 0s 39us/step - loss: 0.7049 - acc: 0.5020
Epoch 3/20
1000/1000 [==============================] - 0s 43us/step - loss: 0.7016 - acc: 0.5000
Epoch 4/20
1000/1000 [==============================] - 0s 39us/step - loss: 0.7031 - acc: 0.5260
Epoch 5/20
1000/1000 [==============================] - ETA: 0s - loss: 0.7046 - acc: 0.515 - 0s 41us/step - loss: 0.7024 - acc: 0.4930
Epoch 6/20
1000/1000 [==============================] - 0s 52us/step - loss: 0.6999 - acc: 0.5040
Epoch 7/20
1000/1000 [==============================] - 0s 47us/step - loss: 0.6974 - acc: 0.5150
Epoch 8/20
1000/1000 [==============================] - 0s 40us/step - loss: 0.6937 - acc: 0.5250
Epoch 9/20
1000/1000 [==============================] - 0s 39us/step - loss: 0.6912 - acc: 0.5260
Epoch 10/20
1000/1000 [==============================] - 0s 37us/step - loss: 0.6891 - acc: 0.5260
Epoch 11/20
1000/1000 [==============================] - 0s 41us/step - loss: 0.6919 - acc: 0.5210
Epoch 12/20
1000/1000 [==============================] - 0s 43us/step - loss: 0.6926 - acc: 0.5190
Epoch 13/20
1000/1000 [==============================] - 0s 44us/step - loss: 0.6897 - acc: 0.5350
Epoch 14/20
1000/1000 [==============================] - 0s 41us/step - loss: 0.6940 - acc: 0.5140
Epoch 15/20
1000/1000 [==============================] - 0s 44us/step - loss: 0.6928 - acc: 0.5300
Epoch 16/20
1000/1000 [==============================] - 0s 56us/step - loss: 0.6925 - acc: 0.5360
Epoch 17/20
1000/1000 [==============================] - 0s 50us/step - loss: 0.6906 - acc: 0.5400
Epoch 18/20
1000/1000 [==============================] - 0s 44us/step - loss: 0.6882 - acc: 0.5330
Epoch 19/20
1000/1000 [==============================] - 0s 37us/step - loss: 0.6923 - acc: 0.5420
Epoch 20/20
1000/1000 [==============================] - 0s 40us/step - loss: 0.6893 - acc: 0.5280
100/100 [==============================] - 0s 10us/step

祝,学习好运……

上一篇下一篇

猜你喜欢

热点阅读