爬虫Python爬虫爬虫专题

Python爬虫日记八:利用API实时爬取斗鱼弹幕

2017-06-03  本文已影响860人  梅花鹿数据rieuse
斗鱼

一:前言

这些天一直想做一个斗鱼爬取弹幕,但是一直考试时间不够,而且这个斗鱼的api接口虽然开放了但是我在github上没有找到可以完美实现连接。我看了好多文章,学了写然后总结一下。也为后面数据分析做准备,后面先对弹幕简单词云化,然后再对各个房间的数据可视化。
代码地址:github.com/rieuse/DouyuTV


这次爬取的房间是斗鱼直播的芜湖大司马,因为他人气比较多,方便分析。主播也是我老乡,嘿嘿。然后把弹幕的信息的uid,昵称,等级,弹幕内容保存mongodb。

先看看效果

Paste_Image.png
GIF.gif

二:运行环境


三:实例分析

首先要想爬取弹幕要看看官方的开发文档

def sendmsg(msgstr):
    msg = msgstr.encode('utf-8')
    data_length = len(msg) + 8
    code = 689
    msgHead = int.to_bytes(data_length, 4, 'little') \
              + int.to_bytes(data_length, 4, 'little') + int.to_bytes(code, 4, 'little')
    client.send(msgHead)
    sent = 0
    while sent < len(msg):
        tn = client.send(msg[sent:])
        sent = sent + tn
Paste_Image.png
 msg = 'type@=loginreq/username@=rieuse/password@=douyu/roomid@={}/\0'.format(roomid)
sendmsg(msg)
Paste_Image.png
    msg_more = 'type@=joingroup/rid@={}/gid@=-9999/\0'.format(roomid)
    sendmsg(msg_more)
Paste_Image.png
def keeplive():
    while True:
        msg = 'type@=keeplive/tick@=' + str(int(time.time())) + '/\0'
        sendmsg(msg)
        time.sleep(15)
Paste_Image.png

到这里这个API的主要功能已经了解了,剩下的就是具体实现,有以下几点:

if not level_more:
            level_more = b'0'

四:实战代码

点击查看完整代码

import multiprocessing
import socket
import time
import re
import pymongo
import requests
from bs4 import BeautifulSoup

clients = pymongo.MongoClient('localhost')
db = clients["DouyuTV_danmu"]
col = db["info"]

client = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
host = socket.gethostbyname("openbarrage.douyutv.com")
port = 8601
client.connect((host, port))

danmu_path = re.compile(b'txt@=(.+?)/cid@')
uid_path = re.compile(b'uid@=(.+?)/nn@')
nickname_path = re.compile(b'nn@=(.+?)/txt@')
level_path = re.compile(b'level@=([1-9][0-9]?)/sahf')

def sendmsg(msgstr):
    msg = msgstr.encode('utf-8')
    data_length = len(msg) + 8
    code = 689
    msgHead = int.to_bytes(data_length, 4, 'little') \
              + int.to_bytes(data_length, 4, 'little') + int.to_bytes(code, 4, 'little')
    client.send(msgHead)
    sent = 0
    while sent < len(msg):
        tn = client.send(msg[sent:])
        sent = sent + tn


def start(roomid):
    msg = 'type@=loginreq/username@=rieuse/password@=douyu/roomid@={}/\0'.format(roomid)
    sendmsg(msg)
    msg_more = 'type@=joingroup/rid@={}/gid@=-9999/\0'.format(roomid)
    sendmsg(msg_more)

    print('---------------欢迎连接到{}的直播间---------------'.format(get_name(roomid)))
    while True:
        data = client.recv(1024)
        uid_more = uid_path.findall(data)
        nickname_more = nickname_path.findall(data)
        level_more = level_path.findall(data)
        danmu_more = danmu_path.findall(data)
        if not level_more:
            level_more = b'0'
        if not data:
            break
        else:
            for i in range(0, len(danmu_more)):
                try:
                    product = {
                        'uid': uid_more[0].decode(encoding='utf-8'),
                        'nickname': nickname_more[0].decode(encoding='utf-8'),
                        'level': level_more[0].decode(encoding='utf-8'),
                        'danmu': danmu_more[0].decode(encoding='utf-8')
                    }
                    print(product)
                    col.insert(product)
                    print('成功导入mongodb')
                except Exception as e:
                    print(e)


def keeplive():
    while True:
        msg = 'type@=keeplive/tick@=' + str(int(time.time())) + '/\0'
        sendmsg(msg)
        time.sleep(15)


def get_name(roomid):
    r = requests.get("http://www.douyu.com/" + roomid)
    soup = BeautifulSoup(r.text, 'lxml')
    return soup.find('a', {'class', 'zb-name'}).string


if __name__ == '__main__':
    room_id = input('请出入房间ID: ')
    p1 = multiprocessing.Process(target=start, args=(room_id,))
    p2 = multiprocessing.Process(target=keeplive)
    p1.start()
    p2.start()

五:弹幕的后续使用

这里我们是将弹幕的几个信息,uid,用户昵称,等级,弹幕内容保存到mongodb,后续要对数据分析就可以直接拿出来,如果我们只需要弹幕那么就可以只把弹幕信息保存到txt文档中就行了。
贴出我的github地址,我的爬虫代码和学习的基础部分都放进去了,有喜欢的朋友可以点击 start follw一起学习交流吧!github.com/rieuse/DouyuTV

加油!
上一篇下一篇

猜你喜欢

热点阅读