Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
Z
zzsn_spider
概览
概览
详情
活动
周期分析
版本库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
统计图
问题
0
议题
0
列表
看板
标记
里程碑
合并请求
1
合并请求
1
CI / CD
CI / CD
流水线
作业
日程表
图表
维基
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
丁双波
zzsn_spider
Commits
41161484
提交
41161484
authored
12月 13, 2023
作者:
薛凌堃
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
年报topic调整
上级
b931eea2
隐藏空白字符变更
内嵌
并排
正在显示
5 个修改的文件
包含
6 行增加
和
69 行删除
+6
-69
client.conf
comData/annualReport/client.conf
+0
-63
fbs_annualreport.py
comData/annualReport/fbs_annualreport.py
+1
-1
fbs_annualreport_1.py
comData/annualReport/fbs_annualreport_1.py
+1
-1
证监会-年报.py
comData/annualReport/证监会-年报.py
+2
-2
雪球网-年报.py
comData/annualReport/雪球网-年报.py
+2
-2
没有找到文件。
comData/annualReport/client.conf
deleted
100644 → 0
浏览文件 @
b931eea2
# connect timeout in seconds
# default value is 30s
connect_timeout
=
300
# network timeout in seconds
# default value is 30s
network_timeout
=
600
# the base path to store log files
#base_path=/home/tarena/django-project/cc_shop1/cc_shop1/logs
# tracker_server can ocur more than once, and tracker_server format is
# "host:port", host can be hostname or ip address
tracker_server
=
114
.
115
.
215
.
96
:
22122
#standard log level as syslog, case insensitive, value list:
### emerg for emergency
### alert
### crit for critical
### error
### warn for warning
### notice
### info
### debug
log_level
=
info
# if use connection pool
# default value is false
# since V4.05
use_connection_pool
=
false
# connections whose the idle time exceeds this time will be closed
# unit: second
# default value is 3600
# since V4.05
connection_pool_max_idle_time
=
3600
# if load FastDFS parameters from tracker server
# since V4.05
# default value is false
load_fdfs_parameters_from_tracker
=
false
# if use storage ID instead of IP address
# same as tracker.conf
# valid only when load_fdfs_parameters_from_tracker is false
# default value is false
# since V4.05
use_storage_id
=
false
# specify storage ids filename, can use relative or absolute path
# same as tracker.conf
# valid only when load_fdfs_parameters_from_tracker is false
# since V4.05
storage_ids_filename
=
storage_ids
.
conf
#HTTP settings
http
.
tracker_server_port
=
80
#use "#include" directive to include HTTP other settiongs
##
include
http
.
conf
\ No newline at end of file
comData/annualReport/fbs_annualreport.py
浏览文件 @
41161484
...
@@ -19,7 +19,7 @@ cursor = baseCore.cursor
...
@@ -19,7 +19,7 @@ cursor = baseCore.cursor
def
sendKafka
(
dic_news
):
def
sendKafka
(
dic_news
):
try
:
# 114.116.116.241
try
:
# 114.116.116.241
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
kafka_result
=
producer
.
send
(
"researchReportTopic"
,
kafka_result
=
producer
.
send
(
"researchReport
Year
Topic"
,
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
print
(
kafka_result
.
get
(
timeout
=
10
))
print
(
kafka_result
.
get
(
timeout
=
10
))
...
...
comData/annualReport/fbs_annualreport_1.py
浏览文件 @
41161484
...
@@ -19,7 +19,7 @@ cursor = baseCore.cursor
...
@@ -19,7 +19,7 @@ cursor = baseCore.cursor
def
sendKafka
(
dic_news
):
def
sendKafka
(
dic_news
):
try
:
# 114.116.116.241
try
:
# 114.116.116.241
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
kafka_result
=
producer
.
send
(
"researchReportTopic"
,
kafka_result
=
producer
.
send
(
"researchReport
Year
Topic"
,
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
print
(
kafka_result
.
get
(
timeout
=
10
))
print
(
kafka_result
.
get
(
timeout
=
10
))
...
...
comData/annualReport/证监会-年报.py
浏览文件 @
41161484
impor
t
json
impor
t
json
...
@@ -180,7 +180,7 @@ def SpiderByZJH(url, payload, dic_info, num, start_time):
...
@@ -180,7 +180,7 @@ def SpiderByZJH(url, payload, dic_info, num, start_time):
# 将相应字段通过kafka传输保存
# 将相应字段通过kafka传输保存
try
:
try
:
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
kafka_result
=
producer
.
send
(
"researchReportTopic"
,
kafka_result
=
producer
.
send
(
"researchReport
Year
Topic"
,
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
print
(
kafka_result
.
get
(
timeout
=
10
))
print
(
kafka_result
.
get
(
timeout
=
10
))
...
...
comData/annualReport/雪球网-年报.py
浏览文件 @
41161484
# -*-
coding: utf-8 -*-
# -*-
coding: utf-8 -*-
...
@@ -221,7 +221,7 @@ def spider_annual_report(dict_info,num):
...
@@ -221,7 +221,7 @@ def spider_annual_report(dict_info,num):
# 将相应字段通过kafka传输保存
# 将相应字段通过kafka传输保存
try
:
try
:
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
producer
=
KafkaProducer
(
bootstrap_servers
=
[
'114.115.159.144:9092'
],
max_request_size
=
1024
*
1024
*
20
)
kafka_result
=
producer
.
send
(
"researchReportTopic"
,
kafka_result
=
producer
.
send
(
"researchReport
Year
Topic"
,
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
json
.
dumps
(
dic_news
,
ensure_ascii
=
False
)
.
encode
(
'utf8'
))
print
(
kafka_result
.
get
(
timeout
=
10
))
print
(
kafka_result
.
get
(
timeout
=
10
))
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论