将 concurrent.futures 与 Flask 结合使用会提高性能吗
Will I get performance boost combining concurrent.futures with Flask
我想知道是否可以将 concurrent.futures
与 Flask 一起使用。这是一个例子。
import requests
from flask import Flask
from concurrent.futures import ThreadPoolExecutor
executor = ThreadPoolExecutor(max_workers=10)
app = Flask(__name__)
@app.route("/path/<xxx>")
def hello(xxx):
f = executor.submit(task, xxx)
return "OK"
def task():
resp = requests.get("some_url")
# save to mongodb
app.run()
任务受 IO 限制,不需要 return 值。请求不会频繁,我估计最多10/s。
我测试了它并且有效。我想知道的是我是否可以通过这种方式使用多线程获得 性能提升。 Flask 会以某种方式阻塞任务吗?
这取决于比 Flask 更多的因素,比如你在 Flask 前面使用的是什么(gunicorn、gevent、uwsgi、nginx 等)。如果您发现您对 "some_url" 的请求确实是一个瓶颈,将其推送到另一个线程可能会有所提升,但这同样取决于您的个人情况; Web 堆栈中的许多元素可以使过程 "slow".
与其在 Flask 进程上使用多线程(这很快就会变得复杂),将阻塞 I/O 推送到辅助进程可能是更好的解决方案。您可以将 Redis 消息发送到异步事件循环上的进程 运行,这将很好地扩展。
app.py
from flask import Flask
import redis
r = redis.StrictRedis(host='127.0.0.1', port=6379)
app = Flask(__name__)
@app.route("/")
def hello():
# send your message to the other process with redis
r.publish('some-channel', 'some data')
return "OK"
if __name__ == '__main__':
app.run(port=4000, debug=True)
helper.py
import asyncio
import asyncio_redis
import aiohttp
@asyncio.coroutine
def get_page():
# get some url
req = yield from aiohttp.get('http://example.com')
data = yield from req.read()
# insert into mongo using Motor or some other async DBAPI
#yield from insert_into_database(data)
@asyncio.coroutine
def run():
# Create connection
connection = yield from asyncio_redis.Connection.create(host='127.0.0.1', port=6379)
# Create subscriber.
subscriber = yield from connection.start_subscribe()
# Subscribe to channel.
yield from subscriber.subscribe([ 'some-channel' ])
# Inside a while loop, wait for incoming events.
while True:
reply = yield from subscriber.next_published()
print('Received: ', repr(reply.value), 'on channel', reply.channel)
yield from get_page()
# When finished, close the connection.
connection.close()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
我想知道是否可以将 concurrent.futures
与 Flask 一起使用。这是一个例子。
import requests
from flask import Flask
from concurrent.futures import ThreadPoolExecutor
executor = ThreadPoolExecutor(max_workers=10)
app = Flask(__name__)
@app.route("/path/<xxx>")
def hello(xxx):
f = executor.submit(task, xxx)
return "OK"
def task():
resp = requests.get("some_url")
# save to mongodb
app.run()
任务受 IO 限制,不需要 return 值。请求不会频繁,我估计最多10/s。
我测试了它并且有效。我想知道的是我是否可以通过这种方式使用多线程获得 性能提升。 Flask 会以某种方式阻塞任务吗?
这取决于比 Flask 更多的因素,比如你在 Flask 前面使用的是什么(gunicorn、gevent、uwsgi、nginx 等)。如果您发现您对 "some_url" 的请求确实是一个瓶颈,将其推送到另一个线程可能会有所提升,但这同样取决于您的个人情况; Web 堆栈中的许多元素可以使过程 "slow".
与其在 Flask 进程上使用多线程(这很快就会变得复杂),将阻塞 I/O 推送到辅助进程可能是更好的解决方案。您可以将 Redis 消息发送到异步事件循环上的进程 运行,这将很好地扩展。
app.py
from flask import Flask
import redis
r = redis.StrictRedis(host='127.0.0.1', port=6379)
app = Flask(__name__)
@app.route("/")
def hello():
# send your message to the other process with redis
r.publish('some-channel', 'some data')
return "OK"
if __name__ == '__main__':
app.run(port=4000, debug=True)
helper.py
import asyncio
import asyncio_redis
import aiohttp
@asyncio.coroutine
def get_page():
# get some url
req = yield from aiohttp.get('http://example.com')
data = yield from req.read()
# insert into mongo using Motor or some other async DBAPI
#yield from insert_into_database(data)
@asyncio.coroutine
def run():
# Create connection
connection = yield from asyncio_redis.Connection.create(host='127.0.0.1', port=6379)
# Create subscriber.
subscriber = yield from connection.start_subscribe()
# Subscribe to channel.
yield from subscriber.subscribe([ 'some-channel' ])
# Inside a while loop, wait for incoming events.
while True:
reply = yield from subscriber.next_published()
print('Received: ', repr(reply.value), 'on channel', reply.channel)
yield from get_page()
# When finished, close the connection.
connection.close()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(run())