探索 Oxygen 包来构建 API 我发现这个问题我一直无法解决
using Oxygen
@get "/model" function()
sleep(0.15)
return json("ok")
end
serve(port = 8001)
我达到约 560 RPS(并发数 = 100)
ab -n1000 -c100 'http://localhost:8001/model'
Server Software:
Server Hostname: localhost
Server Port: 8001
Document Path: /model
Document Length: 4 bytes
Concurrency Level: 100
Time taken for tests: 1.756 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 83000 bytes
HTML transferred: 4000 bytes
Requests per second: 569.62 [#/sec] (mean)
Time per request: 175.556 [ms] (mean)
Time per request: 1.756 [ms] (mean, across all concurrent requests)
Transfer rate: 46.17 [Kbytes/sec] received
当我构建另一个 Oxygen API 并使用前一个 API 时
using Oxygen
using HTTP
@get "/test" function()
response = HTTP.get("http://localhost:8001/model") # sleep(0.15) - Just this line is different
return json("ok")
end
serve(port = 8002)
即使使用 4 个线程和 serverparallel,我也只能达到 100 RPS 左右。我预计 RPS 数字会接近 560。
ab -n1000 -c100 'http://localhost:8002/test'
Server Software:
Server Hostname: localhost
Server Port: 8002
Document Path: /test
Document Length: 4 bytes
Concurrency Level: 100
Time taken for tests: 9.815 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 83000 bytes
HTML transferred: 4000 bytes
Requests per second: 101.89 [#/sec] (mean)
Time per request: 981.455 [ms] (mean)
Time per request: 9.815 [ms] (mean, across all concurrent requests)
Transfer rate: 8.26 [Kbytes/sec] received
我认为问题出在 Julia API 上的 HTTP.get 请求中,也许有更好的方法来做到这一点或加快速度的方法。欢迎任何提示。我测试过: 朱莉娅版本:1.9.4 和 1.10.1 氧气版本:1.4.9 和 1.5.0 HTTP 版本:1.10.2
如果我使用 FastAPI 和 request 对 python 做同样的事情
import requests
from fastapi import FastAPI
app = FastAPI()
@app.get("/test")
def test():
response = requests.get("http://localhost:8001/model")
return "ok"
只有一名工人(uvicorn api_test_nested:app --host 0.0.0.0 --port 8003 --workers 1)我得到大约220 RPS,有4名工人我几乎达到450 RPS
ab -n1000 -c100 'http://localhost:8003/test'
Server Software: uvicorn
Server Hostname: localhost
Server Port: 8003
Document Path: /test
Document Length: 6 bytes
Concurrency Level: 100
Time taken for tests: 4.436 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 149000 bytes
HTML transferred: 6000 bytes
Requests per second: 225.42 [#/sec] (mean)
Time per request: 443.618 [ms] (mean)
Time per request: 4.436 [ms] (mean, across all concurrent requests)
Transfer rate: 32.80 [Kbytes/sec] received
进度:
我将/test
API 更改为使用curl 而不是 HTTP.jl 并且我得到了巨大的改进,所以我几乎可以肯定 HTTP.jl 存在一些奇怪的地方
using Oxygen
@get "/test" function()
response = run(`curl http://localhost:8001/model`) # instead of HTTP.request("GET", "http://localhost:8001/model")
return json("ok")
end
serve(port = 8002)
通过这个小小的改变,我的速度从 100 RPS 提高到大约 440 RPS,更接近我从 /model
获得的 560 RPS,是 FastAPI 的 RPS 的两倍。有什么建议吗?
您似乎想知道两件事;
对于这个例子来说也是如此。我尝试使用 Distributed.jl 在我的计算机上运行代码的多处理版本,随着我添加更多进程,性能得到了很好的提升。
# multiprocess.jl
using Oxygen, HTTP, Distributed
@get "/test" function()
future=Distributed.@spawnat :any begin
response = HTTP.get("http://localhost:8001/model")
end
fetch(future)
return json("ok")
end
serve(port=8002)
仅1个进程
$ julia multiprocess.jl
$ ab -n 1000 -c 100 http://localhost:8002
Server Software:
Server Hostname: localhost
Server Port: 8002
Document Path: /test
Document Length: 4 bytes
Concurrency Level: 100
Time taken for tests: 9.825 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 89000 bytes
HTML transferred: 4000 bytes
Requests per second: 101.78 [#/sec] (mean)
Time per request: 982.481 [ms] (mean)
Time per request: 9.825 [ms] (mean, across all concurrent requests)
Transfer rate: 8.85 [Kbytes/sec] received
有4道工序
$ julia -p 4 multiprocess.jl
$ ab -n 1000 -c 100 http://localhost:8002Server Software:
Server Hostname: localhost
Server Port: 8002
Document Path: /test
Document Length: 4 bytes
Concurrency Level: 100
Time taken for tests: 2.603 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 89000 bytes
HTML transferred: 4000 bytes
Requests per second: 384.24 [#/sec] (mean)
Time per request: 260.252 [ms] (mean)
Time per request: 2.603 [ms] (mean, across all concurrent requests)
Transfer rate: 33.40 [Kbytes/sec] received