US Region

Grandmetric LLC
Lewes DE 19958
16192 Coastal Hwy USA
EIN: 98-1615498
+1 302 691 94 10

EMEA Region

ul. Metalowa 5, 60-118 Poznań, Poland
NIP 7792433527
+48 61 271 04 43


Grandmetric LTD
Office 584b
182-184 High Street North
E6 2JA
+44 20 3321 5276

  • en
  • pl
  • Python REST frameworks performance comparison

    Python REST frameworks performance comparison

    Date: 10.08.2020


    Modern DevOps solutions allow us to scale the applications quickly. Scaling doesn’t always mean that we are slowly gaining customers and need to add more instances or servers. Scaling up can also happen periodically as the demand for our app might be higher in the evening or during the weekends.

    There is also a possibility to reach unexpected peaks of traffic that relate to particular events or happen when our app becomes viral. No matter what the reason is, it is essential to keep up with these changes. Keep in mind that tools used for scaling work best if the architecture of the app support scaling in the first place.

    One of the examples of an architecture that scales well is microservices. According to the study published by Mike Loukides and Steve Swoyer, over 76% of organizations use microservices. Each microservice is deployed separately in its own container. In the case of higher traffic or demand, we can easily replicate a microservice, assigning more resources to it. When the demand decreases again, we can remove the replicas. Microservices might communicate with each other using RESTful APIs. Due to this feature, it’s essential to select the right framework for a particular microservice. The focus of this blog post is Python REST frameworks comparison in terms of performance.

    Libraries under tests

    We gathered 11 popular Python frameworks that can be used as REST API servers and put them under the same synthetic test. The table below shows a list of selected Python REST frameworks with the most up-to-date version available during analysis.

    Library Version
    aiohttp 3.6.2
    bottle 0.12.18
    djangorestframework 3.11.0
    eve 1.1.2
    falcon 2.0.0
    fastapi 0.60.1
    Flask 1.1.2
    hug 2.6.1
    pyramid 1.10.4
    sanic 20.6.3
    tornado 6.0.4

    Test conditions

    Tests were performed on a single machine (i.e., REST API server and client were executed on localhost, and requests were performed using loopback interface). Hardware that ran the tests was as follows:

    CPU: Intel Core i7-8550U @ 8x 4GHz

    MEM: 16 GB

    OS: Ubuntu 20.04 focal

    Kernel: x86_64 Linux 5.4.0-40-generic

    All servers were run in parallel, each on a separate port. Outputs were redirected to /dev/null so that printing any additional information wouldn’t slow down the responses. Every server used a similar, most basic “Hello World” example with a single endpoint (‘/’), and requests were made using the GET method.

    Although the servers were run all at the same time, they were tested separately. In order to avoid the influence of varying load from OS processes, tests were repeated ten times using the Round Robin approach, as shown in the Figure below. In the first round of tests, response times from all servers were gathered, and then the whole process was repeated.

    For measuring the number of requests that a particular framework can handle, we used wrk tool. It allows us to spawn several threads, where each thread can use multiple connections (clients) to perform requests. One of the outputs of the wrk is the value of requests per second, which in case of this Python REST frameworks comparison is most impressive.

    Test results

    Let’s take a look at the most exciting part of this blog post, namely, results. The Figure shows the mean value and standard deviation of 10 test runs. As you can see from the standard deviations, the results were repetitive, so that excellent performance is not a matter of luck.

    Sanic is by far the best framework in terms of requests per second, followed by fastapi and aiohttp. On the other hand, Flask, eve (which is powered by Flask) and djangorestframework might not be the best choice if you are a speed-oriented kind of developer.

    Library Requests/sec
    sanic 10377
    aiohttp 7947
    fastapi 7501
    falcon 4904
    bottle 3900
    tornado 3819
    hug 3512
    pyramid 3346
    Flask 2272
    eve 1401
    djangorestframework 584

    To better grasp the differences between libraries, take a look on the table that defines the slowest framework as a baseline and shows how many times faster, the other libraries are:

    Library Relative performance
    sanic 17.77
    aiohttp 13.61
    fastapi 12.84
    falcon 8.4
    bottle 6.68
    tornado 6.54
    hug 6.01
    pyramid 5.73
    Flask 3.89
    eve 2.4
    djangorestframework 1

    NOTE 1: The exact values of requests per second might vary based on OS, hardware, load, and many other terms. The purpose of these tests is to compare the performance of different frameworks in terms of performance and point the ones that are noticeably faster than others. If you are required to obtain, e.g., min, 4000 rps, make sure that the selected library can handle such traffic in your conditions.

    NOTE 2: The processing done by the library was minimal (Hello world example) with default settings. If you add logic to your endpoints, data processing, database connections, and so on, your results will be different.


    If you are interested in repeating these tests you mind find following bash command useful:

    wrk --duration 10s --threads 5 --connections 20 http://localhost:8000 | awk '/Requests/sec:/{printf "%sn", $2;}'

    It performs wrk-based benchmarking using five threads and 20 connections on localhost:8000 (change it to your needs) and then extracts only the number of requests per second from whole wrk output.

    Then you can easily repeat this test several times using loop in bash:

    for run in {1..10}; do wrk -d10s -t5 -c20 http://localhost:8000 | awk '/Requests/sec:/{printf "%sn", $2;}'; done


    Other materials

    If you are interested in more details about aiohttp framework, take a look at the Routing order in aiohttp library in Python.


    Mateusz Buczkowski

    Mateusz Buczkowski is a senior software engineer and a leader of the R&D team. In Grandmetric, Mateusz is responsible for the shape of the software projects as well as takes his part in backend development. On top of that, he researches and develops IoT and wireless solutions. As an R&D engineer, he took part in two FP7 EU projects, namely 5GNOW and SOLDER, where he worked on solutions that could be used in the 5th Generation wireless networks.

    Adam Hopkins
    7 January 2021 at 07:19

    Sanic project maintainer here (I saw a lot of traffic coming from your post to the repo). Thanks for sharing your results. I would be curious to know about what kind of endpoints you were testing against.

    Also, as a complete aside, we are about to merge into master for release in March a branch that increases performance anywhere from 7-10% on average.

    Mateusz Buczkowski
    12 January 2021 at 14:46

    Hi Adam,
    it’s great to hear that our post had an impact on the peoples’ interest in the Sanic project. In this test scenario, we focused on the easiest setup and we used examples provided by the authors in the docs or readme. We were making GET requests on endpoints without any further processing (like DB connection or data validation).
    In the future, we would like to extend this kind of test. Is there anything in particular, that you would be interested in, like performance under specific conditions, requested payload size, etc?

    Shaowei Su
    10 June 2021 at 00:30

    Curious about the testing endpoints for various frameworks, e.g FastAPI: is it defined using the async def such that the framework running with ASGI

    Mateusz Buczkowski
    30 June 2021 at 13:21

    All of the frameworks were tested using the most basic example provided by the authors. In case of the FastAPI it was in fact an async function returning “Hello world” json.

    29 September 2022 at 03:26

    Have you ever considered creating an ebook or guewst authoring on other websites?
    I have a blog based upon on the same ideas you discuss and would love to have you share some stories/information. I know my viewers would appreciate your work.
    If you are even remotely interested, feel free to shoot me an e-mail.


    Leave a Reply

    Your email address will not be published. Required fields are marked *