My team and I really fell in love with FastAPI when we started using it almost a year and a half ago. It really is a great framework with its great performance, how easy it is to use with async code, its tight integration with Pydantic, and how easy it is to use BackgroundTasks.
We still love it and I personally think it's currently the best Python web framework for building APIs. However, we are now moving away from using BackgroundTasks
due to a few drawbacks that come with using it.
What is BackgroundTasks
?
You can use BackgroundTasks to do a "task" that takes some time to do, but the person doing a request to your backend does not really have to wait for the task to complete before you respond back to them. An example of this could be to send a "forgot your password"-email, copy files between two servers, or do some costly calculations of some kind. Let's use the "send forgot your password email" as an example simply because it's a quite common thing to do.
Assume we have the following code and are not using background task:
# user_view.py
@app.post("/forgot-password/{email}")
async def send_forget_password_email(email: str):
await user_domain.send_forget_password_email(email)
return {
"message": "A reset-password email has been sent "
"if an account with that email exists on our platform."
}
# user_domain.py
async def send_forget_password_email(email: str) -> None:
user_exists = await user_model.user_exists(email=email) # 1
if user_exists:
password_reset_token = await auth_service.get_password_reset_token(
email
) # 2
await email_service.send_forget_password_email(
email=email, password_reset_token=password_reset_token
) # 3
In the code above, the user has to wait for #1
, #2
, & #3
to complete before they get the message that the email has been sent. Let's say that step #1
takes 20ms, #2
takes 700ms, and #3
takes 250 ms. Let's also say that it takes 100ms for the user's request to reach your backend and the same time for your response to reach the user. That means that it would take 1170ms (20+700+250+100+100) for the user to be informed that the request was successful.
Let's now instead look at the code if we use background tasks.
# user_view.py
@app.post(
"/forgot-password/{email}",
status_code=status.HTTP_202_ACCEPTED
)
async def send_forget_password_email(
email: str, background_tasks: BackgroundTasks
):
background_tasks.add_task(
user_domain.send_forget_password_email, email
)
return {
"message": "A reset-password email has been sent "
"if an account with that email exists on our platform."
}
# user_domain.py
async def send_forget_password_email(email: str) -> None:
user_exists = await user_model.user_exists(email=email) # 1
if user_exists:
password_reset_token = await auth_service.get_password_reset_token(
email
) # 2
await email_service.send_forget_password_email(
email=email, password_reset_token=password_reset_token
) # 3
When using a background task, we do no longer need to wait for #1
, #2
, & #3
to complete before responding. This means that the user only has to wait (around) 200ms before they are informed that the request was successful. That is an 82.9% reduction in time. And with such a small code change! Amazing!
Now when we know better what BackgroundTasks is, let's take a look at the gotcha/drawbacks
The gotcha
No presistency
If your service for some reason crashes or is restarted while the background task is in the queue to be run, or is being run, then there is no way for the task to continue or be restarted after the service comes back up again. Not even that, but you won't even know that the background task was started but not finished.
This might not be super important when sending a "forgotten password"-email, but there might be other things you do in background tasks that you definitely want to know if thet were not completed.
Untracable errors
Everyone hates when an error occurs. But if you just use background tasks as above, then you will get the worst kind of errors that exist; errors that contain no information about what caused them.
Let's say that we use background task and user_model.user_exists
raises InvalidEmailException(f"'{email}' is not a valid email")
(you should probably have checked email validity earlier, but let's go with this as an example), then the raised error that you will see in the console will like something like this:
example-api | ERROR: Exception in ASGI application
example-api | Traceback (most recent call last):
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 369, in run_asgi
example-api | result = await app(self.scope, self.receive, self.send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 59, in __call__
example-api | return await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/fastapi/applications.py", line 208, in __call__
example-api | await super().__call__(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/applications.py", line 112, in __call__
example-api | await self.middleware_stack(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/errors.py", line 181, in __call__
example-api | raise exc from None
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/errors.py", line 159, in __call__
example-api | await self.app(scope, receive, _send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/cors.py", line 86, in __call__
example-api | await self.simple_response(scope, receive, send, request_headers=headers)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/cors.py", line 142, in simple_response
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 26, in __call__
example-api | await response(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 224, in __call__
example-api | await run_until_first_complete(
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in run_until_first_complete
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in <listcomp>
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 216, in stream_response
example-api | async for chunk in self.body_iterator:
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 56, in body_stream
example-api | task.result()
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 38, in coro
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 26, in __call__
example-api | await response(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 224, in __call__
example-api | await run_until_first_complete(
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in run_until_first_complete
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in <listcomp>
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 216, in stream_response
example-api | async for chunk in self.body_iterator:
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 56, in body_stream
example-api | task.result()
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 38, in coro
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/exceptions.py", line 86, in __call__
example-api | raise RuntimeError(msg) from exc
example-api | RuntimeError: Caught handled exception, but response already started.
example-api | 2021-08-09 10:29:31,539 ERROR [uvicorn.error:372][MainThread] Exception in ASGI application
example-api | Traceback (most recent call last):
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 369, in run_asgi
example-api | result = await app(self.scope, self.receive, self.send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 59, in __call__
example-api | return await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/fastapi/applications.py", line 208, in __call__
example-api | await super().__call__(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/applications.py", line 112, in __call__
example-api | await self.middleware_stack(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/errors.py", line 181, in __call__
example-api | raise exc from None
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/errors.py", line 159, in __call__
example-api | await self.app(scope, receive, _send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/cors.py", line 86, in __call__
example-api | await self.simple_response(scope, receive, send, request_headers=headers)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/cors.py", line 142, in simple_response
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 26, in __call__
example-api | await response(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 224, in __call__
example-api | await run_until_first_complete(
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in run_until_first_complete
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in <listcomp>
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 216, in stream_response
example-api | async for chunk in self.body_iterator:
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 56, in body_stream
example-api | task.result()
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 38, in coro
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 26, in __call__
example-api | await response(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 224, in __call__
example-api | await run_until_first_complete(
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in run_until_first_complete
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/concurrency.py", line 24, in <listcomp>
example-api | [task.result() for task in done]
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/responses.py", line 216, in stream_response
example-api | async for chunk in self.body_iterator:
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 56, in body_stream
example-api | task.result()
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/middleware/base.py", line 38, in coro
example-api | await self.app(scope, receive, send)
example-api | File "/home/python/.cache/pypoetry/virtualenvs/example-o9msT97p-py3.9/lib/python3.9/site-packages/starlette/exceptions.py", line 86, in __call__
example-api | raise RuntimeError(msg) from exc
example-api | RuntimeError: Caught handled exception, but response already started.
Isn't that helpful? /s This error is raised by starlette
, one of FastAPI's dependencies. And as you see, there is no way to figure out what actually caused the error, so there is no good way of debugging this and finding the cause. Especially if you have a lot of traffic on your server.
Global error handlers prevent BackgroundTasks from being triggered
Thank you to hexarobi for making me aware of this gotcha after reading this blog.
If you have an error that is raised to the global error handler, then the background task won't be triggered. If this is a problem or not depends on your use case.
To exemplify this, let's say that you want to send a warning to every user if someone tries to log in to their account with the wrong password. Also, assume that we have a global error handler that returns a 403 on the exception InvalidCredentials
.
# auth_view.py
@router.post(
"/oauth/token",
response_model=auth_view_schemas.UserAuthTokenResponse,
)
async def login(
credentials: auth_view_schemas.PasswordGrantTypePayload,
response: Response,
background_tasks: BackgroundTasks
) -> dto.JSON:
authentication_tokens = await auth_domain.get_tokens(
credentials, background_tasks
)
response.set_cookie(
# ... omitted
)
return authentication_tokens.dict()
# auth_domain.py
async get_tokens(
credentials: auth_view_schemas.PasswordGrantTypePayload,
background_tasks: BackgroundTasks
):
try:
await verify_credentials(credentials)
except exceptions.InvalidCredentials as e:
background_tasks.add_task(
send_invalid_password_warning, credentials.email
)
raise e
return await generate_authentication_tokens(credentials.email)
`
With the above code, the global error handler will take care of the raised exception exceptions.InvalidCredentials
if verify_credentials
raises it. The global error handler will create a new response and return it. This means that the background task that was a part of the original response will never be triggered since the response was never returned. In other words, the user will never be notified that someone tried to log in with their email.
Mitigating the drawbacks
Making sure background tasks run even on raised errors.
This is easiest mitigated by catching the error in the view layer and then return the appropriate response.
# auth_view.py
@router.post(
"/oauth/token",
response_model=auth_view_schemas.UserAuthTokenResponse,
)
async def login(
credentials: auth_view_schemas.PasswordGrantTypePayload,
response: Response,
background_tasks: BackgroundTasks
) -> dto.JSON:
try:
authentication_tokens = await auth_domain.get_tokens(
credentials, background_tasks
)
except exceptions.InvalidCredentials:
return JSONResponse(
status_code=status_code,
content={"errors": "Incorrect username or password"},
)
# ... rest of function
Another way would be to create a new background task in the global error handler and return that with the response. I leave how to do that up to the reader ;)
Making the untraceable errors traceable
There is actually a very nice way to make the error traceable by using a decorator.
The following decorator:
import inspect
import logging
import uuid
from functools import wraps
from typing import Any, Callable
logger = logging.getLogger(__name__)
def background_task_wrapper(func: Callable) -> Callable:
task_name = func.__name__
@wraps(func)
async def wrapper(*args: Any, **kwargs: Any) -> None:
task_id = uuid.uuid4()
func_args = inspect.signature(func).bind(*args, **kwargs).arguments
func_args_str = (
", ".join("{}={!r}".format(*item) for item in func_args.items())
)
logger.info(
f"[{task_id}] Started {task_name} with arguments: {func_args_str}"
)
try:
await func(*args, **kwargs)
logger.info(f"[{task_id}] Finished {task_name} Successfully")
except Exception as e: #4
logger.error(
f"[{task_id}] Failed Permanently {task_name} with error: {e}"
)
# 5
return wrapper
gives the following log:
example-api | 2021-08-09 12:25:05,263 INFO [example.user_model.user_exists:20][MainThread] [4a683449-f5eb-463d-8eef-f7dc8ac8ba57] Started user_exists with arguments: email='hello there@examle.com'
example-api | 2021-08-09 12:25:05,263 ERROR [example.user_model.user_exists:26][MainThread] [4a683449-f5eb-463d-8eef-f7dc8ac8ba57] Failed Permanently user_exists with error: 'hello there@examle.com' is not a valid email
You don't get the whole traceback but it contains (what I think is) some important information; the name of the function that failed and the arguments that were passed to the function when it failed. (Please note that the line numbers are "incorrect" as they are the line number of where the logger statement is inside the wrapper.)
An important thing to notice is that we want to catch ALL exceptions (# 4
) and we do not want to raise the exception we catch (#5
). That is because if we don't catch all exceptions, or raise the exception again, then we will still get the unhelpful but super long error log in our terminal.
Making the task persistent
I'm sorry but this can't be mitigated by just using a decorator or some other simple thing. You could semi-manually look through the logs to try to identify what background tasks started before the crash but were never logged successfully. You could save the calls to a DB and then mark them as completed (or delete them) from the database once they are completed. But that would just be stupid.
What we ended up doing is starting to use RabbitMQ which is a message broker. We have started to need a message broker for other reasons. Using it to replace BackgroundTaks
was therefore an easy choice for us to solve the task persistency problem.