Examples¶
requests¶
HTTP requests as an example of an unreliable operation.
decorators¶
Using decorators to retry functions is a popular choice, and waiter
supports this pattern.
[2]:
from waiter import wait
import requests
backoff = wait(0.1) * 2
url = 'https://httpbin.org/status/200'
@backoff.retrying(OSError)
def get_url(url):
return requests.get(url)
get_url(url)
[2]:
<Response [200]>
functions¶
But there’s a problem with this approach: the implementer of the unreliable function is choosing the retry strategy instead of the caller. Which in practice means the decorated function is often just a wrapper around the underlying implementation.
The above example could just as easily be a partially bound function, and that is in fact how the waiter
decorators are implemented. This approach also facilitates using sessions, which should be done for repeated requests anyway.
[3]:
from functools import partial
get_url = partial(backoff.retry, OSError, requests.Session().get)
get_url(url)
[3]:
<Response [200]>
Which in turn raises the question of whether get_url
is worth abstracting at all. The completely in-lined variation is arguably just as readable.
[4]:
backoff.retry(OSError, requests.Session().get, url)
[4]:
<Response [200]>
[5]:
backoff.poll(lambda r: r.ok, requests.Session().get, url)
[5]:
<Response [200]>
iteration¶
But even the functional approach breaks down if the unreliable code is more naturally expressed as a block, or there are multiple failure conditions, or logging is required, etc. It’s not worth creating what amounts to a domain-specific language just to avoid a for-loop.
[6]:
import logging
def get_url(url):
"""Retry and log both connection and http errors."""
with requests.Session() as session:
for _ in backoff[:1]:
try:
resp = session.get(url)
except OSError:
logging.exception(url)
continue
if resp.ok:
return resp
logging.error(f'{url} {resp.status_code}')
return None
get_url('https://httpbin.org/status/404')
ERROR:root:https://httpbin.org/status/404 404
ERROR:root:https://httpbin.org/status/404 404
asyncio¶
waiter
also supports async iteration and coroutine functions.
[7]:
import httpx
async def get_url(url):
return await backoff.retry(OSError, httpx.get, url)
get_url(url)
[7]:
<coroutine object get_url at 0x7fc2447babc8>
[8]:
async def get_url(url):
client = httpx.Client()
async for _ in backoff:
resp = await client.get(url)
if resp.status == 200:
return resp
get_url(url)
[8]:
<coroutine object get_url at 0x7fc2447badc8>