The only issue I've run into has been with my attempt to reuse the same AsyncClient to make multiple concurrent requests to the same remote host. It looks like this issue may have been fixed in 0.10 or 0.11 so I'll be upgrading soon to check.
Also, be sure to check out the other fantastic projects by Encode. https://github.com/encode I stumbled upon httpx after using Starlette and Uvicorn for one of our microservices and and been pleasantly surprised by how easy to set up and use they are.
Do you know if those exist in butterfly?
I'm already deep in aiohttp and it's not an easy task to learn an async client (at least in my case, but I'm no dev) so if I do switch it would be for more features (retry option on exceptions, limit requests per host and per time frame...). But that's only my opinion.
This is a huge win compare to requests. AFAIKT requests is too flexible (read: easier to misuse) and difficult to add type annotation now. The author of requests gave a horrible type annotation example here [0].
IMO at this time when you evaluate a new Python library before adopting, "having type annotation" should be as important as "having decent unit test coverage".
I mean this is pretty spot on IMO. I've worked with many languages, and have concluded that having a powerful type system catches soooo many bugs before you even try to run the code.
And they're usually "stupid" bugs too, forgetting to sanitize inputs etc. Even worse is when a language tries to be "smart", so you end up with "1" + 2 = "12" and no errors at all.
The last one is no longer possible in Python 3 and will throw an error. That's another thing I am glad they fixed.
The complex union is as you say more a sign of a too flexible api rather than problem with type hints, the type hints just brings the root issue to the surface. I mean, why would you accept either a mapping or a list of tuples as input? Just let the user call dict(..) on the tuple-list first if they have it in such format? The documentation doesn't even mention that lists are ok for headers, only dicts: https://2.python-requests.org/en/master/api/#main-interface.
The file-tuple api with various length tuples is perhaps valid and the most convenient way to implement such options, but it's still an exceptionally unique api which requires exceptional type hints, it can be made slightly simpler which chc demonstrated above.
This is a really awesome library, thanks!
A friend wrote respx https://github.com/lundberg/respx which is a utility to mock HTTP requests made by httpx. It works similar to the requests mocking library responses https://github.com/getsentry/responses
https://vorpus.org/blog/why-im-not-collaborating-with-kennet...
On every single project I do, it's just a bunch of posting JSON and getting a response synchronously. Over and over.
In some high-level languages even BSD sockets (and many other POSIX functions) aren't in the standard library, and there are various wrappers to provide "ease of use" and integration with a language's runtime system; plenty of complexity (and alternatives) even at that point.
RFC 2616 (HTTP/1.1, 1999) may seem manageable, but it's much more than just posting data and getting a response, and IME many programmers working with HTTP aren't familiar with all of its functionality. Then add TLS with SNI, cookies, CORS, WebSocket protocol, various authentication methods, try to wrap it into a nice API in a given language and not introduce too many bugs, and it's rather far from trivial. But that's just HTTP/1.1 with common[ly expected] extensions.
Edit: Though I think it'd also be controversial to add support for particular higher-level protocols into standard libraries of general-purpose languages, even if it was easy to implement and to come up with a nice API.
But it's probably not, as it's underspecified and ambiguous, which is part of why its been replaced as the HTTP/1.1 spec by RFCs 7230-7237 (2014).
- Should the library includes its own CA store, or use the system's CA store? These kind of library often include their own CA store (since they changes often), and httpx seem to use 3rd party lib to handle that (certifi). This is hard to do in a standard library for variety of reasons (users rarely update their python installation, system CA store is not always available/up to date, etc).
- While the http protocol itself is pretty stable, some part of it are still changing overtime. Things like compression types (brotli is gaining traction these days, and we might get a new compression types in the future), new http headers added, etc. Security issue also show up all the time. The user will want tighter release schedule than python's so they can get these stuff sooner. The situation is even worse for users that stuck in a particular version of python for some reason since they now won't have access to these new update ever.
The CA store should be a configurable option, and one of the supported options should be the system CA store.
> The user will want tighter release schedule than python's so they can get these stuff sooner.
Ruby is moving stdlib to default and bundled gems, which addresses this. There's no reason that “delivered with the interpreter” needs to mean “frozen with the interpreter”.
It's easier for a 3rd party package to come up with a better api, because they can start brand new. Also when there's a radical change it is easier for a new 3rd package to take over. Httpx is example of such evolution, although not due to changes in http but this time changes in Python, it makes use of new functionality that's harder to implement in requests, mainly adding async support and type annotations.
It's not a hard rule, sometimes things do end up in the standard library.
> Package http provides HTTP client and server implementations.
Will need to check how it compares with aiohttp which is quite good and also has these.
Yes it has to be. Requests is not as great as everyone thinks it is. It's API is simple, sure, but when you need more advanced features (doesn't even support HTTP/2 AFAIK) like timeouts, proper exception handling (which you cannot do in requests)) it actually sucks.
httpx is a far superior library already!
Starlette and Uvicorn are both made by Encode https://github.com/encode and in my experience they consistently put out quality stuff.
Regarding database access, my recommendation might not be popular, but I really like asyncpg[1]. They basically dropped DBAPI and instead created their own API that functionally matches PostgreSQL, giving you full control and increasing performance.
As a result of that you probably won't be able to use ORMs (there's SQLAlchemy driver, but as I understand it is hacky and you are losing the performance benefits).
Personally I compared my code with and without ORM and there's no change in number of lines. If you use PyCharm and configure it to connect to your database (I think that might be a pro feature) it will then automatically detect SQL in your strings provide autocomplete (including table and column names) and provide highlighting, removing all the advantages that ORM provided.
Not enough experience with async to comment.
> DB
peewee is good enough and more ergonomic compared to SQLA for a lot of use cases.
> formatter
black. To be clear it often produces truly horrendous code, but at least there’s no arguing and no fussing over options or details.
I've moved to Trio over asyncio (I did plain old async for a couple years and Trio makes a ton of sense)
Quart-trio over Flask (just to get a Trio-friendly flask-a-like server) - plain old aiohttp worked really well too. It takes a bit more roll-your-own work, but you get exactly what you want.
peewee over SQLAlchemy (less committed to this change, but peewee has been fine so far and is much more streamlined) I'm mostly just using SQLite. the async version of the ORM looks pretty new, i'm not using it yet.
also, sqlalchemy is an over-engineered system imo. i only go for it when i have no other choices. otherwise i use a database client directly.
Edit: Well, looks like it does use certifi. But my grumble still stands, I don't understand why does everyone want to mess with your certs.
People should be empowered to substitute cert stores, but the system store should be the default.
Looking forward to trying it out
Edit: maybe the homepage can include a very simple async example as well?
Also I ran into hard-to-debug issues when there were lots of requests in the past. I'll check again with httpx soon.
I do look forward to httpx becoming the new “standard” though. Tom is a great developer, and his ecosystem of tools are going to have a really big impact in python web dev over the coming years.
I like to see the usage of async inside of a main loop, where other things can be processed while waiting for the async response to come in.
Further, not all applications have the idea of an event loop, so async may not be needed, but it's useful to have the option.
I use async operations to multiplex operations in a queue. The linux scheduler can handle the execution time slices for me, I'm not going to build a scheduler, but the queue controller's role is to accept jobs and handle timeouts and results of each async operation.
The await keyword only blocks the execution of the coroutine in which it is used. It releases the event loop so that other coroutines can continue processing while the result of an awaitable is being fetched.
Question: is there any advantage of this over flask’s builtin werkzeug test client?
[1] https://www.python-httpx.org/advanced/#calling-into-python-w...
Based on similar experience with other tools, that possibly means that Httpx is great for simple testing but if you need to go deep it's better to use the framework provided. That's an assumption, though, so I'd love to hear more from others.
with httpx.Client(app=app) as client:
...
in the httpx sample code with with app.test_client() as client:
...
for flask’s builtin test client and the rest is basically the same. Now that’s a fair comparison and neither is simpler than the other.I guess one advantage of httpx is that developers might generally be more familiar with the requests response object API than the werkzeug response object API.
Will definitely be checking out and potentially replacing requests and aiohttp.
https://emilydamstra.com/news/please-enough-dead-butterflies...
Great package though. Love the dual support for async and sync requests.
Well that's going to work wonders for async now isn't it?