Where is the fault in this chain? How can I as a maintainer of a few servers be sure my servers are secure without manually patching every package?
[1] http://packages.ubuntu.com/xenial/libcurl3 [2] http://oss-security.openwall.org/wiki/mailing-lists/distros
EDIT: changed "12 CVEs" to "at least 15 CVEs". The changelog don't have CVE-numbers in the title for all of them.
So if the Ubuntu security team does its job properly then you shouldn't have a reason to worry.
(However given the number of security vulns these days it's often challenging for LTS distributions to backport all security fixes. There are already breakdowns of the LTS concept, e.g. sticking with latest upstream versions for some packages like chromium where backporting is not realistic.)
You can see the status of all known CVEs and which .deb updates patches them here: https://security-tracker.debian.org/tracker/source-package/c...
The best way to stay on top of things is to subscribe to your distro's security advisory mailing list, for example https://lists.debian.org/debian-security-announce/
You are trusting Ubuntu's judgment that the remaining 12 CVEs aren't that important. Ubuntu's security team is pretty good, but I don't think there is any distro that is extremely good. In part this is because a distro is on the hook for compatibility of all the software they ship, and expected to prioritize compatibility over security. Anything other than targeted security fixes can cause regressions.
* https://github.com/hyperium/hyper/
It's the de-facto-standard-HTTP-library (although it does more than just HTTP). E.g. if you have a PHP script that downloads some data from another webpage it very often does this with curl.
curl --verbose --location URL
or even curl --trace --location URL
provides the desired information.Super tool, we use it all the time for file transfers
Getting my outgoing ip address (to check net connectivity): curl ip.mydomain.net or (to force ipv4) curl -4 ip.mydomain.net
To check how a page redirects: curl -v example.com (shows headers)
Interacting with elasticsearch, mainly showing indexes and their health: curl localhost:9200/_cat/indices
That's what I use it for, I think, mainly.
Edit: also between php scripts
As a general rule of thumb, if you think everyone around you is independently doing something stupid, you should first pursue the hypothesis that it is your reasoning that is flawed and not the entire rest of the world's.
Fixed in 7.51.0 - November 2 2016
Changes:
nss: additional cipher suites are now accepted by CURLOPT_SSL_CIPHER_LIST
New option: CURLOPT_KEEP_SENDING_ON_ERROR
Bugfixes: CVE-2016-8615: cookie injection for other servers
CVE-2016-8616: case insensitive password comparison
CVE-2016-8617: OOB write via unchecked multiplication
CVE-2016-8618: double-free in curl_maprintf
CVE-2016-8619: double-free in krb5 code
CVE-2016-8620: glob parser write/read out of bounds
CVE-2016-8621: curl_getdate read out of bounds
CVE-2016-8622: URL unescape heap overflow via integer truncation
CVE-2016-8623: Use-after-free via shared cookies
CVE-2016-8624: invalid URL parsing with '#'
CVE-2016-8625: IDNA 2003 makes curl use wrong host
openssl: fix per-thread memory leak using 1.0.1 or 1.0.2
http: accept "Transfer-Encoding: chunked" for HTTP/2 as well
LICENSE-MIXING.md: update with mbedTLS dual licensing
examples/imap-append: Set size of data to be uploaded
test2048: fix url
darwinssl: disable RC4 cipher-suite support
CURLOPT_PINNEDPUBLICKEY.3: fix the AVAILABILITY formatting
openssl: don’t call CRYTPO_cleanup_all_ex_data
libressl: fix version output
easy: Reset all statistical session info in curl_easy_reset
curl_global_cleanup.3: don't unload the lib with sub threads running
dist: add CurlSymbolHiding.cmake to the tarball
docs: Remove that --proto is just used for initial retrieval
configure: Fixed builds with libssh2 in a custom location
curl.1: --trace supports % for sending to stderr!
cookies: same domain handling changed to match browser behavior
formpost: trying to attach a directory no longer crashes
CURLOPT_DEBUGFUNCTION.3: fixed unused argument warning
formpost: avoid silent snprintf() truncation
ftp: fix Curl_ftpsendf
mprintf: return error on too many arguments
smb: properly check incoming packet boundaries
GIT-INFO: remove the Mac 10.1-specific details
resolve: add error message when resolving using SIGALRM
cmake: add nghttp2 support
dist: remove PDF and HTML converted docs from the releases
configure: disable poll() in macOS builds
vtls: only re-use session-ids using the same scheme
pipelining: skip to-be-closed connections when pipelining
win: fix Universal Windows Platform build
curl: do not set CURLOPT_SSLENGINE to DEFAULT automatically
maketgz: make it support "only" generating version info
Curl_socket_check: add extra check to avoid integer overflow
gopher: properly return error for poll failures
curl: set INTERLEAVEDATA too
polarssl: clear thread array at init
polarssl: fix unaligned SSL session-id lock
polarssl: reduce #ifdef madness with a macro
curl_multi_add_handle: set timeouts in closure handles
configure: set min version flags for builds on mac
INSTALL: converted to markdown => INSTALL.md
curl_multi_remove_handle: fix a double-free
multi: fix inifinte loop in curl_multi_cleanup()
nss: fix tight loop in non-blocking TLS handhsake over proxy
mk-ca-bundle: Change URL retrieval to HTTPS-only by default
mbedtls: stop using deprecated include file
docs: fix req->data in multi-uv example
configure: Fix test syntax for monotonic clock_gettime
CURLMOPT_MAX_PIPELINE_LENGTH.3: Clarify it's not for HTTP/2In this case, haxx.se is not that bad, but many news sites present so much ads, overlays, non-responsive UI, dark UI etc etc that most mobile browsers crash, and loading takes forever due to 10+mb of ads on a 3G connection just to display 20 lines of information.