The HTTP protocols 1.1, 2 and 3 allow reusing a single connection for multiple requests. Doing so eliminates the overhead to establish a connection, including the SSL handshake. While most web servers and clients nowadays use HTTP/2 and HTTP/3 where keep-alive is usually the default, many web crawlers still use HTTP/1.1 for crawling, which can result in quite some waste of resources if keep-alive is not properly configured and enabled on the web server.
The following command uses curl to create an HTTP/1.1 connection. Curl will request the same URL twice and output the request and response headers and connection information.
curl --http1.1 -sIXGET -v https://example.com/ https://example.com/
The relevant information can be filtered using
curl --http1.1 -sIXGET -v https://example.com/ https://example.com/ | grep -i connection
Note: Some web servers block requests with the curl User-agent. To change the User-Agent, use the -A option to set a different one.
curl --http1.1 -sIXGET -v -A "MyUserAgent" https://example.com/ https://example.com/ | grep -i connection
If keep-alives are supported the output will state something like:
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 * Connection #0 to host example.com left intact * Re-using existing connection! (#0) with host example.com * Closing connection 0
If keep-alives are not supported the output will state something like:
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 < Connection: close * Closing connection 0 Connection: close * SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 < Connection: close * Closing connection 1 Connection: close
If keep-alives are not supported, I strongly suggest enabling it to reduce the overhead for creating a new connection for every single HTTP/1.1 request. You should have a look at the configuration options for your web server and adjust those.
Note: Most web servers also allow you to log how many requests used the same connection. I suggest that you make use of those log settings to verify it is working properly.