Operations people are often called upon to do low level HTTP troublshooting and I often end up using tcpdump and tcptrace to break out HTTP sessions and troubleshoot.

Install tcptrace on your localmachine

apt-get install tcptrace

or for you Mac people

brew install tcptrace

Run tcpdump on your server

tcpdump -s 1500 -w /tmp/DUMP.pcap -c 5000 -i eth0 port 80 and host www.mague.com
switch reason
-s Sets the snaplength or the length to capture, by default it is often too small and you lose data that you want for analysis
-w Write a pcap file to this location. I usually prefer to perform analysis on another host
-c Capture this many packets. Not necessary, but useful if you forget to stop the capture
-i Interface to capture on. lo is the looopback and you can find interfaces by running ifconfig -a
expression limit to ports or protocols More info on filtering

Copy the dumpfile down to your local machine

mkdir -p ~/tmp/analysis
cd ~/tmp/analysis
scp remotehost:/tmp/DUMP.pcap .
tcptrace -n -xhttp DUMP.pcap

This will create a bunch of files in your directory like so:

172.16.0.20_http.xpl contains information that you can plot using xplot
http.times contains information on the timestamps when data was first fetched and completed
for troubleshooting however we are interested in the *.dat files

The request and response are in separate files with the names reversed.

For example a2b_contents.dat is the request

and b2a_contents.dat is the response

Now you can go about finding errors with grep

chris@gorilla:o ] ~/tmp/analysis 
$ grep --binary-files=text 404 *.dat
o2p_contents.dat:GET /throw/me/a/404/please HTTP/1.1
p2o_contents.dat:HTTP/1.1 404 Not Found

This is also super useful if you want to use curl later to reproduce any issue because now you can just add all of the headers that were previously sent

curl -v -H "Host: www.mague.com" -H \
    "Accept-Encoding: gzip,deflate,sdch" http://www.mague.com