HTTPie

GitHub上排名第一的python开源项目名叫HTTPie,它的描述为:HTTPie is a command line HTTP client, a user-friendly cURL replacement,看样子跟cURL的功能类似,之前有项目是通过python脚本调用cURL将磁盘中存储的JSON数据通过HTTP POST的方式发送给各个接收端来分别进行解析,处理并展示,既然是比较火的开源项目,这玩意应该挺好用

1:从TAG里下载一个版本

https://github.com/jakubroztocil/httpie/tree/0.9.2

2:安装

官方写的是可以通过apt-get,yum,或者pip直接进行安装,我好折腾,喜欢直接编译源代码瞧瞧各种依赖关系

[lihui@localhost httpie-0.9.2]$ sudo python setup.py install

结果会卡在一个在线Download的依赖

 

Installed /usr/lib/python2.6/site-packages/argparse-1.3.0-py2.6.egg
Searching for Pygments>=1.5
Reading http://pypi.python.org/simple/Pygments/
Best match: Pygments 2.0.2
Downloading https://pypi.python.org/packages/source/P/Pygments/Pygments-2.0.2.tar.gz#md5=238587a1370d62405edabd0794b3ec4a

在线Downloading半天没动静,直接wget这个URL,果真找不到,由于是VPN连到服务器,也不清楚是啥网络问题,直接本地PC可以下载这个包,传过去,先安装Pygments

[lihui@localhost Pygments-2.0.2]$ sudo python setup.py install

安装完成之后,再次安装httpie,看是否还存在依赖,再次出现一个依赖,不过貌似很小,很快就结束

Processing dependencies for httpie==0.9.2
Searching for requests>=2.3.0
Reading http://pypi.python.org/simple/requests/
Best match: requests 2.7.0
Downloading https://pypi.python.org/packages/source/r/requests/requests-2.7.0.tar.gz#md5=29b173fd5fa572ec0764d1fd7b527260
Processing requests-2.7.0.tar.gz
Running requests-2.7.0/setup.py -q bdist_egg --dist-dir /tmp/easy_install-oh9joO/requests-2.7.0/egg-dist-tmp-ddOi17
Adding requests 2.7.0 to easy-install.pth file

安装完之后,可以找到

[lihui@localhost httpie-0.9.2]$ which http
/usr/bin/http

3:实验

(1)开启tcpdump抓包

[lihui@localhost ~]$ sudo tcpdump -i eth0 port 80 -vv -w baidu.pcap

(2)执行httpie,随便来个GET试试

[lihui@localhost httpie-0.9.2]$ http -f GET baidu.com
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: max-age=86400
Connection: Keep-Alive
Content-Length: 81
Content-Type: text/html
Date: Thu, 14 May 2015 15:34:46 GMT
ETag: "51-4b4c7d90"
Expires: Fri, 15 May 2015 15:34:46 GMT
Last-Modified: Tue, 12 Jan 2010 13:48:00 GMT
Server: Apache

<html>
<meta http-equiv="refresh" content="0;url=http://www.baidu.com/">
</html>

(3)解析一下抓获的数据包,可以解析到HOST

[lihui@localhost ~]$ tshark -r baidu.pcap -T fields -e http.host -e http.request.method | sed '/^\s*$/d'
baidu.com        GET

除此之外,选项貌似还看到了JSON

[lihui@localhost httpie-0.9.2]$ http -dasdfs
usage: http [--json] [--form] [--pretty {all,colors,format,none}]
            [--style STYLE] [--print WHAT] [--verbose] [--headers] [--body]
            [--stream] [--output FILE] [--download] [--continue]
            [--session SESSION_NAME_OR_PATH]
            [--session-read-only SESSION_NAME_OR_PATH] [--auth USER[:PASS]]
            [--auth-type {basic,digest}] [--proxy PROTOCOL:PROXY_URL]
            [--follow] [--verify VERIFY] [--cert CERT] [--cert-key CERT_KEY]
            [--timeout SECONDS] [--check-status] [--ignore-stdin] [--help]
            [--version] [--traceback] [--debug]
            [METHOD] URL [REQUEST_ITEM [REQUEST_ITEM ...]]
http: error: too few arguments

可以分别尝试下含义

发表回复