当前位置: 首页 > 工具软件 > Integral FTP > 使用案例 >

HTTP/FTP客户端开发库:libwww、libcurl、libfetch 以及更多

劳和雅
2023-12-01

网页抓取和ftp访问是目前很常见的一个应用需要,无论是搜索引擎的爬虫,分析程序,资源获取程序,WebService等等都是需要的,自己开发抓取库当然是最好了,不过开发需要时间和周期,使用现有的Open source程序是个更好的选择,一来别人已经写的很好了,就近考验,二来自己使用起来非常快速,三来自己还能够学习一下别人程序的优点。

闲来无事,在网上浏览,就发现了这些好东西,特别抄来分享分享。主要就是libwww、libcurl、libfetch 这三个库,当然,还有一些其他很多更优秀库,文章后面会有简单的介绍。

 

【libwww】
官方网站:http://www.w3.org/Library/
更多信息:http://www.w3.org/Library/User/
运行平台:Unix/Linux,Windows

以下资料来源:http://9.douban.com/site/entry/15448100/http://zh.wikipedia.org/wiki/Libwww

简介:
Libwww 是一个高度模组化用户端的网页存取API ,用C语言写成,可在 Unix 和 Windows 上运行。 It can be used for both large and small applications including: browsers/editors, robots and batch tools. There are pluggable modules provided with Libwww which include complete HTTP/1.1 with caching, pipelining, POST, Digest Authentication, deflate, etc. The purpose of libwww is to serve as a testbed for protocol experiments. 蒂姆·伯纳斯-李 在 1992 年十一月创造出了 Libwww,用於展示网际网路的潜能。使用 Libwww 的应用程式,如被广泛使用的命令列文字浏览器 Lynx 及 Mosaic web browser 即是用 Libwww 所写成的。 Libwww 目前为一开放原始码程式,并於日前移至 W3C 管理。基於其为开放原始码的特性,任何人都能为 Libwww 付出一点心力,这也确保了 Libwww 能一直进步,成为更有用的软体。

操作示例:
最近我需要写点页面分析的东西,这些东西某些程度上类似搜索引擎的“爬虫->parser->存储”的过程。

过去我常用的抓取页面的库是libcurl,这个东西是unix常用命令curl的基础,curl被称做“命令行浏览器”,功能强大,支持的协议也全面。遗憾的是libcurl仅仅是个支持多协议的抓取库,不能做解析。

找来找去,发现了w3c的Libwww库,这东西功能强大的吓人,不仅有解析,还有robot(也就是爬虫了,或是叫internet walker)功能。在Libwww基础上完成的程序很多,最著名的大概是字符模式的浏览器lynx。我几乎就觉得这就我需要的东西了,立刻dive into。

一整天之后,我终于能用这东西抓下来页面,并且从html页面中分析出来一些信息了,但是想更进一步就变的异常困难。因为这个库功能太复杂了。这东西文档不详细,被人提及的也少。Libwww最近的Release 5.3.2,发布于2000年12月20日。一个有这么多年历史的东西,竟然没多少开发者在讨论,非常不正常。

找来找去,最后在libcurl的FAQ里面看到了和Libwww的比较,精选的读者来信告诉我,不仅仅是我一个人被Libwww的复杂弄的晕了头脑,我才花了一整天,写信的那个哥们竟然用了一人月,还是在里面打转,直到换了curl才好。虽然这是libcurl推销自己的方法,不过这些失败的前辈的经验让我对自己的智商重新有了信心。看来这东西没多少人讨论是正常的...

好吧,我也投降,libcurl没html解析功能,这没关系,我找别的办法好了...这么复杂的库,再好我也实在没办法忍受下去了,再说我需要的功能其实也真没Libwww那么复杂的。

写程序其实很容易迷失,你会看到一个似乎很完美,什么都能做的东西,一下子就喜欢上它,但是最后往往还是无福消受。往往是那些,不那么成熟,多少有点小毛病的库,组合在一起才是真正的解决方案。

 

【libcurl】

官方网站:http://curl.haxx.se/libcurl
更多特点:http://curl.haxx.se/docs/features.html
运行平台:Unix/Linux,Windows(Windows上貌似也有实现)

以下资料来源:http://blog.csdn.net/hwz119/archive/2007/04/29/1591920.aspx

 

Libcurl为一个免费开源的,客户端url传输库,支持FTP,FTPS,TFTP,HTTP,HTTPS,GOPHER,TELNET,DICT,FILE和LDAP,跨平台,支持Windows,Unix,Linux等,线程安全,支持Ipv6。并且易于使用。

http://curl.haxx.se/libcurl/

 

从http://curl.haxx.se/libcurl/ 下载一个稳定的版本,注意选择OS。

 

编译libcurl

下载下来的是源码包,需要编译。

解压zip文件,进入curl-7.14.0/lib目录(我下载的是7.14.0)。

编译Debug版本。新建一个批处理bat文件,如buildDebug.bat,内容如下:

call "C:/Program Files/Microsoft Visual Studio/VC98/Bin/vcvars32.bat"

set CFG=debug-dll-ssl-dll-zlib-dll

set OPENSSL_PATH=E:/SSL/openssl-0.9.7e

set ZLIB_PATH=E:/zip/zlib123

nmake -f Makefile.vc6

 

其输出:libcurld_imp.lib, libcurld.dll

 

编译Release版本。新建一个批处理文件BuildRelease.bat,内容如下:

call "C:/Program Files/Microsoft Visual Studio/VC98/Bin/vcvars32.bat"

set CFG=release-dll-ssl-dll-zlib-dll

set OPENSSL_PATH=E:/SSL/openssl-0.9.7e

set ZLIB_PATH=E:/zip/zlib123

nmake -f Makefile.vc6

 

其输出:libcurl_imp.lib, libcurl.dll

 

上面编译的是libcurl的 dll,使用OpenSSL Dll版本和Zlib Dll版本。如果没有,可以从www.openssl.org 或者http://www.zlib.net/ 下载。

如果需要编译其他版本,可查看Makefile.vc6,设定相应的CFG 参数即可。

 

商业软件使用libcurl时,只需要包含其copywrite声明即可。

 

Sample

 

#include <stdio.h>
#include "../curl-7.14.0/include/curl/curl.h"
#pragma comment(lib, "../curl-7.14.0/lib/libcurl_imp.lib")

int main(void)
{
  curl = curl_easy_init();
  if(curl) {

    CURLcode res;    
    res = curl_easy_setopt(curl, CURLOPT_PROXY, "Test-pxy08:8080");
    res = curl_easy_setopt(curl, CURLOPT_PROXYTYPE, CURLPROXY_HTTP);
    res = curl_easy_setopt(curl, CURLOPT_URL, "http://www.vckbase.com");
    res = curl_easy_perform(curl);

    if(CURLE_OK == res) {
      char *ct;
      /**//* ask for the content-type */
      /**//* http://curl.haxx.se/libcurl/c/curl_easy_getinfo.html */
      res = curl_easy_getinfo(curl, CURLINFO_CONTENT_TYPE, &ct);

      if((CURLE_OK == res) && ct)
        printf("We received Content-Type: %s ", ct);
    }

    /**//* always cleanup */
    curl_easy_cleanup(curl);
  }
  return 0;
}

 

 

【libfetch】
官方网站:http://libfetch.darwinports.com/ 
更多信息:http://www.freebsd.org/cgi/man.cgi?query=fetch&sektion=3
运行平台:BSD

以下资料来源:http://bbs.chinaunix.net/viewthread.php?tid=105809

前几天无双老大在FB版介绍了一下CU的巨猛的法老级灌水大师,小弟于是说要编个程序自动来灌,哈哈昨晚有所突破,找到一个很好的库,先介绍给各位大鱼小虾们,不过可别真的拿它来灌水啊,否则我被这里的班长们砍死以后的冤魂可要来算帐的喔!      
这是在FreeBSD里找到的一个库:libfetch,源代码在/usr/src/lib/libfetch里,它对http和ftp协议进行了封装,提供了一些很容易使用的函数,因为昨天刚看到,还没仔细研究,我试了一个用http取网页的函数,示例如下:
#include <stio.h>
#include 
#include 

#include "fetch.h"

const char * myurl = "http://qjlemon:aaa@192.169.0.1:8080/test.html";

main()
{
        FILE * fp;
        char buf[1024];

        fp = fetchGetURL(myurl, "";
        if (!fp) {
                printf("error: %s ", fetchLastErrString);
                return 1;
        }
        while (!feof(fp)) {
                memset(buf, 0, sizeof(buf));
                fgets(buf, sizeof(buf), fp);
                if (ferror(fp))
                        break;
                if (buf[0])
                        printf("%s", buf);
                else
                        break;
        }
        fclose(fp);
        fp = NULL;
}

这里最重要的就是fetchGetURL函数,它按指定的URL来取文件,比如URL
是以http开头的,这个函数就知道按http取文件,如果是ftp://,就会按ftp取文件,还可以指定用户名和口令。
如果文件被取到,它会返回一个FILE指针,可以象操作普通的文件一样把网页的内容取出来。
另外这个库还提供了一些函数,可以对网络操作进行更为精细的控制。
当然最有用的是还是几个PUT函数,想要灌水就得用这个哟!哈哈哈!

 

【其他相关HTTP/FTP客户端库】
资料来源:http://curl.haxx.se/libcurl/competitors.html

Free Software and Open Source projects have a long tradition of forks and duplicate efforts. We enjoy "doing it ourselves", no matter if someone else has done something very similar already.

Free/open libraries that cover parts of libcurl's features:

libcurl (MIT)

  • a highly portable and easy-to-use client-side URL transfer library, supporting FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TELNET, DICT, FILE, TFTP and LDAP. libcurl also supports HTTPS certificates, HTTP POST, HTTP PUT, FTP uploading, kerberos, HTTP form based upload, proxies, cookies, user+password authentication, file transfer resume, http proxy tunnelling and more!

libghttp (LGPL)

  • Having a glance at libghttp (a gnome http library), it looks as if it works rather similar to libcurl (for http). There's no web page for this and the person who's email is mentioned in the README of the latest release I found claims he has passed the leadership of the project to "eazel". Popular choice among GNOME projects.

libwww (W3C licensecomparison with libcurl 

  • More complex, and and harder to use than libcurl is. Includes everything from multi-threading to HTML parsing. The most notable transfer-related feature that libcurl does not offer but libwww does, is caching.

libferit (GPL)

    • C++ library "for transferring files via http, ftp, gopher, proxy server". Based on 'snarf' 2.0.9-code (formerly known as libsnarf). Quote from freshmeat: 

"As the author of snarf, I have to say this frightens me. Snarf's networking system is far from robust and complete. It's probably full of bugs, and although it works for maybe 85% of all current situations, I wouldn't base a library on it."

neon (LGPL)

  • An HTTP and WebDAV client library, with a C interface. I've mainly heard and seen people use this with WebDAV as their main interest.

libsoup (LGPL) comparison with libcurl

  • Part of glib (GNOME). Supports: HTTP 1.1, Persistent connections, Asynchronous DNS and transfers, Connection cache, Redirects, Basic, Digest, NTLM authentication, SSL with OpenSSL or Mozilla NSS, Proxy support including SSL, SOCKS support, POST data. Probably not very portable. Lacks: cookie support, NTLM for proxies, GSS, gzip encoding, trailers in chunked responses and more.

mozilla netlib (MPL)

  • Handles URLs, protocols, transports for the Mozilla browser.

mozilla libxpnet (MPL)

  • Minimal download library targeted to be much smaller than the above mentioned netlib. HTTP and FTP support.

wget (GPL)

  • While not a library at all, I've been told that people sometimes extract the network code from it and base their own hacks from there.

libfetch (BSD)

    • Does HTTP and FTP transfers (both ways), supports file: URLs, and an API for URL parsing. The utility

fetch

    •  that is built on libfetch is an integral part of the 

FreeBSD

  •  operating system.

HTTP Fetcher (LGPL)

    • "

a small, robust, flexible library for downloading files via HTTP using the GET method.

  • "

http-tiny (Artistic License)

    • "

a very small C library to make http queries (GET, HEAD, PUT, DELETE, etc.) easily portable and embeddable

  • "

XMLHTTP Object also known as IXMLHTTPRequest (part of MSXML 3.0)

  • (Windows) Provides client-side protocol support for communication with HTTP servers. A client computer can use the XMLHTTP object to send an arbitrary HTTP request, receive the response, and have the Microsoft® XML Document Object Model (DOM) parse that response.

QHttp (GPL)

  • QHttp is a class in the Qt library from Troll Tech. Seems to be restricted to plain HTTP. Supports GET, POST and proxy. Asynchronous.

ftplib (GPL)

    • "

a set of routines that implement the FTP protocol. They allow applications to create and access remote files through function calls instead of needing to fork and exec an interactive ftp client program."

ftplibpp (GPL)

  • A C++ library for "easy FTP client functionality. It features resuming of up- and downloads, FXP support, SSL/TLS encryption, and logging functionality."

GNU Common C++ library

  • Has a URLStream class. This C++ class allow you to download a file using HTTP. See demo/urlfetch.cpp in commoncpp2-1.3.19.tar.gz

HTTPClient (LGPL)

  • Java HTTP client library.

Jakarta Commons HttpClient (Apache License)

    • A Java HTTP client library written by the Jakarta project.

转载于:https://www.cnblogs.com/lifan3a/articles/7479271.html

 类似资料: