1 /* NEVER EVER edit this manually, fix the mkhelp script instead! */
7 " Project ___| | | | _ \\| | \n"
8 " / __| | | | |_) | | \n"
9 " | (__| |_| | _ <| |___ \n"
10 " \\___|\\___/|_| \\_\\_____|\n"
12 " curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n"
13 " HTTP or HTTPS syntax.\n"
16 " curl [options] url\n"
19 " curl is a client to get documents/files from servers, using\n"
20 " any of the supported protocols. The command is designed to\n"
21 " work without user interaction or any kind of interactivity.\n"
23 " curl offers a busload of useful tricks like proxy support,\n"
24 " user authentication, ftp upload, HTTP post, SSL (https:)\n"
25 " connections, cookies, file transfer resume and more.\n"
28 " The URL syntax is protocol dependent. You'll find a detailed\n"
29 " description in RFC 2396.\n"
31 " You can specify multiple URLs or parts of URLs by writing\n"
32 " part sets within braces as in:\n"
34 " http://site.{one,two,three}.com\n"
36 " or you can get sequences of alphanumeric series by using []\n"
39 " ftp://ftp.numericals.com/file[1-100].txt\n"
40 " ftp://ftp.numericals.com/file[001-100].txt (with leading\n"
42 " ftp://ftp.letters.com/file[a-z].txt\n"
44 " It is possible to specify up to 9 sets or series for a URL,\n"
45 " but no nesting is supported at the moment:\n"
47 " http://www.any.org/archive[1996-1999]/vol\n"
48 " ume[1-4]part{a,b,c,index}.html\n"
52 " (FTP) When used in a ftp upload, this will tell curl to\n"
53 " append to the target file instead of overwriting it. If\n"
54 " the file doesn't exist, it will be created.\n"
56 " -A/--user-agent <agent string>\n"
57 " (HTTP) Specify the User-Agent string to send to the\n"
58 " HTTP server. Some badly done CGIs fail if its not set\n"
59 " to \"Mozilla/4.0\". To encode blanks in the string, sur\n"
60 " round the string with single quote marks. This can\n"
61 " also be set with the -H/--header flag of course.\n"
62 " -b/--cookie <name=data>\n"
63 " (HTTP) Pass the data to the HTTP server as a cookie. It\n"
64 " is supposedly the data previously received from the\n"
65 " server in a \"Set-Cookie:\" line. The data should be in\n"
66 " the format \"NAME1=VALUE1; NAME2=VALUE2\".\n"
68 " If no '=' letter is used in the line, it is treated as\n"
69 " a filename to use to read previously stored cookie\n"
70 " lines from, which should be used in this session if\n"
71 " they match. Using this method also activates the\n"
72 " \"cookie parser\" which will make curl record incoming\n"
73 " cookies too, which may be handy if you're using this in\n"
74 " combination with the -L/--location option. The file\n"
75 " format of the file to read cookies from should be plain\n"
76 " HTTP headers or the netscape cookie file format.\n"
78 " NOTE that the file specified with -b/--cookie is only\n"
79 " used as input. No cookies will be stored in the file.\n"
80 " To store cookies, save the HTTP headers to a file using\n"
81 " -D/--dump-header!\n"
84 " (FTP/LDAP) Use ASCII transfer when getting an FTP file\n"
85 " or LDAP info. For FTP, this can also be enforced by\n"
86 " using an URL that ends with \";type=A\".\n"
89 " Continue/Resume a previous file transfer. This\n"
90 " instructs curl to continue appending data on the file\n"
91 " where it was previously left, possibly because of a\n"
92 " broken connection to the server. There must be a named\n"
93 " physical file to append to for this to work. Note:\n"
94 " Upload resume is depening on a command named SIZE not\n"
95 " always present in all ftp servers! Upload resume is for\n"
96 " FTP only. HTTP resume is only possible with HTTP/1.1\n"
97 " or later servers.\n"
99 " -C/--continue-at <offset>\n"
100 " Continue/Resume a previous file transfer at the given\n"
101 " offset. The given offset is the exact number of bytes\n"
102 " that will be skipped counted from the beginning of the\n"
103 " source file before it is transfered to the destination.\n"
104 " If used with uploads, the ftp server command SIZE will\n"
105 " not be used by curl. Upload resume is for FTP only.\n"
106 " HTTP resume is only possible with HTTP/1.1 or later\n"
109 " -d/--data <data>\n"
110 " (HTTP) Sends the specified data in a POST request to\n"
111 " the HTTP server. Note that the data is sent exactly as\n"
112 " specified with no extra processing. The data is\n"
113 " expected to be \"url-encoded\". This will cause curl to\n"
114 " pass the data to the server using the content-type\n"
115 " application/x-www-form-urlencoded. Compare to -F.\n"
117 " If you start the data with the letter @, the rest\n"
118 " should be a file name to read the data from, or - if\n"
119 " you want curl to read the data from stdin. The con\n"
120 " tents of the file must already be url-encoded.\n"
122 " -D/--dump-header <file>\n"
123 " (HTTP/FTP) Write the HTTP headers to this file. Write\n"
124 " the FTP file info to this file if -I/--head is used.\n"
126 " This option is handy to use when you want to store the\n"
127 " cookies that a HTTP site sends to you. The cookies\n"
128 " could then be read in a second curl invoke by using the\n"
129 " -b/--cookie option!\n"
131 " -e/--referer <URL>\n"
132 " (HTTP) Sends the \"Referer Page\" information to the HTTP\n"
133 " server. Some badly done CGIs fail if it's not set. This\n"
134 " can also be set with the -H/--header flag of course.\n"
136 " -E/--cert <certificate[:password]>\n"
137 " (HTTPS) Tells curl to use the specified certificate\n"
138 " file when getting a file with HTTPS. The certificate\n"
139 " must be in PEM format. If the optional password isn't\n"
140 " specified, it will be queried for on the terminal. Note\n"
141 " that this certificate is the private key and the pri\n"
142 " vate certificate concatenated!\n"
145 " (HTTP) Fail silently (no output at all) on server\n"
146 " errors. This is mostly done like this to better enable\n"
147 " scripts etc to better deal with failed attempts. In\n"
148 " normal cases when a HTTP server fails to deliver a doc\n"
149 " ument, it returns a HTML document stating so (which\n"
150 " often also describes why and more). This flag will pre\n"
151 " vent curl from outputting that and fail silently\n"
154 " -F/--form <name=content>\n"
155 " (HTTP) This lets curl emulate a filled in form in which\n"
156 " a user has pressed the submit button. This causes curl\n"
157 " to POST data using the content-type multipart/form-data\n"
158 " according to RFC1867. This enables uploading of binary\n"
159 " files etc. To force the 'content' part to be read from\n"
160 " a file, prefix the file name with an @ sign. Example,\n"
161 " to send your password file to the server, where 'pass\n"
162 " word' is the name of the form-field to which\n"
163 " /etc/passwd will be the input:\n"
165 " curl -F password=@/etc/passwd www.mypasswords.com\n"
166 " To read the file's content from stdin insted of a file,\n"
167 " use - where the file name should've been.\n"
172 " -H/--header <header>\n"
173 " (HTTP) Extra header to use when getting a web page. You\n"
174 " may specify any number of extra headers. Note that if\n"
175 " you should add a custom header that has the same name\n"
176 " as one of the internal ones curl would use, your exter\n"
177 " nally set header will be used instead of the internal\n"
178 " one. This allows you to make even trickier stuff than\n"
179 " curl would normally do. You should not replace inter\n"
180 " nally set headers without knowing perfectly well what\n"
184 " (HTTP) Include the HTTP-header in the output. The HTTP-\n"
185 " header includes things like server-name, date of the\n"
186 " document, HTTP-version and more...\n"
189 " (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n"
190 " feature the command HEAD which this uses to get nothing\n"
191 " but the header of a document. When used on a FTP file,\n"
192 " curl displays the file size only.\n"
194 " -K/--config <config file>\n"
195 " Specify which config file to read curl arguments from.\n"
196 " The config file is a text file in which command line\n"
197 " arguments can be written which then will be used as if\n"
198 " they were written on the actual command line. If the\n"
199 " first column of a config line is a '#' character, the\n"
200 " rest of the line will be treated as a comment.\n"
202 " Specify the filename as '-' to make curl read the file\n"
206 " (FTP) When listing an FTP directory, this switch forces\n"
207 " a name-only view. Especially useful if you want to\n"
208 " machine-parse the contents of an FTP directory since\n"
209 " the normal directory view doesn't use a standard look\n"
213 " (HTTP/HTTPS) If the server reports that the requested\n"
214 " page has a different location (indicated with the\n"
215 " header line Location:) this flag will let curl attempt\n"
216 " to reattempt the get on the new place. If used together\n"
217 " with -i or -I, headers from all requested pages will be\n"
220 " -m/--max-time <seconds>\n"
221 " Maximum time in seconds that you allow the whole opera\n"
222 " tion to take. This is useful for preventing your batch\n"
223 " jobs from hanging for hours due to slow networks or\n"
224 " links going down. This doesn't work properly in win32\n"
228 " Manual. Display the huge help text.\n"
231 " Makes curl scan the .netrc file in the user's home\n"
232 " directory for login name and password. This is typi\n"
233 " cally used for ftp on unix. If used with http, curl\n"
234 " will enable user authentication. See netrc(5) for\n"
235 " details on the file format. Curl will not complain if\n"
236 " that file hasn't the right permissions (it should not\n"
237 " be world nor group readable). The environment variable\n"
238 " \"HOME\" is used to find the home directory.\n"
240 " A quick and very simple example of how to setup a\n"
241 " .netrc to allow curl to ftp to the machine\n"
242 " host.domain.com with user name\n"
244 " machine host.domain.com login myself password secret\n"
247 " Disables the buffering of the output stream. In normal\n"
248 " work situations, curl will use a standard buffered out\n"
249 " put stream that will have the effect that it will out\n"
250 " put the data in chunks, not necessarily exactly when\n"
251 " the data arrives. Using this option will disable that\n"
254 " -o/--output <file>\n"
255 " Write output to <file> instead of stdout. If you are\n"
256 " using {} or [] to fetch multiple documents, you can use\n"
257 " '#' followed by a number in the <file> specifier. That\n"
258 " variable will be replaced with the current string for\n"
259 " the URL being fetched. Like in:\n"
261 " curl http://{one,two}.site.com -o \"file_#1.txt\"\n"
263 " or use several variables like:\n"
265 " curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n"
267 " -O/--remote-name\n"
268 " Write output to a local file named like the remote file\n"
269 " we get. (Only the file part of the remote file is used,\n"
270 " the path is cut off.)\n"
272 " -P/--ftpport <address>\n"
273 " (FTP) Reverses the initiator/listener roles when con\n"
274 " necting with ftp. This switch makes Curl use the PORT\n"
275 " command instead of PASV. In practice, PORT tells the\n"
276 " server to connect to the client's specified address and\n"
277 " port, while PASV asks the server for an ip address and\n"
278 " port to connect to. <address> should be one of:\n"
280 " interface i.e \"eth0\" to specify which interface's IP\n"
281 " address you want to use (Unix only)\n"
283 " IP address i.e \"192.168.10.1\" to specify exact IP num\n"
286 " host name i.e \"my.host.domain\" to specify machine\n"
288 " - (any single-letter string) to make it pick\n"
289 " the machine's default\n"
291 " -q If used as the first parameter on the command line, the\n"
292 " $HOME/.curlrc file will not be read and used as a con\n"
295 " -Q/--quote <comand>\n"
296 " (FTP) Send an arbitrary command to the remote FTP\n"
297 " server, by using the QUOTE command of the server. Not\n"
298 " all servers support this command, and the set of QUOTE\n"
299 " commands are server specific! Quote commands are sent\n"
300 " BEFORE the transfer is taking place. To make commands\n"
301 " take place after a successful transfer, prefix them\n"
302 " with a dash '-'. You may specify any amount of commands\n"
303 " to be run before and after the transfer. If the server\n"
304 " returns failure for one of the commands, the entire\n"
305 " operation will be aborted.\n"
307 " -r/--range <range>\n"
308 " (HTTP/FTP) Retrieve a byte range (i.e a partial docu\n"
309 " ment) from a HTTP/1.1 or FTP server. Ranges can be\n"
310 " specified in a number of ways.\n"
312 " 0-499 specifies the first 500 bytes\n"
314 " 500-999 specifies the second 500 bytes\n"
316 " -500 specifies the last 500 bytes\n"
318 " 9500 specifies the bytes from offset 9500 and for\n"
321 " 0-0,-1 specifies the first and last byte only(*)(H)\n"
323 " specifies 300 bytes from offset 500(H)\n"
326 " specifies two separate 100 bytes ranges(*)(H)\n"
328 " (*) = NOTE that this will cause the server to reply with a\n"
329 " multipart response!\n"
331 " You should also be aware that many HTTP/1.1 servers do not\n"
332 " have this feature enabled, so that when you attempt to get a\n"
333 " range, you'll instead get the whole document.\n"
335 " FTP range downloads only support the simple syntax 'start-\n"
336 " stop' (optionally with one of the numbers omitted). It\n"
337 " depends on the non-RFC command SIZE.\n"
340 " Silent mode. Don't show progress meter or error mes\n"
341 " sages. Makes Curl mute.\n"
344 " When used with -s it makes curl show error message if\n"
348 " Transfer the stdin data to the specified file. Curl\n"
349 " will read everything from stdin until EOF and store\n"
350 " with the supplied name. If this is used on a http(s)\n"
351 " server, the PUT command will be used.\n"
353 " -T/--upload-file <file>\n"
354 " Like -t, but this transfers the specified local file.\n"
355 " If there is no file part in the specified URL, Curl\n"
356 " will append the local file name. NOTE that you must use\n"
357 " a trailing / on the last directory to really prove to\n"
358 " Curl that there is no file name or curl will think that\n"
359 " your last directory name is the remote file name to\n"
360 " use. That will most likely cause the upload operation\n"
361 " to fail. If this is used on a http(s) server, the PUT\n"
362 " command will be used.\n"
364 " -u/--user <user:password>\n"
365 " Specify user and password to use when fetching. See\n"
366 " README.curl for detailed examples of how to use this.\n"
367 " If no password is specified, curl will ask for it\n"
370 " -U/--proxy-user <user:password>\n"
371 " Specify user and password to use for Proxy authentica\n"
372 " tion. If no password is specified, curl will ask for it\n"
375 " Makes the fetching more verbose/talkative. Mostly\n"
376 " usable for debugging. Lines starting with '>' means\n"
377 " data sent by curl, '<' means data received by curl that\n"
378 " is hidden in normal cases and lines starting with '*'\n"
379 " means additional info provided by curl.\n"
382 " Displays the full version of curl, libcurl and other\n"
383 " 3rd party libraries linked with the executable.\n"
385 " -w/--write-out <format>\n"
386 " Defines what to display after a completed and success\n"
387 " ful operation. The format is a string that may contain\n"
388 " plain text mixed with any number of variables. The\n"
389 " string can be specified as \"string\", to get read from a\n"
390 " particular file you specify it \"@filename\" and to tell\n"
391 " curl to read the format from stdin you write \"@-\".\n"
393 " The variables present in the output format will be sub\n"
394 " stituted by the value or text that curl thinks fit, as\n"
395 " described below. All variables are specified like\n"
396 " %{variable_name} and to output a normal % you just\n"
397 " write them like %%. You can output a newline by using\n"
398 " \\n, a carrige return with \\r and a tab space with \\t.\n"
400 " NOTE: The %-letter is a special letter in the\n"
401 " win32-environment, where all occurrences of % must be\n"
402 " doubled when using this option.\n"
404 " Available variables are at this point:\n"
406 " url_effective The URL that was fetched last. This is\n"
407 " mostly meaningful if you've told curl to\n"
408 " follow location: headers.\n"
410 " http_code The numerical code that was found in the\n"
411 " last retrieved HTTP(S) page.\n"
413 " time_total The total time, in seconds, that the\n"
414 " full operation lasted. The time will be\n"
415 " displayed with millisecond resolution.\n"
418 " The time, in seconds, it took from the\n"
419 " start until the name resolving was com\n"
422 " time_connect The time, in seconds, it took from the\n"
423 " start until the connect to the remote\n"
424 " host (or proxy) was completed.\n"
425 " time_pretransfer\n"
426 " The time, in seconds, it took from the\n"
427 " start until the file transfer is just\n"
428 " about to begin. This includes all pre-\n"
429 " transfer commands and negotiations that\n"
430 " are specific to the particular proto\n"
431 " col(s) involved.\n"
433 " size_download The total amount of bytes that were\n"
436 " size_upload The total amount of bytes that were\n"
439 " speed_download The average download speed that curl\n"
440 " measured for the complete download.\n"
442 " speed_upload The average upload speed that curl mea\n"
443 " sured for the complete download.\n"
445 " -x/--proxy <proxyhost[:port]>\n"
446 " Use specified proxy. If the port number is not speci\n"
447 " fied, it is assumed at port 1080.\n"
449 " -X/--request <command>\n"
450 " (HTTP) Specifies a custom request to use when communi\n"
451 " cating with the HTTP server. The specified request\n"
452 " will be used instead of the standard GET. Read the HTTP\n"
453 " 1.1 specification for details and explanations.\n"
455 " (FTP) Specifies a custom FTP command to use instead of\n"
456 " LIST when doing file lists with ftp.\n"
458 " -y/--speed-time <time>\n"
459 " If a download is slower than speed-limit bytes per sec\n"
460 " ond during a speed-time period, the download gets\n"
461 " aborted. If speed-time is used, the default speed-limit\n"
462 " will be 1 unless set with -y.\n"
464 " -Y/--speed-limit <speed>\n"
465 " If a download is slower than this given speed, in bytes\n"
466 " per second, for speed-time seconds it gets aborted.\n"
467 " speed-time is set with -Y and is 30 if not set.\n"
469 " -z/--time-cond <date expression>\n"
470 " (HTTP) Request to get a file that has been modified\n"
471 " later than the given time and date, or one that has\n"
472 " been modified before that time. The date expression can\n"
473 " be all sorts of date strings or if it doesn't match any\n"
474 " internal ones, it tries to get the time from a given\n"
475 " file name instead! See the GNU date(1) man page for\n"
476 " date expression details.\n"
477 " Start the date expression with a dash (-) to make it\n"
478 " request for a document that is older than the given\n"
479 " date/time, default is a document that is newer than the\n"
480 " specified date/time.\n"
483 " (HTTPS) Forces curl to use SSL version 3 when negotiat\n"
484 " ing with a remote SSL server.\n"
487 " (HTTPS) Forces curl to use SSL version 2 when negotiat\n"
488 " ing with a remote SSL server.\n"
490 " -#/--progress-bar\n"
491 " Make curl display progress information as a progress\n"
492 " bar instead of the default statistics.\n"
495 " (FTP) Convert LF to CRLF in upload. Useful for MVS\n"
499 " Redirect all writes to stderr to the specified file\n"
500 " instead. If the file name is a plain '-', it is instead\n"
501 " written to stdout. This option has no point when you're\n"
502 " using a shell with decent redirecting capabilities.\n"
506 " Default config file.\n"
509 " HTTP_PROXY [protocol://]<host>[:port]\n"
510 " Sets proxy server to use for HTTP.\n"
512 " HTTPS_PROXY [protocol://]<host>[:port]\n"
513 " Sets proxy server to use for HTTPS.\n"
515 " FTP_PROXY [protocol://]<host>[:port]\n"
516 " Sets proxy server to use for FTP.\n"
518 " GOPHER_PROXY [protocol://]<host>[:port]\n"
519 " Sets proxy server to use for GOPHER.\n"
521 " ALL_PROXY [protocol://]<host>[:port]\n"
522 " Sets proxy server to use if no protocol-specific proxy\n"
525 " NO_PROXY <comma-separated list of hosts>\n"
526 " list of host names that shouldn't go through any proxy.\n"
527 " If set to a asterisk '*' only, it matches all hosts.\n"
528 " COLUMNS <integer>\n"
529 " The width of the terminal. This variable only affects\n"
530 " curl when the --progress-bar option is used.\n"
533 " There exists a bunch of different error codes and their cor\n"
534 " responding error messages that may appear during bad condi\n"
535 " tions. At the time of this writing, the exit codes are:\n"
537 " 1 Unsupported protocol. This build of curl has no support\n"
538 " for this protocol.\n"
540 " 2 Failed to initialize.\n"
542 " 3 URL malformat. The syntax was not correct.\n"
544 " 4 URL user malformatted. The user-part of the URL syntax\n"
545 " was not correct.\n"
547 " 5 Couldn't resolve proxy. The given proxy host could not\n"
550 " 6 Couldn't resolve host. The given remote host was not\n"
553 " 7 Failed to connect to host.\n"
555 " 8 FTP weird server reply. The server sent data curl\n"
558 " 9 FTP access denied. The server denied login.\n"
560 " 10 FTP user/password incorrect. Either one or both were\n"
561 " not accepted by the server.\n"
563 " 11 FTP weird PASS reply. Curl couldn't parse the reply\n"
564 " sent to the PASS request.\n"
566 " 12 FTP weird USER reply. Curl couldn't parse the reply\n"
567 " sent to the USER request.\n"
569 " 13 FTP weird PASV reply, Curl couldn't parse the reply\n"
570 " sent to the PASV request.\n"
572 " 14 FTP weird 227 formay. Curl couldn't parse the 227-line\n"
573 " the server sent.\n"
575 " 15 FTP can't get host. Couldn't resolve the host IP we got\n"
576 " in the 227-line.\n"
578 " 16 FTP can't reconnect. Couldn't connect to the host we\n"
579 " got in the 227-line.\n"
580 " 17 FTP couldn't set binary. Couldn't change transfer\n"
581 " method to binary.\n"
583 " 18 Partial file. Only a part of the file was transfered.\n"
585 " 19 FTP couldn't RETR file. The RETR command failed.\n"
587 " 20 FTP write error. The transfer was reported bad by the\n"
590 " 21 FTP quote error. A quote command returned error from\n"
593 " 22 HTTP not found. The requested page was not found. This\n"
594 " return code only appears if --fail is used.\n"
596 " 23 Write error. Curl couldn't write data to a local\n"
597 " filesystem or similar.\n"
599 " 24 Malformat user. User name badly specified.\n"
601 " 25 FTP couldn't STOR file. The server denied the STOR\n"
604 " 26 Read error. Various reading problems.\n"
606 " 27 Out of memory. A memory allocation request failed.\n"
608 " 28 Operation timeout. The specified time-out period was\n"
609 " reached according to the conditions.\n"
611 " 29 FTP couldn't set ASCII. The server returned an unknown\n"
614 " 30 FTP PORT failed. The PORT command failed.\n"
616 " 31 FTP couldn't use REST. The REST command failed.\n"
618 " 32 FTP couldn't use SIZE. The SIZE command failed. The\n"
619 " command is an extension to the original FTP spec RFC\n"
622 " 33 HTTP range error. The range \"command\" didn't work.\n"
624 " 34 HTTP post error. Internal post-request generation\n"
627 " 35 SSL connect error. The SSL handshaking failed.\n"
629 " 36 FTP bad download resume. Couldn't continue an earlier\n"
630 " aborted download.\n"
631 " 37 FILE couldn't read file. Failed to open the file. Per\n"
634 " 38 LDAP cannot bind. LDAP bind operation failed.\n"
636 " 39 LDAP search failed.\n"
638 " 40 Library not found. The LDAP library was not found.\n"
640 " 41 Function not found. A required LDAP function was not\n"
643 " XX There will appear more error codes here in future\n"
644 " releases. The existing ones are meant to never change.\n"
647 " If you do find any (or have other suggestions), mail Daniel\n"
648 " Stenberg <Daniel.Stenberg@haxx.nu>.\n"
650 "AUTHORS / CONTRIBUTORS\n"
651 " - Daniel Stenberg <Daniel.Stenberg@haxx.nu>\n"
652 " - Rafael Sagula <sagula@inf.ufrgs.br>\n"
653 " - Sampo Kellomaki <sampo@iki.fi>\n"
654 " - Linas Vepstas <linas@linas.org>\n"
655 " - Bjorn Reese <breese@mail1.stofanet.dk>\n"
656 " - Johan Anderson <johan@homemail.com>\n"
657 " - Kjell Ericson <Kjell.Ericson@haxx,nu>\n"
658 " - Troy Engel <tengel@sonic.net>\n"
659 " - Ryan Nelson <ryan@inch.com>\n"
660 " - Bjorn Stenberg <Bjorn.Stenberg@haxx.nu>\n"
661 " - Angus Mackay <amackay@gus.ml.org>\n"
662 " - Eric Young <eay@cryptsoft.com>\n"
663 " - Simon Dick <simond@totally.irrelevant.org>\n"
664 " - Oren Tirosh <oren@monty.hishome.net>\n"
665 " - Steven G. Johnson <stevenj@alum.mit.edu>\n"
666 " - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>\n"
667 " - Andrés García <ornalux@redestb.es>\n"
668 " - Douglas E. Wegscheid <wegscd@whirlpool.com>\n"
669 " - Mark Butler <butlerm@xmission.com>\n"
670 " - Eric Thelin <eric@generation-i.com>\n"
671 " - Marc Boucher <marc@mbsi.ca>\n"
672 " - Greg Onufer <Greg.Onufer@Eng.Sun.COM>\n"
673 " - Doug Kaufman <dkaufman@rahul.net>\n"
674 " - David Eriksson <david@2good.com>\n"
675 " - Ralph Beckmann <rabe@uni-paderborn.de>\n"
676 " - T. Yamada <tai@imasy.or.jp>\n"
677 " - Lars J. Aas <larsa@sim.no>\n"
678 " - Jörn Hartroth <Joern.Hartroth@telekom.de>\n"
679 " - Matthew Clarke <clamat@van.maves.ca>\n"
680 " - Linus Nielsen <Linus.Nielsen@haxx.nu>\n"
681 " - Felix von Leitner <felix@convergence.de>\n"
682 " - Dan Zitter <dzitter@zitter.net>\n"
683 " - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n"
684 " - Chris Maltby <chris@aurema.com>\n"
685 " - Ron Zapp <rzapper@yahoo.com>\n"
686 " - Paul Marquis <pmarquis@iname.com>\n"
687 " - Ellis Pritchard <ellis@citria.com>\n"
688 " - Damien Adant <dams@usa.net>\n"
689 " - Chris <cbayliss@csc.come>\n"
690 " - Marco G. Salvagno <mgs@whiz.cjb.net>\n"
693 " http://curl.haxx.nu\n"
696 " ftp://ftp.sunet.se/pub/www/utilities/curl/\n"
699 " ftp(1), wget(1), snarf(1)\n"
703 " You always find news about what's going on as well as the latest versions\n"
704 " from the curl web pages, located at:\n"
706 " http://curl.haxx.nu\n"
710 " Get the main page from netscape's web-server:\n"
712 " curl http://www.netscape.com/\n"
714 " Get the root README file from funet's ftp-server:\n"
716 " curl ftp://ftp.funet.fi/README\n"
718 " Get a gopher document from funet's gopher server:\n"
720 " curl gopher://gopher.funet.fi\n"
722 " Get a web page from a server using port 8000:\n"
724 " curl http://www.weirdserver.com:8000/\n"
726 " Get a list of the root directory of an FTP site:\n"
728 " curl ftp://ftp.fts.frontec.se/\n"
730 " Get the definition of curl from a dictionary:\n"
732 " curl dict://dict.org/m:curl\n"
734 "DOWNLOAD TO A FILE\n"
736 " Get a web page and store in a local file:\n"
738 " curl -o thatpage.html http://www.netscape.com/\n"
740 " Get a web page and store in a local file, make the local file get the name\n"
741 " of the remote document (if no file name part is specified in the URL, this\n"
744 " curl -O http://www.netscape.com/index.html\n"
750 " To ftp files using name+passwd, include them in the URL like:\n"
752 " curl ftp://name:passwd@machine.domain:port/full/path/to/file\n"
754 " or specify them with the -u flag like\n"
756 " curl -u name:passwd ftp://machine.domain:port/full/path/to/file\n"
760 " The HTTP URL doesn't support user and password in the URL string. Curl\n"
761 " does support that anyway to provide a ftp-style interface and thus you can\n"
762 " pick a file like:\n"
764 " curl http://name:passwd@machine.domain/full/path/to/file\n"
766 " or specify user and password separately like in\n"
768 " curl -u name:passwd http://machine.domain/full/path/to/file\n"
770 " NOTE! Since HTTP URLs don't support user and password, you can't use that\n"
771 " style when using Curl via a proxy. You _must_ use the -u style fetch\n"
772 " during such circumstances.\n"
776 " Probably most commonly used with private certificates, as explained below.\n"
780 " Curl features no password support for gopher.\n"
784 " Get an ftp file using a proxy named my-proxy that uses port 888:\n"
786 " curl -x my-proxy:888 ftp://ftp.leachsite.com/README\n"
788 " Get a file from a HTTP server that requires user and password, using the\n"
789 " same proxy as above:\n"
791 " curl -u user:passwd -x my-proxy:888 http://www.get.this/\n"
793 " Some proxies require special authentication. Specify by using -U as above:\n"
795 " curl -U user:passwd -x my-proxy:888 http://www.get.this/\n"
797 " See also the environment variables Curl support that offer further proxy\n"
802 " With HTTP 1.1 byte-ranges were introduced. Using this, a client can request\n"
803 " to get only one or more subparts of a specified document. Curl supports\n"
804 " this with the -r flag.\n"
806 " Get the first 100 bytes of a document:\n"
808 " curl -r 0-99 http://www.get.this/\n"
810 " Get the last 500 bytes of a document:\n"
812 " curl -r -500 http://www.get.this/\n"
814 " Curl also supports simple ranges for FTP files as well. Then you can only\n"
815 " specify start and stop position.\n"
817 " Get the first 100 bytes of a document using FTP:\n"
819 " curl -r 0-99 ftp://www.get.this/README \n"
825 " Upload all data on stdin to a specified ftp site:\n"
827 " curl -t ftp://ftp.upload.com/myfile\n"
829 " Upload data from a specified file, login with user and password:\n"
831 " curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile\n"
833 " Upload a local file to the remote site, and use the local file name remote\n"
836 " curl -T uploadfile -u user:passwd ftp://ftp.upload.com/\n"
838 " Upload a local file to get appended to the remote file using ftp:\n"
840 " curl -T localfile -a ftp://ftp.upload.com/remotefile\n"
842 " NOTE: Curl does not support ftp upload through a proxy! The reason for this\n"
843 " is simply that proxies are seldomly configured to allow this and that no\n"
844 " author has supplied code that makes it possible!\n"
848 " Upload all data on stdin to a specified http site:\n"
850 " curl -t http://www.upload.com/myfile\n"
852 " Note that the http server must've been configured to accept PUT before this\n"
853 " can be done successfully.\n"
855 " For other ways to do http data upload, see the POST section below.\n"
859 " If curl fails where it isn't supposed to, if the servers don't let you\n"
860 " in, if you can't understand the responses: use the -v flag to get VERBOSE\n"
861 " fetching. Curl will output lots of info and all data it sends and\n"
862 " receives in order to let the user see all client-server interaction.\n"
864 " curl -v ftp://ftp.upload.com/\n"
866 "DETAILED INFORMATION\n"
868 " Different protocols provide different ways of getting detailed information\n"
869 " about specific files/documents. To get curl to show detailed information\n"
870 " about a single file, you should use -I/--head option. It displays all\n"
871 " available info on a single file for HTTP and FTP. The HTTP information is a\n"
872 " lot more extensive.\n"
874 " For HTTP, you can get the header information (the same as -I would show)\n"
875 " shown before the data by using -i/--include. Curl understands the\n"
876 " -D/--dump-header option when getting files from both FTP and HTTP, and it\n"
877 " will then store the headers in the specified file.\n"
879 " Store the HTTP headers in a separate file:\n"
881 " curl --dump-header headers.txt curl.haxx.nu\n"
883 " Note that headers stored in a separate file can be very useful at a later\n"
884 " time if you want curl to use cookies sent by the server. More about that in\n"
885 " the cookies section.\n"
889 " It's easy to post data using curl. This is done using the -d <data>\n"
890 " option. The post data must be urlencoded.\n"
892 " Post a simple \"name\" and \"phone\" guestbook.\n"
894 " curl -d \"name=Rafael%20Sagula&phone=3320780\" \\\n"
895 " http://www.where.com/guest.cgi\n"
897 " How to post a form with curl, lesson #1:\n"
899 " Dig out all the <input> tags in the form that you want to fill in. (There's\n"
900 " a perl program called formfind.pl on the curl site that helps with this).\n"
902 " If there's a \"normal\" post, you use -d to post. -d takes a full \"post\n"
903 " string\", which is in the format\n"
905 " <variable1>=<data1>&<variable2>=<data2>&...\n"
907 " The 'variable' names are the names set with \"name=\" in the <input> tags, and\n"
908 " the data is the contents you want to fill in for the inputs. The data *must*\n"
909 " be properly URL encoded. That means you replace space with + and that you\n"
910 " write weird letters with %XX where XX is the hexadecimal representation of\n"
911 " the letter's ASCII code.\n"
915 " (page located at http://www.formpost.com/getthis/\n"
917 " <form action=\"post.cgi\" method=\"post\">\n"
918 " <input name=user size=10>\n"
919 " <input name=pass type=password size=10>\n"
920 " <input name=id type=hidden value=\"blablabla\">\n"
921 " <input name=ding value=\"submit\">\n"
924 " We want to enter user 'foobar' with password '12345'.\n"
926 " To post to this, you enter a curl command line like:\n"
928 " curl -d \"user=foobar&pass=12345&id=blablabla&dig=submit\" (continues)\n"
929 " http://www.formpost.com/getthis/post.cgi\n"
932 " While -d uses the application/x-www-form-urlencoded mime-type, generally\n"
933 " understood by CGI's and similar, curl also supports the more capable\n"
934 " multipart/form-data type. This latter type supports things like file upload.\n"
936 " -F accepts parameters like -F \"name=contents\". If you want the contents to\n"
937 " be read from a file, use <@filename> as contents. When specifying a file,\n"
938 " you can also specify which content type the file is, by appending\n"
939 " ';type=<mime type>' to the file name. You can also post contents of several\n"
940 " files in one field. So that the field name 'coolfiles' can be sent three\n"
941 " files with different content types in a manner similar to:\n"
943 " curl -F \"coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html\" \\\n"
944 " http://www.post.com/postit.cgi\n"
946 " If content-type is not specified, curl will try to guess from the extension\n"
947 " (it only knows a few), or use the previously specified type (from an earlier\n"
948 " file if several files are specified in a list) or finally using the default\n"
949 " type 'text/plain'.\n"
951 " Emulate a fill-in form with -F. Let's say you fill in three fields in a\n"
952 " form. One field is a file name which to post, one field is your name and one\n"
953 " field is a file description. We want to post the file we have written named\n"
954 " \"cooltext.txt\". To let curl do the posting of this data instead of your\n"
955 " favourite browser, you have to check out the HTML of the form page to get to\n"
956 " know the names of the input fields. In our example, the input field names are\n"
957 " 'file', 'yourname' and 'filedescription'.\n"
959 " curl -F \"file=@cooltext.txt\" -F \"yourname=Daniel\" \\\n"
960 " -F \"filedescription=Cool text file with cool text inside\" \\\n"
961 " http://www.post.com/postit.cgi\n"
963 " So, to send two files in one post you can do it in two ways:\n"
965 " 1. Send multiple files in a single \"field\" with a single field name:\n"
967 " curl -F \"pictures=@dog.gif,cat.gif\" \n"
969 " 2. Send two fields with two field names: \n"
971 " curl -F \"docpicture=@dog.gif\" -F \"catpicture=@cat.gif\" \n"
975 " A HTTP request has the option to include information about which address\n"
976 " that referred to actual page, and curl allows the user to specify that\n"
977 " referrer to get specified on the command line. It is especially useful to\n"
978 " fool or trick stupid servers or CGI scripts that rely on that information\n"
979 " being available or contain certain data.\n"
981 " curl -e www.coolsite.com http://www.showme.com/\n"
985 " A HTTP request has the option to include information about the browser\n"
986 " that generated the request. Curl allows it to be specified on the command\n"
987 " line. It is especially useful to fool or trick stupid servers or CGI\n"
988 " scripts that only accept certain browsers.\n"
992 " curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/\n"
994 " Other common strings:\n"
995 " 'Mozilla/3.0 (Win95; I)' Netscape Version 3 for Windows 95\n"
996 " 'Mozilla/3.04 (Win95; U)' Netscape Version 3 for Windows 95\n"
997 " 'Mozilla/2.02 (OS/2; U)' Netscape Version 2 for OS/2\n"
998 " 'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)' NS for AIX\n"
999 " 'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)' NS for Linux\n"
1001 " Note that Internet Explorer tries hard to be compatible in every way:\n"
1002 " 'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)' MSIE for W95\n"
1004 " Mozilla is not the only possible User-Agent name:\n"
1005 " 'Konqueror/1.0' KDE File Manager desktop client\n"
1006 " 'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser\n"
1010 " Cookies are generally used by web servers to keep state information at the\n"
1011 " client's side. The server sets cookies by sending a response line in the\n"
1012 " headers that looks like 'Set-Cookie: <data>' where the data part then\n"
1013 " typically contains a set of NAME=VALUE pairs (separated by semicolons ';'\n"
1014 " like \"NAME1=VALUE1; NAME2=VALUE2;\"). The server can also specify for what\n"
1015 " path the \"cookie\" should be used for (by specifying \"path=value\"), when the\n"
1016 " cookie should expire (\"expire=DATE\"), for what domain to use it\n"
1017 " (\"domain=NAME\") and if it should be used on secure connections only\n"
1020 " If you've received a page from a server that contains a header like:\n"
1021 " Set-Cookie: sessionid=boo123; path=\"/foo\";\n"
1023 " it means the server wants that first pair passed on when we get anything in\n"
1024 " a path beginning with \"/foo\".\n"
1026 " Example, get a page that wants my name passed in a cookie:\n"
1028 " curl -b \"name=Daniel\" www.sillypage.com\n"
1030 " Curl also has the ability to use previously received cookies in following\n"
1031 " sessions. If you get cookies from a server and store them in a file in a\n"
1032 " manner similar to:\n"
1034 " curl --dump-header headers www.example.com\n"
1036 " ... you can then in a second connect to that (or another) site, use the\n"
1037 " cookies from the 'headers' file like:\n"
1039 " curl -b headers www.example.com\n"
1041 " Note that by specifying -b you enable the \"cookie awareness\" and with -L\n"
1042 " you can make curl follow a location: (which often is used in combination\n"
1043 " with cookies). So that if a site sends cookies and a location, you can\n"
1044 " use a non-existing file to trig the cookie awareness like:\n"
1046 " curl -L -b empty-file www.example.com\n"
1048 " The file to read cookies from must be formatted using plain HTTP headers OR\n"
1049 " as netscape's cookie file. Curl will determine what kind it is based on the\n"
1054 " The progress meter exists to show a user that something actually is\n"
1055 " happening. The different fields in the output have the following meaning:\n"
1057 " % Total % Received % Xferd Average Speed Time Curr.\n"
1058 " Dload Upload Total Current Left Speed\n"
1059 " 0 151M 0 38608 0 0 9406 0 4:41:43 0:00:04 4:41:39 9287\n"
1061 " From left-to-right:\n"
1062 " % - percentage completed of the whole transfer\n"
1063 " Total - total size of the whole expected transfer\n"
1064 " % - percentage completed of the download\n"
1065 " Received - currently downloaded amount of bytes\n"
1066 " % - percentage completed of the upload\n"
1067 " Xferd - currently uploaded amount of bytes\n"
1069 " Dload - the average transfer speed of the download\n"
1071 " Upload - the average transfer speed of the upload\n"
1072 " Time Total - expected time to complete the operation\n"
1073 " Time Current - time passed since the invoke\n"
1074 " Time Left - expected time left to completetion\n"
1075 " Curr.Speed - the average transfer speed the last 5 seconds (the first\n"
1076 " 5 seconds of a transfer is based on less time of course.)\n"
1078 " The -# option will display a totally different progress bar that doesn't\n"
1079 " need much explanation!\n"
1083 " Curl offers the user to set conditions regarding transfer speed that must\n"
1084 " be met to let the transfer keep going. By using the switch -y and -Y you\n"
1085 " can make curl abort transfers if the transfer speed doesn't exceed your\n"
1086 " given lowest limit for a specified time.\n"
1088 " To let curl abandon downloading this page if its slower than 3000 bytes per\n"
1089 " second for 1 minute, run:\n"
1091 " curl -y 3000 -Y 60 www.far-away-site.com\n"
1093 " This can very well be used in combination with the overall time limit, so\n"
1094 " that the above operatioin must be completed in whole within 30 minutes:\n"
1096 " curl -m 1800 -y 3000 -Y 60 www.far-away-site.com\n"
1100 " Curl automatically tries to read the .curlrc file (or _curlrc file on win32\n"
1101 " systems) from the user's home dir on startup. The config file should be\n"
1102 " made up with normal command line switches. Comments can be used within the\n"
1103 " file. If the first letter on a line is a '#'-letter the rest of the line\n"
1104 " is treated as a comment.\n"
1106 " Example, set default time out and proxy in a config file:\n"
1108 " # We want a 30 minute timeout:\n"
1110 " # ... and we use a proxy for all accesses:\n"
1111 " -x proxy.our.domain.com:8080\n"
1113 " White spaces ARE significant at the end of lines, but all white spaces\n"
1114 " leading up to the first characters of each line are ignored.\n"
1116 " Prevent curl from reading the default file by using -q as the first command\n"
1117 " line parameter, like:\n"
1119 " curl -q www.thatsite.com\n"
1121 " Force curl to get and display a local help page in case it is invoked\n"
1122 " without URL by making a config file similar to:\n"
1124 " # default url to get\n"
1125 " http://help.with.curl.com/curlhelp.html\n"
1127 " You can specify another config file to be read by using the -K/--config\n"
1128 " flag. If you set config file name to \"-\" it'll read the config from stdin,\n"
1129 " which can be handy if you want to hide options from being visible in process\n"
1132 " echo \"-u user:passwd\" | curl -K - http://that.secret.site.com\n"
1136 " When using curl in your own very special programs, you may end up needing\n"
1137 " to pass on your own custom headers when getting a web page. You can do\n"
1138 " this by using the -H flag.\n"
1140 " Example, send the header \"X-you-and-me: yes\" to the server when getting a\n"
1143 " curl -H \"X-you-and-me: yes\" www.love.com\n"
1145 " This can also be useful in case you want curl to send a different text in\n"
1146 " a header than it normally does. The -H header you specify then replaces the\n"
1147 " header curl would normally send.\n"
1149 "FTP and PATH NAMES\n"
1151 " Do note that when getting files with the ftp:// URL, the given path is\n"
1152 " relative the directory you enter. To get the file 'README' from your home\n"
1153 " directory at your ftp site, do:\n"
1155 " curl ftp://user:passwd@my.site.com/README\n"
1157 " But if you want the README file from the root directory of that very same\n"
1158 " site, you need to specify the absolute file name:\n"
1160 " curl ftp://user:passwd@my.site.com//README\n"
1162 " (I.e with an extra slash in front of the file name.)\n"
1164 "FTP and firewalls\n"
1166 " The FTP protocol requires one of the involved parties to open a second\n"
1167 " connction as soon as data is about to get transfered. There are two ways to\n"
1170 " The default way for curl is to issue the PASV command which causes the\n"
1171 " server to open another port and await another connection performed by the\n"
1172 " client. This is good if the client is behind a firewall that don't allow\n"
1173 " incoming connections.\n"
1175 " curl ftp.download.com\n"
1177 " If the server for example, is behind a firewall that don't allow connections\n"
1178 " on other ports than 21 (or if it just doesn't support the PASV command), the\n"
1179 " other way to do it is to use the PORT command and instruct the server to\n"
1180 " connect to the client on the given (as parameters to the PORT command) IP\n"
1181 " number and port.\n"
1183 " The -P flag to curl allows for different options. Your machine may have\n"
1184 " several IP-addresses and/or network interfaces and curl allows you to select\n"
1185 " which of them to use. Default address can also be used:\n"
1187 " curl -P - ftp.download.com\n"
1189 " Download with PORT but use the IP address of our 'le0' interface:\n"
1191 " curl -P le0 ftp.download.com\n"
1193 " Download with PORT but use 192.168.0.10 as our IP address to use:\n"
1195 " curl -P 192.168.0.10 ftp.download.com\n"
1199 " Secure HTTP requires SSL libraries to be installed and used when curl is\n"
1200 " built. If that is done, curl is capable of retrieving and posting documents\n"
1201 " using the HTTPS procotol.\n"
1205 " curl https://www.secure-site.com\n"
1207 " Curl is also capable of using your personal certificates to get/post files\n"
1208 " from sites that require valid certificates. The only drawback is that the\n"
1209 " certificate needs to be in PEM-format. PEM is a standard and open format to\n"
1210 " store certificates with, but it is not used by the most commonly used\n"
1211 " browsers (Netscape and MSEI both use the so called PKCS#12 format). If you\n"
1212 " want curl to use the certificates you use with your (favourite) browser, you\n"
1213 " may need to download/compile a converter that can convert your browser's\n"
1214 " formatted certificates to PEM formatted ones. This kind of converter is\n"
1215 " included in recent versions of OpenSSL, and for older versions Dr Stephen\n"
1216 " N. Henson has written a patch for SSLeay that adds this functionality. You\n"
1217 " can get his patch (that requires an SSLeay installation) from his site at:\n"
1218 " http://www.drh-consultancy.demon.co.uk/\n"
1220 " Example on how to automatically retrieve a document using a certificate with\n"
1221 " a personal password:\n"
1223 " curl -E /path/to/cert.pem:password https://secure.site.com/\n"
1225 " If you neglect to specify the password on the command line, you will be\n"
1226 " prompted for the correct password before any data can be received.\n"
1228 " Many older SSL-servers have problems with SSLv3 or TLS, that newer versions\n"
1229 " of OpenSSL etc is using, therefore it is sometimes useful to specify what\n"
1230 " SSL-version curl should use. Use -3 or -2 to specify that exact SSL version\n"
1233 " curl -2 https://secure.site.com/\n"
1235 " Otherwise, curl will first attempt to use v3 and then v2.\n"
1237 "RESUMING FILE TRANSFERS\n"
1239 " To continue a file transfer where it was previously aborted, curl supports\n"
1240 " resume on http(s) downloads as well as ftp uploads and downloads.\n"
1242 " Continue downloading a document:\n"
1244 " curl -c -o file ftp://ftp.server.com/path/file\n"
1246 " Continue uploading a document(*1):\n"
1248 " curl -c -T file ftp://ftp.server.com/path/file\n"
1250 " Continue downloading a document from a web server(*2):\n"
1252 " curl -c -o file http://www.server.com/\n"
1254 " (*1) = This requires that the ftp server supports the non-standard command\n"
1255 " SIZE. If it doesn't, curl will say so.\n"
1257 " (*2) = This requires that the wb server supports at least HTTP/1.1. If it\n"
1258 " doesn't, curl will say so.\n"
1262 " HTTP allows a client to specify a time condition for the document it\n"
1263 " requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to\n"
1264 " specify them with the -z/--time-cond flag.\n"
1266 " For example, you can easily make a download that only gets performed if the\n"
1267 " remote file is newer than a local copy. It would be made like:\n"
1269 " curl -z local.html http://remote.server.com/remote.html\n"
1271 " Or you can download a file only if the local file is newer than the remote\n"
1272 " one. Do this by prepending the date string with a '-', as in:\n"
1274 " curl -z -local.html http://remote.server.com/remote.html\n"
1276 " You can specify a \"free text\" date as condition. Tell curl to only download\n"
1277 " the file if it was updated since yesterday:\n"
1279 " curl -z yesterday http://remote.server.com/remote.html\n"
1281 " Curl will then accept a wide range of date formats. You always make the date\n"
1282 " check the other way around by prepending it with a dash '-'.\n"
1288 " curl dict://dict.org/m:curl\n"
1289 " curl dict://dict.org/d:heisenbug:jargon\n"
1290 " curl dict://dict.org/d:daniel:web1913\n"
1292 " Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'\n"
1293 " and 'lookup'. For example,\n"
1295 " curl dict://dict.org/find:curl\n"
1297 " Commands that break the URL description of the RFC (but not the DICT\n"
1300 " curl dict://dict.org/show:db\n"
1301 " curl dict://dict.org/show:strat\n"
1303 " Authentication is still missing (but this is not required by the RFC)\n"
1307 " If you have installed the OpenLDAP library, curl can take advantage of it\n"
1308 " and offer ldap:// support.\n"
1310 " LDAP is a complex thing and writing an LDAP query is not an easy task. I do\n"
1311 " advice you to dig up the syntax description for that elsewhere, RFC 1959 if\n"
1312 " no other place is better.\n"
1314 " To show you an example, this is now I can get all people from my local LDAP\n"
1315 " server that has a certain sub-domain in their email address:\n"
1317 " curl -B \"ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se\"\n"
1319 " If I want the same info in HTML format, I can get it by not using the -B\n"
1320 " (enforce ASCII) flag.\n"
1322 "ENVIRONMENT VARIABLES\n"
1324 " Curl reads and understands the following environment variables:\n"
1326 " HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY\n"
1328 " They should be set for protocol-specific proxies. General proxy should be\n"
1333 " A comma-separated list of host names that shouldn't go through any proxy is\n"
1334 " set in (only an asterisk, '*' matches all hosts)\n"
1338 " If a tail substring of the domain-path for a host matches one of these\n"
1339 " strings, transactions with that node will not be proxied.\n"
1342 " The usage of the -x/--proxy flag overrides the environment variables.\n"
1346 " Unix introduced the .netrc concept a long time ago. It is a way for a user\n"
1347 " to specify name and password for commonly visited ftp sites in a file so\n"
1348 " that you don't have to type them in each time you visit those sites. You\n"
1349 " realize this is a big security risk if someone else gets hold of your\n"
1350 " passwords, so therefor most unix programs won't read this file unless it is\n"
1351 " only readable by yourself (curl doesn't care though).\n"
1353 " Curl supports .netrc files if told so (using the -n/--netrc option). This is\n"
1354 " not restricted to only ftp, but curl can use it for all protocols where\n"
1355 " authentication is used.\n"
1357 " A very simple .netrc file could look something like:\n"
1359 " machine curl.haxx.nu login iamdaniel password mysecret\n"
1363 " To better allow script programmers to get to know about the progress of\n"
1364 " curl, the -w/--write-out option was introduced. Using this, you can specify\n"
1365 " what information from the previous transfer you want to extract.\n"
1367 " To display the amount of bytes downloaded together with some text and an\n"
1368 " ending newline:\n"
1370 " curl -w 'We downloaded %{size_download} bytes\\n' www.download.com\n"
1374 " We have an open mailing list to discuss curl, its development and things\n"
1375 " relevant to this.\n"
1377 " To subscribe, mail curl-request@contactor.se with \"subscribe <your email\n"
1378 " address>\" in the body.\n"
1380 " To post to the list, mail curl@contactor.se.\n"
1382 " To unsubcribe, mail curl-request@contactor.se with \"unsubscribe <your\n"
1383 " subscribed email address>\" in the body.\n"