[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[GNUnet-SVN] [gnurl] 48/116: curl: speed up handling of many URLs
From: |
gnunet |
Subject: |
[GNUnet-SVN] [gnurl] 48/116: curl: speed up handling of many URLs |
Date: |
Tue, 05 Dec 2017 14:51:18 +0100 |
This is an automated email from the git hooks/post-receive script.
ng0 pushed a commit to branch master
in repository gnurl.
commit ee8016b3de0b0e3e9a352a1123c5e6272848aa55
Author: Daniel Stenberg <address@hidden>
AuthorDate: Sat Nov 4 12:56:30 2017 +0100
curl: speed up handling of many URLs
By properly keeping track of the last entry in the list of URLs/uploads
to handle, curl now avoids many meaningless traverses of the list which
speeds up many-URL handling *MASSIVELY* (several magnitudes on 100K
URLs).
Added test 1291, to verify that it doesn't take ages - but we don't have
any detection of "too slow" command in the test suite.
Reported-by: arainchik on github
Fixes #1959
Closes #2052
---
src/tool_cfgable.h | 1 +
src/tool_getparam.c | 20 +++++++++----------
tests/data/Makefile.inc | 2 +-
tests/data/test1291 | 51 +++++++++++++++++++++++++++++++++++++++++++++++++
4 files changed, 63 insertions(+), 11 deletions(-)
diff --git a/src/tool_cfgable.h b/src/tool_cfgable.h
index 23943fe7b..ddfc9bfce 100644
--- a/src/tool_cfgable.h
+++ b/src/tool_cfgable.h
@@ -114,6 +114,7 @@ struct OperationConfig {
struct getout *url_last; /* point to the last/current node */
struct getout *url_get; /* point to the node to fill in URL */
struct getout *url_out; /* point to the node to fill in outfile */
+ struct getout *url_ul; /* point to the node to fill in upload */
char *cipher_list;
char *proxy_cipher_list;
char *cert;
diff --git a/src/tool_getparam.c b/src/tool_getparam.c
index b65c45732..12e3abd55 100644
--- a/src/tool_getparam.c
+++ b/src/tool_getparam.c
@@ -787,7 +787,7 @@ ParameterError getparameter(const char *flag, /* f or
-long-flag */
url = config->url_get;
else
/* there was no free node, create one! */
- url = new_getout(config);
+ config->url_get = url = new_getout(config);
if(!url)
return PARAM_NO_MEM;
@@ -1787,7 +1787,7 @@ ParameterError getparameter(const char *flag, /* f or
-long-flag */
url = config->url_out;
else
/* there was no free node, create one! */
- url = new_getout(config);
+ config->url_out = url = new_getout(config);
if(!url)
return PARAM_NO_MEM;
@@ -1912,23 +1912,23 @@ ParameterError getparameter(const char *flag, /* f or
-long-flag */
/* we are uploading */
{
struct getout *url;
- if(!config->url_out)
- config->url_out = config->url_list;
- if(config->url_out) {
+ if(!config->url_ul)
+ config->url_ul = config->url_list;
+ if(config->url_ul) {
/* there's a node here, if it already is filled-in continue to find
an "empty" node */
- while(config->url_out && (config->url_out->flags & GETOUT_UPLOAD))
- config->url_out = config->url_out->next;
+ while(config->url_ul && (config->url_ul->flags & GETOUT_UPLOAD))
+ config->url_ul = config->url_ul->next;
}
/* now there might or might not be an available node to fill in! */
- if(config->url_out)
+ if(config->url_ul)
/* existing node */
- url = config->url_out;
+ url = config->url_ul;
else
/* there was no free node, create one! */
- url = new_getout(config);
+ config->url_ul = url = new_getout(config);
if(!url)
return PARAM_NO_MEM;
diff --git a/tests/data/Makefile.inc b/tests/data/Makefile.inc
index 35c41a5c1..9104f34f5 100644
--- a/tests/data/Makefile.inc
+++ b/tests/data/Makefile.inc
@@ -137,7 +137,7 @@ test1252 test1253 test1254 test1255 test1256 test1257
test1258 test1259 \
test1260 test1261 test1262 \
\
test1280 test1281 test1282 test1283 test1284 test1285 test1286 test1287 \
-test1288 test1289 test1290 \
+test1288 test1289 test1290 test1291 \
test1298 test1299 \
test1300 test1301 test1302 test1303 test1304 test1305 test1306 test1307 \
test1308 test1309 test1310 test1311 test1312 test1313 test1314 test1315 \
diff --git a/tests/data/test1291 b/tests/data/test1291
new file mode 100644
index 000000000..12d65f3d8
--- /dev/null
+++ b/tests/data/test1291
@@ -0,0 +1,51 @@
+# This test case is primarily meant to verify that parsing and adding the 100K
+# files is a swift operation.
+#
+<testcase>
+<info>
+<keywords>
+HTTP
+HTTP PUT
+</keywords>
+</info>
+
+#
+# Server-side
+<reply>
+<data>
+</data>
+</reply>
+
+# Client-side
+<client>
+<server>
+none
+</server>
+<name>
+Attempt to upload 100K files but fail immediately
+</name>
+<command>
+-K log/cmd1291 --fail-early
+</command>
+<file name="log/upload-this">
+XXXXXXXx
+</file>
+# generate the config file
+<precheck>
+perl -e 'for(1 .. 100000) {
printf("upload-file=log/upload-this\nurl=htttttp://non-existing-host.haxx.se/upload/1291\n",
$_);}' > log/cmd1291;
+</precheck>
+</client>
+
+# Verify data after the test has been "shot"
+<verify>
+<errorcode>
+1
+</errorcode>
+
+# we disable valgrind here since it takes 40+ seconds even on a fairly snappy
+# machine
+<valgrind>
+disable
+</valgrind>
+</verify>
+</testcase>
--
To stop receiving notification emails like this one, please contact
address@hidden
- [GNUnet-SVN] [gnurl] 53/116: HTTP: implement Brotli content encoding, (continued)
- [GNUnet-SVN] [gnurl] 53/116: HTTP: implement Brotli content encoding, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 36/116: runtests.pl: Fixed typo in message, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 37/116: mkhelp.pl: support reproducible build, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 55/116: travis: add a job with brotli enabled, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 63/116: connect: store IPv6 connection status after valid connection, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 32/116: timeval: use mach time on MacOS, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 47/116: curl: pass through [] in URLs instead of calling globbing error, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 72/116: curl_share_setopt: va_end was not called if conncache errors, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 14/116: transfer: Fix chunked-encoding upload bug, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 49/116: RELEASE-NOTES: synced with ee8016b3d, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 48/116: curl: speed up handling of many URLs,
gnunet <=
- [GNUnet-SVN] [gnurl] 43/116: tests: Fixed torture tests on tests 556 and 650, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 75/116: README.md: fixed layout, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 77/116: SMB: fix uninitialized local variable, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 27/116: RELEASE-NOTES: synced with f20cbac97, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 68/116: curl_share_setopt.3: document CURL_LOCK_DATA_CONNECT, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 52/116: HTTP: support multiple Content-Encodings, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 69/116: --interface: add support for Linux VRF, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 106/116: ssh: remove check for a NULL pointer (!), gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 92/116: http2: fix "Value stored to 'hdbuf' is never read" scan-build error, gnunet, 2017/12/05
- [GNUnet-SVN] [gnurl] 66/116: test1554: verify connection cache sharing, gnunet, 2017/12/05