emacs-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Emacs-diffs] /srv/bzr/emacs/trunk r107233: Fix previous URL doc change


From: Glenn Morris
Subject: [Emacs-diffs] /srv/bzr/emacs/trunk r107233: Fix previous URL doc change
Date: Fri, 10 Feb 2012 09:22:09 -0800
User-agent: Bazaar (2.3.1)

------------------------------------------------------------
revno: 107233
committer: Glenn Morris <address@hidden>
branch nick: trunk
timestamp: Fri 2012-02-10 09:22:09 -0800
message:
  Fix previous URL doc change
  
  * lisp/url/url-queue.el (url-queue-retrieve): Fic previous doc fix.
  
  * doc/misc/url.texi (Retrieving URLs): Update url-retrieve arguments.
  Mention url-queue-retrieve.
  
  * etc/NEWS: Related edit.
modified:
  doc/misc/url.texi
  etc/NEWS
  lisp/url/ChangeLog
  lisp/url/url-queue.el
=== modified file 'doc/misc/url.texi'
--- a/doc/misc/url.texi 2012-02-10 03:23:47 +0000
+++ b/doc/misc/url.texi 2012-02-10 17:22:09 +0000
@@ -216,10 +216,10 @@
 @vindex url-queue-parallel-processes
 @vindex url-queue-timeout
 @defun url-queue-retrieve url callback &optional cbargs silent no-cookies
-This acts like the @code{url-retrieve} function, but downloads in
-parallel.  The option @code{url-queue-parallel-processes} controls the
-number of concurrent processes, and the option @code{url-queue-timeout}
-sets a timeout in seconds.
+This acts like the @code{url-retrieve} function, but with limits on
+the degree of parallelism.  The option @code{url-queue-parallel-processes}
+controls the number of concurrent processes, and the option
address@hidden sets a timeout in seconds.
 @end defun
 
 @node Supported URL Types

=== modified file 'etc/NEWS'
--- a/etc/NEWS  2012-02-10 17:13:54 +0000
+++ b/etc/NEWS  2012-02-10 17:22:09 +0000
@@ -858,8 +858,9 @@
 remote machines that support SELinux.
 
 +++
-** New function, url-queue-retrieve, fetches URLs asynchronously like
-url-retrieve does, but in parallel.
+** New function, `url-queue-retrieve', which behaves like url-retrieve,
+but with limits (`url-queue-parallel-processes', `url-queue-timeout') on
+the degree of parallelism.
 
 ** VC and related modes
 

=== modified file 'lisp/url/ChangeLog'
--- a/lisp/url/ChangeLog        2012-02-10 11:36:42 +0000
+++ b/lisp/url/ChangeLog        2012-02-10 17:22:09 +0000
@@ -1,3 +1,7 @@
+2012-02-10  Glenn Morris  <address@hidden>
+
+       * url-queue.el (url-queue-retrieve): Fic previous doc fix.
+
 2012-02-10  Andreas Schwab  <address@hidden>
 
        * url-http.el (url-http-clean-headers): Return the number of

=== modified file 'lisp/url/url-queue.el'
--- a/lisp/url/url-queue.el     2012-02-10 03:23:47 +0000
+++ b/lisp/url/url-queue.el     2012-02-10 17:22:09 +0000
@@ -57,9 +57,9 @@
 (defun url-queue-retrieve (url callback &optional cbargs silent 
inhibit-cookies)
   "Retrieve URL asynchronously and call CALLBACK with CBARGS when finished.
 This is like `url-retrieve' (which see for details of the arguments),
-but downloads in parallel.  The variable `url-queue-parallel-processes'
-sets the number of concurrent processes.  The variable `url-queue-timeout'
-sets a timeout."
+but with limits on the degree of parallelism.  The variable
+`url-queue-parallel-processes' sets the number of concurrent processes.
+The variable `url-queue-timeout' sets a timeout."
   (setq url-queue
        (append url-queue
                (list (make-url-queue :url url


reply via email to

[Prev in Thread] Current Thread [Next in Thread]