[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#20154: 25.0.50; json-encode-string is too slow for large strings

From: Dmitry Gutov
Subject: bug#20154: 25.0.50; json-encode-string is too slow for large strings
Date: Fri, 20 Mar 2015 16:26:07 +0200

A 300Kb string takes 0.5s to encode on my machine. Example:

(defvar s (apply #'concat (cl-loop for i from 1 to 30000
                                   collect "0123456789\n")))

(length (json-encode-string s))

For a comparison, the built-in json module in my local Python
installation takes only 2ms to do that for the same string.

This is important for advanced code completion in general, because JSON
is a common transport format, and sending the contents of the current
buffer to the server is a common approach to avoid needlessly saving it
(and running associated hooks, etc).

And in this specific case, our JSON encoding speed is a bottleneck when
working with ycmd, the editor-agnostic code completion daemon extracted
from a popular Vim package:

I've tried to reimplement this function using `replace-regexp-in-string'
or `re-search-forward' with a temp buffer, to minimize the number of
concatenations and `json-encode-char' calls in the fast case (all
characters are ASCII), but as long as characters that need to be encoded
(such as newlines) still occur throughout the contents of the string,
the speed improvement is nowhere near the acceptable level. Should it be
written in C?

In GNU Emacs (x86_64-unknown-linux-gnu, GTK+ Version 3.12.2)
 of 2015-03-20 on axl
Repository revision: 8142fc97af742e083fb83e4d0470da59b123a467
Windowing system distributor `The X.Org Foundation', version 11.0.11601901
System Description:     Ubuntu 14.10

reply via email to

[Prev in Thread] Current Thread [Next in Thread]