qemu-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Qemu-devel] [PATCH 41/56] json: Nicer recovery from invalid leading


From: Markus Armbruster
Subject: Re: [Qemu-devel] [PATCH 41/56] json: Nicer recovery from invalid leading zero
Date: Tue, 14 Aug 2018 10:24:07 +0200
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/26.1 (gnu/linux)

Eric Blake <address@hidden> writes:

> On 08/08/2018 07:03 AM, Markus Armbruster wrote:
>> For input 0123, the lexer produces the tokens
>>
>>      JSON_ERROR    01
>>      JSON_INTEGER  23
>>
>> Reporting an error is correct; 0123 is invalid according to RFC 7159.
>> But the error recovery isn't nice.
>>
>> Make the finite state machine eat digits before going into the error
>> state.  The lexer now produces
>>
>>      JSON_ERROR    0123
>>
>> Signed-off-by: Markus Armbruster <address@hidden>
>> ---
>>   qobject/json-lexer.c | 7 ++++++-
>>   1 file changed, 6 insertions(+), 1 deletion(-)
>>
>
>> @@ -158,10 +159,14 @@ static const uint8_t json_lexer[][256] =  {
>>       /* Zero */
>>       [IN_ZERO] = {
>>           TERMINAL(JSON_INTEGER),
>> -        ['0' ... '9'] = IN_ERROR,
>> +        ['0' ... '9'] = IN_BAD_ZERO,
>>           ['.'] = IN_MANTISSA,
>>       },
>>   +    [IN_BAD_ZERO] = {
>> +        ['0' ... '9'] = IN_BAD_ZERO,
>> +    },
>> +
>
> Should IN_BAD_ZERO also consume '.' and/or 'e' (after all, '01e2 is a
> valid C constant, but not a valid JSON literal)?  But I think your
> choice here is fine (again, add too much, and then the lexer has to
> track a lot of state; whereas this minimal addition catches the most
> obvious things with little effort).

My patch is of marginal value to begin with.  It improves error recovery
only for the "integer with redundant leading zero" case.  I guess that's
more common than "floating-point with redundant leading zero".

An obvious way to extend it to "number with redundant leading zero"
would be cloning the lexer states related to numbers.  Clean, but six
more states.  Meh.

Another way is to dumb down the lexer not to care about leading zero,
and catch it in parse_literal() instead.  Basically duplicates the lexer
state machine up to where leading zero is recognized.  Not much code,
but meh.

Yet another way is to have the lexer eat "digit salad" after redundant
leading zero:

    [IN_BAD_ZERO] = {
        ['0' ... '9'] = IN_BAD_ZERO,
        ['.'] = IN_BAD_ZERO,
        ['e'] = IN_BAD_ZERO,
        ['-'] = IN_BAD_ZERO,
        ['+'] = IN_BAD_ZERO,
    },

Eats even crap like 01e...  But then the same crap without leading zero
should also be eaten.  We'd have to add state transitions to IN_BAD_ZERO
to the six states related to numbers.  Perhaps clever use of gcc's range
initialization lets us do this compactly.  Otherwise, meh again.

Opinions?

I'd also be fine with dropping this patch.

> Reviewed-by: Eric Blake <address@hidden>

Thanks!



reply via email to

[Prev in Thread] Current Thread [Next in Thread]