Skip to content

Streaming decode #73

@zuiderkwast

Description

@zuiderkwast

Hi!

Parsing a document incrementally, feeding the decoder chunks as they arrive as TCP packets over the socket, could speed up the total handling of a request.

We're receiving ~2MB JSON requests over HTTP/2 where each document is interleaved with other requests in the same HTTP/2 connection, which makes the latency for per request even higher when there are many concurrent requests on the same connection.

An idea is to use an API similar to how JSX streming decode API:

1> {incomplete, F} = jsx:decode(<<"[">>, [stream]).
{incomplete,#Fun<jsx_decoder.1.122947756>}
2> F(end_stream).  % can also be `F(end_json)`
** exception error: bad argument
3> {incomplete, G} = F(<<"]">>).
{incomplete,#Fun<jsx_decoder.1.122947756>}
4> G(end_stream).  % can also be `G(end_json)`
[]

We could use another representation too. Any preference? Would you be willing to accept a PR?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions