Does the getEntries function block once the response has been returned?

Does anyone have some insight if the getEntries() function blocks once the response is returned to my code? We have a situation where we query quite a large JSON tree and I think once the data is returned it might be taking a long time to deserialize it into js objects.

I tried to research this by querying the data using the basic http API and compared the time to the js SDK. The basic query took about a second. The SDK query took about 4 seconds. I don’t mind the 4 second query if the entire thing is non-blocking.

I don’t know how to determine if the SDK is blocking once it returns to my code. If it is I might try to fork it myself to try to improve my app’s concurrency.

Hi Dustin,
the js SDK is promise based so it should not block.
Unless you are using async await.

Hi Khaled,

Even though the call to Contentful is asynchronous, the work to deserialize the JSON to js objects is probably synchronous. I don’t see any code in your source that indicates that it does that work on a separate thread.

To your other point, “await” doesn’t actually block thread execution. It will still allow the Node.js thread to execute other requests and other actions. It only blocks the current execution flow. That’s why this is important to me. I would love it if the entire fetch and deserialization of results is truly async.

I’m fairly sure that the work that the SDK does after the response is returned is blocking and quite slow. That’s why I mentioned that I compared the basic HTTP API with the js SDK, because the time difference is quite long.

I’m not 100% sure of what I’m saying above, but I’m fairly confident. There no magic in a promise to make something run in another thread if the operation isn’t already an async operation.

Thanks for your response.

1 Like

Hi Dustin,

The SDK by default does link resolution and this could slow down if the payload is a bit complex or big.
Can you try to load fewer items per page using limit and skip query paramaters?
and see if this helps.


That makes a lot of sense and matches what I’m seeing. Our data set is really big. For new work we’re going to work hard on getting that down to a reasonable size. I’ll look into limit and skip but I think due to an unfortunate design we need all that data to load our pages.

Thanks for your reply.