Response size too big

Getting a “Response size too big” error when trying to fetch content entries because the limit setting gets applied to all attributes on an entry. Some of our attributes link to other top-level entries - thus pulling way too much data. Anyone else run into this type of issue? If so, how’d you solve it? Thx!

Hi @tj1,

The best way to overcome this kind of issues is to add parameters that will help you paginate your responses and separate them into smaller pieces. You can do this by using the limit and skip parameters to reduce the amount of retrieved items, and select to choose which fields ought to be retrieved under your response:

Hope this helps! :slightly_smiling_face:

@tj1 you also can make use of the include parameter to specify how many levels of linked references to include in the response. If you only care about the top level entries set include=0 will limit the response to only top level entries.

Thanks for the info. Unfortunately, we’re already paging at 5 entries per page. Also, since we want the entire content tree for each top-level entry we have to specify an include > 0, which ends up pulling into too much information about other top-level linked entries.

It does seem like we could instead pull the entries in 2 passes. First with an include = n, and omit the linked entries field via select. Then via another pass with an include = 0, and only select the linked entries field.