Algolia – splitting the text into smaller chunks
- radekoParticipant3 years, 7 months ago #23798
Hi, during creating the index on Algolia some of the posts can’t be indexed (indexing stops) because there is a limit of 100KB per record. See the error here https://pasteboard.co/JKlmC40.png (from: https://www.amruta.org/el/1991/12/07/public-program-in-madras-7-dec-1991/)
Does the plugin supports splitting the text into the smaller chunks?
see https://www.algolia.com/doc/faq/basics/is-there-a-size-limit-for-my-index-records/
thank youwpsolrKeymaster3 years, 7 months ago #23802We do not provide Algolia “distinct” feature as mentioned on https://www.algolia.com/doc/guides/sending-and-managing-data/prepare-your-data/how-to/indexing-long-documents/
I am surprised the article you mentioned is hitting 100K of texts.
Do you also index custom fields, or taxonomies with your articles that could add up to the total payload?radekoParticipant3 years, 7 months ago #23804The text is not so long but it is in Greek. Encoding https://pasteboard.co/JKlmC40.png makes it too big. It is possible to ignore the error and continue indexing?
radekoParticipant3 years, 7 months ago #23813Hi, can I put the post IDS also here?
(2.2) Do not index items (post, pages, …)
I tried that but it does not work, still getting
(Algolia) “Record at the position 150 objectID=14397 is too big size=100796 bytes. Contact us if you need an extended quota”
re-indexing and deleting the index does not help
You must be logged in to reply to this topic.