Algolia – splitting the text into smaller chunks
2 years, 8 months ago #23798
Hi, during creating the index on Algolia some of the posts can’t be indexed (indexing stops) because there is a limit of 100KB per record. See the error here https://pasteboard.co/JKlmC40.png (from: https://www.amruta.org/el/1991/12/07/public-program-in-madras-7-dec-1991/)
Does the plugin supports splitting the text into the smaller chunks?
thank you2 years, 8 months ago #23802
We do not provide Algolia “distinct” feature as mentioned on https://www.algolia.com/doc/guides/sending-and-managing-data/prepare-your-data/how-to/indexing-long-documents/
I am surprised the article you mentioned is hitting 100K of texts.
Do you also index custom fields, or taxonomies with your articles that could add up to the total payload?2 years, 8 months ago #23813
Hi, can I put the post IDS also here?
(2.2) Do not index items (post, pages, …)
I tried that but it does not work, still getting
(Algolia) “Record at the position 150 objectID=14397 is too big size=100796 bytes. Contact us if you need an extended quota”
re-indexing and deleting the index does not help
You must be logged in to reply to this topic.