I got confirmation: This is the ārealā problem with AI, itās the context window.
so basically the size of the input
It canāt be exceeded. Thereās 0 means. The system itself limits it
In this case, the feedback I was given is as follows: automate the processing (file upload + execution of instructions) via a script + the API of a model:
Either itās word processing and we can go through RAG, so load everything and then the LLM searches in the RAG (kind of contextual database),
or you make a script that makes an LLM call for XX lines
In my case, on the one hand a taxonomy made up of categories and subcategories, on the other a csv of 14,000 lines/products.
If anyone sees a way to resolve this case in dust, Iām interested.