I got confirmation: This is the ‘real’ problem with AI, it’s the context window.
so basically the size of the input
It can’t be exceeded. There’s 0 means. The system itself limits it
In this case, the feedback I was given is as follows: automate the processing (file upload + execution of instructions) via a script + the API of a model:
Either it’s word processing and we can go through RAG, so load everything and then the LLM searches in the RAG (kind of contextual database),
or you make a script that makes an LLM call for XX lines
In my case, on the one hand a taxonomy made up of categories and subcategories, on the other a csv of 14,000 lines/products.
If anyone sees a way to resolve this case in dust, I’m interested.