Why would the model be trained on outdated prices? I’m not talking about LLMs, but separate model designed to parse visual information - specifically websites - and extract particular elements like prices. My comment about ChataGPT was in reference to the newer models which can relay visual information, I’m not suggesting that would be the right approach for training a new model.
The applications would be broader than just prices - this would allow you to scrape any human-readable website without needing to do bespoke development.
I am not sure, that would work. You could train a model that analyzes data and then feed it the data you want to transform. The data wouldn’t be the training data then but part of your request.
Like you can feed a book into GPT4/5 and then ask questions about it.
For what you describe you wouldn’t really need AI just a more or less fuzzy parser (like the scan a receipt, get the prices ocr things). Unless I didn’t get it.
Why would the model be trained on outdated prices? I’m not talking about LLMs, but separate model designed to parse visual information - specifically websites - and extract particular elements like prices. My comment about ChataGPT was in reference to the newer models which can relay visual information, I’m not suggesting that would be the right approach for training a new model.
The applications would be broader than just prices - this would allow you to scrape any human-readable website without needing to do bespoke development.
I am not sure, that would work. You could train a model that analyzes data and then feed it the data you want to transform. The data wouldn’t be the training data then but part of your request.
Like you can feed a book into GPT4/5 and then ask questions about it.
For what you describe you wouldn’t really need AI just a more or less fuzzy parser (like the scan a receipt, get the prices ocr things). Unless I didn’t get it.