Yeah, it can easily do that, if such a comparison service already exists. Then it's just yet another API that it calls, or in this case, it more likely just does a Bing search and recounts the top results. But I haven't yet heard of such a service existing for groceries, so it would still need to be built.
I understand what you're saying. In theory, it's possible. But in practice, it is not just a matter of linearly improving LLMs and then at some point, they'll just do it on their own. Any task that takes more than a few steps means that the error rate of the LLM multiplies.
The error rate would need to get magnitudes lower for that multiplication to not explode with many steps. The alternative is removing steps that the LLM needs to do, as I described. Some colleagues at my dayjob do basically nothing else now.
Yeah, it can easily do that, if such a comparison service already exists. Then it's just yet another API that it calls, or in this case, it more likely just does a Bing search and recounts the top results. But I haven't yet heard of such a service existing for groceries, so it would still need to be built.
I understand what you're saying. In theory, it's possible. But in practice, it is not just a matter of linearly improving LLMs and then at some point, they'll just do it on their own. Any task that takes more than a few steps means that the error rate of the LLM multiplies.
The error rate would need to get magnitudes lower for that multiplication to not explode with many steps. The alternative is removing steps that the LLM needs to do, as I described. Some colleagues at my dayjob do basically nothing else now.