Add generic inference engine to allow dynamic selection by the user #1226
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The generic inference engine enables users to set the actual inference engine dynamically via the inference_engine environment variable. Additionally, the class includes a default parameter, which specifies the inference engine to use if the environment variable is not set.
For example, when the inference_engine variable is not set, the engine will
default to ibm_gen_ai.llama_3_8b_instruct. However, if
inference_engine=ollama.llama2 is set, it will switch to the Ollama model.
This update allows lm-eval users to run the UniTXT dataset with LLMaaJ using any supported inference engine.