Skip to content

IHTSDO/llm-chain-entity-extraction

Repository files navigation

Using AI Chains for SNOMED CT entity extraction from free text clinical notes

LLM Chains allow us to divide a complex task into a Chain of smaller tasks, where the output of one prompt becomes the input of the next.

This is not significantly slower, despite requiring a greater number of calls/requests to the LLM, because short outputs take a proportionally shorter time to generate. Although the input prompt tokens also need to be processed each time, they are less time-expensive than output generated tokens in the case of LLaMA 2.

Chain design

image

How to run the Entity Extractor

The entity_extractor.py script extracts clinical entities from free text using LLMs. It uses the FHIR API to retrieve clinical data and the OpenAI, BARD, or Llama API to extract entities from the text.

Prerequisites

To run the entity_extractor.py script, you will need:

Usage

To use the entity_extractor.py script, follow these steps:

  • Clone the repository to your local machine.
  • Open the project in your preferred code editor.
  • Install any necessary dependencies by running pip install -r requirements.txt in your terminal.
  • If you need to use OpenAI models, save your OpenAI API key in a file named "openai.key".
  • Run the script by typing python entity_extractor.py -a --model in your terminal, where is the name of the LLM API you want to use (openai, bard, or llama) and is the name of the model you want to run for OpenAI or path to the model in the case of Llama-2.
  • View the output in the terminal or in the output pane of your code editor.

Examples:

python3.10 entity_extractor.py --api llama --model /my-drive/models/llama2/llama-2-13b-chat/ggml-model-q4_0.bin --sentences=example_cases/clinical_text.txt

python3 entity_extractor.py --api openai --model gpt-4o  --sentences=example_cases/clinical_text.txt

References

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published