diff --git a/README.md b/README.md index 15062dc2..0ed1ce6e 100644 --- a/README.md +++ b/README.md @@ -33,7 +33,7 @@ The approach begins with using static code analysis via the [Kantra](https://git ### Launch the Kai backend with sample data -The quickest way to get running is to leverage sample data commited into the Kai repo along with the `podman compose up` workflow +The quickest way to get running is to leverage sample data committed into the Kai repo along with the `podman compose up` workflow 1. `git clone https://github.com/konveyor/kai.git` 1. `cd kai` diff --git a/docs/Getting_Started.md b/docs/Getting_Started.md index 238cc1bb..eaf397fe 100644 --- a/docs/Getting_Started.md +++ b/docs/Getting_Started.md @@ -33,7 +33,7 @@ Steps: 1. Optional Configuration changes _ok to skip and use the defaults if using cached responses_ 1. Make changes to `kai/config.toml` to select your desired provider and model 1. Export `GENAI_KEY` or `OPENAI_API_KEY` as appropriate as per [docs/LLM_Selection.md](/docs/LLM_Selection.md) - 1. Note: By default the `stable` image tag will be used by podman compose.yaml. If you want to run with an alternate tag you can export the environment varaible: `TAG="stable"` with any tag you would like to use. + 1. Note: By default the `stable` image tag will be used by podman compose.yaml. If you want to run with an alternate tag you can export the environment variable: `TAG="stable"` with any tag you would like to use. 1. Run `podman compose up`. The first time this is run it will take several minutes to download images and to populate sample data. - After the first run the DB will be populated and subsequent starts will be much faster, as long as the kai_kai_db_data volume is not deleted. - To clean up all resources run `podman compose down && podman volume rm kai_kai_db_data`. @@ -73,7 +73,7 @@ _Konveyor integration is still being developed and is not yet fully integrated._ You may also run the Kai server from a python virtual environment to aid testing local changes without needing to build a container image. -- See [docs/contrib/Dev_Environment.md](docs/contrib/Dev_Environment.md) +- See [docs/contrib/Dev_Environment.md](/docs/contrib/Dev_Environment.md) ### Example CLI Script in Python @@ -83,7 +83,7 @@ You may also run the Kai server from a python virtual environment to aid testing ### Extending the data Kai consumes -- You may modify the analysis information Kai consumes via [docs/customApps.md](docs/customApps.md) +- You may modify the analysis information Kai consumes via [docs/customApps.md](/docs/customApps.md) ### Misc notes with `podman compose` diff --git a/docs/contrib/Tracing.md b/docs/contrib/Tracing.md index 13ee3d4d..d2027703 100644 --- a/docs/contrib/Tracing.md +++ b/docs/contrib/Tracing.md @@ -40,7 +40,7 @@ Example of hierarchy: │   │   ├── prompt << The formatted prompt prior to sending to LLM >> │   │   └── prompt_vars.json << The prompt variables which are injected into the prompt template >> │   ├── params.json << Request parameters >> - │   └── timing << Duration of a Succesful Request >> + │   └── timing << Duration of a Successful Request >> └── src └── main ├── java diff --git a/docs/design/solvedIncidentStore.md b/docs/design/solvedIncidentStore.md index 8c6ad633..72e759f6 100644 --- a/docs/design/solvedIncidentStore.md +++ b/docs/design/solvedIncidentStore.md @@ -38,7 +38,7 @@ In the longer term, additional layers of processing will be implemented to ensur ### Long-term Data Storage -Analysis reports are stored permantently. Solutions are stored in a separate portion of the database for use with Retrieval-Augmented Generation (RAG) prompts. Solutions are considered recomputable from the original reports. Permanent storage of analysis reports allows for: +Analysis reports are stored permanently. Solutions are stored in a separate portion of the database for use with Retrieval-Augmented Generation (RAG) prompts. Solutions are considered recomputable from the original reports. Permanent storage of analysis reports allows for: - Future data extraction, if additional data is identified for extraction. - Using the data for fine-tuning models, providing flexibility for ongoing improvements. diff --git a/docs/scenarios/demo.md b/docs/scenarios/demo.md index d1ca0d46..37e263fe 100644 --- a/docs/scenarios/demo.md +++ b/docs/scenarios/demo.md @@ -23,7 +23,7 @@ In this step, we will configure the Kai IDE plugin within VSCode to showcase the ### Setup Kai VSCode IDE plugin -- Follow along the steps listed in [here](https://github.com/konveyor-ecosystem/kai/tree/main/ide) to intergrate Kai IDE plugin with VSCode. +- Follow along the steps listed in [here](https://github.com/konveyor-ecosystem/kai/tree/main/ide) to integrate Kai IDE plugin with VSCode. - Before starting the Kai server, Select model `meta-llama/llama-3-70b-instruct` by uncommenting the following block in `kai/config.toml` file @@ -101,7 +101,7 @@ Let's clone the Coolstore application, which we will be used demo the migration We will analyze the Coolstore application using the following migration targets to identify potential areas for improvement: -- containterization +- containerization - jakarta-ee - jakarta-ee8+ - jakarata-ee9+ @@ -222,4 +222,4 @@ mvn clean compile package -Dquarkus.kubernetes.deploy=true ### Conclusion -In this demo, we showcased the capability of Kai in facilitating various types of code migrations within the Coolstore application. By leveraging Kai's capabilities, organizations can expedite the modernization process. If you are intereted to learn more about our ongoing efforts and future plans, please reach out to us in the [slack channel](https://kubernetes.slack.com/archives/CR85S82A2) +In this demo, we showcased the capability of Kai in facilitating various types of code migrations within the Coolstore application. By leveraging Kai's capabilities, organizations can expedite the modernization process. If you are interested to learn more about our ongoing efforts and future plans, please reach out to us in the [slack channel](https://kubernetes.slack.com/archives/CR85S82A2)