Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs/fix links #6498

Merged
merged 4 commits into from
Jun 20, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/docs_skeleton/docs/get_started/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ Learn best practices for developing with LangChain.
LangChain is part of a rich ecosystem of tools that integrate with our framework and build on top of it. Check out our growing list of [integrations](/docs/ecosystem/integrations/) and [dependent repos](/docs/ecosystem/dependents.html).

### [Additional resources](/docs/additional_resources/)
Our community is full of prolific developers, creative builders, and fantastic teachers. Check out [YouTube tutorials](/docs/ecosystem/youtube.html) for great tutorials from folks in the community, and [Gallery](https://github.com/kyrolabs/awesome-langchain) for a list of awesome LangChain projects, compiled by the folks at [KyroLabs](https://kyrolabs.com).
Our community is full of prolific developers, creative builders, and fantastic teachers. Check out [YouTube tutorials](/docs/additional_resources/youtube.html) for great tutorials from folks in the community, and [Gallery](https://github.com/kyrolabs/awesome-langchain) for a list of awesome LangChain projects, compiled by the folks at [KyroLabs](https://kyrolabs.com).

<h3><span style={{color:"#2e8555"}}> Support </span></h3>

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Document QA

Here we walk through how to use LangChain for question answering over a list of documents. Under the hood we'll be using our [Document chains](../document.html).
Here we walk through how to use LangChain for question answering over a list of documents. Under the hood we'll be using our [Document chains](/docs/modules/chains/document/).

import Example from "@snippets/modules/chains/additional/question_answering.mdx"

Expand Down
4 changes: 2 additions & 2 deletions docs/docs_skeleton/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ const config = {
// For GitHub pages deployment, it is often '/<projectName>/'
baseUrl: "/",

onBrokenLinks: "ignore",
onBrokenMarkdownLinks: "ignore",
onBrokenLinks: "warn",
onBrokenMarkdownLinks: "throw",

plugins: [
() => ({
Expand Down
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/awadb.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ whether for semantic search or example selection.
from langchain.vectorstores import AwaDB
```

For a more detailed walkthrough of the AwaDB wrapper, see [this notebook](../modules/indexes/vectorstores/examples/awadb.ipynb)
For a more detailed walkthrough of the AwaDB wrapper, see [here](/docs/modules/data_connection/vectorstores/integrations/awadb.html).
8 changes: 4 additions & 4 deletions docs/extras/ecosystem/integrations/databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,12 @@ Databricks embraces the LangChain ecosystem in various ways:

Databricks connector for the SQLDatabase Chain
----------------------------------------------
You can connect to [Databricks runtimes](https://docs.databricks.com/runtime/index.html) and [Databricks SQL](https://www.databricks.com/product/databricks-sql) using the SQLDatabase wrapper of LangChain. See the notebook [Connect to Databricks](./databricks/databricks.html) for details.
You can connect to [Databricks runtimes](https://docs.databricks.com/runtime/index.html) and [Databricks SQL](https://www.databricks.com/product/databricks-sql) using the SQLDatabase wrapper of LangChain. See the notebook [Connect to Databricks](/docs/ecosystem/integrations/databricks/databricks.html) for details.

Databricks-managed MLflow integrates with LangChain
---------------------------------------------------

MLflow is an open source platform to manage the ML lifecycle, including experimentation, reproducibility, deployment, and a central model registry. See the notebook [MLflow Callback Handler](./mlflow_tracking.ipynb) for details about MLflow's integration with LangChain.
MLflow is an open source platform to manage the ML lifecycle, including experimentation, reproducibility, deployment, and a central model registry. See the notebook [MLflow Callback Handler](/docs/ecosystem/integrations/mlflow_tracking.ipynb) for details about MLflow's integration with LangChain.

Databricks provides a fully managed and hosted version of MLflow integrated with enterprise security features, high availability, and other Databricks workspace features such as experiment and run management and notebook revision capture. MLflow on Databricks offers an integrated experience for tracking and securing machine learning model training runs and running machine learning projects. See [MLflow guide](https://docs.databricks.com/mlflow/index.html) for more details.

Expand All @@ -26,11 +26,11 @@ Databricks-managed MLflow makes it more convenient to develop LangChain applicat
Databricks as an LLM provider
-----------------------------

The notebook [Wrap Databricks endpoints as LLMs](../modules/models/llms/integrations/databricks.html) illustrates the method to wrap Databricks endpoints as LLMs in LangChain. It supports two types of endpoints: the serving endpoint, which is recommended for both production and development, and the cluster driver proxy app, which is recommended for interactive development.
The notebook [Wrap Databricks endpoints as LLMs](/docs/modules/model_io/models/llms/integrations/databricks.html) illustrates the method to wrap Databricks endpoints as LLMs in LangChain. It supports two types of endpoints: the serving endpoint, which is recommended for both production and development, and the cluster driver proxy app, which is recommended for interactive development.

Databricks endpoints support Dolly, but are also great for hosting models like MPT-7B or any other models from the Hugging Face ecosystem. Databricks endpoints can also be used with proprietary models like OpenAI to provide a governance layer for enterprises.

Databricks Dolly
----------------

Databricks’ Dolly is an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. The model is available on Hugging Face Hub as databricks/dolly-v2-12b. See the notebook [Hugging Face Hub](../modules/models/llms/integrations/huggingface_hub.html) for instructions to access it through the Hugging Face Hub integration with LangChain.
Databricks’ Dolly is an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. The model is available on Hugging Face Hub as databricks/dolly-v2-12b. See the notebook [Hugging Face Hub](/docs/modules/model_io/models/llms/integrations/huggingface_hub.html) for instructions to access it through the Hugging Face Hub integration with LangChain.
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/google_search.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,4 @@ from langchain.agents import load_tools
tools = load_tools(["google-search"])
```

For more information on this, see [this page](/docs/modules/agents/tools/getting_started.md)
For more information on tools, see [this page](/docs/modules/agents/tools/).
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/google_serper.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -70,4 +70,4 @@ from langchain.agents import load_tools
tools = load_tools(["google-serper"])
```

For more information on this, see [this page](/docs/modules/agents/tools/getting_started.md)
For more information on tools, see [this page](/docs/modules/agents/tools/).
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/huggingface.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -66,4 +66,4 @@ For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_

The Hugging Face Hub has lots of great [datasets](https://huggingface.co/datasets) that can be used to evaluate your LLM chains.

For a detailed walkthrough of how to use them to do so, see [this notebook](../use_cases/evaluation/huggingface_datasets.html)
For a detailed walkthrough of how to use them to do so, see [this notebook](/docs/use_cases/evaluation/huggingface_datasets.html)
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/openweathermap.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,4 +41,4 @@ from langchain.agents import load_tools
tools = load_tools(["openweathermap-api"])
```

For more information on this, see [this page](/docs/modules/agents/tools/getting_started.md)
For more information on tools, see [this page](/docs/modules/agents/tools/).
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/promptlayer.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ for res in llm_results.generations:
```
You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).

This LLM is identical to the [OpenAI LLM](./openai.md), except that
This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai.html) LLM, except that
- all your requests will be logged to your PromptLayer account
- you can add `pl_tags` when instantializing to tag your requests on PromptLayer
- you can add `return_pl_id` when instantializing to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
Expand Down
2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/searx.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -87,4 +87,4 @@ arxiv_tool = SearxSearchResults(name="Arxiv", wrapper=wrapper,
})
```

For more information on tools, see [this page](../modules/agents/tools/getting_started.md)
For more information on tools, see [this page](/docs/modules/agents/tools/).
4 changes: 2 additions & 2 deletions docs/extras/ecosystem/integrations/vectara/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ The results are returned as a list of relevant documents, and a relevance score


For a more detailed examples of using the Vectara wrapper, see one of these two sample notebooks:
* [Chat Over Documents with Vectara](./vectara/vectara_chat.html)
* [Vectara Text Generation](./vectara/vectara_text_generation.html)
* [Chat Over Documents with Vectara](./vectara_chat.html)
* [Vectara Text Generation](./vectara_text_generation.html)


2 changes: 1 addition & 1 deletion docs/extras/ecosystem/integrations/wolfram_alpha.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,4 @@ from langchain.agents import load_tools
tools = load_tools(["wolfram-alpha"])
```

For more information on this, see [this page](/docs/modules/agents/tools/getting_started.md)
For more information on tools, see [this page](/docs/modules/agents/tools/).
4 changes: 2 additions & 2 deletions docs/extras/guides/evaluation/openapi_eval.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"source": [
"# Evaluating an OpenAPI Chain\n",
"\n",
"This notebook goes over ways to semantically evaluate an [OpenAPI Chain](openapi.html), which calls an endpoint defined by the OpenAPI specification using purely natural language."
"This notebook goes over ways to semantically evaluate an [OpenAPI Chain](/docs/modules/chains/additiona/openapi.html), which calls an endpoint defined by the OpenAPI specification using purely natural language."
]
},
{
Expand Down Expand Up @@ -967,7 +967,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.11.3"
}
},
"nbformat": 4,
Expand Down
Loading