Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

justfile #3

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
172 changes: 172 additions & 0 deletions .ipynb_checkpoints/openaichat-checkpoint.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
{
bmorphism marked this conversation as resolved.
Show resolved Hide resolved
"cells": [
{
"cell_type": "markdown",
"id": "e49f1e0d",
"metadata": {},
"source": [
"# OpenAIChat\n",
"\n",
"OpenAI also has a [chat model](https://platform.openai.com/docs/guides/chat) you can use. The interface is very similar to the normal OpenAI model."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "522686de",
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import OpenAIChat\n",
"from langchain import PromptTemplate, LLMChain"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "62e0dbc3",
"metadata": {},
"outputs": [],
"source": [
"llm = OpenAIChat(temperature=0)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "fbb043e6",
"metadata": {},
"outputs": [],
"source": [
"template = \"\"\"Question: {question}\"\"\"\n",
"\n",
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "3f945b76",
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "25260808",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'\\n\\nAs an AI language model, I cannot have a worst or do anything. However, according to the second law of thermodynamics, entropy can never be reversed in a closed system. Entropy always increases over time, leading to a state of maximum disorder or randomness. While it is possible to decrease entropy in a localized system, the overall entropy of the universe will continue to increase.'"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"question = \"Can entropy be reversed?\"\n",
"\n",
"llm_chain.run(question)"
]
},
{
"cell_type": "markdown",
"id": "75a05b79",
"metadata": {},
"source": [
"## Prefix Messages\n",
"\n",
"OpenAI Chat also supports the idea of [prefix messages](https://platform.openai.com/docs/guides/chat/chat-vs-completions), eg messages that would appear before the user input. These can be used as system messages to give more context/purpose the LLM."
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "c27a1501",
"metadata": {},
"outputs": [],
"source": [
"prefix_messages = [{\"role\": \"system\", \"content\": \"You are a helpful assistant that is very good at problem solving who thinks step by step.\"}]"
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "e46a914e",
"metadata": {},
"outputs": [],
"source": [
"llm = OpenAIChat(temperature=0, prefix_messages=prefix_messages)"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "d683d9f2",
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "6f5b8e78",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'Step 1: Justin Bieber was born on March 1, 1994.\\nStep 2: The Super Bowl is played in February of each year.\\nStep 3: Therefore, the Super Bowl that was played in the year Justin Bieber was born was Super Bowl XXVIII, which was played on January 30, 1994.\\nStep 4: The Dallas Cowboys won Super Bowl XXVIII by defeating the Buffalo Bills with a score of 30-13.\\nStep 5: Therefore, the Dallas Cowboys were the NFL team that won the Super Bowl in the year Justin Bieber was born.'"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
"\n",
"llm_chain.run(question)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "44c0330d",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
14 changes: 14 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Getting started
## Using `just`
`just shell`
## Manually
0. Clone this repo
1. Install [nix](https://nixos.org/download.html#nix-install-macos)
2. Run `poetry update`
3. Run `poetry shell` inside `agent/agent`
4. ??? (e.g. `python <...>`)
5. PROFIT!11!!!
## Appendix
### Reproducibility
`nix` and `poetry` allow [packaging for serverless execution](https://github.com/bananaml/serverless-template) with reliable system, package, and other application dependencies like Secrets correctly derived for individual `agent` runtime environment.

18 changes: 9 additions & 9 deletions agent/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,13 @@



documents = PagedPDFSplitter("/Users/barton/Lab/poet/agent/vdf.pdf").load_data()
# documents = PagedPDFSplitter("/Users/barton/Lab/poet/agent/vdf.pdf").load_data()
bmorphism marked this conversation as resolved.
Show resolved Hide resolved

index = GPTSimpleVectorIndex(documents)
index.save_to_disk('index.json')
# index = GPTSimpleVectorIndex(documents)
# index.save_to_disk('index.json') # TODO: use Zulip UUID here!


pages = loader.load_and_split()
# pages = loader.load_and_split()



Expand All @@ -42,11 +42,11 @@
llm = OpenAI(temperature=0.42, model="text-davinci-003")
memory = ConversationSummaryMemory(memory_key="chat_history", llm=llm)

# notagent = initialize_agent(tools,
# llm=llm,
# agent="conversational-react-description",
# verbose=True,
# memory=memory)
notagent = initialize_agent(tools,
llm=llm,
agent="conversational-react-description",
verbose=True,
memory=memory)



Expand Down
12 changes: 12 additions & 0 deletions justfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
#!/usr/bin/env -S just --justfile

update:
poetry update

nix:
nix-shell

shell: nix update

gmi:
echo "only GMI in retrospect!"
Loading