Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR-Codex init #4

Open
wants to merge 20 commits into
base: codex-base
Choose a base branch
from
Open

PR-Codex init #4

wants to merge 20 commits into from

Conversation

jjranalli
Copy link
Contributor

@jjranalli jjranalli commented Mar 16, 2023

This is PR-Codex in action


PR-Codex overview

This PR adds a new feature to the project: a GitHub app that explains and summarizes PR code diffs. It includes a new github/route.ts file and updates several existing files, including README.md, Homepage.tsx, DefaultHead.tsx, AppLayout.tsx, Footer.tsx, and Navbar.tsx.

Detailed summary

  • Adds new github/route.ts file
  • Updates README.md
  • Updates Homepage.tsx
  • Updates DefaultHead.tsx
  • Updates AppLayout.tsx
  • Updates Footer.tsx
  • Updates Navbar.tsx

The following files were skipped due to too many changes: lib/summarizePullRequest.ts

✨ Ask PR-Codex anything about this PR by commenting with /codex {your question}

@vercel
Copy link

vercel bot commented Mar 16, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
pr-codex ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 18, 2023 11:27am

@decentralizedlabs decentralizedlabs deleted a comment from pr-codex bot Apr 16, 2023
README.md Outdated Show resolved Hide resolved
@decentralizedlabs decentralizedlabs deleted a comment from pr-codex bot Apr 16, 2023
@decentralizedlabs decentralizedlabs deleted a comment from pr-codex bot Apr 16, 2023
@decentralizedlabs decentralizedlabs deleted a comment from pr-codex bot Apr 16, 2023
Comment on lines 28 to 29
<stop stopColor="#fde047" />
<stop offset={1} stopColor="#FF80B5" />
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/ask-codex what colors are these stopColor values?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what colors are these stopColor values?

@jjranalli The stopColor values are #fde047 and #FF80B5.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/ask-codex suggest hexcodes values for red and blue, to replace the current stopColor values

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggest hexcodes values for red and blue, to replace the current stopColor values

@jjranalli I cannot suggest specific hexcodes without knowing the desired color scheme. However, here are some common hexcodes for red and blue that you could use:

  • Red: #FF0000, #FF5733, #DC143C, #B22222
  • Blue: #0000FF, #1E90FF, #4169E1, #6495ED

Feel free to choose the ones that match your design preferences.

@jjranalli
Copy link
Contributor Author

/codex how does the logic of generating replies from comments work?

@pr-codex
Copy link

pr-codex bot commented Apr 17, 2023

how does the logic of generating replies from comments work?

@jjranalli When a comment is created on a pull request with the text /ask-codex, the replyIssueComment function is triggered. It first extracts the relevant information from the payload, such as the repository, issue, sender, and comment. Then it gets the diff content using the getCodeDiff function, which retrieves the diff of the pull request using the GitHub API.

If there are changes in the diff, generateChatGpt is called with a system prompt and the code diff as messages. generateChatGpt uses OpenAI's GPT-3 API to generate a response to the question asked in the comment.

Finally, the response is formatted as a comment and posted back to the pull request as a reply to the original comment. In addition, if there are skipped files or if the code diff exceeds the maximum character limit, this information is appended to the response.

Comment on lines +4 to +40
export const generateChatGpt = async (
messages: ChatCompletionRequestMessage[]
) => {
const DECODER = new TextDecoder()
let text = ""

try {
const stream = await OpenAI(
"chat",
{
model: "gpt-3.5-turbo",
temperature: 0.7,
messages
},
{ apiKey: process.env.OPENAI_API_KEY }
)

for await (const chunk of yieldStream(stream)) {
try {
const decoded: string = DECODER.decode(chunk)

if (decoded === undefined)
throw new Error(
"No choices in response. Decoded response: " +
JSON.stringify(decoded)
)

text += decoded
} catch (err) {
console.error(err)
}
}
} catch (err) {
console.error(err)
}

return text
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/codex what does this do?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does this do?

@jjranalli This is a JavaScript function that exports a named function generateChatGpt. It takes an array of ChatCompletionRequestMessage objects as input and returns a string. The function uses the OpenAI library to generate a chat response using the GPT-3.5-turbo model with a temperature of 0.7. It then concatenates all the chunks of data returned by the OpenAI stream into a single string and returns it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant