Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/image-url.jpg?q=openai&page=118&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 1609 results for "openai"(470ms)

VALLEREADME.md1 match

@pomdtr•Updated 9 months ago
1Fork it and authenticate with your Val Town API token as the password. Needs an `OPENAI_API_KEY` env var to be set.
2
3WARNING: pollutes your homepage with lots of temporary vals!!

VALLEmain.tsx3 matches

@pomdtr•Updated 9 months ago
7import { Hono } from "npm:hono@3";
8import _ from "npm:lodash@4";
9import OpenAI from "npm:openai";
10import { renderToString } from "npm:react-dom/server";
11
193
194 const contextWindow: any = await valleGetValsContextWindow(model);
195 const openai = new OpenAI();
196 const stream = await openai.chat.completions.create({
197 model,
198 stream: true,

VALLEREADME.md1 match

@janpaul123•Updated 9 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

valTownChatGPT2main.tsx3 matches

@janpaul123•Updated 9 months ago
6import ValTown from "npm:@valtown/sdk";
7import { Hono } from "npm:hono@3";
8import OpenAI from "npm:openai";
9import { renderToString } from "npm:react-dom/server";
10
170 );
171
172 const openai = new OpenAI();
173 const stream = await openai.chat.completions.create({
174 model,
175 stream: true,

ReadmeWriterREADME.md3 matches

@willthereader•Updated 10 months ago
1# Val Town AI Readme Writer
2
3This val provides a class `ReadmeWriter` for generating readmes for vals with OpenAI. It can both draft readmes and update them directly
4
5PRs welcome! See **Todos** below for some ideas I have.
43
44- `model` (optional): The model to be used for generating the readme. Defaults to "gpt-3.5-turbo".
45- `apiKey` (optional): An OpenAI API key. Defaults to `Deno.env.get("OPENAI_API_KEY")`.
46
47#### Methods
63
64## Todos
65- [ ] Additional options to pass to the OpenAI model
66- [ ] Ability to pass more instructions to the prompt to modify how the readme is constructed

ReadmeWritermain.tsx9 matches

@willthereader•Updated 10 months ago
1import { type WriterOptions } from "https://esm.town/v/nbbaier/WriterOptions";
2import { fetch } from "https://esm.town/v/std/fetch?v=4";
3import OpenAI, { type ClientOptions } from "npm:openai";
4
5export class ReadmeWriter {
6 model: string;
7 openai: OpenAI;
8 apiKey: string;
9 valtownKey: string;
10
11 constructor(options: WriterOptions) {
12 const { model, ...openaiOptions } = options;
13 this.model = model ? model : "gpt-3.5-turbo";
14 this.openai = new OpenAI(openaiOptions);
15 this.valtownKey = Deno.env.get("valtown");
16 }
47 }
48
49 private async performOpenAICall(prompt: string) {
50 try {
51 const response = await this.openai.chat.completions.create({
52 messages: [{ role: "system", content: prompt }],
53 model: this.model,
55
56 if (!response.choices || response.choices.length === 0) {
57 throw new Error("No response from OpenAI");
58 }
59
61
62 if (!readme) {
63 throw new Error("No readme returned by OpenAI. Try again.");
64 }
65
92 const { id, code } = await this.getVal(username, valName);
93 const prompt = this.createPrompt(code, userPrompt);
94 const readme = await this.performOpenAICall(prompt);
95 return { id, readme };
96 }

getValsContextWindowmain.tsx4 matches

@janpaul123•Updated 10 months ago
174 },
175 {
176 prompt: "Write a val that uses OpenAI",
177 code: `import { OpenAI } from "https://esm.town/v/std/openai";
178
179 const openai = new OpenAI();
180 const completion = await openai.chat.completions.create({
181 "messages": [
182 { "role": "user", "content": "Say hello in a creative way" },

valwritermain.tsx5 matches

@janpaul123•Updated 10 months ago
3import { basicAuth } from "https://esm.town/v/pomdtr/basicAuth?v=62";
4import { fetchText } from "https://esm.town/v/stevekrouse/fetchText";
5import { chat } from "https://esm.town/v/stevekrouse/openai";
6import cronstrue from "npm:cronstrue";
7import { Hono } from "npm:hono@3";
117 await email({ subject: "Subject line", text: "Body of message" });
118
119 // OpenAI
120 import { OpenAI } from "https://esm.town/v/std/openai";
121 const openai = new OpenAI();
122 const completion = await openai.chat.completions.create({
123 messages: [
124 { role: "user", content: "Say hello in a creative way" },

easyAQIREADME.md1 match

@rishabhparikh•Updated 10 months ago
126. Uses EPA's ranking to classify the severity of the score (ie "Unhealthy for Sensitive Groups")
13
14It uses blob storage to cache the openai location id for your location string to skip a couple steps for the next time.
15
16## Example usage

openAiProxymain.tsx4 matches

@ashryanio•Updated 10 months ago
1import { OpenAI } from "https://esm.town/v/std/openai";
2
3const openai = new OpenAI();
4
5/**
58
59/**
60 * Gets the response from the OpenAI language model.
61 * @param {string} prompt - The prompt for the language model.
62 * @returns {Promise<string>} - The response from the language model.
63 */
64async function getLlmResponse(prompt: string) {
65 const completion = await openai.chat.completions.create({
66 "messages": [
67 { "role": "user", "content": prompt },

translateToEnglishWithOpenAI1 file match

@shlmt•Updated 2 days ago

testOpenAI1 file match

@stevekrouse•Updated 4 days ago
lost1991
import { OpenAI } from "https://esm.town/v/std/openai"; export default async function(req: Request): Promise<Response> { if (req.method === "OPTIONS") { return new Response(null, { headers: { "Access-Control-Allow-Origin": "*",