Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/$%7Bart_info.art.src%7D?q=openai&page=111&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 1594 results for "openai"(1742ms)

VALLErunmain.tsx2 matches

@janpaul123•Updated 9 months ago
9import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
10import { anthropic } from "npm:@ai-sdk/anthropic";
11import { openai } from "npm:@ai-sdk/openai";
12import ValTown from "npm:@valtown/sdk";
13import { StreamingTextResponse, streamText } from "npm:ai";
1104 let vercelModel;
1105 if (model.includes("gpt")) {
1106 vercelModel = openai(model);
1107 } else {
1108 vercelModel = anthropic(model);

valleGetValsContextWindowmain.tsx4 matches

@janpaul123•Updated 9 months ago
174 },
175 {
176 prompt: "Write a val that uses OpenAI",
177 code: `import { OpenAI } from "https://esm.town/v/std/openai";
178
179 const openai = new OpenAI();
180 const completion = await openai.chat.completions.create({
181 "messages": [
182 { "role": "user", "content": "Say hello in a creative way" },

bedtimeStoryMakerREADME.md2 matches

@dthyresson•Updated 9 months ago
13* and activity (befriends aliens, goes to the doctor, rides a rollercoaster, bakes a cake for friends)
14
15It uses OpenAI to write a children's bedtime story
16
17* title
21for a "fantastical story about a green whale who rides the bus" or the "spooky story about the tomato fox who explores a cave".
22
23Then using the summary, OpenAI geenrates another prompt to describe the instructions to geneate a childrens story book image.
24
25That's sent to Fal to generate an image.

askAImain.tsx5 matches

@DFB•Updated 9 months ago
1import { OpenAI } from "https://deno.land/x/openai@v4.54.0/mod.ts";
2
3const apiKey = Deno.env.get("OPENAI_API_KEY");
4const openai = new OpenAI({ apiKey });
5
6export async function askAI(msg: string) {
9 model: "gpt-4o-mini",
10 max_tokens: 3000,
11 } satisfies OpenAI.ChatCompletionCreateParamsNonStreaming;
12
13 const chat = await openai.chat.completions.create(cfg);
14
15 return chat.choices?.[0].message.content;

movieMashupREADME.md1 match

@dthyresson•Updated 9 months ago
3It's Blader Runner meets Pretty in Pink.
4
5OpenAI generated movie maship title, tagline and treatments.
6
7Fal generated movie posters.

VALLEREADME.md1 match

@oijoijcoiejoijce•Updated 9 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

VALLErunmain.tsx2 matches

@roadlabs•Updated 9 months ago
8import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
9import { anthropic } from "npm:@ai-sdk/anthropic";
10import { openai } from "npm:@ai-sdk/openai";
11import ValTown from "npm:@valtown/sdk";
12import { StreamingTextResponse, streamText } from "npm:ai";
359 let vercelModel;
360 if (model.includes("gpt")) {
361 vercelModel = openai(model);
362 } else {
363 vercelModel = anthropic(model);

azureCheetahmain.tsx2 matches

@tmcw•Updated 9 months ago
9import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
10import { anthropic } from "npm:@ai-sdk/anthropic";
11import { openai } from "npm:@ai-sdk/openai";
12import ValTown from "npm:@valtown/sdk";
13import { StreamingTextResponse, streamText } from "npm:ai";
354 let vercelModel;
355 if (model.startsWith("gpt")) {
356 vercelModel = openai(model);
357 } else {
358 vercelModel = anthropic(model);

VALLErunmain.tsx2 matches

@tmcw•Updated 9 months ago
9import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
10import { anthropic } from "npm:@ai-sdk/anthropic";
11import { openai } from "npm:@ai-sdk/openai";
12import ValTown from "npm:@valtown/sdk";
13import { StreamingTextResponse, streamText } from "npm:ai";
354 let vercelModel;
355 if (model.startsWith("gpt")) {
356 vercelModel = openai(model);
357 } else {
358 vercelModel = anthropic(model);

VALLEREADME.md1 match

@tmcw•Updated 9 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

translateToEnglishWithOpenAI1 file match

@shlmt•Updated 14 hours ago

testOpenAI1 file match

@stevekrouse•Updated 2 days ago
lost1991
import { OpenAI } from "https://esm.town/v/std/openai"; export default async function(req: Request): Promise<Response> { if (req.method === "OPTIONS") { return new Response(null, { headers: { "Access-Control-Allow-Origin": "*",