Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/image-url.jpg?q=openai&page=112&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 1612 results for "openai"(716ms)

1import { OpenAI } from "https://esm.town/v/std/openai";
2import { OpenAIToolSet } from "npm:composio-core";
3
4const COMPOSIO_API_KEY = Deno.env.get("COMPOSIO_API_KEY"); // Getting the API key from the environment
5const toolset = new OpenAIToolSet({ apiKey: COMPOSIO_API_KEY });
6
7// Creating an authentication function for the user
32 const instruction = "Star a repo ComposioHQ/composio on GitHub";
33
34 const client = new OpenAI();
35 const response = await client.chat.completions.create({
36 model: "gpt-4-turbo",
1# Using OpenAI Assistant API, Composio to Star a Github Repo
2This is an example code of using Composio to star a github Repository by creating an AI Agent using OpenAI API
3
4## Goal
5Enable OpenAI assistants to perform tasks like starring a repository on GitHub via natural language commands.
6
7## Tools

easyAQIREADME.md1 match

@bcongdon•Updated 9 months ago
126. Uses EPA's ranking to classify the severity of the score (ie "Unhealthy for Sensitive Groups")
13
14It uses blob storage to cache the openai location id for your location string to skip a couple steps for the next time.
15
16## Example usage

tenseRoseTiglonmain.tsx3 matches

@MichaelNollox•Updated 9 months ago
2import { extractValInfo } from "https://esm.town/v/pomdtr/extractValInfo";
3
4import { OpenAI } from "https://esm.town/v/std/openai";
5
6const openai = new OpenAI();
7
8const completion = await openai.chat.completions.create({
9 messages: [
10 { role: "user", content: "Say hello in a creative way" },

tenseRoseTiglonREADME.md1 match

@MichaelNollox•Updated 9 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

VALLEREADME.md1 match

@MichaelNollox•Updated 9 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

motionlessPurpleBatmain.tsx3 matches

@carts•Updated 9 months ago
1import { OpenAI } from "https://esm.town/v/std/openai?v=4";
2
3const prompt = "Tell me a dad joke. Format the response as JSON with 'setup' and 'punchline' keys.";
6
7export default async function dailyDadJoke(req: Request): Response {
8 const openai = new OpenAI();
9
10 const resp = await openai.chat.completions.create({
11 messages: [
12 { role: "user", content: prompt },

valleBlogV0main.tsx3 matches

@janpaul123•Updated 9 months ago
3import { passwordAuth } from "https://esm.town/v/pomdtr/password_auth?v=84";
4import { verifyToken } from "https://esm.town/v/pomdtr/verifyToken?v=1";
5import { openai } from "npm:@ai-sdk/openai";
6import ValTown from "npm:@valtown/sdk";
7import { streamText } from "npm:ai";
36
37 const stream = await streamText({
38 model: openai("gpt-4o", {
39 baseURL: "https://std-openaiproxy.web.val.run/v1",
40 apiKey: Deno.env.get("valtown"),
41 } as any),

VALLErunmain.tsx2 matches

@janpaul123•Updated 9 months ago
9import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
10import { anthropic } from "npm:@ai-sdk/anthropic";
11import { openai } from "npm:@ai-sdk/openai";
12import ValTown from "npm:@valtown/sdk";
13import { StreamingTextResponse, streamText } from "npm:ai";
1104 let vercelModel;
1105 if (model.includes("gpt")) {
1106 vercelModel = openai(model);
1107 } else {
1108 vercelModel = anthropic(model);

valleGetValsContextWindowmain.tsx4 matches

@janpaul123•Updated 9 months ago
174 },
175 {
176 prompt: "Write a val that uses OpenAI",
177 code: `import { OpenAI } from "https://esm.town/v/std/openai";
178
179 const openai = new OpenAI();
180 const completion = await openai.chat.completions.create({
181 "messages": [
182 { "role": "user", "content": "Say hello in a creative way" },

translateToEnglishWithOpenAI1 file match

@shlmt•Updated 3 days ago

testOpenAI1 file match

@stevekrouse•Updated 5 days ago
lost1991
import { OpenAI } from "https://esm.town/v/std/openai"; export default async function(req: Request): Promise<Response> { if (req.method === "OPTIONS") { return new Response(null, { headers: { "Access-Control-Allow-Origin": "*",