Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/...?q=openai&page=177&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 2274 results for "openai"(2916ms)

MuxAITranscriptREADME.md1 match

@decepulis•Updated 10 months ago
14- Mux Access token details (`MUX_TOKEN_ID`, `MUX_TOKEN_SECRET`) This endpoint requires an existing Mux asset that's ready with an audio-only static rendition associated with it. You can run [this val](https://www.val.town/v/mux/createDubbingTestAsset) to create a new one for testing.
15- AssemblyAI API key (`ASSEMBLYAI_API_KEY`). Get it [from their dashboard here](https://www.assemblyai.com/app/account)
16- OpenAI API key (`OPENAI_API_KEY`). Get it [from their dashboard here](https://platform.openai.com/api-keys)
17
18Make a POST request to the Val's endpoint with the following body, replacing the values with your own asset ID and the list of speakers. Speakers are listed in order of appearance.
1import { OpenAI } from "https://esm.town/v/std/openai";
2import { OpenAIToolSet } from "npm:composio-core";
3
4const COMPOSIO_API_KEY = Deno.env.get("COMPOSIO_API_KEY"); // Getting the API key from the environment
5const toolset = new OpenAIToolSet({ apiKey: COMPOSIO_API_KEY });
6
7// Creating an authentication function for the user
32 const instruction = "Star a repo ComposioHQ/composio on GitHub";
33
34 const client = new OpenAI();
35 const response = await client.chat.completions.create({
36 model: "gpt-4-turbo",
1# Using OpenAI Assistant API, Composio to Star a Github Repo
2This is an example code of using Composio to star a github Repository by creating an AI Agent using OpenAI API
3
4## Goal
5Enable OpenAI assistants to perform tasks like starring a repository on GitHub via natural language commands.
6
7## Tools

easyAQIREADME.md1 match

@bcongdon•Updated 10 months ago
126. Uses EPA's ranking to classify the severity of the score (ie "Unhealthy for Sensitive Groups")
13
14It uses blob storage to cache the openai location id for your location string to skip a couple steps for the next time.
15
16## Example usage

tenseRoseTiglonmain.tsx3 matches

@MichaelNollox•Updated 10 months ago
2import { extractValInfo } from "https://esm.town/v/pomdtr/extractValInfo";
3
4import { OpenAI } from "https://esm.town/v/std/openai";
5
6const openai = new OpenAI();
7
8const completion = await openai.chat.completions.create({
9 messages: [
10 { role: "user", content: "Say hello in a creative way" },

tenseRoseTiglonREADME.md1 match

@MichaelNollox•Updated 10 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

VALLEREADME.md1 match

@MichaelNollox•Updated 10 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

motionlessPurpleBatmain.tsx3 matches

@carts•Updated 10 months ago
1import { OpenAI } from "https://esm.town/v/std/openai?v=4";
2
3const prompt = "Tell me a dad joke. Format the response as JSON with 'setup' and 'punchline' keys.";
6
7export default async function dailyDadJoke(req: Request): Response {
8 const openai = new OpenAI();
9
10 const resp = await openai.chat.completions.create({
11 messages: [
12 { role: "user", content: prompt },

valleBlogV0main.tsx3 matches

@janpaul123•Updated 10 months ago
3import { passwordAuth } from "https://esm.town/v/pomdtr/password_auth?v=84";
4import { verifyToken } from "https://esm.town/v/pomdtr/verifyToken?v=1";
5import { openai } from "npm:@ai-sdk/openai";
6import ValTown from "npm:@valtown/sdk";
7import { streamText } from "npm:ai";
36
37 const stream = await streamText({
38 model: openai("gpt-4o", {
39 baseURL: "https://std-openaiproxy.web.val.run/v1",
40 apiKey: Deno.env.get("valtown"),
41 } as any),

VALLErunmain.tsx2 matches

@janpaul123•Updated 10 months ago
9import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
10import { anthropic } from "npm:@ai-sdk/anthropic";
11import { openai } from "npm:@ai-sdk/openai";
12import ValTown from "npm:@valtown/sdk";
13import { StreamingTextResponse, streamText } from "npm:ai";
1104 let vercelModel;
1105 if (model.includes("gpt")) {
1106 vercelModel = openai(model);
1107 } else {
1108 vercelModel = anthropic(model);

openai-client4 file matches

@cricks_unmixed4u•Updated 10 hours ago

openai_enrichment6 file matches

@stevekrouse•Updated 2 weeks ago
kwhinnery_openai
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai"; import { sqlite } from "https://esm.town/v/stevekrouse/sqlite"; /** * Practical Implementation of Collective Content Intelligence * Bridging advanced AI with collaborative content creation */ exp