Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/$1?q=openai&page=154&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 2147 results for "openai"(727ms)

handsomeMagentaStoatREADME.md13 matches

@wangqiao1234Updated 7 months ago
1# OpenAI - [Docs ↗](https://docs.val.town/std/openai)
2
3Use OpenAI's chat completion API with [`std/openai`](https://www.val.town/v/std/openai). This integration enables access to OpenAI's language models without needing to acquire API keys.
4
5For free Val Town users, [all calls are sent to `gpt-4o-mini`](https://www.val.town/v/std/openaiproxy?v=12#L85).
6
7## Basic Usage
8
9```ts title="Example" val
10import { OpenAI } from "https://esm.town/v/std/openai";
11
12const openai = new OpenAI();
13
14const completion = await openai.chat.completions.create({
15 messages: [
16 { role: "user", content: "Say hello in a creative way" },
58## Limits
59
60While our wrapper simplifies the integration of OpenAI, there are a few limitations to keep in mind:
61
62* **Usage Quota**: We limit each user to 10 requests per minute.
65If these limits are too low, let us know! You can also get around the limitation by using your own keys:
66
671. Create your own API key on [OpenAI's website](https://platform.openai.com/api-keys)
682. Create an [environment variable](https://www.val.town/settings/environment-variables?adding=true) named `OPENAI_API_KEY`
693. Use the `OpenAI` client from `npm:openai`:
70
71```ts title="Example" val
72import { OpenAI } from "npm:openai";
73
74const openai = new OpenAI();
75```
76
77
78[📝 Edit docs](https://github.com/val-town/val-town-docs/edit/main/src/content/docs/std/openai.mdx)

handsomeMagentaStoatmain.tsx10 matches

@wangqiao1234Updated 7 months ago
1import { type ClientOptions, OpenAI as RawOpenAI } from "npm:openai";
2
3/**
4 * API Client for interfacing with the OpenAI API. Uses Val Town credentials.
5 */
6export class OpenAI {
7 private rawOpenAIClient: RawOpenAI;
8
9 /**
10 * API Client for interfacing with the OpenAI API. Uses Val Town credentials.
11 *
12 * @param {number} [opts.timeout=10 minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.
19 */
20 constructor(options: Omit<ClientOptions, "baseURL" | "apiKey" | "organization"> = {}) {
21 this.rawOpenAIClient = new RawOpenAI({
22 ...options,
23 baseURL: "https://std-openaiproxy.web.val.run/v1",
24 apiKey: Deno.env.get("valtown"),
25 organization: null,
28
29 get chat() {
30 return this.rawOpenAIClient.chat;
31 }
32
33 readonly beta = {
34 get chat(): RawOpenAI["beta"]["chat"] {
35 return this.rawOpenAIClient.beta.chat;
36 }
37 }

aimain.tsx17 matches

@yawnxyzUpdated 7 months ago
2import { Hono } from "npm:hono@3";
3import { cors } from "npm:hono/cors";
4import { createOpenAI } from "npm:@ai-sdk/openai";
5import { createAnthropic } from "npm:@ai-sdk/anthropic@0.0.48";
6import { google, createGoogleGenerativeAI } from 'npm:@ai-sdk/google';
30});
31
32const openai = createOpenAI({
33 // apiKey = Deno.env.get("OPENAI_API_KEY");
34 apiKey: Deno.env.get("OPENAI_API_KEY_COVERSHEET")
35});
36
37
38const groq = createOpenAI({
39 baseURL: 'https://api.groq.com/openai/v1',
40 apiKey: Deno.env.get("GROQ_API_KEY"),
41});
42
43const perplexity = createOpenAI({
44 apiKey: Deno.env.get("PERPLEXITY_API_KEY") ?? '',
45 baseURL: 'https://api.perplexity.ai/',
57 this.memories = options.memories || [];
58 this.messages = options.messages || [];
59 this.defaultProvider = options.provider || 'openai';
60 this.defaultModel = options.model;
61 this.defaultMaxTokens = options.maxTokens;
122 let result;
123 switch (provider) {
124 case 'openai':
125 result = await this.generateOpenAIResponse({ model, prompt, maxTokens, temperature, streaming, schema, system, messages, tools, ...additionalSettings });
126 break;
127 case 'anthropic':
171 }
172
173 async generateOpenAIResponse({ model, prompt, maxTokens, temperature, streaming, schema, system, messages, tools, embed, value, dimensions, user, ...additionalSettings }) {
174 const modelId = model || 'gpt-3.5-turbo';
175
176 if (embed) {
177 let result = await this.generateOpenAIEmbedding({ model, value, dimensions, user });
178 // console.log('embed!', result)
179 return result
181
182 const options = {
183 model: openai(modelId),
184 system,
185 temperature,
235 }
236
237 async generateOpenAIEmbedding({ model, value, dimensions, user }) {
238 const modelId = model || 'text-embedding-3-large';
239 const options = {
240 model: openai.embedding(modelId, {
241 dimensions,
242 user,
491
492app.get('/generate', async (c) => {
493 const provider = c.req.query('provider') || 'openai';
494 const model = c.req.query('model');
495 const prompt = c.req.query('prompt');
523 console.log("post/generate", { mode: 'post/generate', prompt, provider, model });
524 const response = await modelProvider.gen({
525 provider: provider || 'openai',
526 model,
527 prompt,

VALLEREADME.md1 match

@kateUpdated 7 months ago
6* Fork this val to your own profile.
7* Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in `tempValsParentFolderId`.
8* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-variables).
9* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environment-variables).
10* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API token as the password to log in.

GDI_AITranslatorREADME.md1 match

@rozek_at_hftUpdated 7 months ago
8It contains a simple web page which allows users to enter some german (or other
9non-english) text and send it to a preconfigured server. That server translates
10the text with the help of OpenAI and sends the result back to this app where it
11is finally presented to the user.
12
80
81 const API_CONFIG = {
82 url: "https://willthereader-openaidefiner.web.val.run",
83 method: "POST",
84 mode: "cors",

healthtech4africamain.tsx3 matches

@thomaskangahUpdated 8 months ago
159
160export default async function server(request: Request): Promise<Response> {
161 const { OpenAI } = await import("https://esm.town/v/std/openai");
162 const { sqlite } = await import("https://esm.town/v/stevekrouse/sqlite");
163 const openai = new OpenAI();
164
165 const SCHEMA_VERSION = 2;
230 }
231
232 const completion = await openai.chat.completions.create({
233 messages: [
234 { role: "system", content: systemMessage },

email_channelREADME.md1 match

@campsiteUpdated 8 months ago
3This val creates an email address that posts forwarded emails to a [Campsite](https://campsite.com) channel.
4
5It uses GPT-4 to extract a readable version of the forwarded email from the raw body. If you don't want to use GPT-4, omit the `OPENAI_API_KEY` and the raw body will be included in the post. Other providers are available via [Vercel's AI SDK](https://sdk.vercel.ai/docs/introduction#model-providers).
6
7For help with creating integrations, check out the [Campsite API docs](https://app.campsite.com/campsite/p/notes/campsite-api-docs-l07d7gm5n5rm). You'll need to create an integration and get an API key.

weatherGPTmain.tsx3 matches

@kevinforrestconnorsUpdated 8 months ago
1import { email } from "https://esm.town/v/std/email?v=11";
2import { OpenAI } from "npm:openai";
3
4let location = "brooklyn ny";
8).then(r => r.json());
9
10const openai = new OpenAI();
11let chatCompletion = await openai.chat.completions.create({
12 messages: [{
13 role: "user",

GDI_AIChatCompletionServicemain.tsx3 matches

@rozekUpdated 8 months ago
1import { OpenAI } from "https://esm.town/v/std/openai";
2
3export default async function (req: Request): Promise<Response> {
19 }
20
21 const openai = new OpenAI();
22 const completion = await openai.chat.completions.create({
23 model: "gpt-4o-mini",
24 messages: [

openai-client1 file match

@cricks_unmixed4uUpdated 3 days ago

openai_enrichment6 file matches

@stevekrouseUpdated 4 days ago
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai"; import { sqlite } from "https://esm.town/v/stevekrouse/sqlite"; /** * Practical Implementation of Collective Content Intelligence * Bridging advanced AI with collaborative content creation */ exp
kwhinnery_openai