Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/image-url.jpg%20%22Image%20title%22?q=openai&page=119&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 1610 results for "openai"(532ms)

openAiProxymain.tsx4 matches

@ashryanioUpdated 10 months ago
1import { OpenAI } from "https://esm.town/v/std/openai";
2
3const openai = new OpenAI();
4
5/**
58
59/**
60 * Gets the response from the OpenAI language model.
61 * @param {string} prompt - The prompt for the language model.
62 * @returns {Promise<string>} - The response from the language model.
63 */
64async function getLlmResponse(prompt: string) {
65 const completion = await openai.chat.completions.create({
66 "messages": [
67 { "role": "user", "content": prompt },

openAiProxyREADME.md5 matches

@ashryanioUpdated 10 months ago
1# openAiProxy
2
3## Overview
4
5This val is a proxy server that interacts with the OpenAI API to generate responses based on prompts in the request body. The function handles incoming HTTP POST requests, processes the prompt, and returns a response generated by the LLM.
6
7## Prerequisites
8
9- Server-side: (Optional) An active OpenAI API key
10- Client-side: Something that can make POST requests (browser code, Postman, cURL, another Val, etc)
11
25
26```sh
27curl -X POST https://ashryanio-openaiproxy.web.val.run -H "Content-Type: application/json" -d '{"prompt": "Hello, OpenAI!"}'
28```
29
31
32- **Content-Type**: application/json
33- **Body**: JSON object containing the response from the OpenAI language model.
34
35#### Example Response

valTownChatGPTREADME.md1 match

@mttlwsUpdated 10 months ago
1# ChatGPT Implemented in Val Town
2
3Demonstrated how to use assistants and threads with the OpenAI SDK and how to stream the response with Server-Sent Events.
4
5<p align=center>

valTownChatGPTmain.tsx8 matches

@mttlwsUpdated 10 months ago
1/** @jsxImportSource https://esm.sh/react */
2import OpenAI from "npm:openai";
3import { renderToString } from "npm:react-dom/server";
4
5// This uses by personal API key, you'll need to provide your own if
6// you fork this. We'll be adding support to the std/openai lib soon!
7const openai = new OpenAI();
8import { Hono } from "npm:hono@3";
9
38 });
39
40 // Setup the SSE connection and stream back the response. OpenAI handles determining
41 // which message is the correct response based on what was last read from the
42 // thread. This is likely vulnerable to race conditions.
58const app = new Hono();
59app.get("/", async (c) => {
60 const thread = await openai.beta.threads.create();
61 const assistant = await openai.beta.assistants.create({
62 name: "",
63 instructions:
114app.post("/post-message", async (c) => {
115 let message = await c.req.text();
116 await openai.beta.threads.messages.create(
117 c.req.query("threadId"),
118 { role: "user", content: message },
132 ));
133 };
134 const run = openai.beta.threads.runs.stream(threadId, {
135 assistant_id: assistantId,
136 // Make sure we only display messages we haven't seen yet.

browserbase_google_concertsmain.tsx3 matches

@stevekrouseUpdated 10 months ago
1import puppeteer from "https://deno.land/x/puppeteer@16.2.0/mod.ts";
2import { OpenAI } from "https://esm.town/v/std/openai?v=4";
3import { Browserbase } from "npm:@browserbasehq/sdk";
4
31
32// ask chat gpt for list of concert dates
33const openai = new OpenAI();
34
35const completion = await openai.chat.completions.create({
36 messages: [
37 { role: "system", content: "Return concert dates as JSON array. No code fences." },

databaseRunnermain.tsx1 match

@nicosuaveUpdated 10 months ago
89 console.log(`Created ${fileResponses.length} ${gzip ? "gzip" : ""} files`);
90
91 return new Response(JSON.stringify({ openaiFileResponse: fileResponses }), {
92 status: 200,
93 headers: { "Content-Type": "application/json" },

OpenAIREADME.md13 matches

@wangqiao1234Updated 10 months ago
1# OpenAI - [Docs ↗](https://docs.val.town/std/openai)
2
3Use OpenAI's chat completion API with [`std/openai`](https://www.val.town/v/std/openai). This integration enables access to OpenAI's language models without needing to acquire API keys.
4
5For free Val Town users, [all calls are sent to `gpt-3.5-turbo`](https://www.val.town/v/std/openaiproxy?v=5#L69).
6
7Streaming is not yet supported. Upvote the [HTTP response streaming feature request](https://github.com/val-town/val-town-product/discussions/14) if you need it!
10
11```ts title="Example" val
12import { OpenAI } from "https://esm.town/v/std/openai";
13
14const openai = new OpenAI();
15
16const completion = await openai.chat.completions.create({
17 messages: [
18 { role: "user", content: "Say hello in a creative way" },
27## Limits
28
29While our wrapper simplifies the integration of OpenAI, there are a few limitations to keep in mind:
30
31* **Usage Quota**: We limit each user to 10 requests per minute.
34If these limits are too low, let us know! You can also get around the limitation by using your own keys:
35
361. Create your own API key on [OpenAI's website](https://platform.openai.com/api-keys)
372. Create an [environment variable](https://www.val.town/settings/environment-variables?adding=true) named `OPENAI_API_KEY`
383. Use the `OpenAI` client from `npm:openai`:
39
40```ts title="Example" val
41import { OpenAI } from "npm:openai";
42
43const openai = new OpenAI();
44```
45
46
47[📝 Edit docs](https://github.com/val-town/val-town-docs/edit/main/src/content/docs/std/openai.mdx)

valTownChatGPTREADME.md1 match

@simonwUpdated 10 months ago
1# ChatGPT Implemented in Val Town
2
3Demonstrated how to use assistants and threads with the OpenAI SDK and how to stream the response with Server-Sent Events.
4
5<p align=center>

valTownChatGPTmain.tsx8 matches

@simonwUpdated 10 months ago
1/** @jsxImportSource https://esm.sh/react */
2import OpenAI from "npm:openai";
3import { renderToString } from "npm:react-dom/server";
4
5// This uses by personal API key, you'll need to provide your own if
6// you fork this. We'll be adding support to the std/openai lib soon!
7const openai = new OpenAI();
8import { Hono } from "npm:hono@3";
9
38 });
39
40 // Setup the SSE connection and stream back the response. OpenAI handles determining
41 // which message is the correct response based on what was last read from the
42 // thread. This is likely vulnerable to race conditions.
58const app = new Hono();
59app.get("/", async (c) => {
60 const thread = await openai.beta.threads.create();
61 const assistant = await openai.beta.assistants.create({
62 name: "",
63 instructions:
114app.post("/post-message", async (c) => {
115 let message = await c.req.text();
116 await openai.beta.threads.messages.create(
117 c.req.query("threadId"),
118 { role: "user", content: message },
132 ));
133 };
134 const run = openai.beta.threads.runs.stream(threadId, {
135 assistant_id: assistantId,
136 // Make sure we only display messages we haven't seen yet.
4import Instructor from "npm:@instructor-ai/instructor";
5import Jimp from "npm:jimp";
6import OpenAI from "npm:openai";
7import { z } from "npm:zod";
8
17const TODOIST_API_KEY = process.env.TODOIST_API_KEY;
18const HABITIFY_API_KEY = process.env.HABITIFY_API_KEY;
19const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
20const DEF_TIMEZONE = "America/Los_Angeles"; // Get your timezone from here: https://stackoverflow.com/a/54500197
21
49const todoistapi = new TodoistApi(TODOIST_API_KEY);
50
51const oai = new OpenAI({
52 apiKey: OPENAI_API_KEY ?? undefined,
53});
54

translateToEnglishWithOpenAI1 file match

@shlmtUpdated 2 days ago

testOpenAI1 file match

@stevekrouseUpdated 4 days ago
lost1991
import { OpenAI } from "https://esm.town/v/std/openai"; export default async function(req: Request): Promise<Response> { if (req.method === "OPTIONS") { return new Response(null, { headers: { "Access-Control-Allow-Origin": "*",