Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/$%7Bsuccess?q=openai&page=102&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 1610 results for "openai"(504ms)

healthtech4africamain.tsx3 matches

@thomaskangah•Updated 7 months ago
159
160export default async function server(request: Request): Promise<Response> {
161 const { OpenAI } = await import("https://esm.town/v/std/openai");
162 const { sqlite } = await import("https://esm.town/v/stevekrouse/sqlite");
163 const openai = new OpenAI();
164
165 const SCHEMA_VERSION = 2;
230 }
231
232 const completion = await openai.chat.completions.create({
233 messages: [
234 { role: "system", content: systemMessage },

email_channelREADME.md1 match

@campsite•Updated 7 months ago
3This val creates an email address that posts forwarded emails to a [Campsite](https://campsite.com) channel.
4
5It uses GPT-4 to extract a readable version of the forwarded email from the raw body. If you don't want to use GPT-4, omit the `OPENAI_API_KEY` and the raw body will be included in the post. Other providers are available via [Vercel's AI SDK](https://sdk.vercel.ai/docs/introduction#model-providers).
6
7For help with creating integrations, check out the [Campsite API docs](https://app.campsite.com/campsite/p/notes/campsite-api-docs-l07d7gm5n5rm). You'll need to create an integration and get an API key.

weatherGPTmain.tsx3 matches

@kevinforrestconnors•Updated 7 months ago
1import { email } from "https://esm.town/v/std/email?v=11";
2import { OpenAI } from "npm:openai";
3
4let location = "brooklyn ny";
8).then(r => r.json());
9
10const openai = new OpenAI();
11let chatCompletion = await openai.chat.completions.create({
12 messages: [{
13 role: "user",

GDI_AIChatCompletionServicemain.tsx3 matches

@rozek•Updated 7 months ago
1import { OpenAI } from "https://esm.town/v/std/openai";
2
3export default async function (req: Request): Promise<Response> {
19 }
20
21 const openai = new OpenAI();
22 const completion = await openai.chat.completions.create({
23 model: "gpt-4o-mini",
24 messages: [

GDI_AIChatCompletionServiceREADME.md1 match

@rozek•Updated 7 months ago
8It contains a simple HTTP end point which expects a POST request with a JSON
9structure containing the properties "SystemMessage" and "UserMessage". These
10message are then used to run an OpenAI chat completion and produce an "assistant
11message" which is sent back to the client as plain text.
12

GDI_AITranslatorREADME.md1 match

@rozek•Updated 7 months ago
8It contains a simple web page which allows users to enter some german (or other
9non-english) text and send it to a preconfigured server. That server translates
10the text with the help of OpenAI and sends the result back to this app where it
11is finally presented to the user.
12

GDI_AITranslatorServicemain.tsx5 matches

@rozek•Updated 7 months ago
1import { OpenAI } from "https://esm.town/v/std/openai";
2
3export default async function (req: Request): Promise<Response> {
16 }
17
18 // Initialize OpenAI
19 const openai = new OpenAI();
20
21 // Translate the text using OpenAI
22 const completion = await openai.chat.completions.create({
23 messages: [
24 { role: "system", content: "You are a German to English translator. Translate the following text to English:" },

GDI_AITranslatorServiceREADME.md1 match

@rozek•Updated 7 months ago
7
8It contains a simple HTTP end point which expects a POST request with a text
9body. That text is translated to english with the help of OpenAI and sent back
10to the client
11

multiUserChatwithLLMmain.tsx3 matches

@trob•Updated 7 months ago
128
129export default async function server(request: Request): Promise<Response> {
130 const { OpenAI } = await import("https://esm.town/v/std/openai");
131 const { sqlite } = await import("https://esm.town/v/stevekrouse/sqlite");
132 const openai = new OpenAI();
133 const SCHEMA_VERSION = 2;
134 const KEY = "multiUserChatwithLLM";
178 messages.push({ role: "user", content: `${username}: ${message}` });
179
180 const completion = await openai.chat.completions.create({
181 messages,
182 model: "gpt-4o-mini",

weatherGPTmain.tsx3 matches

@tnordby•Updated 7 months ago
1import { email } from "https://esm.town/v/std/email?v=11";
2import { OpenAI } from "npm:openai";
3
4let location = "brooklyn ny";
8).then(r => r.json());
9
10const openai = new OpenAI();
11let chatCompletion = await openai.chat.completions.create({
12 messages: [{
13 role: "user",

translateToEnglishWithOpenAI1 file match

@shlmt•Updated 2 days ago

testOpenAI1 file match

@stevekrouse•Updated 4 days ago
lost1991
import { OpenAI } from "https://esm.town/v/std/openai"; export default async function(req: Request): Promise<Response> { if (req.method === "OPTIONS") { return new Response(null, { headers: { "Access-Control-Allow-Origin": "*",