Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/$2?q=openai&page=15&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 2240 results for "openai"(2831ms)

stevensDemo.cursorrules4 matches

@sysbot•Updated 1 week ago
100Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
101
102### OpenAI
103```ts
104import { OpenAI } from "https://esm.town/v/std/openai";
105const openai = new OpenAI();
106const completion = await openai.chat.completions.create({
107 messages: [
108 { role: "user", content: "Say hello in a creative way" },

thirdTimerval-town.mdc4 matches

@nbbaier•Updated 1 week ago
93Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
94
95### OpenAI
96
97```ts
98import { OpenAI } from "https://esm.town/v/std/openai";
99const openai = new OpenAI();
100const completion = await openai.chat.completions.create({
101 messages: [
102 { role: "user", content: "Say hello in a creative way" },

realtymain.tsx10 matches

@legal•Updated 1 week ago
953// --- Main Request Handler (Server Code - MODIFIED) ---
954export default async function(req: Request) {
955 const { OpenAI } = await import("https://esm.town/v/std/openai");
956 const { fetch } = await import("https://esm.town/v/std/fetch");
957 // PDFExtract is kept if you want to add document features later, but not primary for this use case.
965 const action = url.searchParams.get("action"); // New: "loanAssumptionInfo"
966 const sourceUrl = import.meta.url.replace("esm.town", "val.town");
967 const openai = new OpenAI();
968 const MAX_TEXT_LENGTH_ANALYSIS = 10000; // Reduced as input is smaller now
969
992 }
993
994 // callOpenAI function (same as original, but uses gpt-4o by default)
995 async function callOpenAI(
996 openaiInstance: OpenAI,
997 systemPrompt: string,
998 userMessage: string, // For this app, userMessage to AI might be empty if all info is in systemPrompt
1003 ): Promise<object | string> {
1004 // ... (implementation from original)
1005 log.push({ agent: agentName, type: "step", message: `Calling OpenAI model ${model}...` });
1006 try {
1007 const response = await openaiInstance.chat.completions.create({
1008 model: model,
1009 messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userMessage }],
1124 .replace("%%USER_NAME%%", analysisInput.userName);
1125 // The %%INPUT_SOURCE_DESCRIPTION%% and %%LEGAL_TASK_QUERY%% are not in the new prompt in this direct way.
1126 // The userMessage to openAI can be kept minimal or empty as the system prompt is rich.
1127 const userMessageForAI = ""; // Or analysisInput.documentText if you want to provide more context there.
1128
1129 const analysisAgentName = "HomeAdvantage AI";
1130 const aiResponse = await callOpenAI(
1131 openai,
1132 finalSystemPrompt,
1133 userMessageForAI,

docsmain.tsx10 matches

@legal•Updated 1 week ago
880}
881export default async function(req: Request) {
882 const { OpenAI } = await import("https://esm.town/v/std/openai");
883 const { fetch } = await import("https://esm.town/v/std/fetch");
884 const { PDFExtract, PDFExtractOptions } = await import("npm:pdf.js-extract");
893 const action = url.searchParams.get("action");
894 const sourceUrl = import.meta.url.replace("esm.town", "val.town");
895 const openai = new OpenAI();
896 const MAX_TEXT_LENGTH_SUGGESTION = 20000;
897 const MAX_TEXT_LENGTH_ANALYSIS = 30000;
917 }
918 }
919 async function callOpenAI(
920 openaiInstance: OpenAI,
921 systemPrompt: string,
922 userMessage: string,
926 agentName: string,
927 ): Promise<object | string> {
928 log.push({ agent: agentName, type: "step", message: `Calling OpenAI model ${model}...` });
929 try {
930 const response = await openaiInstance.chat.completions.create({
931 model: model,
932 messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userMessage }],
1092 );
1093 const suggestionAgentName = "Task Suggestion AI";
1094 const suggestionsResponse = await callOpenAI(
1095 openai,
1096 suggestionPrompt,
1097 "",
1212 );
1213 const analysisAgentName = "Legal Analysis AI";
1214 const aiResponse = await callOpenAI(
1215 openai,
1216 finalSystemPrompt,
1217 documentTextToAnalyze,

leglmain.tsx12 matches

@join•Updated 1 week ago
2 * Legal AI Document Analysis (Single Val Version with PDF Upload & Dashboard Style)
3 * Ingests documents (URL, Text, PDF Upload), takes a user-defined legal task query,
4 * and uses a Legal AI Agent (via OpenAI) to analyze the content.
5 * The Legal AI Agent outputs a standardized JSON structure.
6 * Uses 'npm:pdf.js-extract' for direct PDF text extraction within the Val.
566export default async function(req: Request) {
567 // --- Dynamic Imports (Inside Handler) ---
568 const { OpenAI } = await import("https://esm.town/v/std/openai");
569 const { z } = await import("npm:zod"); // For potential future robust input validation on server
570 const { fetch } = await import("https://esm.town/v/std/fetch");
608 }
609
610 // --- Helper Function: Call OpenAI API ---
611 async function callOpenAI(
612 openai: OpenAI,
613 systemPrompt: string,
614 userMessage: string, // This will be the document text
617 ): Promise<object | string> { // Return type can be object if JSON, or string if error
618 try {
619 const response = await openai.chat.completions.create({
620 model: model,
621 messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userMessage }],
630 return JSON.parse(content);
631 } catch (parseError) {
632 console.error("OpenAI JSON Parse Error:", parseError, "Raw Content:", content);
633 // Return the raw content if parsing fails, for debugging, wrapped as an error object
634 return {
642 }
643 } catch (error) {
644 console.error(`OpenAI API call failed. Error:`, error);
645 let errorMessage = "Error communicating with AI model.";
646 if (error.message) { errorMessage += ` Details: ${error.message}`; }
661 log: LogEntry[],
662 ): Promise<{ finalResult?: LegalAIResponse | object; log: LogEntry[] }> {
663 const openai = new OpenAI(); // Assumes OPENAI_API_KEY is in environment
664
665 log.push({ agent: "System", type: "step", message: "Starting Legal AI analysis workflow." });
756 finalSystemPrompt = finalSystemPrompt.replace("%%INPUT_SOURCE_DESCRIPTION%%", input.inputSourceDescription);
757
758 const aiResponse = await callOpenAI(openai, finalSystemPrompt, truncatedText, "gpt-4o", true); // Expect JSON
759
760 if (typeof aiResponse === "object" && (aiResponse as any).error) {
761 // Error from callOpenAI helper (API or parsing)
762 log.push({
763 agent: "Legal AI Agent",
784 log.push({ agent: "System", type: "final", message: "Workflow finished successfully." });
785 return { finalResult: aiResponse as LegalAIResponse, log };
786 } else { // Should not happen if expectJson is true and no error object from callOpenAI
787 log.push({
788 agent: "Legal AI Agent",

Towniesystem_prompt.txt4 matches

@PrincessJossy•Updated 1 week ago
88Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
89
90### OpenAI
91
92```ts
93import { OpenAI } from "https://esm.town/v/std/openai";
94const openai = new OpenAI();
95const completion = await openai.chat.completions.create({
96 messages: [
97 { role: "user", content: "Say hello in a creative way" },

Townie.cursorrules4 matches

@PrincessJossy•Updated 1 week ago
94Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
95
96### OpenAI
97
98```ts
99import { OpenAI } from "https://esm.town/v/std/openai";
100const openai = new OpenAI();
101const completion = await openai.chat.completions.create({
102 messages: [
103 { role: "user", content: "Say hello in a creative way" },

Lawyersmain.tsx12 matches

@Get•Updated 1 week ago
2 * Legal AI Document Analysis (Single Val Version with PDF Upload & Dashboard Style)
3 * Ingests documents (URL, Text, PDF Upload), takes a user-defined legal task query,
4 * and uses a Legal AI Agent (via OpenAI) to analyze the content.
5 * The Legal AI Agent outputs a standardized JSON structure.
6 * Uses 'npm:pdf.js-extract' for direct PDF text extraction within the Val.
813export default async function(req: Request) {
814 // --- Dynamic Imports (Inside Handler) ---
815 const { OpenAI } = await import("https://esm.town/v/std/openai");
816 const { z } = await import("npm:zod"); // For potential future robust input validation on server
817 const { fetch } = await import("https://esm.town/v/std/fetch");
855 }
856
857 // --- Helper Function: Call OpenAI API ---
858 async function callOpenAI(
859 openai: OpenAI,
860 systemPrompt: string,
861 userMessage: string, // This will be the document text
864 ): Promise<object | string> { // Return type can be object if JSON, or string if error
865 try {
866 const response = await openai.chat.completions.create({
867 model: model,
868 messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userMessage }],
877 return JSON.parse(content);
878 } catch (parseError) {
879 console.error("OpenAI JSON Parse Error:", parseError, "Raw Content:", content);
880 // Return the raw content if parsing fails, for debugging, wrapped as an error object
881 return {
889 }
890 } catch (error) {
891 console.error(`OpenAI API call failed. Error:`, error);
892 let errorMessage = "Error communicating with AI model.";
893 if (error.message) { errorMessage += ` Details: ${error.message}`; }
908 log: LogEntry[],
909 ): Promise<{ finalResult?: LegalAIResponse | object; log: LogEntry[] }> {
910 const openai = new OpenAI(); // Assumes OPENAI_API_KEY is in environment
911
912 log.push({ agent: "System", type: "step", message: "Starting Legal AI analysis workflow." });
1003 finalSystemPrompt = finalSystemPrompt.replace("%%INPUT_SOURCE_DESCRIPTION%%", input.inputSourceDescription);
1004
1005 const aiResponse = await callOpenAI(openai, finalSystemPrompt, truncatedText, "gpt-4o", true); // Expect JSON
1006
1007 if (typeof aiResponse === "object" && (aiResponse as any).error) {
1008 // Error from callOpenAI helper (API or parsing)
1009 log.push({
1010 agent: "Legal AI Agent",
1031 log.push({ agent: "System", type: "final", message: "Workflow finished successfully." });
1032 return { finalResult: aiResponse as LegalAIResponse, log };
1033 } else { // Should not happen if expectJson is true and no error object from callOpenAI
1034 log.push({
1035 agent: "Legal AI Agent",

leglmain.tsx12 matches

@Get•Updated 1 week ago
2 * Legal AI Document Analysis (Single Val Version with PDF Upload & Dashboard Style)
3 * Ingests documents (URL, Text, PDF Upload), takes a user-defined legal task query,
4 * and uses a Legal AI Agent (via OpenAI) to analyze the content.
5 * The Legal AI Agent outputs a standardized JSON structure.
6 * Uses 'npm:pdf.js-extract' for direct PDF text extraction within the Val.
506export default async function(req: Request) {
507 // --- Dynamic Imports (Inside Handler) ---
508 const { OpenAI } = await import("https://esm.town/v/std/openai");
509 const { z } = await import("npm:zod"); // For potential future robust input validation on server
510 const { fetch } = await import("https://esm.town/v/std/fetch");
548 }
549
550 // --- Helper Function: Call OpenAI API ---
551 async function callOpenAI(
552 openai: OpenAI,
553 systemPrompt: string,
554 userMessage: string, // This will be the document text
557 ): Promise<object | string> { // Return type can be object if JSON, or string if error
558 try {
559 const response = await openai.chat.completions.create({
560 model: model,
561 messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userMessage }],
570 return JSON.parse(content);
571 } catch (parseError) {
572 console.error("OpenAI JSON Parse Error:", parseError, "Raw Content:", content);
573 // Return the raw content if parsing fails, for debugging, wrapped as an error object
574 return {
582 }
583 } catch (error) {
584 console.error(`OpenAI API call failed. Error:`, error);
585 let errorMessage = "Error communicating with AI model.";
586 if (error.message) { errorMessage += ` Details: ${error.message}`; }
601 log: LogEntry[],
602 ): Promise<{ finalResult?: LegalAIResponse | object; log: LogEntry[] }> {
603 const openai = new OpenAI(); // Assumes OPENAI_API_KEY is in environment
604
605 log.push({ agent: "System", type: "step", message: "Starting Legal AI analysis workflow." });
696 finalSystemPrompt = finalSystemPrompt.replace("%%INPUT_SOURCE_DESCRIPTION%%", input.inputSourceDescription);
697
698 const aiResponse = await callOpenAI(openai, finalSystemPrompt, truncatedText, "gpt-4o", true); // Expect JSON
699
700 if (typeof aiResponse === "object" && (aiResponse as any).error) {
701 // Error from callOpenAI helper (API or parsing)
702 log.push({
703 agent: "Legal AI Agent",
724 log.push({ agent: "System", type: "final", message: "Workflow finished successfully." });
725 return { finalResult: aiResponse as LegalAIResponse, log };
726 } else { // Should not happen if expectJson is true and no error object from callOpenAI
727 log.push({
728 agent: "Legal AI Agent",

blog2025-06-03-newsletter-25.md1 match

@valdottown•Updated 1 week ago
167- [**We at Val Town**](https://www.val.town/u/valdottown) made [door](https://www.val.town/x/valdottown/door) for Val Town office guests to open the door through their phone, like [this](https://www.val.town/x/valdottown/door/pull/ef854848-34b0-11f0-9887-9e149126039e)!
168- [**Joey Hiller**](https://www.val.town/u/jhiller) made [ValTown-Package-Tracker](https://www.val.town/x/jhiller/ValTown-Package-Tracker) to track a package of sensors he was sending from Oakland to us in Downtown Brooklyn!
169- [**dinavinter**](https://www.val.town/u/dinavinter) made [slack](https://www.val.town/x/dinavinter/slack), a Slack bot that uses OpenAI to generate responses.
170- [**prashamtrivedi**](https://www.val.town/u/prashamtrivedi) made [hn-remote-ts-genai-jobs](https://www.val.town/x/prashamtrivedi/hn-remote-ts-genai-jobs), to track Hacker News' Who's Hiring filter for remote TypeScript + GenAI-related jobs.
171- [**dcm31**](https://www.val.town/u/dcm31) made [rotrank](https://www.val.town/x/dcm31/rotrank), an ELO ranking system for popular Italian Brainrot characters.

openai-client1 file match

@cricks_unmixed4u•Updated 1 week ago

openai_enrichment6 file matches

@stevekrouse•Updated 1 week ago
kwhinnery_openai
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai"; import { sqlite } from "https://esm.town/v/stevekrouse/sqlite"; /** * Practical Implementation of Collective Content Intelligence * Bridging advanced AI with collaborative content creation */ exp