Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/...?q=openai&page=14&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=openai

Returns an array of strings in format "username" or "username/projectName"

Found 2240 results for "openai"(3641ms)

ContextualREADME.md1 match

@c15rUpdated 1 week ago
14
15### Core Cognitive Tools
16- **AI-Enhanced Thought Forking**: Automatically generate parallel explorations using OpenAI
17- **Goal Tracking**: Create, update, and monitor goals with hierarchical structure
18- **Task Management**: Break down goals into actionable tasks with state tracking

osdata.json5 matches

@dinavinterUpdated 1 week ago
4 "projects": [
5 {
6 "raw": "# bolt.diy\r\n\r\n### Repository URL\r\n\r\nhttps://github.com/stackblitz-labs/bolt.diy\r\n\r\n### Project Description\r\n\r\nbolt.diy is an open-source platform that enables users to prompt, run, edit, and deploy full-stack web applications directly in the browser using various Large Language Models (LLMs). It supports integration with multiple LLM providers, including OpenAI, Anthropic, HuggingFace, and more. The platform offers features like an integrated terminal, code versioning, and the ability to attach images to prompts, facilitating a seamless AI-powered development experience. \r\n\r\n\r\n### Potential Contribution Areas\r\n\r\nAgent Collaboration: Enhance bolt.diy to support MCP tool access and A2A-based agent collaboration, fostering better interoperability between SAP agents and bolt.diy projects.\r\nSAP Integration: Develop adapters or plugins to integrate bolt.diy with SAP's App Router or Cloud Foundry, enabling seamless deployment of applications within the SAP ecosystem.\r\n\r\n### Estimated Time Commitment\r\n\r\nMedium (4-8 hours/week)\r\n\r\n### Required Skills\r\n\r\nTypeScript\r\n\r\n### License\r\n\r\nMIT\r\n\r\n### Additional Information\r\n\r\n_No response_",
7 "id": 22,
8 "title": "bolt.diy",
39 "projectName": "",
40 "repositoryUrl": "https://github.com/stackblitz-labs/bolt.diy",
41 "description": "bolt.diy is an open-source platform that enables users to prompt, run, edit, and deploy full-stack web applications directly in the browser using various Large Language Models (LLMs). It supports integration with multiple LLM providers, including OpenAI, Anthropic, HuggingFace, and more. The platform offers features like an integrated terminal, code versioning, and the ability to attach images to prompts, facilitating a seamless AI-powered development experience.",
42 "contribution": "Agent Collaboration: Enhance bolt.diy to support MCP tool access and A2A-based agent collaboration, fostering better interoperability between SAP agents and bolt.diy projects.\r\nSAP Integration: Develop adapters or plugins to integrate bolt.diy with SAP's App Router or Cloud Foundry, enabling seamless deployment of applications within the SAP ecosystem.",
43 "timeCommitment": "Medium (4-8 hours/week)",
222 },
223 {
224 "raw": "### Project Name\n\nLLM Proxy Server for SAP AI Core\n\n### Repository URL\n\nhttps://github.com/sap-samples/llm-proxy-sap-ai-core\n\n### Project Description\n\nThe LLM Proxy Server for SAP AI Core is an open-source project designed to serve as a lightweight, performant, and extensible intermediary between SAP AI Core and a variety of large language model (LLM) backends. Inspired by solutions like LightLLM, this proxy server standardizes interactions with different model providers (OpenAI, Anthropic, HuggingFace, etc.), handles rate limiting and caching, and simplifies integration for enterprise use cases. Its purpose is to provide a scalable, multi-tenant LLM gateway that plugs seamlessly into the SAP AI Core inference pipeline.\n\n### Potential Contribution Areas\n\n- Model Adapter Layer: Implement new adapters to support additional LLM providers or fine-tuned models hosted on SAP AI Core.\r\n- Request Routing & Optimization: Improve routing logic, caching strategies, and load balancing between LLM providers.\r\n- Security & Multi-Tenancy: Enhance authentication, logging, and quota management to support secure, tenant-aware deployments.\r\n- SAP Integration: Develop SDK components or API contracts to enable easy consumption of the proxy within SAP BTP-based applications.\n\n### Estimated Time Commitment\n\nMedium (4-8 hours/week)\n\n### Required Skills\n\n_No response_\n\n### License\n\n_No response_\n\n### Additional Information\n\n_No response_",
225 "id": 23,
226 "title": "LLM Proxy Server for SAP AI Core",
256 "projectName": "LLM Proxy Server for SAP AI Core",
257 "repositoryUrl": "https://github.com/sap-samples/llm-proxy-sap-ai-core",
258 "description": "The LLM Proxy Server for SAP AI Core is an open-source project designed to serve as a lightweight, performant, and extensible intermediary between SAP AI Core and a variety of large language model (LLM) backends. Inspired by solutions like LightLLM, this proxy server standardizes interactions with different model providers (OpenAI, Anthropic, HuggingFace, etc.), handles rate limiting and caching, and simplifies integration for enterprise use cases. Its purpose is to provide a scalable, multi-tenant LLM gateway that plugs seamlessly into the SAP AI Core inference pipeline.",
259 "contribution": "- Model Adapter Layer: Implement new adapters to support additional LLM providers or fine-tuned models hosted on SAP AI Core.\r\n- Request Routing & Optimization: Improve routing logic, caching strategies, and load balancing between LLM providers.\r\n- Security & Multi-Tenancy: Enhance authentication, logging, and quota management to support secure, tenant-aware deployments.\r\n- SAP Integration: Develop SDK components or API contracts to enable easy consumption of the proxy within SAP BTP-based applications.",
260 "timeCommitment": "Medium (4-8 hours/week)",
746 },
747 {
748 "raw": "# Open WebUI \r\n ---\r\n \r\n ## 🔗 Repository URL \r\n https://github.com/open-webui/open-webui\r\n \r\n ## 🧠 Project Description \r\n Local web UI for interacting with LLMs like OpenAI, Ollama, LM Studio, and others.\r\n \r\n ## 🧩 Interoperability: Agent & Tool Protocol Fit \r\n Could integrate MCP to allow access to external tool APIs and A2A for backend AI agent interaction.\r\n \r\n ## 🛠️ How AI Guild Can Contribute \r\n - Add SAP API Hub wrapper as MCP server\r\n - Build agent-to-agent pipeline that connects support tooling to dev tools\r\n \r\n ## 🕒 Estimated Time Commitment \r\n Medium (4-8 hours/week)\r\n\r\n \r\n ## 🧪 Required Skills \r\n TypeScript, Python, LLM APIs, Web UI Development\r\n \r\n ## ⚖️ License \r\n \r\n \r\n ---\r\n \r\n ## 💬 Additional Information \r\n \r\n \r\n 📎 Linked source: `Open WebUI.json`\r\n ",
749 "id": 9,
750 "title": "Open WebUI",
5import { discordWebhook } from "https://esm.town/v/stevekrouse/discordWebhook";
6import { Octokit } from "npm:octokit";
7import { OpenAI } from "https://esm.town/v/std/openai";
8
9// Environment variables, set them in the left sidebar
25 */
26async function generateUserFocusedSummary(fullCommitMessage: string, commitType: string): Promise<string> {
27 const openai = new OpenAI();
28
29 const prompt = `You are writing release notes for a developer tool. Based on this commit message, write a concise one-liner.
51
52 try {
53 const completion = await openai.chat.completions.create({
54 messages: [{ role: "user", content: prompt }],
55 model: "gpt-4o-mini",
5import { discordWebhook } from "https://esm.town/v/stevekrouse/discordWebhook";
6import { Octokit } from "npm:octokit";
7import { OpenAI } from "https://esm.town/v/std/openai";
8
9// Environment variables, set them in the left sidebar
25 */
26async function generateUserFocusedSummary(fullCommitMessage: string, commitType: string): Promise<string> {
27 const openai = new OpenAI();
28
29 const prompt = `You are writing release notes for a developer tool. Based on this commit message, write a concise one-liner.
51
52 try {
53 const completion = await openai.chat.completions.create({
54 messages: [{ role: "user", content: prompt }],
55 model: "gpt-4o-mini",

Townie-09system_prompt.txt4 matches

@jxnblkUpdated 1 week ago
88Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
89
90### OpenAI
91
92```ts
93import { OpenAI } from "https://esm.town/v/std/openai";
94const openai = new OpenAI();
95const completion = await openai.chat.completions.create({
96 messages: [
97 { role: "user", content: "Say hello in a creative way" },

Townie-09.cursorrules4 matches

@jxnblkUpdated 1 week ago
94Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
95
96### OpenAI
97
98```ts
99import { OpenAI } from "https://esm.town/v/std/openai";
100const openai = new OpenAI();
101const completion = await openai.chat.completions.create({
102 messages: [
103 { role: "user", content: "Say hello in a creative way" },

autonomous-valREADME.md1 match

@charmaineUpdated 1 week ago
9Configure the following variables in your environment:
10- `AGENT_API_KEY` (This is a secure token that you choose to secure the agent.tsx POST endpoint)
11- `OPENAI_API_KEY` (An OpenAI API Key)
12- `EXA_API_KEY` (Optional, though needed if you use the web search tool)
13

autonomous-valagent.tsx2 matches

@charmaineUpdated 1 week ago
1import { anthropic } from "npm:@ai-sdk/anthropic";
2import { openai } from "npm:@ai-sdk/openai";
3import { generateText, streamText } from "npm:ai";
4import { getSystemPrompt } from "./prompt.tsx";
34 const maxSteps = 10;
35
36 const model = Deno.env.get("ANTHROPIC_API_KEY") ? anthropic("claude-3-7-sonnet-latest") : openai("gpt-4.1");
37
38 const options = {

markdown-embed.cursorrules4 matches

@stevekrouseUpdated 1 week ago
94Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.
95
96### OpenAI
97
98```ts
99import { OpenAI } from "https://esm.town/v/std/openai";
100const openai = new OpenAI();
101const completion = await openai.chat.completions.create({
102 messages: [
103 { role: "user", content: "Say hello in a creative way" },

openai-clientmain.tsx9 matches

@cricks_unmixed4uUpdated 1 week ago
1import { OpenAI } from "https://esm.sh/openai@4.85.1";
2import { sqlite } from "https://esm.town/v/std/sqlite";
3
7};
8
9interface ChatOpenAI {
10 invoke(messages: Message[]): Promise<string>;
11}
12
13export function ChatOpenAI(model: string): ChatOpenAI {
14 const openai = new OpenAI();
15
16 return {
17 invoke: async (messages: Message[]): Promise<string> => {
18 const completion = await openai.chat.completions.create({
19 messages: messages.map(message => ({
20 role: message.role as "user" | "assistant" | "system",
29}
30
31// Decorator for ChatOpenAI that will eventually add rate limiting
32export function GlobalRateLimitedChatOpenAI(model: string, requestsPerSecond: number): ChatOpenAI {
33 const openAi = ChatOpenAI(model);
34
35 const rateLimiter = new GlobalRateLimiter(requestsPerSecond);
39 await rateLimiter.check();
40
41 return openAi.invoke(messages);
42 },
43 };

openai-client1 file match

@cricks_unmixed4uUpdated 1 week ago

openai_enrichment6 file matches

@stevekrouseUpdated 1 week ago
kwhinnery_openai
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai"; import { sqlite } from "https://esm.town/v/stevekrouse/sqlite"; /** * Practical Implementation of Collective Content Intelligence * Bridging advanced AI with collaborative content creation */ exp