Val Town Code SearchReturn to Val Town

API Access

You can access search results via JSON API by adding format=json to your query:

https://codesearch.val.run/image-url.jpg?q=database&page=173&format=json

For typeahead suggestions, use the /typeahead endpoint:

https://codesearch.val.run/typeahead?q=database

Returns an array of strings in format "username" or "username/projectName"

Found 7054 results for "database"(3482ms)

Towniethink.ts1 match

@prubeandoAlโ€ขUpdated 1 week ago
7export const thinkTool = tool({
8 description:
9 "Use the tool to think about something. It will not obtain new information or change the database, but just append the thought to the log. Use it when complex reasoning or some cache memory is needed.",
10 parameters: z.object({
11 thought: z.string().describe("A thought to think about."),

Towniesystem_prompt.txt2 matches

@prubeandoAlโ€ขUpdated 1 week ago
174```
175โ”œโ”€โ”€ backend/
176โ”‚ โ”œโ”€โ”€ database/
177โ”‚ โ”‚ โ”œโ”€โ”€ migrations.ts # Schema definitions
178โ”‚ โ”‚ โ”œโ”€โ”€ queries.ts # DB query functions
234 ```
235
236### Database Patterns
237- Run migrations on startup or comment out for performance
238- Change table names when modifying schemas rather than altering

Towniesend-message.ts1 match

@prubeandoAlโ€ขUpdated 1 week ago
10 overLimit,
11 startTrackingUsage,
12} from "../database/queries.tsx";
13import { makeChangeValTypeTool, makeFetchTool, makeTextEditorTool } from "../tools/index.ts";
14import fileWithLinesNumbers from "../utils/fileWithLinesNumbers.ts";

Towniequeries.tsx1 match

@prubeandoAlโ€ขUpdated 1 week ago
4import { INFERENCE_CALLS_TABLE, USAGE_TABLE } from "./schema.tsx";
5
6// Eventually we'll have a user database,
7// but in the meantime, we can cache user info in memory
8const userIdCache: { [key: string]: any } = {};

Townie.cursorrules2 matches

@prubeandoAlโ€ขUpdated 1 week ago
198```
199โ”œโ”€โ”€ backend/
200โ”‚ โ”œโ”€โ”€ database/
201โ”‚ โ”‚ โ”œโ”€โ”€ migrations.ts # Schema definitions
202โ”‚ โ”‚ โ”œโ”€โ”€ queries.ts # DB query functions
257 ```
258
259### Database Patterns
260- Run migrations on startup or comment out for performance
261- Change table names when modifying schemas rather than altering
notion-2-bluesky

notion-2-blueskyREADME.md7 matches

@nuckyโ€ขUpdated 1 week ago
1# N2B Scheduler - Notion to Bluesky Auto Poster
2
3An automated scheduler that reads posts from a Notion database and publishes them to Bluesky at scheduled times.
4
5See it in action, https://bsky.app/profile/nucky.bsky.social
7## Features
8
9- ๐Ÿ“… Schedule Bluesky posts from Notion database
10- ๐Ÿค– Automatic posting via cron job (every 15 minutes)
11- ๐Ÿ“Š Status tracking (Draft โ†’ Scheduled โ†’ Posted)
15## Setup
16
17### 1. Notion Database Setup
18
19Your database needs these fields:
20- **Title** (Title field) - Post title/identifier
21- **Text** (Rich Text field) - The post content
29Set these in your Val Town environment:
30- `NOTION_TOKEN` - Your Notion integration token
31- `NOTION_DATABASE_ID` - Your database ID
32- `BLUESKY_HANDLE` - Your Bluesky handle (e.g., user.bsky.social)
33- `BLUESKY_APP_PASSWORD` - Your Bluesky app password
38- Read content
39- Update content
40- Read database structure
41
42## Files
56## Usage
57
581. Create a new row in your Notion database
592. Set the Text field with your post content
603. **Optional**: Upload images to the Media field (supports JPG, PNG, GIF, WebP)

scrape-hwsreddit-scraper.ts2 matches

@wxwโ€ขUpdated 1 week ago
170
171 /**
172 * Check if a post already exists in the database
173 */
174 async postExists(redditId: string): Promise<boolean> {
193
194 /**
195 * Save a post to the database
196 */
197 async savePost(post: RedditPost): Promise<boolean> {

scrape-hwsREADME.md7 matches

@wxwโ€ขUpdated 1 week ago
1# Reddit /r/hardwareswap Scraper
2
3A Val Town application that scrapes posts from Reddit's /r/hardwareswap subreddit and stores them in a Supabase database.
4
5## Features
6
7- ๐Ÿ”„ Automated scraping of /r/hardwareswap posts using Reddit's official OAuth API
8- ๐Ÿ—„๏ธ Stores posts in Supabase PostgreSQL database
9- ๐Ÿšซ Duplicate detection to avoid storing the same post twice
10- ๐Ÿ“Š Detailed logging and statistics
311. Create a new project in [Supabase](https://supabase.com)
322. Go to the SQL Editor in your Supabase dashboard
333. Copy and paste the contents of `database-schema.sql` and run it
344. Go to Settings > API to get your project URL and anon key
35
56 - Every 15 minutes: `*/15 * * * *`
57
58## Database Schema
59
60The `posts` table contains:
65- `title`: Post title
66- `created_at`: When the post was created on Reddit
67- `updated_at`: When the record was last updated in our database
68
69## Usage
791. Authenticate with Reddit using OAuth client credentials
802. Fetch the latest 25 posts from /r/hardwareswap
813. Check for duplicates in the database
824. Save new posts to Supabase
835. Log statistics about the scraping session
115
1161. **Missing environment variables**: Ensure all required Reddit and Supabase credentials are set
1172. **Database connection errors**: Verify your Supabase credentials and that the table exists
1183. **Reddit OAuth errors**: Check your Reddit app credentials and ensure the app type is "script"
1194. **Rate limiting**: Reddit may temporarily block requests if rate limits are exceeded

scrape-hwsscraper-api.ts1 match

@wxwโ€ขUpdated 1 week ago
306 <div class="endpoint">
307 <div class="method">GET /stats</div>
308 <div>Get scraping statistics and database info</div>
309 </div>
310

scrape-hwsdatabase-schema.sql1 match

@wxwโ€ขUpdated 1 week ago
1-- Supabase Database Schema for Reddit Scraper
2-- Run this SQL in your Supabase SQL editor to create the posts table
3

bookmarksDatabase

@s3thiโ€ขUpdated 3 months ago

sqLiteDatabase1 file match

@ideofunkโ€ขUpdated 6 months ago