Towniesystem_prompt.txt2 matches
174```
175โโโ backend/
176โ โโโ database/
177โ โ โโโ migrations.ts # Schema definitions
178โ โ โโโ queries.ts # DB query functions
234```
235236### Database Patterns
237- Run migrations on startup or comment out for performance
238- Change table names when modifying schemas rather than altering
Towniesend-message.ts1 match
10overLimit,
11startTrackingUsage,
12} from "../database/queries.tsx";
13import { makeChangeValTypeTool, makeFetchTool, makeTextEditorTool } from "../tools/index.ts";
14import fileWithLinesNumbers from "../utils/fileWithLinesNumbers.ts";
Towniequeries.tsx1 match
4import { INFERENCE_CALLS_TABLE, USAGE_TABLE } from "./schema.tsx";
56// Eventually we'll have a user database,
7// but in the meantime, we can cache user info in memory
8const userIdCache: { [key: string]: any } = {};
Townie.cursorrules2 matches
198```
199โโโ backend/
200โ โโโ database/
201โ โ โโโ migrations.ts # Schema definitions
202โ โ โโโ queries.ts # DB query functions
257```
258259### Database Patterns
260- Run migrations on startup or comment out for performance
261- Change table names when modifying schemas rather than altering
notion-2-blueskyREADME.md7 matches
1# N2B Scheduler - Notion to Bluesky Auto Poster
23An automated scheduler that reads posts from a Notion database and publishes them to Bluesky at scheduled times.
45See it in action, https://bsky.app/profile/nucky.bsky.social
7## Features
89- ๐ Schedule Bluesky posts from Notion database
10- ๐ค Automatic posting via cron job (every 15 minutes)
11- ๐ Status tracking (Draft โ Scheduled โ Posted)
15## Setup
1617### 1. Notion Database Setup
1819Your database needs these fields:
20- **Title** (Title field) - Post title/identifier
21- **Text** (Rich Text field) - The post content
29Set these in your Val Town environment:
30- `NOTION_TOKEN` - Your Notion integration token
31- `NOTION_DATABASE_ID` - Your database ID
32- `BLUESKY_HANDLE` - Your Bluesky handle (e.g., user.bsky.social)
33- `BLUESKY_APP_PASSWORD` - Your Bluesky app password
38- Read content
39- Update content
40- Read database structure
4142## Files
56## Usage
57581. Create a new row in your Notion database
592. Set the Text field with your post content
603. **Optional**: Upload images to the Media field (supports JPG, PNG, GIF, WebP)
scrape-hwsreddit-scraper.ts2 matches
170171/**
172* Check if a post already exists in the database
173*/
174async postExists(redditId: string): Promise<boolean> {
193194/**
195* Save a post to the database
196*/
197async savePost(post: RedditPost): Promise<boolean> {
scrape-hwsREADME.md7 matches
1# Reddit /r/hardwareswap Scraper
23A Val Town application that scrapes posts from Reddit's /r/hardwareswap subreddit and stores them in a Supabase database.
45## Features
67- ๐ Automated scraping of /r/hardwareswap posts using Reddit's official OAuth API
8- ๐๏ธ Stores posts in Supabase PostgreSQL database
9- ๐ซ Duplicate detection to avoid storing the same post twice
10- ๐ Detailed logging and statistics
311. Create a new project in [Supabase](https://supabase.com)
322. Go to the SQL Editor in your Supabase dashboard
333. Copy and paste the contents of `database-schema.sql` and run it
344. Go to Settings > API to get your project URL and anon key
3556- Every 15 minutes: `*/15 * * * *`
5758## Database Schema
5960The `posts` table contains:
65- `title`: Post title
66- `created_at`: When the post was created on Reddit
67- `updated_at`: When the record was last updated in our database
6869## Usage
791. Authenticate with Reddit using OAuth client credentials
802. Fetch the latest 25 posts from /r/hardwareswap
813. Check for duplicates in the database
824. Save new posts to Supabase
835. Log statistics about the scraping session
1151161. **Missing environment variables**: Ensure all required Reddit and Supabase credentials are set
1172. **Database connection errors**: Verify your Supabase credentials and that the table exists
1183. **Reddit OAuth errors**: Check your Reddit app credentials and ensure the app type is "script"
1194. **Rate limiting**: Reddit may temporarily block requests if rate limits are exceeded
scrape-hwsscraper-api.ts1 match
306<div class="endpoint">
307<div class="method">GET /stats</div>
308<div>Get scraping statistics and database info</div>
309</div>
310
1-- Supabase Database Schema for Reddit Scraper
2-- Run this SQL in your Supabase SQL editor to create the posts table
3