Sub-path 6 of 6
Final Mastery Project
Ship the one project that proves it.
Pick a multi-source data product, build it end-to-end, deploy it, document it. Five suggested capstones, price intelligence, jobs analytics, real-estate, public data, or SERP rank tracker.
~4 weeks part-time · 1 project
Lessons
- 6.1expert
Pick One Project, Ship It Publicly
How the capstone works, what counts as done, and how to pick between the five project options without overthinking it.
- 6.2expert
Project A, Multi-Source Price Intelligence Platform
Track prices across Catalog108, two external e-commerce sites, and Google Shopping daily. Output: a normalised price-history database plus a public dashboard.
Lab:
/deals/live - 6.3expert
Project B, Job Market Analytics Service
Aggregate jobs across Catalog108 /jobs, five external boards, and Google Jobs (via SERP API). Dedupe, normalise, and ship a dashboard that surfaces market trends.
Lab:
/jobs - 6.4expert
Project C, Real-Estate Intelligence Dashboard
Aggregate property listings from 10+ public sources, normalise the messy schema, and surface market insights for one city or region.
- 6.5expert
Project D, Open Public-Data Aggregator
Pick an underserved public dataset and build the canonical free version. The most flexible capstone option, and the one most likely to outlive the curriculum.
- 6.6expert
Project E, SERP Rank-Tracking SaaS
Build a small SaaS that tracks keyword rankings across Google, Bing, and Brave, including AI Overviews and competitor analysis. The most commercial-feeling capstone.
Lab:
/search - 6.7expert
Shipping, and What Comes Next
How to publish your capstone, and what to do in the first 90 days after you ship so the project compounds into a job, a freelance pipeline, or a paying SaaS.
Every lesson has a hands-on lab target on Catalog108 , our first-party practice scraping sandbox. Each lab page has a /grade endpoint that returns pass/fail on your scraper output.