- Forked from coreyhaines31/marketingskills v1.1.0 (MIT license) - Removed 4 SaaS-only skills (churn-prevention, paywall-upgrade-cro, onboarding-cro, signup-flow-cro) - Reworked 2 skills (popup-cro → hvac-estimate-popups, revops → hvac-lead-ops) - Adapted all 28 retained skills with HVAC industry context and Compendium integration - Created 10 new HVAC-specific skills: - hvac-content-from-data (flagship DB integration) - hvac-seasonal-campaign (demand cycle marketing) - hvac-review-management (GBP review strategy) - hvac-video-repurpose (long-form → social) - hvac-technical-content (audience-calibrated writing) - hvac-brand-voice (trade authenticity guide) - hvac-contractor-website-audit (discovery & analysis) - hvac-contractor-website-package (marketing package assembly) - hvac-compliance-claims (EPA/rebate/safety claim checking) - hvac-content-qc (fact-check & citation gate) - Renamed product-marketing-context → hvac-marketing-context (global) - Created COMPENDIUM_INTEGRATION.md (shared integration contract) - Added Compendium wrapper tools (search, scrape, classify) - Added compendium capability tags to YAML frontmatter - Updated README, AGENTS.md, CLAUDE.md, VERSIONS.md, marketplace.json - All 38 skills pass validate-skills.sh - Zero dangling references to removed/renamed skills Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
25 lines
776 B
Bash
Executable file
25 lines
776 B
Bash
Executable file
#!/bin/bash
|
|
# Compendium Scrape Router wrapper
|
|
# Usage: compendium-scrape.sh <url> [--metadata]
|
|
|
|
set -euo pipefail
|
|
|
|
SCRAPE_ROUTER="${COMPENDIUM_SCRAPE_URL:-http://192.168.10.249:30098}"
|
|
URL="${1:?Usage: compendium-scrape.sh <url> [--metadata]}"
|
|
EXTRACT_METADATA="false"
|
|
|
|
if [[ "${2:-}" == "--metadata" ]]; then
|
|
EXTRACT_METADATA="true"
|
|
fi
|
|
|
|
# Health check
|
|
if ! curl -sf "${SCRAPE_ROUTER}/health" > /dev/null 2>&1; then
|
|
echo "ERROR: Scrape Router unavailable at ${SCRAPE_ROUTER}" >&2
|
|
echo "Fallback: Use Claude Code WebFetch instead" >&2
|
|
exit 1
|
|
fi
|
|
|
|
curl -sf -X POST "${SCRAPE_ROUTER}/scrape" \
|
|
-H "Content-Type: application/json" \
|
|
-d "{\"url\": \"${URL}\", \"extract_text\": true, \"extract_metadata\": ${EXTRACT_METADATA}}" \
|
|
| python3 -m json.tool
|