Agent Skills: Wget URL Reader

Fetch data from URLs. Use when asked to download content, fetch remote files, or read web data.

UncategorizedID: beshkenadze/claude-skills-marketplace/wget-reader

Install this agent skill to your local

pnpm dlx add-skill https://github.com/beshkenadze/claude-skills-marketplace/tree/HEAD/skills/utility/wget-reader

Skill Files

Browse the full folder contents for wget-reader.

Download Skill

Loading file tree…

skills/utility/wget-reader/SKILL.md

Skill Metadata

Name
wget-reader
Description
Fetch data from URLs. Use when asked to download content, fetch remote files, or read web data.

Wget URL Reader

Overview

Fetches content from URLs using wget command-line tool. Supports downloading files, reading web pages, and retrieving API responses.

Instructions

  1. When user provides a URL to read or fetch:

    • Validate the URL format
    • Use wget with appropriate flags based on content type
  2. For reading content to stdout (display):

    wget -qO- "<URL>"
    
  3. For downloading files:

    wget -O "<filename>" "<URL>"
    
  4. For JSON API responses:

    wget -qO- --header="Accept: application/json" "<URL>"
    
  5. Common wget flags:

    • -q: Quiet mode (no progress output)
    • -O-: Output to stdout
    • -O <file>: Output to specific file
    • --header: Add custom HTTP header
    • --timeout=<seconds>: Set timeout
    • --tries=<n>: Number of retries
    • --user-agent=<agent>: Set user agent

Examples

Example: Read webpage content

Input: "Read the content from https://example.com" Command:

wget -qO- "https://example.com"

Example: Download a file

Input: "Download the file from https://example.com/data.json" Command:

wget -O "data.json" "https://example.com/data.json"

Example: Fetch API with headers

Input: "Fetch JSON from https://api.example.com/data" Command:

wget -qO- --header="Accept: application/json" "https://api.example.com/data"

Example: Download with timeout and retries

Input: "Download with 30 second timeout" Command:

wget --timeout=30 --tries=3 -O "output.txt" "<URL>"

Guidelines

Do

  • Always quote URLs to handle special characters
  • Use -q flag to suppress progress bars in scripts
  • Add --timeout for unreliable endpoints
  • Respect robots.txt and rate limits

Don't

  • Use --no-check-certificate unless necessary
  • Fetch URLs without validating format first
  • Ignore HTTP error codes in responses
  • Store credentials in command history