26 Unix tools. One binary. Zero dependencies.·the missing coreutils for the agent era·vrk mcp - expose all 26 tools to any AI agent·curl vrk.sh/install.sh | sh - ready in 5 seconds·

The missing coreutils
for the agent era.

zero silent failures  •  composable by default  •  one static CLI binary

vrk grab https://api.example.com/docs \
  | vrk tok --check 8000 \
  | vrk prompt --system "extract the breaking changes" \
  | vrk validate --schema changes.json \
  | vrk kv set changelog
curl -fsSL vrk.sh/install.sh | sh

try it now - no API key needed

~ vrk
# Count tokens in a document
cat README.md | vrk tok
# → 342

# Mask secrets before they leak
echo "key: sk-ant-api-abc123xyz" | vrk mask
# → key: [REDACTED]

before

import anthropic, tiktoken, json, re, jsonschema

client = anthropic.Anthropic()
with open("article.md") as f:
    text = f.read()

# Count tokens - abort if too long
enc = tiktoken.encoding_for_model("gpt-4")
if len(enc.encode(text)) > 8000:
    raise ValueError("over token budget")

# Redact secrets before sending
text = re.sub(r"sk-[a-zA-Z0-9]{48}", "[REDACTED]", text)

# Call the API
resp = client.messages.create(
    model="claude-sonnet-4-6", max_tokens=1024,
    messages=[{"role": "user", "content": text}]
)
result = resp.content[0].text

# Validate the output
schema = json.load(open("schema.json"))
jsonschema.validate(json.loads(result), schema)

after

~ vrk
cat article.md \
  | vrk tok --check 8000 \
  | vrk mask \
  | vrk prompt --system 'Summarize' \
  | vrk validate --schema schema.json

every vrk tool: stdin in, stdout out, exit 0/1/2

over token limit?exit 1← pipeline stops
bad JSON schema?exit 1← nothing downstream runs
wrong flags?exit 2← usage error, never silent

no more pipelines that look like they worked.

batch jobs at scale

~ vrk
find docs/ -name '*.md' -exec cat {} + \
  | vrk chunk --size 4000 \
  | vrk throttle --rate 60/m \
  | vrk prompt --field text --model gpt-4o --json

chunked. paced. structured output. safe to rerun.

Install

brew tap vrksh/vrksh
brew install vrk
go install github.com/vrksh/vrksh@latest
curl -sSL https://vrk.sh/install.sh | sh

Core

vrk tok start here

Count tokens. Gate pipelines before they fail.

cat system-prompt.txt | vrk tok --check 8000
vrk prompt

LLM prompt - Anthropic/OpenAI, --schema, --retry, --explain.

cat article.md | vrk prompt --system 'Summarize the key findings in 3 bullet points'
vrk chunk

Token-aware text splitter - JSONL chunks within a token budget

cat contract.pdf.txt | vrk chunk --size 4000 --overlap 200

Pipeline

+ 17 utilities: jwt, epoch, uuid, and more →

Works with Claude Code and Cursor via MCP. See agent integration →