Skip to main content
Prompts Go HTTP Benchmarking Tool Builder

developer coding user risk: low

Go HTTP Benchmarking Tool Builder

Create a high-performance HTTP benchmarking tool in Go with concurrent request generation, configurable threads, detailed statistics on latency, throughput, and errors, support for…

PROMPT

Create a high-performance HTTP benchmarking tool in Go. Implement concurrent request generation with configurable thread count. Add detailed statistics including latency, throughput, and error rates. Include support for HTTP/1.1, HTTP/2, and HTTP/3. Implement custom header and cookie management. Add request templating for dynamic content. Include response validation with regex and status code checking. Implement TLS configuration with certificate validation options. Add load profile configuration with ramp-up and steady-state phases. Include detailed reporting with percentiles and histograms. Implement distributed testing mode for high-load scenarios.

EXPECTED OUTPUT

Format
code

SUCCESS CRITERIA

  • Implement concurrent request generation with configurable thread count
  • Add detailed statistics including latency, throughput, and error rates
  • Include support for HTTP/1.1, HTTP/2, and HTTP/3
  • Implement custom header and cookie management
  • Add request templating for dynamic content
  • Include response validation with regex and status code checking
  • Implement TLS configuration with certificate validation options
  • Add load profile configuration with ramp-up and steady-state phases
  • Include detailed reporting with percentiles and histograms
  • Implement distributed testing mode for high-load scenarios

FAILURE MODES

  • May implement only basic features omitting advanced ones like HTTP/3 or distributed mode
  • Could fail to achieve high performance due to lack of optimization guidance
  • Might overlook concurrency safety or resource management
  • May produce incomplete code without full feature integration

CAVEATS

Missing context
  • Go version
  • Configuration input format (CLI, YAML, etc.)
  • Output report format (console, JSON, CSV, HTML?)
  • Usage examples
  • Performance benchmarks or scale targets
Ambiguities
  • Unclear how configurations (e.g., thread count, headers) are provided (CLI flags, config file?)
  • 'Request templating for dynamic content' lacks syntax or variable definition
  • Distributed testing mode mechanism unspecified (e.g., master-worker, coordination protocol)

QUALITY

OVERALL
0.70
CLARITY
0.90
SPECIFICITY
0.75
REUSABILITY
0.30
COMPLETENESS
0.65

IMPROVEMENT SUGGESTIONS

  • Add 'Use a CLI interface with flags for all configurations, supporting YAML config files for complex setups.'
  • Specify 'Request templating uses Go templates with variables like {{.timestamp}}, {{.counter}}.'
  • Detail 'Distributed mode: master node coordinates via gRPC with worker nodes reporting metrics.'
  • Include 'Output reports as JSON and console with ASCII histograms; export to Prometheus format.'
  • Provide sample command-line usage and config file example.

USAGE

Copy the prompt above and paste it into your AI of choice — Claude, ChatGPT, Gemini, or anywhere else you're working. Replace any placeholder sections with your own context, then ask for the output.

MORE FOR DEVELOPER