Public Exposure
100 points total
Identifies sensitive information leaking through response headers, error pages, robots.txt, and page source — internal IPs, stack traces, source maps, and debug artifacts.
Checks
| Check | Weight | What it measures |
|---|---|---|
Information Leakage Headers exposure_information_leakage_headers | 30 pts | Scans response headers for internal IP addresses, hostnames, and backend identifiers. |
Response Content Leakage exposure_response_content_leakage | 30 pts | Analyzes page source for source maps, internal URLs, commented-out code, and debug artifacts. |
Error Handling Quality exposure_error_handling_quality | 25 pts | Probes for custom error pages versus default server errors, stack traces, and database errors. |
Robots.txt Analysis exposure_robots_txt_analysis | 15 pts | Reviews robots.txt for entries that reveal sensitive infrastructure paths. |
Pass / Warn / Fail Logic
Information Leakage Headers
Pass if no internal info in headers; warn on hostname exposure; fail on internal IP exposure.
Response Content Leakage
Pass if clean; warn on minor leaks; fail on source maps or internal references.
Error Handling Quality
Pass if custom error pages; warn on default pages; fail on stack traces or DB errors.
Robots.txt Analysis
Pass if minimal/clean; warn on sensitive paths; fail on extensive disclosure.
Findings & How to Fix Them
These are the specific findings RedScore may report for this category, along with remediation guidance.
internal_ip_in_headersConfigure your reverse proxy or load balancer to strip internal addressing headers (X-Backend-Server, X-Upstream, Via) before responses reach the internet.
stack_trace_exposedEnsure all application error handlers catch exceptions and return generic error pages in production. Disable debug mode and verbose error output.
database_error_exposedWrap all database operations in error handlers that log details server-side and return generic messages to the client.
internal_hostname_in_headersStrip or sanitize proxy and backend headers in your edge/load balancer configuration.
robots_extensive_disclosureSignificantly reduce your robots.txt footprint. Sensitive resources should be protected by authentication and network controls, not just excluded from search indexing.
source_maps_exposedRemove source map references from production builds. Configure your build tool (webpack, Vite, esbuild) to either skip source map generation for production or restrict source map access to authenticated debugging sessions.
internal_refs_in_sourceAudit page templates and application configuration for hardcoded internal addresses. Use environment-specific configuration to ensure production pages reference only public endpoints.
insecure_form_actionUpdate all form action URLs to use HTTPS. Ensure your application generates HTTPS URLs in all contexts when running behind TLS termination.
default_error_pageConfigure custom error pages that don't reveal server identity or internal details.
robots_sensitive_pathsReview your robots.txt for entries that reveal sensitive infrastructure. Consider whether disallowed paths need to exist at all, or whether they should be protected by authentication rather than obscurity.
internal_urls_in_commentsStrip HTML comments from production builds using your build tool's minification settings, or audit comments for internal references before deployment.
commented_code_blocksRemove commented-out code from production templates. Use version control to preserve old code instead of commenting it out.
no_error_response_observed_in_probesNo error responses were observed during normal probing, so error-page hygiene was not directly tested. This is not the same as verifying custom error handling under failure conditions.