Description
Multiple nested directories hide the real flag page. Mirror the site or manually enumerate `/secret/hidden/superhidden/` until you discover the correct HTML.
Setup
Recursively download the site (`wget -m http://saturn.picoctf.net:53932/`) or probe the directories manually.
Follow the hints (`/secret`, `/secret/hidden`, `/secret/hidden/superhidden`) until you land on the flag page.
wget -m http://saturn.picoctf.net:53932/curl -s http://saturn.picoctf.net:53932/secret/hidden/superhidden/grep -oE "picoCTF\{.*?\}" --color=noneSolution
- Step 1Mirror or crawlUsing `wget -m` pulls down every referenced directory so you can explore offline. Alternatively, use curl to fetch each nested folder live.
Learn more
Directory enumeration is the process of discovering hidden or unlisted paths on a web server. Servers often host files in directories that aren't linked from any page - they're accessible if you know the path, but invisible to normal browsing. This is called security through obscurity, and it's not a valid security control.
wget -m(mirror mode) is equivalent towget -r -N -l inf --no-remove-listing- it recursively downloads everything, preserves timestamps, and follows all links it finds. This is ideal for capturing the full structure of a static site.In professional web application testing, directory brute-forcing with tools like gobuster, ffuf, or dirbuster uses wordlists containing thousands of common path names (
admin,backup,secret,api, etc.) to systematically probe for unlisted endpoints.Another quick recon step is checking robots.txt (e.g.,
http://target.com/robots.txt) and sitemap.xml. Robots.txt tells search engine crawlers which paths not to index - and ironically, theDisallowentries are often a direct map of the most interesting directories on the site. Real-world examples include admin panels, staging environments, and internal API endpoints accidentally disclosed in production robots.txt files.Web archive services like the Wayback Machine (web.archive.org) are also valuable: even if a developer removed a sensitive directory, old snapshots may reveal its existence and contents. During bug bounty reconnaissance, combining current directory brute-forcing with historical archive inspection significantly broadens the attack surface you can identify.
- Step 2Extract the flagOnce you reach `/secret/hidden/superhidden/`, grep the returned HTML for `picoCTF{...}`.
Learn more
The layered nesting (
/secret/hidden/superhidden/) illustrates how developers sometimes try to hide pages by using obscure directory names rather than implementing proper authentication. Each layer provides no additional security - if you know the path exists, HTTP will serve it to anyone.HTTP directory listing is another related vulnerability: when a web server is configured to show the contents of a directory (because there's no
index.html), it automatically exposes all files in that path to any visitor. Always check for open directory listings during web recon - they're a fast way to discover interesting files.Proper access control means requiring authentication (login, API key, session token) before serving sensitive resources - not relying on paths being unknown. Authenticated endpoints return 401 or 403 to unauthenticated requests regardless of whether the path is guessed or linked.
When checking for directory listing, look for the telltale "Index of /" title in the page response. A quick
curl -s http://target/ | grep -i "index of"identifies open listings automatically. Automated tools likeniktoalso flag open directory listings as part of their standard scan output.This challenge is a good reminder that the OWASP category Security Misconfiguration (A05 in the OWASP Top 10 2021) covers a wide range of issues including open directory listings, unnecessary features enabled, default credentials, and verbose error messages. Misconfiguration is consistently one of the most prevalent web vulnerabilities because it requires active hardening - the insecure state is often the out-of-the-box default.
Flag
picoCTF{succ3ss_@h3n1c@10n_51b2...}
robots.txt or site mirroring often reveals “hidden” directories with ease.