Humanoid-Traboda-Web-CTF-Write-up

Aravind Rajesh
3 min readJun 29, 2021

CHALLENGE DESCRIPTION

Author has some secret data Hidden without getting exposed to public. Can you dig it out?

HINT
Login page

We are given a hint “REP”…

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.

In practice, robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) user agents.

In the challenge URL, we append </robot.txt>

Note: challenge URL may vary from instance to instance. So do the following changes in your Given challenge URL

Disallow: The command used to tell a user-agent not to crawl particular URL. Only one “Disallow:” line is allowed for each URL.

Allow (Only applicable for Googlebot): The command to tell Googlebot it can access a page or subfolder even though its parent page or subfolder may be disallowed.

The browser is restricted from crawling the website /starbenboorbenbarben.html.

May be our flag is inside this html file. So, let’s look into it. It can be done by a small change in our challenge URL. That is by appending /starbenboorbenbarben.html in challenge URL.

Doing this, we will get our 🏁..

Challenge Flag
Traboda CyberLabs

About Traboda…

Traboda is an end-to-end cybersecurity learning platform which has more than 400+ CTF challenges spreading across various categories of Cybersecurity. Level up your skills through immersive, gamified and hands-on learning experience.

--

--

Aravind Rajesh

CSE undergrad. Learning web3 and Blockchain. Interested in both Development and Security.