HTB Academy: Network Enumeration with NMAP

I’d really appreciate a nudge with the following question:

Section:
Nmap Scripting Engine

Question:
“Use NSE and its scripts to find the flag that one of the services contain and submit it as the answer”

Hint:
Web servers are among the most attacked services because they are made accessible to users and present a high attack potential.

What I’ve done:
We’ll I’ve enumerated both ports 80 and 31337, using every ‘default’ script category using the --script and all the categories, i.e., discovery, version, vuln, etc - none of them turn up the flag…

…I’d really appreciate a pointer if at all possible.

1 Like

I had issues connecting to the machine the other night because the VM wouldn’t load (got a red box error from my htb) , now some of the ports that were previously open are now closed, I’ve respawned the target a few times. I’m wondering if theres an issue with it…

Got it now :::facepalm:::

overthinking

1 Like

I had the same issue

Apologies if OT.
Have you been able to complete “Firewall and IDS/IPS Evasion - Hard Lab”?
I found the service, but can’t fingerprint the version without causing the firewall to close the connection

@MoeSyzslak could you throw a hint here, im stuck at the same :smile:

[Deleted]

I never managed to complete the hard lab… ?

Hey. Did anyone solve this question? I’m stuck in this question too. Can someone give me a help to solve this?

You should look into a file that is given by one of the scans proposed into the section.

3 Likes

Thank you very much - saved my sanity today!

1 Like

can anybody help me with this i have run all the scan and scanned port 80 with http-enum script but it doesn’t give me any flag or anything i am stuck please help

so i found the /robots.txt on port 80 but i cant figure out how to read it to find the flag. I’m on page 7 in nmap. please help

you can access the robots.txt which is to tell search engine what is to be indexed and what is not.

For example: If you know the location of robots.txt for a website, you can access it directly via Browser.

For example:

Open Browser and can access http://10.x.x.x/robots.txt

3 Likes

nmap is used as part of enumeration and recon phase. So, when you see robots.txt, you should access it and read it. Robots.txt can give you some insights into structure the website you are targeting. It is a valuable source for reconn and enumeration phase.

The purpose of robots.txt is to tell search engine crawler which part of the site is to be indexed and which is not in backend search engine databases.

2 Likes

Thank you - this didn’t seem like the point of the exercise.

1 Like

thanks, for u answer, but…can you help me, whats key u use for find robots.txt ?
I use sudo nmap 10.129.2.49 -p- -sV -sC --scripts vuln

Thanks again.

Hint Also Find robots.txt !

use , http-enum

sudo nmap < Target IP > -p ,<PORT N’>,<PORT N’(2)>,<PORT N’(INFINITY SYMBOL)> --script discovery, exploit, vuln, version

Will generate the output you seek:

80/tcp open http
| http-enum:
|_ /robots.txt: Robots file