This Week's Top Stories About Information Gathering Tools and Techniques Used by Cyber Security and Ethical Hackers

This Week's Top Stories About Information Gathering Tools and Techniques Which Cyber Security and Ethical Hackers Are Using.


Today I am going to tell you about Information Gathering Tools and Techniques Which Used By Hackers And Cyber Security.
Ethical Hacking About Information Gathering Tools and Techniques Used by Cyber Security and  Ethical Hackers

Guys This Article Is For Only Educational purpose.

Guys if you Need Then click Here Top 30 Ethical Hacking Tools Of All Time Which Hackers Used To Hack.

Common Guy's Then We Will Begin The Article.

The more information you have about the target, the more is the chance of successful exploitation.” 

Information gathering is the first phase of hacking. 

In this phase, we gather as much information as possible regarding the target’s online presence, which in turn reveal useful information about the target itself. 

The required information will depend on whether we are doing a network pentest or a web application pentest. 

In the case of a network pentest, our main goal would be to gather information on the network. 

The same applies to web application pentests. 

In this article, we will discuss numerous methods of real-world information intelligence.

In general, all information gathering techniques can be classified into two main categories:

1. Active information gathering.

2. Passive information gathering.

Let me explain what is this.

Active Information Gathering.

Guys Inactive information gathering, we would directly engage with the target, for example, gathering information about what ports are open on a particular target, what services they are running, and what operating system they are using in there PC. 

However, the techniques involving active information gathering would be very noisy at the other end. 

As they are easily detected by IDS, IPSfirewalls and generate a log of their presence, and hence are not recommended sometimes.

Passive Information Gathering.

In passive information gathering, we do not directly engage with the target. Instead, we use search engines, social media, and other websites to gather information about the target. 

This method is recommended since it does not generate any log of presence on the target system. 

A common example would be to use LinkedIn, Facebook, and other social networks to gather information about the employees and their interests. 

This would be very useful when we perform phishing, keylogging, browser exploitation, and other client-side attacks on the employees. 

Sources of Information Gathering.

There are many sources of information; the most important ones are as follows:

Social media website
Search engines
Press releases
People search
Job sites

So let’s discuss some of these sources in detail along with some tools of the trade.

Copying Websites Locally.

Many tools can be used to copy websites locally; however, one of the most comprehensive tools is httrack. 

It can be used to investigate the website further. 

For example, let’s assume that the file permissions of a configuration file are not set properly. 

The configuration might reveal some confidential information, for example, a username and password, about the target.

Information Gathering with Whois.

As I have mentioned earlier, our goal in the information gathering is to gather as much information as possible about the target. 

Whois holds a huge database that contains information regarding almost every website that is on the web, most common information is “who owns the website” and “the e-mail of the owner,” which can be used to perform social engineering attacks.

Finding Other Websites Hosted on the Same Server.

which will show you exactly how an attacker can use a single website in order to compromise every website on the same server.

However, for now, we would just discuss the method of finding the domains hosted on the same server

The method is called reverse IP lookup. allows you to perform a reverse IP lookup on a webserver to detect all other websites present on the same server. 

All you need to enter the domain name.

Tracing the Location.

Guys you need to know the IP address of the webserver in order to trace the exact location. 

There are several methods to figure it out. We will use the simplest one, that is, the ping command. 

Ping command sends ICMP echo requests to check if the website is up. 

It’s used for network troubleshooting purposes.


Traceroute is a very popular utility available in both Windows and Linux. 

It is used for network orientation. 

By network orientation, I don’t mean scanning a host for open ports or scanning for services running on a port. 

It means to figure out how the network topology, firewalls, load balancers, and control points, etc. are implemented on the network.

A traceroute uses a TTL (Time To Live) field from the IP header, and it increments the IP packet in order to determine where the system is running. 

The time to live value decreases every time it reaches a hop on the network (i.e. router to the server is one hop).

There are three different types of traceroutes:

1. ICMP traceroute. (which is used in Windows by default)
2. TCP traceroute.
3. UDP traceroute.

Let me explain the types of traceroutes step by step.

ICMP Traceroute.

Microsoft Windows by default uses ICMP traceroute; however, after a few hops, you will get a timeout, which indicates that there might be a device like IDS or firewall that is blocking ICMP echo requests.

TCP Traceroute.

Many devices are configured to block ICMP traceroutes. This is where we try TCP or UDP traceroutes, also known as layer 4 traceroutes. 

TCP traceroute is by default available in BackTrack.

UDP Traceroute.

Linux also has a traceroute utility, but unlike Windows, it uses UDP protocol for the traceroute.

In Windows, the command to start traceroute is “tracrt”. In, Linux, it’s “traceroute”.

There are two other tools. 


NeoTrace is a very fine GUI-based tool for mapping out a network.


Cheops-ng is another remarkable tool for tracing and fingerprinting a network.

Enumerating and Fingerprinting the Webservers.

For successful target enumeration, it’s necessary for us to figure out what webserver is running at the back end. 

In this section, we will look at both active and passive information gathering meth-ods. 

As a reminder, inactive information gathering, we directly interact with the target; in passive information gathering, we do not interact with the target but use the information available on the web in order to obtain details about the target.

Intercepting a Response.

The first thing you should probably try is to send an HTTP request to a webserver and intercept the response. 

But HTTP responses normally reveal the webserver version of many websites. For that purpose, you would need a web proxy such as Burp Suite, Paros, and web scrap.

Acunetix Vulnerability Scanner.

Acunetix vulnerability scanner also has an excellent web server fingerprinting feature and is freely available from

For security reasons, many websites fake the server banner in order to trick newbies into thinking that the target is using a vulnerable webserver. 

Acunetix has the capability to detect fake server banners.


Our active information gathering section will not be complete without introducing a tool from BackTrack, now they are available in every Debian sources. 

WhatWeb is an all-in-one package for performing active footprinting on a website.

It has more than 900 plug-ins capable of identifying server version, e-mail addresses, and SQL errors.


Netcraft contains a huge online database with useful information on websites and can be used for passive reconnaissance against the target. 

It is also capable of fingerprinting the webservers.

Google Hacking.

Google searches can be more than a treasure for a pentester if he uses them effectively. 

With Google searches, an attacker may be able to gather some very interesting information, including passwords, on the target. 

Google has developed a few search parameters in order to improve a targeted search. 

Xcode Exploit Scanner.

Xcode exploit scanner is an automated tool that uses some common Google dorks to scan for vulnerabilities such as SQLI and XSS.

File Analysis.

Analyzing the files of the target could also reveal some interesting information such as the meta-data (data about data) of a particular target.


Foca is a very effective tool that is capable of analyzing files without downloading them. 

It can search for a wide variety of extensions from all the three big search engines (Google, Yahoo, and Bing). 

It’s also capable of finding some vulnerabilities such as directory listing and DNS cache snooping.

Harvesting E-Mail Lists.

Gathering information about e-mails from employees of an organization can give the attacker a very broad attack vector against the target. 

This method can be classified under passive reconnaissance since we are not engaging with the target in any way but would be using search engines to gather a list of e-mails. 

These e-mail lists and usernames could be used later for social engineering attacks and other brute force attacks. 

 It’s quite a tedious job to gather e-mails one by one with Google. 

Luckily, we have lots of built-in tools in BackTrack, kali and more that can take care of this. 

One of those tools is TheHarvester, written in Python. 

The way works are that the data available publicly to gather e-mails of the target.

Gathering Wordlist from a Target Website.

After we have gathered e-mail lists from search engines, it would be really useful for us to gather a list of words that we would use for brute-forcing purposes. 

CEWL is another excellent tool in BackTrack And Kali Linux, which enables you to gather a list of words from the target website, which can be later used for brute-forcing the e-mail addresses we found earlier. is a tool that has capabilities similar to fierce for determining subdomains. 

It has a built-in internal list as well as the capabilities of scanning with your custom wordlist. 

It can also perform zone transfers.


The following website also gives a decent amount of subdomains. 

It returns the most important subdomains that get the most traffic. 

Most time-saving tool if you need to save you the time you can try WolframAlpha.


Guys This Article Is For Only Educational purpose.

Guys if you need any detail information on any tool just comment in the comment box I will make a detail article on that tool.




Post a Comment