Hacking Vasco translator through binary SMS

Recently I was asked to configure internet browser on a thing called Vasco Translator Premium 7″. The device looks exactly like many of the low-end Android tablets from China. And it happens to be one. The problem is that is was locked to used the only allowed application which is the translator. It has some minor functions like camera and it seems that was a mistake of its authors. They used default Android app as Camera and Gallery applications (and forgot to lock send message button in the latter 🙂 ).

At first I have to highlight the fact, this is not the full unlock or root of the device, but the fact of ease of the process allows me to suppose that rooting the thing should not be too difficult, too. Our goal here is to open up the web browser. Because as it seems the software below the shitty overlay is ordinary Android with all the apps on its place. And how useful might be the tablet without internet browser?

Prerequisites

The tool we will use to do the trick is good ol’ WAP Push protocol. For some reason in newer Android devices this dinosaur has been revived and supported out of the box. The goal is to send such message to our locked device, and since the gallery app mentioned above allows to escape to Messaging app, read it on it. This is probably the hardest part of the process. And possibly may require to buy some additional hardware (if you have access to any service, you know is sending WAP Push, you can use it and skip this part).

And that hardware is a GSM modem. It is highly possible that you already have one that can be used. The thing we will need is the possibility to send SMS through AT commands. Many Android phones allows that, at least if they are rooted, probably LTE/3G modems can do that too (not checked that personally). Ok, since the procedure to get access to AT interface is completely different for any device you can get, I have to leave you alone with getting used to that. After some time, you will probably end up in minicom or some similar program and parameters like 115200/8N1 or 9600/8N1. In my case (Android with Qualcomm processor) it is /dev/smd11 and params are 115200/8N1. Now you could type AT to check if the device you found is really an AT modem (should respond with OK) and AT+SCA? to check your SMS center address. You should be able to recognize it or Google it to check if it really belongs to your mobile operator.

Crafting SMS

Now, since we have all the tools, we can start crafting SMS. I will omit many details here since just general description of PDU format would take whole article and complete one is more than 100 pages long. The only part you need to know about is destination address. This will be the phone number of your device. Trailer of the message is WAP Push payload, which to be described will need another 100 or so pages, so skip it. As a remark, there is some program called wbxml2xml/xml2wbxml that allows to read/write the message. In our case, we want to enforce the device to visit Google.com, so this will be the address of WAP bookmark.

Ok, so on the picture above, thing we are interested in are [dest_addr] and [dest_len]. The first encodes telephone number “+37201234567” (note lack of ‘+’ sign), the second its length (as number of digits, 0x0b == 11). The number of your device should be placed here and you could move on to next section.

Or you can try customizing the payload. The important thing here is marked as [WBXML] and can be crafted with program mentioned before. After changing this, adjustment of [ud_len] value to number of bytes in payload (those after the length) is required.

Sending SMS

Since we already have modem, we need to type AT command to initiate message sending. But before that, we need to ensure that we are in binary mode. Type AT+CMGF? and, if value is other than zero, AT+CMGF=0. Now start sending with:

AT+CMGS=55

Where 55 is length of payload in bytes, but without SMSC header (one byte at the beginning). Modem should respond with > prompt, where SMS could be typed.

0041000b917302214365f700042a0605040b8423f0ea0601ae03056a0045c60c03676f6f676c652e636f6d00080103476f6f676c65000101

And after that press CTRL-Z (^Z) in your terminal. This should send SUB (substitute) to modem. It is important not to use any characters in between, like spaces and ENTER. After about a second, you should see that sending was successful and no error was returned.

Receiving and opening

Now, if you have your translator turned on, you should hear that new message was received, but nothing appeared on screen. That is ok. The rest of the procedure is shown on video below:

Postscriptum

After another few minutes of playing with the device I found another method of opening the browser and it is way faster than what was described below. But the first one was much more entertaining for me and is showing one of the many places where serious bugs could be found – forgotten technologies, still being implemented, possibly used, but with lack of knowledge about details in general public.

You can see the other method on video below, and possibly it is the one you want to use.

Posted in Uncategorized | Leave a comment

Understanding JCOP: memory dump

Some time ago I was struggling with JCOP smart card. The one I received as it turned out was not pre-personalized, which means some interesting features (like setting encryption keys and PIN) was still unlocked. Because documentation and all the usual helpers (StackOverflow) were not very useful (well, ok, there was no publicly available documentation at all), I started very deep search on Google, which finished with full success. I was able to make dump of whole memory available during pre-personalization.

Since it is not something that could be found online, here you have screenshot of it, colored a bit with help of my hdcb program. Without documentation it might not be very useful, but in some emergency situation, maybe somebody will need it.

JCOP memory dump made at the very beginning of pre-personalization

Small explanation: first address, I was able to read was 0xC000F0, first address with read error after configuration area was 0xC09600. I know that, despite of lack of privileges some data is placed there.

There are three configurations: cold start (0xc00123-0xc00145), warm start (0xc00146-0xc00168) and contactless (0xc00169-at least 0xc0016f). Description of coding of the individual fields is outside of the scope of this article. I hope, I will describe them in future.

Next time, I will try to describe the process of pre-personalization, that is making not pre-personalized card, easy to get from usual sources of cheap electronics, able to receive and run applets.

Posted in Tutorials | Tagged , , , , , , , , , | Leave a comment

A few words on everything daemon – systemd and power management overridden problem

For few weeks I had a strange problem with power management on my laptop. It is quite old Asus, really nicely supported by Linux community, so it was almost impossible to be any low-level problem. The problem was that, when I had lid of the laptop closed, system was always suspending, ignoring Plasma power management settings. This was really annoying, especially when I was leaving some background task and of course closing the lid to save power on turned off screen and then few minutes later realizing that the computer slept just after leaving it. Unfortunately I had not much time to investigate what was the reason.

Until today. It seems that systemd just started managing another part of my computer – power management. And was kind enough to ignore the fact that desktop environment is already managing it and set up its own configuration files. The effect was that it was intercepting lid close event and suspend the computer anyway.

So, if you encountered similar problem, then below are the details of what should be changed to turn that feature off.

  1. Open /etc/systemd/logind.conf with your favorite editor as root.
  2. Add following lines to [Login] section:
HandlePowerKey=ignore
HandleSuspendKey=suspend
HandleHibernateKey=hibernate
HandleLidSwitch=ignore
  1. Save and restart logind with:
systemctl restart systemd-logind.service

Now you should have gotten rid of systemd’s attempt to be the everything you need on Linux box (at least this time).

Posted in Tutorials | Tagged , , | Leave a comment

Wget with SSL/TLS support for Android

wget dependency tree

wget dependency tree

Lately I have tried to download some file from a website to my Android smartphone. Simple thing, yeah? Well, not really. Unfortunately mobile browser developers removed many features from their mobile distributions. One of them is a possibility of downloading random page to disk as is. Instead (this is the case at least with Mozilla’s product) they are forcing “Download as PDF” feature. I had a bit of luck, because the file I was trying to download was MP4 movie, which is downloadable, maybe not in an intuitive way, but it is. But before I have found that feature hidden in a player’s context menu, I tried another solution – wget. Since I am great fan of terminals, I have busybox installed on my phone. Those of you, who know what exactly is busybox should know that this is set of lightweight variants of most standard UNIX tools. So, if they are lightweight, they had to cut some part of tool functionality, right? And in case of my busybox’s wget, they cut HTTPS support. And today, it is more likely to encounter site which is only HTTPS than one that is only HTTP, at least when talking about popular sites. So I had to get my own distribution of wget, that will not be such constrained one.

Not to get you bored too much, here you can find binary distribution of what I achieved to compile. It was compiled for ARMv7 platform using NDKr12b and API level 24 (Nougat), so it will probably not work on most of current Android phones, but if you read later, it is probably working on your device or even is outdated. If you are interested in recompiling binaries yourself, you can find detailed how-to in the next part of this article.

Dependencies

Before compiling wget itself, you have to have whole bunch of its dependencies. But at first, you of course need Android compiler. It is distributed as part of NDK and I won’t describe its installation here. Sources of every program compiled here can be grabbed from its official sites (list at the end of this post). The only exception is libtasn1, which required few hacks to be done to make it compile with Android bionic libc. Its source, ported to Android can be get from my github repo.

Let’s start with programs that does not depend on anything. For all projects, the procedure is more or less the same and can be described with simplified bash script:

tar -zxvf program-1.00.tar.gz
mkdir build
mkdir install
cd build
CC=arm-linux-androideabi-gcc AR=arm-linux-androideabi-ar RANLIB=arm-linux-androideabi-ranlib CFLAGS=-pie \
    ../program-1.00/configure --host=arm-linux --prefix=/data/local/root
make
make install DESTDIR=$(dirname `pwd`)/install/
cd ../install
tar -zcvf program.tar.gz *

gmp, libidn and libffi

For these three programs, the procedure above should work without any modification.

nettle

Since nettle depends on gmp, it has to be configured with paths to gmp binaries and headers in its CFLAGS and LDFLAGS variables. They should look like this:

CFLAGS="-pie -I`pwd`/../../gmp/install/data/local/root/include"
LDFLAGS="-L`pwd`/../../gmp/install/data/local/root/lib"

when invoking configure script.

libtasn1

This was the hardest part for me, but should go smoothly now. Script below should do the work correctly:

git clone git@github.com:v3l0c1r4pt0r/android_external_libtasn1.git
mkdir build
mkdir install
cd build
CC=arm-linux-androideabi-gcc AR=arm-linux-androideabi-ar RANLIB=arm-linux-androideabi-ranlib CFLAGS=-pie \
    ../libtasn1/configure --host=arm-linux --prefix=/data/local/root --disable-doc
make
make install DESTDIR=$(dirname `pwd`)/install/
cd ../install
tar -zcvf libtasn1.tar.gz

p11-kit

This is the last dependency of gnutls which is the only, but very important dependency of wget. Just embedding libtasn1 and libffi should do the job well.

CFLAGS="-pie -I`pwd`/../../libtasn1/install/data/local/root/include"
LDFLAGS="-L`pwd`/../../libtasn1/install/data/local/root/lib -L`pwd`/../../libffi/install/data/local/root/lib"

Notice that libffi has no headers, so we add it just to CFLAGS here!

gnutls

This one was more complicated than the rest. As I mentioned above, it is very important to wget functionality. However wget’s dependency on it could probably be turned off, we would not have TLS support then. When compiling it I had some problems that seemed to be serious. There were a few errors while making it, so I had to call make twice and even though it failed. Despite that it seem to work after make install, which obviously failed too. In my case following script did the job:

mkdir build
mkdir install
cd build
CC=arm-linux-androideabi-gcc AR=arm-linux-androideabi-ar RANLIB=arm-linux-androideabi-ranlib \
    CFLAGS="-pie -I`pwd`/../../gmp/install/data/local/root/include -I`pwd`/../../nettle/install/data/local/root/include -I`pwd`/../../libtasn1/install/data/local/root/include -I`pwd`/../../libidn/install/data/local/root/include -I`pwd`/../../p11-kit/install/data/local/root/include" \
    LDFLAGS="-L`pwd`/../../gmp/install/data/local/root/lib -L`pwd`/../../nettle/install/data/local/root/lib -L`pwd`/../../libtasn1/install/data/local/root/lib -L`pwd`/../../libidn/install/data/local/root/lib -L`pwd`/../../p11-kit/install/data/local/root/lib" \
    ../gnutls-3.4.9/configure --host=arm-linux --prefix=/data/local/root --disable-cxx --disable-tools
make || make
make install DESTDIR=$(dirname `pwd`)/install/ || true
cd ../install
tar -zcvf file.tar.gz *

Compilation

Since we should now have all dependencies compiled, we can try compiling wget itself. The procedure here is the same as with dependencies. We just have to pass path to gnutls. And then standard configure, make, make install should work. However if your NDK installation is fairly new and you were not hacking it before, you most likely don’t have <sys/fcntl.h> header and make should complain about that. Luckily Android itself have this header present, but for reason unknown it is kept in include directory directly. To make wget, and any other program that uses it, compile you can just point “sys/” instance to <fcntl.h> with symlink or do something like that:

echo "#include <fcntl.h>" > $TOOLCHAIN/sysroot/usr/include/sys/fcntl.h

where $TOOLCHAIN/sysroot is path at which you have your headers placed. Depending on tutorial you were using for making it work it may have different structure.

Installation

All commands I presented above implies that you have your custom-compiled binaries in “/data/local/root”. I made it that way to have clear separation between default and busybox binaries. If you want to have them somewhere else, you should pass it to configure scripts of all programs you are compiling. After successful compilation of all tools, I have made single tarball containing all compilation output (this file’s link was placed above). Its content can be installed into Android by typing

tar -zxvf wget-with-deps.tar.gz -C/

using adb shell or terminal emulator.

Sources

Below you can find links to sources of all programs nedded to follow this tutorial.

Posted in Uncategorized | Leave a comment

Simple extraction of content from HTTP request

When writing a script for extraction of data exported to XML from burp, I was forced to make one very dirty hack on extracting body of HTTP request. It seems that there is no easy way of getting just the data, especially if the message transferred is binary file, i.e. image or executable. If it is simply text, it might probably done with some perl regexp wizardry. However it would be nice either way. The hack I mentioned was creating netcat server and piping whole request into it and then downloading it with help of wget. That way it works and takes only two straightforward lines of code, but it is still a hack. You can see it here.

To not be forced to do hacks like that in future I had a plan to make very simple utility program that will just delete HTTP header and print rest of input to stdout. Well, I decided to do this just after finishing the mentioned script, so about half year ago. Now I have found some time to do this and the result is very small utility written in C that does just that and by the way is able to parse the header itself and provides a way to be used easily by the user. The latter is done by spawning child process of default shell with header values set in its environment, so it may not be the nicest way to do this, but for it was the fastest to implement and should be fairly easy to work with. If you know better way that will result in similar behavior I will be graceful if you leave me a comment here.

Posted in Uncategorized | Tagged , , | Leave a comment

Method for intercepting (lot of) files from website using burp

Burp exporting requests to file

Burp exporting requests to file

From time to time everyone has a need to download bunch of files from some website. Sometimes there exists one index where links to every file can be found. But sometimes not. Analysis of a website and/or figuring out a way the link is created (especially if it is something like http://some-cdn.io/directory/file-generator?param=5acae975-7784-e511-9412-b8ca3a5db7a1&ws=b0c491db-ae8b-e011-969d-0030487d8897&uid=66dba4ec-bb13-e211-a76f-f04da23e67f6&switch=1) could take months and success is not guaranteed. If this happens, downloading files manually is the only way to do it. But manual download can be optimized too.

Burp way

Burp is a sort of swiss army knife of penetration tester. Its main function is intercepting HTTP(s) traffic through built in proxy. It allows to decrypt traffic of any website or even Android app to third-party server. So this way you could configure web browser to use burp proxy and simply view every file you want to save. This would be ideal solution for backing up any image gallery including facebook galleries that can be viewed only by logged in users and uses weird links to facebook CDNs. But there is one problem. Burp does not allow to export many files at once (or at least its free license does not allow it). Or to be exact uses its own format to store both HTTP request, response and lot of metadata, we simply do not need, but we need to have directory filled with images, right?

Solution

To obey the problem, I have written simple bash script that extracts plain data from that exported data file. It gets XML file exported by Burp and unpacks plain responses, each to separate file. Usage is very simple.

  1. At first export files you want to save from Burp’s Target tab by selecting them, clicking on Save selected items and save file as whatever.xml.
  2. Then you just have to start the script with
    ./xtract-burp.sh whatever.xml

    and optionally appending desired file extension as second parameter.

Note that files will be named with iterator starting on 1 and going up and sorted the same way Burp had them exported.

As usual repo is available on github.

Posted in Tutorials | Tagged , , , | Leave a comment

HDCB – new way of analysing binary files under Linux

As any observer of my projects spotted, most of the biggest projects I do involves binary file analysis. Currently I am working on another one that requires such analysis.
Unfortunately such analysis is not an easy task and anything that will ease this or speed it up is appreciated. Of course there are some tools that will make it a bit easier. One of them is hexdump. Even IDA Pro can make it easier a bit. Despite them I always felt that something is missing here. When creating xSDM and delz utils, I was using hexdump output with LibreOffice document to mark different structure members with different colors. It worked, but selecting 100-byte buffer line by line was just wasting precious time.

SDC file analyzed by HDCB script

Solution

So why not automate whole process? What we really need here is just hexdump output and terminal emulator with color support. And that’s why I’ve made HDCB – HexDump Coloring Book. Basically it is just extension to bash scripting language. Goal was to create simple script that will hide as much of its internals from end-user and allow user to just start it will his shell using old good ./scriptname.ext and that’s it. HDCB is masked as if it was standalone scripting language. It uses shebang, known from bash or python scripts to let user shell know what interpreter to use (#!/usr/bin/env hdcb). Those who are python programmers should recognize usage of env binary.

In fact it is just simple extension to bash language, so users are still able to utilize any features available in bash. Main extensions are two commands: one (define) for defining variables and the other (use) for defining field or array of that defined type. Such scripts should be started with just one argument – file that is meant to be hexdumped and analyzed.

Internals

Bash scripts are just some kind of a cover of real program. Main core of the program is colour utility. It gets unlimited number of parameters grouped in groups of four. They are in order: offset of byte being colored, length of the field, background and foreground colors. As standard input, hexdump output (in fact only hexdump -C or hexdump -Cv are supported) is provided. Program colors the hexdump with rules provided as arguments. This architecture allows clever hacker to build that cover mentioned in virtually any programming language.

Downloads, documentation and more

As usual, program is available on my Github profile. Sources are provided on GPLv3 license so you are free to contribute to the project and you are strongly encouraged to do so or make proposals of a new functions. Program is meant to be expanded according to my future needs, but I will try to implement any good idea. Whole documentation, installation instructions and the like are also available on Github.

Posted in Uncategorized | Tagged , , | 1 Comment

Airlive WN-151ARM UART pinout & root access

airlive-pinout

Airlive WN-151ARM pinout

For curious ones. Here is pinout of serial connection. As you can see UART pins are at J4 header (should have pin 4 labeled and 1 be square).

J4 header
Num. Function
1 VCC
2 RX
3 TX
4 GND

Edit: Oh, and one more thing: goldpin header, you see in the picture is soldered by me, so do not be surprised if you have to hold wires all the time during the transmission.

Root access

There is also possibility to gain root access without removing the cover and possibly voiding the warranty. You have to connect to router’s AP and enter

http://192.168.1.254/system_command.htm

into your browser (panel authentication required). Now you can execute any command you want with root privileges! So let’s type

/usr/sbin/utelnetd -d &

into Console command field and press Execute button. If everything went well, you should now be able to connect to your router using telnet at its default TCP port 23. After that you should see BusyBox banner and command prompt.

It is worth noting that this hidden console cannot be accessed by unauthorized person, so only router administrator can use this (in theory, in practice there are surely a lot of routers using default credentials and security of httpd binary is unknown).

Posted in Just knowledge | Tagged , , , , , , , | Leave a comment

SDC file format description – Errata

Last year, I published a program for Microsoft Dreamspark’s SDC file decryption. Soon after that I wrote article about SDC file format and its analysis. Now it’s time to complete the description with newest data.

This article wouldn’t be written if not the contribution of GitHub’s user @halorhhr who spotted multi-file SDC container and let me know on project’s page. Thanks!

When writing that post year ago, I had no idea what multi-file container really looks like. Any suspicions could not then be confirmed, because it seemed that these files simply where not used in the wild. A days ago situation changed. I got a working sample of multi-file container so I was able to start its analysis.

Real container format

sdc-format

SDC files with different signatures

After quick analysis, I knew that I was wrong with my suspicions. Filename length and encrypted filename strings are not part of a file description. In fact they are placed after them and filename is concatenated string of all filenames (including trailing null-byte). So to sum up filename of n-th element starts at file[n].filename_offset and ends just like any other c-style string.

Whole header structure is like on the sample header on the right. Note that all headers beside 0xb3 one has been already decrypted for readability. In real header the only unencrypted field is header size at the very beginning of the file. 0xb3 sample has unencrypted header and header size is not present in a file. However file name is encrypted in some way, I haven’t figured out as of now. Encryption method is blowfish-compat (the difference between this and blowfish is ciphertext endiannes). Filenames are encrypted once again.

After header, all other data is XORed using key from EDV string and then deflated, so before reading them, you have to inflate and XOR again. Format of data in 0xb3 version is still unknown, however analysis of compressed and file size hints that it may be stored the same way. It is important to note that depending on file signature different configuration of deflater may be needed. It is now known that files older than 0xd1 header, which appears to be newest (because only this one supports files greater than 4 GiB) need to have deflater initialized with

inflateInit2_(&stream,-15,ZLIB_VERSION,(int)sizeof(z_stream));

or equivalent.

Unknowns

This errata does not contain all information needed to support all variations of SDC files. Beside unknowns I mentioned above, there is another variation that uses 0xc4 signature and which I had no sample of. The only trace of its existence is condition in SDM code. Because of that I cannot write support for that type of file. There is also possibility of multi-file containers having 0xb5 or 0xb3 signatures existence. That type of files seems to appear only lately, but it is probable that it existed in the past. Because of having no samples of them I cannot verify that xSDM properly handles them.

So if you have sample of any of variations mentioned here, just send them to me at my email address: v3l0c1r4pt0r at gmail dot com or if you suppose it may be illegal in your country, just send me SDX link or any other hint for me how can I find them.

Other way?

Few days ago, after I started writing this post Github user @adiantek let me know in issues that there is a method to obey SDM in Dreamspark download process. To download plain, unencrypted file you just have to replace ‘dfc=1‘ to ‘sdm=0‘ in a link Dreamspark provided in SDX file. If it true that it works in every file Dreamspark provides, my xSDM project would be obsolete now. However, because Microsoft’s intentions when creating this backdoor (it seems to be created just for debugging) are unknown, I will continue to support the project and fix any future bugs I will be aware of. But now it seems that this project will start to be just proof of concept for curious hackers and will start to slowly die.

Nevertheless, if you have something that might help me or anyone who may be interested in SDC format in future, just let me know somehow, so it will be available somewhere on the internet.

Posted in Uncategorized | Tagged , , , , | Leave a comment

imeitool – validate, generate and find out information about IMEI number easily

I just pushed imeitool to my Github. imeitool is small utility that can do few useful operations on IMEI numbers. It can check if number given is valid IMEI number, find information in its databases about given IMEI or TAC (Type Allocation Code, usually first 8 digits of IMEI) and generate fake IMEI based on conditions provided by user.

The reason this program was created should appear on blog soon. There is a possibility for anyone to contribute to imeitool’s db, so if you want to help, more info can be found on project’s README file.

Posted in Uncategorized | Tagged , , | Leave a comment
« Older