I built a little java program that hides messages in an image using the least significant bit method. It works fine when inputting a jpg file. The output may be png or jpg. When inputting a png though, the result looks very stange.
Here are the original and the result images respectively:
public abstract class Builder{
public static void leastSignificantBitEncryption(String imageSource, String message, String newPath) {
BufferedImage image = returnImage(imageSource);
//prepare variables
String[] messageBinString = null;
String[] pixelBinString = null;
final byte[] messageBin = message.getBytes(StandardCharsets.UTF_8);
final byte[] pixelsBin = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
//convert message and image to binary string array
try {
messageBinString = stringToBinaryStrings(messageBin);
pixelBinString = stringToBinaryStrings(pixelsBin);
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
String[] messageBinStringCut = splitIn2Bit(messageBinString); //split message binary into 2 bit strings
String[] pixelBinStringNew = pixelBinString.clone(); //insert 2 bit strings in last 2 bits of bytes from bitmap
insert2Bit(messageBinStringCut, pixelBinStringNew);
byte[] pixelsBinNew = stringArrayToByteArray(pixelBinStringNew); //Convert string array to byte array
try { //Create new image out of bitmap
int w = image.getWidth();
int h = image.getHeight();
BufferedImage imageNew = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
imageNew.setData(Raster.createRaster(imageNew.getSampleModel(), new DataBufferByte(pixelsBinNew, pixelsBinNew.length), new Point()));
File imageFile = new File(newPath);
ImageIO.write(imageNew, "png", imageFile);
} catch (IOException e) {
e.printStackTrace();
}
}
private static String[] stringToBinaryStrings(byte[] messageBin) throws UnsupportedEncodingException{
String[] bytes = new String[messageBin.length];
int i = 0;
for(byte b : messageBin) {
bytes[i] = String.format("%8s", Integer.toBinaryString(b & 0xFF)).replace(' ', '0');
i++;
}
return bytes;
}
private static String binaryStringsToString(String[] messageBin) throws UnsupportedEncodingException{
StringBuilder stringBuilder = new StringBuilder();
int i = 0;
while(messageBin[i] != null) {
stringBuilder.append((char) Integer.parseInt(messageBin[i], 2));
i++;
}
return stringBuilder.toString();
}
private static BufferedImage returnImage(String imageSource) {
try{
try {
return ImageIO.read(new URL(imageSource));
} catch (MalformedURLException e) {
return ImageIO.read(new File(imageSource));
}
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
}
private static byte[] stringArrayToByteArray(String[] stringArray) {
byte[] byteArray = new byte[stringArray.length];
for(int i = 0; i < stringArray.length; i++) {
byteArray[i] = (byte) Integer.parseInt(stringArray[i], 2);
}
return byteArray;
}
private static String[] splitIn2Bit(String[] inputArray) {
String[] outputArray = new String[inputArray.length * 4];
for(int i = 0; i < outputArray.length; i += 4) {
String[] splitByte = inputArray[i / 4].split("(?<=\\G..)");
outputArray[i] = splitByte[0];
outputArray[i + 1] = splitByte[1];
outputArray[i + 2] = splitByte[2];
outputArray[i + 3] = splitByte[3];
}
return outputArray;
}
private static String[] insert2Bit(String[] twoBitArray, String[] insertArray) {
for(int i = 0; i < twoBitArray.length; i++) {
insertArray[i] = insertArray[i].substring(0, 6) + twoBitArray[i];
}
return insertArray;
}
}
Also, the testclass:
public class Test {
public static void main(String[] args) {
Builder.leastSignificantBitEncryption("IMAGEPATH OR URL", "MESSAGE", "PATH FOR IMAGE CONTAINING MESSAGE");
Builder.leastSignificantBitDecryption("PATH OF IMAGE CONTAINING MESSAGE", "PATH FOR TXT CONTAINING OUTPUT");
}
}
The error originates from the fact that the png image has an extra channel for transparency. System.out.println(pixelsBin.length);
returns 338355 bytes for the jpg and 451140 bytes for the png.
The simplest solution would be to create the appropriate imageNew
depending on the format file. For example,
int w = image.getWidth();
int h = image.getHeight();
BufferedImage imageNew = null;
if (imageSource.matches(".*jpg$")) {
imageNew = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
} else if (imageSource.matches(".*png$")) {
imageNew = new BufferedImage(w, h, BufferedImage.TYPE_4BYTE_ABGR);
} else {
// whatever
}
imageNew.setData(Raster.createRaster(imageNew.getSampleModel(), new DataBufferByte(pixelsBinNew, pixelsBinNew.length), new Point()));
However, you have to be aware that the message is not embedded in both types in the same pixels. The byte array of a 3-channel image (no transparency) goes like this
first-pixel-BLUE, first-pixel-GREEN, first-pixel-RED, second-pixel-BLUE, etc
while for a 4-channel image
first-pixel-ALPHA, first-pixel-BLUE, first-pixel-GREEN, first-pixel-RED, second-pixel-ALPHA, etc
If you care about that detail, you might be interested in removing the alpha channel from the png first, so you're always working with 3-channel images.