I’m trying to convert a char array to a long integer and a long integer to a char array, but the conversion fails at value 128.
public long asciiToLong(char[] in) {
CharBuffer charBuffer = CharBuffer.wrap(in);
ByteBuffer byteBuffer = StandardCharsets.US_ASCII.encode(charBuffer);
byte[] byteArray = new byte[byteBuffer.remaining()];
byteBuffer.get(byteArray);
long result = ByteBuffer.wrap(new String(byteArray).getBytes()).getLong();
return result;
}
public char[] longToAscii(long in) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(in);
byte[] b = buffer.array();
CharBuffer charBuffer = StandardCharsets.US_ASCII.decode(ByteBuffer.wrap(b));
char[] result = new char[charBuffer.remaining()];
charBuffer.get(result);
return result;
}
When I test it with the following code it fails at 128
.
for (long l = 0 ; l < 100000 ; l++) {
char[] ascii = longToAscii(l);
System.out.printf("%d -> ", l);
long l2 = asciiToLong(ascii);
System.err.printf("%dn", l2);
if (l != l2) {
throw new RuntimeException();
}
}
Output :
0 -> 0
1 -> 1
2 -> 2
3 -> 3
// console output truncated..
126 -> 126
127 -> 127
128 -> 63 // fails
Changing the StandardCharset used to decode the long integer fails at the same position with different value.