I am needing help in figuring out why my code is returning wrong values. I’m creating my own UUID class in the dart programming language. When setting the values, I can observe everything in the byte array is infact correct. Only when I read the Long or Unsigned Long from the array does it go wrong. The return value is a negative, which I initially thought might have to do with a sign bit, so i switched it over to using Unsigned. This did not work. I’ll link the relevant code, and paste snippets here after the explanation. I am attempting to make this have functional parity with Java, as my library also implements a NBT (Named Binary Tag), so I wanted a way to be able to also read a UUID, or write it back. Here’s the code. This is more of a general purpose library for all my coding projects that are going to be using Dart.
https://git.zontreck.com/AriasCreations/LibAC-dart/src/branch/feat/7/test/uuid_test.dart#L55-L61
test("Test v3 implementation", () {
var expected =
"3e1b8c8a-efab-381b-ab57-4764c45b0889"; // Minecraft offline UUID : zontreck
var ID3 = UUID.generate(3, parameters: ["OfflinePlayer:zontreck", ""]);
expect(ID3.toString(), expected);
});
https://git.zontreck.com/AriasCreations/LibAC-dart/src/branch/feat/7/lib/utils/uuid/UUID.dart#L74-L133
if (params.length != 2)
throw Exception(
"UUID v3 requires two parameters, [namespace,name]");
String namespace = params[0] as String;
String name = params[1] as String;
ByteLayer layer = ByteLayer();
if (!namespace.isEmpty) {
final namespaceBytes = utf8.encode(namespace);
layer.writeBytes(namespaceBytes);
}
if (!name.isEmpty) {
final nameBytes = utf8.encode(name);
layer.writeBytes(nameBytes);
}
var bytes = md5.convert(List.from(layer.bytes)).bytes;
layer.clear();
layer.writeBytes(bytes);
layer.unsetSetBit(6, 0x0F, 0x30);
print(
"Existing bit at position 8: ${layer.getBit(8).toRadixString(2)}:${layer.getBit(8)}");
layer.unsetSetBit(8, 0x3F, 0x80);
print(
"New bit at position 8: ${layer.getBit(8).toRadixString(2)}:${layer.getBit(8)}");
layer.resetPosition();
var msb = layer.readUnsignedLong();
var lsb = layer.readUnsignedLong(); // <----- This is where the problem happens
if (msb < 0) print("Most significant bit is negative! ${msb}");
if (lsb < 0) print("Least significant bit is negative! ${lsb}"); //<-- LSB is negative
print("Forming UUID using MSB-LSB - ${msb} - ${lsb}");
return UUID(msb, lsb);
}
https://git.zontreck.com/AriasCreations/LibAC-dart/src/branch/feat/7/lib/nbt/Stream.dart#L260-L265
int readUnsignedLong() {
final value =
_byteBuffer.buffer.asByteData().getUint64(_position, Endian.big);
_position += 8;
return value;
}
I’ve tried switching to signed, unsigned, reading the byte array in debug mode, writing the long back to the array and checking that the bytes are still correct (they are). I’m just not sure why I am getting a negative number here. Forcing the number to be positive doesn’t work.
QUESTION: How do I get this to work properly, what am I doing wrong?
Tara Piccari is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.