I am trying to build an AR measure App. The user will be able to measure distances using AR ( WebXR ). I have figured out the distance measure part and it is working fine. Currently I am using hit test to mark points and using those points to measure distance.
function render(_: DOMHighResTimeStamp, frame?: XRFrame) {
if (!frame)
return;
const referenceSpace = renderer.xr.getReferenceSpace();
const session = renderer.xr.getSession();
if (!hitTestSourceRequested) {
session.requestReferenceSpace('viewer').then((referenceSpace) => {
session.requestHitTestSource({space: referenceSpace}).then((source) => {
hitTestSource = source;
});
});
session.requestReferenceSpace('local').then(edgeDetection.setReferenceSpace.bind(edgeDetection));
session.addEventListener('end', () => {
hitTestSourceRequested = false;
hitTestSource = null;
});
hitTestSourceRequested = true;
}
if (hitTestSource) {
const hitTestResults = frame.getHitTestResults(hitTestSource);
if (hitTestResults.length) {
const hit = hitTestResults[0];
reticle.visible = true;
reticle.matrix.fromArray(hit.getPose(referenceSpace)!.transform.matrix);
} else {
reticle.visible = false;
}
activeShape?.updateShape(reticle.matrix);
}
labels.forEach((label) => {
const pos = toScreenPosition(label.point, renderer.xr.getCamera());
label.div.style.transform = `translate(-50%, -50%) translate(${pos.x}px, ${pos.y}px)`;
});
edgeDetection.setupGLBinding(session)
edgeDetection.detectEdges(frame).then();
renderer.render(scene, camera);
}
Now I want to add a feature where the user will be a able to measure distance by detecting contours of real world objects. Like say if there is a rope the user would like to measure, they can just point to that and the contour of the rope will be detected and highlighted automatically.
I have figured out the contour detection part also, using raw camera access from WebXR I get the camera frame. Then I use OpenCV to get the edges.
private async processImage(imageData: ImageData) {
await this.cvLoaded;
const src = cv.matFromImageData(imageData);
cv.cvtColor(src, this.gray, cv.COLOR_RGBA2GRAY);
src.delete();
cv.threshold(this.gray, this.edges, 100, 255, cv.THRESH_BINARY);
cv.findContours(this.edges, this.contours, this.hierarchy, cv.RETR_LIST, cv.CHAIN_APPROX_SIMPLE);
const minPerimeter = 50; // Adjust this value as needed
// Convert contours to 3D vectors
const contours3D: Array<Array<THREE.Vector3>> = [];
for (let i = 0; i < this.contours.size(); i++) {
const contour = this.contours.get(i);
const perimeter = cv.arcLength(contour, true);
if (perimeter >= minPerimeter) {
const pts = contour.data32S;
const points3D: Array<THREE.Vector3> = [];
for (let j = 0; j < pts.length; j += 2) {
const x = pts[j];
const y = pts[j + 1];
const vector = this.convertTo3D(x, y);
points3D.push(vector);
}
contours3D.push(points3D);
}
}
}
All this is working fine and I am getting edges the problem is I don’t know how to convert from the coordinates of the contours ( x, y ) to AR space coordinates ( x, y, z ). This is what I have tried, but is not working ( the coronates is wrong )
private convertTo3D(x: number, y: number) {
const vector = new THREE.Vector3(
(x / window.innerWidth) * 2 - 1,
-(y / window.innerHeight) * 2 + 1,
0
);
vector.unproject(this.camera);
return vector;
}
What is the correct way to convert the coordinates here? Is there any better way to measure edges in AR other than what I am doing?