Abstract

Augmented reality (AR) enhances the user’s perception of the real environment by superimposing virtual images generated by computers. These virtual images provide additional visual information that complements the real-world view. AR systems are rapidly gaining popularity in various manufacturing fields such as training, maintenance, assembly, and robot programming. In some AR applications, it is crucial for the invisible virtual environment to be precisely aligned with the physical environment to ensure that human users can accurately perceive the virtual augmentation in conjunction with their real surroundings. The process of achieving this accurate alignment is known as calibration. During some robotics applications using AR, we observed instances of misalignment in the visual representation within the designated workspace. This misalignment can potentially impact the accuracy of the robot’s operations during the task. Based on the previous research on AR-assisted robot programming systems, this work investigates the sources of misalignment errors and presents a simple and efficient calibration procedure to reduce the misalignment accuracy in general video see-through AR systems. To accurately superimpose virtual information onto the real environment, it is necessary to identify the sources and propagation of errors. In this work, we outline the linear transformation and projection of each point from the virtual world space to the virtual screen coordinates. An offline calibration method is introduced to determine the offset matrix from the head-mounted display (HMD) to the camera, and experiments are conducted to validate the improvement achieved through the calibration process.

References

1.
Azuma
,
R. T.
,
1997
, “
A Survey of Augmented Reality
,”
Presence: Teleop. Virt. Environ.
,
6
(
4
), pp.
355
385
.
2.
Billinghurst
,
M.
,
Clark
,
A.
, and
Lee
,
G.
,
2015
, “
A Survey of Augmented Reality
,”
Found. Trends Human Comput. Interact.
,
8
(
2–3
), pp.
73
272
.
3.
Rehman
,
U.
, and
Cao
,
S.
,
2016
, “
Augmented-Reality-Based Indoor Navigation: A Comparative Analysis of Handheld Devices Versus Google Glass
,”
IEEE Trans. Hum. Mach. Syst.
,
47
(
1
), pp.
140
151
.
4.
Birlo
,
M.
,
Edwards
,
P. E.
,
Clarkson
,
M.
, and
Stoyanov
,
D.
,
2022
, “
Utility of Optical See-Through Head Mounted Displays in Augmented Reality-Assisted Surgery: A Systematic Review
,”
Med. Image Anal.
,
77
, p.
102361
.
5.
Fang
,
W.
,
Chen
,
L.
,
Zhang
,
T.
,
Chen
,
C.
,
Teng
,
Z.
, and
Wang
,
L.
,
2023
, “
Head-Mounted Display Augmented Reality in Manufacturing: A Systematic Review
,”
Robot. Comput. Integr. Manuf.
,
83
, p.
102567
.
6.
Condino
,
S.
,
Carbone
,
M.
,
Piazza
,
R.
,
Ferrari
,
M.
, and
Ferrari
,
V.
,
2019
, “
Perceptual Limits of Optical See-Through Visors for Augmented Reality Guidance of Manual Tasks
,”
IEEE Trans. Biomed. Eng.
,
67
(
2
), pp.
411
419
.
7.
Funk
,
M.
,
Kosch
,
T.
, and
Schmidt
,
A.
,
2016
, “
Interactive Worker Assistance: Comparing the Effects of In-Situ Projection, Head-Mounted Displays, Tablet, and Paper Instructions
,”
Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing
,
Heidelberg, Germany
,
Sept. 12–16
, pp.
934
939
.
8.
Westerfield
,
G.
,
Mitrovic
,
A.
, and
Billinghurst
,
M.
,
2015
, “
Intelligent Augmented Reality Training for Motherboard Assembly
,”
Int. J. Art. Intell. Edu.
,
25
(
1
), pp.
157
172
.
9.
Siew
,
C. Y.
,
Ong
,
S. -K.
, and
Nee
,
A. Y.
,
2019
, “
A Practical Augmented Reality-Assisted Maintenance System Framework for Adaptive User Support
,”
Robot. Comput. Integr. Manuf.
,
59
, pp.
115
129
.
10.
Nee
,
A. Y.
,
Ong
,
S.
,
Chryssolouris
,
G.
, and
Mourtzis
,
D.
,
2012
, “
Augmented Reality Applications in Design and Manufacturing
,”
CIRP Ann.
,
61
(
2
), pp.
657
679
.
11.
Evans
,
G.
,
Miller
,
J.
,
Pena
,
M. I.
,
MacAllister
,
A.
, and
Winer
,
E.
, “
Evaluating the Microsoft HoloLens through an augmented reality assembly application
,”
Degraded Environments: Sensing, Processing, and Display 2017
,
Anaheim, CA
,
Apr. 9–13
, SPIE, pp.
282
297
.
12.
Quintero
,
C. P.
,
Li
,
S.
,
Pan
,
M. K.
,
Chan
,
W. P.
,
Van der Loos
,
H. M.
, and
Croft
,
E.
,
2018
, “
Robot Programming Through Augmented Trajectories in Augmented Reality
,”
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
,
Madrid, Spain
,
Oct. 1–5
, IEEE, pp.
1838
1844
.
13.
Gallala
,
A.
,
Hichri
,
B.
, and
Plapper
,
P.
,
2019
, “
Survey: The Evolution of the Usage of Augmented Reality in Industry 4.0
,”
IOP Conf. Ser. Mater. Sci. Eng.
,
521
, IOP Publishing, p.
012017
.
14.
Makhataeva
,
Z.
, and
Varol
,
H. A.
,
2020
, “
Augmented Reality for Robotics: A Review
,”
Robotics
,
9
(
2
), p.
21
.
15.
Ong
,
S.-K.
,
Yew
,
A.
,
Thanigaivel
,
N. K.
, and
Nee
,
A. Y.
,
2020
, “
Augmented Reality-Assisted Robot Programming System for Industrial Applications
,”
Robot. Comput. Integr. Manuf.
,
61
, p.
101820
.
16.
Van Krevelen
,
D.
, and
Poelman
,
R.
,
2010
, “
A Survey of Augmented Reality Technologies, Applications and Limitations
,”
Int. J. Virt. Real.
,
9
(
2
), pp.
1
20
.
17.
Ballestin
,
G.
,
Chessa
,
M.
, and
Solari
,
F.
,
2021
, “
A Registration Framework for the Comparison of Video and Optical See-Through Devices in Interactive Augmented Reality
,”
IEEE Access
,
9
, pp.
64828
64843
.
18.
Li
,
H.
, and
Hartley
,
R.
,
2007
, “
The 3D-3D Registration Problem Revisited
,”
2007 IEEE 11th International Conference on Computer Vision
,
Rio de Janeiro, Brazil
,
Oct. 14–21
, IEEE, pp.
1
8
.
19.
Rolland
,
J. P.
, and
Fuchs
,
H.
,
2000
, “
Optical Versus Video See-Through Head-Mounted Displays in Medical Visualization
,”
Presence
,
9
(
3
), pp.
287
309
.
20.
Liu
,
R.
,
Peng
,
C.
,
Zhang
,
Y.
,
Husarek
,
H.
, and
Yu
,
Q.
,
2021
, “
A Survey of Immersive Technologies and Applications for Industrial Product Development
,”
Comput. Graph.
,
100
, pp.
137
151
.
21.
Zhou
,
F.
,
Duh
,
H. B.-L.
, and
Billinghurst
,
M.
,
2008
, “
Trends in Augmented Reality Tracking, Interaction and Display: A Review of Ten Years of Ismar
,”
2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
,
Washington, DC
,
Sept. 15–18
, IEEE, pp.
193
202
.
22.
Yang
,
W.
,
Xiao
,
Q.
, and
Zhang
,
Y.
,
2021
, “
An Augmented-Reality Based Human-Robot Interface for Robotics Programming in the Complex Environment
,”
International Manufacturing Science and Engineering Conference
, Vol.
85079
,
American Society of Mechanical Engineers
, p.
V002T07A003
.
23.
Yang
,
W.
,
Xiao
,
Q.
, and
Zhang
,
Y.
,
2023
, “
HAR2bot: A Human-Centered Augmented Reality Robot Programming Method With the Awareness of Cognitive Load
,”
J. Intell. Manuf.
, pp.
1
19
.
24.
Yang
,
W.
, and
Zhang
,
Y.
,
2022
, “
Visualization Error Analysis for Augmented Reality Stereo Video See-Through Head-Mounted Displays in Industry 4.0 Applications
,”
International Manufacturing Science and Engineering Conference
, Vol.
85819
,
American Society of Mechanical Engineers
, p.
V002T06A016
.
25.
Samini
,
A.
,
Palmerius
,
K. L.
, and
Ljung
,
P.
,
2021
, “
A Review of Current, Complete Augmented Reality Solutions
,”
2021 International Conference on Cyberworlds (CW)
,
Caen, France
,
Sept. 28–30
, pp.
49
56
.
26.
Howard
,
I.
, and
Rogers
,
B.
,
2002
, “Seeing in Depth Volume 2 Depth Perception (Toronto: I Porteous)”.
27.
Diaz
,
C.
,
Walker
,
M.
,
Szafir
,
D. A.
, and
Szafir
,
D.
,
2017
, “
Designing for Depth Perceptions in Augmented Reality
,”
2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Nantes, France
,
Oct. 9–13
, IEEE, pp.
111
122
.
28.
Swan
,
J. E.
,
Jones
,
A.
,
Kolstad
,
E.
,
Livingston
,
M. A.
, and
Smallman
,
H. S.
,
2007
, “
Egocentric Depth Judgments in Optical, See-Through Augmented Reality
,”
IEEE Trans. Vis. Comput. Graph.
,
13
(
3
), pp.
429
442
.
29.
Ballestin
,
G.
,
Solari
,
F.
, and
Chessa
,
M.
,
2018
, “
Perception and Action in Peripersonal Space: A Comparison Between Video and Optical See-Through Augmented Reality Devices
,”
2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
,
Munich, Germany
,
Oct. 16–20
, IEEE, pp.
184
189
.
30.
Calabrò
,
E. M.
,
Cutolo
,
F.
,
Carbone
,
M.
, and
Ferrari
,
V.
,
2017
, “
Wearable Augmented Reality Optical See Through Displays Based on Integral Imaging
,”
Wireless Mobile Communication and Healthcare
,
Milan, Italy
,
Nov. 14–16
, pp.
345
356
.
31.
Takagi
,
A.
,
Yamazaki
,
S.
,
Saito
,
Y.
, and
Taniguchi
,
N.
,
2000
, “
Development of a Stereo Video See-Through HMD for AR Systems
,”
Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000)
,
Munich, Germany
,
Oct. 5–6
, pp.
68
77
.
32.
Cattari
,
N.
,
Cutolo
,
F.
,
D’amato
,
R.
,
Fontana
,
U.
, and
Ferrari
,
V.
,
2019
, “
Toed-In Vs Parallel Displays in Video See-Through Head-Mounted Displays for Close-Up View
,”
IEEE Access
,
7
, pp.
159698
159711
.
33.
Cutolo
,
F.
,
Fontana
,
U.
, and
Ferrari
,
V.
,
2018
, “
Perspective Preserving Solution for Quasi-Orthoscopic Video See-Through HMDS
,”
Technologies
,
6
(
1
), p.
9
.
34.
Stereo Labs
,
2017
, “
Zed Mini
,” https://www.stereolabs.com/zed-mini/. Accessed July 10, 2023.
35.
Samini
,
A.
,
Palmerius
,
K. L.
, and
Ljung
,
P.
,
2021
, “
A Review of Current, Complete Augmented Reality Solutions
,”
2021 International Conference on Cyberworlds (CW)
,
Caen, France
,
Sept. 28–30
, IEEE, pp.
49
56
.
36.
Li
,
K.
,
Schmidt
,
S.
,
Bacher
,
R.
,
Leemans
,
W.
, and
Steinicke
,
F.
,
2022
, “
Mixed Reality Tunneling Effects for Stereoscopic Untethered Video-See-Through Head-Mounted Displays
,”
2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Singapore
,
Oct. 17–21
, IEEE,pp. 44–53.
37.
Maruhn
,
P.
,
Dietrich
,
A.
,
Prasch
,
L.
, and
Schneider
,
S.
,
2020
, “
Analyzing Pedestrian Behavior in Augmented Reality-Proof of Concept
,”
2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)
,
Atlanta, GA
,
Mar. 22–26
, IEEE, pp.
313
321
.
38.
Li
,
K.
,
Choudhuri
,
A.
,
Schmidt
,
S.
,
Lang
,
T.
,
Bacher
,
R.
,
Hartl
,
I.
,
Leemans
,
W.
, and
Steinicke
,
F.
,
2022
, “
Stereoscopic Video See-Through Head-Mounted Displays for Laser Safety: An Empirical Evaluation at Advanced Optics Laboratories
,”
2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Singapore
,
Oct. 17–21
, IEEE, pp.
112
120
.
39.
Pfeil
,
K.
,
Masnadi
,
S.
,
Belga
,
J.
,
Sera-Josef
,
J.-V. T.
, and
LaViola
,
J.
,
2021
, “
Distance Perception With a Video See-Through Head-Mounted Display
,”
Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
,
Yokohama, Japan
,
May 8–13
, pp.
1
9
.
40.
Tuceryan
,
M.
,
Greer
,
D. S.
,
Whitaker
,
R. T.
,
Breen
,
D. E.
,
Crampton
,
C.
,
Rose
,
E.
, and
Ahlers
,
K. H.
,
1995
, “
Calibration Requirements and Procedures for a Monitor-Based Augmented Reality System
,”
IEEE Trans. Vis. Comput. Graph.
,
1
(
3
), pp.
255
273
.
41.
Grubert
,
J.
,
Itoh
,
Y.
,
Moser
,
K.
, and
Swan
,
J. E.
,
2017
, “
A Survey of Calibration Methods for Optical See-Through Head-Mounted Displays
,”
IEEE Trans. Vis. Comput. Graph.
,
24
(
9
), pp.
2649
2662
.
42.
Moser
,
K. R.
,
Arefin
,
M. S.
, and
Swan
,
J. E.
,
2018
, “
Impact of Alignment Point Distance and Posture on Spaam Calibration of Optical See-Through Head-Mounted Displays
,”
2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Munich, Germany
,
Oct. 16–20
, IEEE, pp.
21
30
.
43.
Moser
,
K.
,
Itoh
,
Y.
,
Oshima
,
K.
,
Swan
,
J. E.
,
Klinker
,
G.
, and
Sandor
,
C.
,
2015
, “
Subjective Evaluation of a Semi-Automatic Optical See-Through Head-Mounted Display Calibration Technique
,”
IEEE Trans. Vis. Comput. Graph.
,
21
(
4
), pp.
491
500
.
44.
Besl
,
P. J.
, and
McKay
,
N. D.
,
1992
, “
Method for Registration of 3-D Shapes
,”
Sensor Fusion IV: Control Paradigms and Data Structures
,
Boston, MA
,
Nov. 12–15
, Vol. 1611, SPIE, pp.
586
606
.
45.
Fitzgibbon
,
A. W.
,
2003
, “
Robust Registration of 2D and 3D Point Sets
,”
Image Vis. Comput.
,
21
(
13–14
), pp.
1145
1153
.
46.
Rusinkiewicz
,
S.
, and
Levoy
,
M.
,
2001
, “
Efficient Variants of the ICP Algorithm
,”
Proceedings Third International Conference on 3-D Digital Imaging and Modeling
,
Quebec City, Canada
,
May 28– June 1
, IEEE, pp.
145
152
.
47.
Gold
,
S.
,
Rangarajan
,
A.
,
Lu
,
C.-P.
,
Pappu
,
S.
, and
Mjolsness
,
E.
,
1998
, “
New Algorithms for 2D and 3D Point Matching: Pose Estimation and Correspondence
,”
Pattern Recognit.
,
31
(
8
), pp.
1019
1031
.
48.
Granger
,
S.
, and
Pennec
,
X.
,
2002
, “
Multi-Scale EM-ICP: A Fast and Robust Approach for Surface Registration
,”
European Conference on Computer Vision
,
Copenhagen, Denmark
,
May 28–31
, Springer, pp.
418
432
.
49.
Fuhrmann
,
A.
,
Schmalstieg
,
D.
, and
Purgathofer
,
W.
,
1999
, “
Fast Calibration for Augmented Reality
,”
Proceedings of the ACM Symposium on Virtual Reality Software and Technology
,
London, UK
,
Dec. 20–22
, pp.
166
167
.
50.
Kato
,
H.
, and
Billinghurst
,
M.
,
1999
, “
Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System
,”
Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99)
,
San Francisco, CA
,
Oct. 20–21
, IEEE, pp.
85
94
.
51.
Hu
,
X.
, and
Cutolo
,
F.
,
2021
, “
Rotation-Constrained Optical See-Through Headset Calibration With Bare-Hand Alignment
,”
2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Bari, Italy
,
Oct. 4–8
, IEEE, pp.
256
264
.
52.
Faugeras
,
O.
,
1993
,
Three-Dimensional Computer Vision: A Geometric Viewpoint
,
MIT Press
,
London, UK
.
53.
Hartley
,
R.
, and
Zisserman
,
A.
,
2003
,
Multiple View Geometry in Computer Vision
,
Cambridge University Press
,
Cambridge, UK
.
54.
Zhang
,
Z.
,
2000
, “
A Flexible New Technique for Camera Calibration
,”
IEEE Trans. Pattern. Anal. Mach. Intell.
,
22
(
11
), pp.
1330
1334
.
55.
Jost
,
T. A.
,
Nelson
,
B.
, and
Rylander
,
J.
,
2021
, “
Quantitative Analysis of the Oculus Rift S in Controlled Movement
,”
Disabil. Rehabil. Assist. Technol.
,
16
(
6
), pp.
632
636
.
56.
Eger Passos
,
D.
, and
Jung
,
B.
,
2020
, “
Measuring the Accuracy of Inside-Out Tracking in XR Devices Using a High-Precision Robotic ARM
,”
HCI International 2020-Posters: 22nd International Conference, HCII 2020, Proceedings, Part I, Copenhagen, Denmark, July 19–24
,
Springer
, pp.
19
26
.
57.
Tsai
,
R. Y.
, and
Lenz
,
R. K.
,
1989
, “
A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration
,”
IEEE Trans. Rob. Autom.
,
5
(
3
), pp.
345
358
.
58.
Bajura
,
M.
, and
Neumann
,
U.
,
1995
, “
Dynamic Registration Correction in Augmented-Reality Systems
,”
Proceedings Virtual Reality Annual International Symposium’95
,
Research Triangle Park, NC
,
Mar. 11–15
, IEEE, pp.
189
196
.
You do not currently have access to this content.