GOAR GIS ORIENTED MOBILE AUGMENTED
REALITY FOR URBAN LANDSCAPE ASSESSMENT
4th International Conference on Communications, Mobility, and Computing (CMC2012), Guilin, China
TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU, MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKI
Division of Sustainable Energy and Environmental Engineering,
Graduate School of Engineering, Osaka University, Japan
Outline
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consideration of allowable residual error
2. Accuracy of geometric consistency with a video image and 3DCG
4. Conclusion
2
Outline
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consideration of allowable residual error
2. Accuracy of geometric consistency with a video image and 3DCG
4. Conclusion
3
1.1 Motivation -1 1. Introduction
In recent years, the need for landscape simulation has been growing. A review meeting of future landscape is carried out on a planned construction site in addition to being carried out in a conference room.
It is difficult for stakeholders to imagine concretely such an image that is three-dimensional and does not exist. A landscape visualization method using Computer Graphics (CG) and Virtual Reality (VR) has been developed.
However, this method requires much time and expense to make a 3D model. Moreover, since consistency with real space is not achieved when using VR on a planned construction site, it has the problem that a reviewer cannot get an immersive experience.
4 A landscape study on site VR caputure of Kobe city
1.1 Motivation -2 1. Introduction
In this research, the authors focus Augmented Reality (AR) which can superimpose an actual landscape acquired with a video camera and 3DCG. When AR is used, a landscape assessment object will be included in the present surroundings. Thereby, a drastic reduction of the time and expense involved in carrying out 3DCG modeling of the present surroundings can be expected.
A smartphone is widely available on the market level.
5
Sekai Camera Web http://sekaicamera.com/
Smartphone Market in Japan
モバイル型景観ARの進化
6 ©2012 Tomohiro Fukuda, Osaka-U
2006
Image Sketch (2005)
1.2 Previous Study 1. Introduction
In AR, realization of geometric consistency with a video image of an actual landscape and CG is an important feature
1. Use of physical sensors such as GPS (Global Positioning System) and gyroscope. To realize highly precise geometric consistency, special hardware which is expensive is required.
1.2 Previous Study 1. Introduction
7
Yabuki, N., et al.: 2011, An invisible height evaluation system for building height regulation to preserve good landscapes using augmented reality, Automation in Construction, Volume 20, Issue 3, 228-235.
2. Use of an artificial marker. Since an artificial marker needs to be always visible by the AR camera, the movable span of a user is limited. Moreover, to realize high precision, it is necessary to use a large artificial marker.
artificial marker
1.3 Aim 1. Introduction
In this research, GOAR (GIS Oriented Mobile AR) system which realizes geometric consistency using GIS to obtain position data instead of GPS which obtains a low accuracy of the location information, a gyroscope and a video camera which are mounted in a smartphone is developed.
A low cost AR system with high flexibility is realizable.
8
(Virtual Object for Landscape Simulation)
Outline
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consideration of allowable residual error
2. Accuracy of geometric consistency with a video image and 3DCG
4. Conclusion
9
2.1 Development Environment Of a System
Standard Spec Smartphone: GALAPAGOS 003SH (Softbank Mobile Corp.)
Development Language: OpenGL-ES(Ver.2.0),Java(Ver.1.6)
Development Environment: Eclipse Galileo(Ver.3.5)
Location Estimation Technology: GIS includes Google Maps API and Digital Elevation Model (DEM) which is 10 m mesh size
10
OS Android™ 2.2
CPU Qualcomm®MSM8255 Snapdragon® 1GHz
Memory ROM:1GB RAM:512MB
Weight ≒140g Size ≒W62×H121×D12mm
Display Size 3.8 inch
Resolution 480×800 pixel
Spec of 003SH
003SH
Video Camera
2. System Development
2.2 System Flow -1 2. System Development
Calibration of the video camera using Android NDK-OpenCV
While the CG model realizes ideal rendering by the perspective drawing method, rendering of a video camera produces distortion.
Distortion Calibration
11
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
2.2 System Flow -2 2. System Development
3DCG model allocation file
Geometry, Texture, Unit
3DCG model name, File name, Position data (longitude, latitude, orthometric height), Degree of rotation angle, and Zone number of the rectangular plane
Number of the 3DCG model allocation information file, Each name
3DCG Model
3DCG model arrangement information file
12
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
2.2 System Flow -3 2. System Development
13
GUI of the Developed System
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
2.2 System Flow -4 2. System Development
14
Coordinate System of Developed AR system
yaw
roll pitch
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
2.2 System Flow -5 2. System Development
15
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
1.The user tap the current location on Google Maps
2.The position data (longitude, latitude) on the current location is obtained
3.Altitude is created using position data (longitude, latitude) and DEM
2.2 System Flow -6 2. System Development
16
Definition of landscape assessment 3DCG model
Selection of 3DCG model
Calibration of a video camera
Activation of AR system
Starting of Google Maps
Position information acquisition
Activation of gyroscope
Angle information acquisition
Definition of position and angle information on CG virtual camera
Superposition to live video image and 3DCG model
Display of AR image
Activation of video camera
Capture of live video image
Save of AR image
Input of DEM
モバイル型景観ARの進化
17
Outline
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consideration of allowable residual error
2. Accuracy of geometric consistency with a video image and 3DCG
4. Conclusion
18
5mm (Width of finger)
= 8m (Distance in real space)
3.1 Consideration of allowable residual error
The residual error of position (longitude, latitude) occurs by the gap with the position in which a user does a tap on Google Maps as an actual position.
When the size of the digital map is maximized on Google Maps, the distance in the real space of the map is 123 m to the size of a screen being 78 mm. That is, 1 mm on a screen is about 1.6 m in the real space.
On the other hand, since a tap is operated with a finger, a residual error may occur only the width of the finger used for a tap. Since the width of the finger had individual difference, it was set as 5 mm in this research.
Therefore, if the scale of a digital map and the error of the width of a finger are taken into consideration, an error will be set to less than 8 m when directing latitude and longitude.
19
3. Verification of System
1mm (Size of screen)
= 1.6m (Distance in real space)
Moreover, about the residual error of altitude, it is expected that 10m mesh DEM cannot respond to change of the altitude from a model creation time and a difference with reality may occur since the altitude between the mesh vertices are linearly interpolated.
3.2 Accuracy of geometric consistency with a video image and 3DCG -1
Experimental Methodology ▶ The parameters for realizing geometric consistency are:
▶ Position: latitude, longitude, altitude by GIS ▶ Angle: yaw, pitch, roll by gyroscope
▶ The accuracy of geometric consistency is determined by combining the residual error of these parameters.
▶ A known building and viewpoint place are set up.
▶ In one experiment, only one parameter was acquired from a device and the remaining parameters set up a known value as a fixed value.
▶ Calculation of residual error between live video image and CG at the same point
20
3. Verification of System
Known Building Target ▶ GSE Common East Building at Osaka University Suita Campus
▶ W29.6 m, D29.0 m, H67.0 m
21
3.2 Accuracy of geometric consistency with a video image and 3DCG -2
64.8
m
28.95m
29.6
m
64.8
m
29.6m
29.6m
28.95m
Photo Drawing
Outlined 3D Model Latitude, Longitude, Orthometric height 34.823026944, 135.520751389, 60.15
3. Verification of System
Known Viewpoint Place ▶ No.14-563 reference point. Distance from the reference point to the center
of the Building was 203 m.
▶ AR system was installed with a tripod at a level height 1.5m.
22
A D B C
Measuring Points of Residual Error
Viewpoint (No.14-563 Reference Point)
Building Target
203m
3.2 Accuracy of geometric consistency with a video image and 3DCG -3
3. Verification of System
Latitude, Longitude, Altitude 34.82145699, 135.519612, 53.1
10m
Reference Point
Maximum Altitude: 53.5m
Altitude of Reference Point: 53.1m
Minimum Altitude: 51.0m
Parameter Settings of Eight Experiments
23
Experiment Position Information of
CG Virtual Camera Angle Information of CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.1 S S S S S S
No.2 D (GIS) D (GIS) D (GIS) S S S
No.3 D (GIS) D (GIS) D (GIS) D D D
No.4 D (GPS) D (GPS) D (GPS) D D D
Parameter Settings (S: Static Value = Known value, D: Dynamic Value = Acquired value from a device )
3.2 Accuracy of geometric consistency with a video image and 3DCG -4
3. Verification of System
1)
1) T. Fukuda, T. Zhang, A. Shimizu, M. Taguchi, L. Sun, N. Yabuki, “SOAR: Sensor oriented Mobile
Augmented Reality for Urban Landscape Assessment”, Proceedings of the 17th International
Conference on Computer Aided Architectural Design Research in Asia (CAADRIA), pp.387-396, 2012-4.
Calculation Procedure of Residual Error 1. Pixel Error: Each difference between the horizontal direction and vertical
direction of four points measured by pixels (Δx, Δy).
24
Calculation image of residual error between live video image and CG
Live Image
CG Model
⊿x
⊿y
3.2 Accuracy of geometric consistency with a video image and 3DCG
2. Distance Error: From the acquired value (Δx, Δy), each difference in the horizontal direction and vertical direction was computed as a meter unit by the formula 1 and the formula 2 (ΔX, ΔY).
(1) (2)
W: Actual width of an object (m) H: Actual height of an object (m) x: Width of 3DCG model on AR image (px) y: Height of 3DCG model on AR image (px)
3. Verification of System
Results: No.1 AR image
3.2 Accuracy of geometric consistency with a video image and 3DCG
3. Verification of System
Pixel Error
No.1 No.2 No.3 No.4
Unit
Unit
Distance Error
Max. Min. Mean
No.1 No.2 No.3 No.4
Unit:
Dis
tance E
rror
Experim
ent
Position Information of CG Virtual Camera
Angle Information of CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.1 S S S S S S
(0.12m/pixel)
Results: No.2
3.2 Accuracy of geometric consistency with a video image and 3DCG
AR image
3. Verification of System
Experim
ent
Position Information of CG Virtual Camera
Angle Information of CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.2 D (GIS) D (GIS) D (GIS) S S S
(0.12m/pixel)
Pixel Error
No.1 No.2 No.3 No.4
Unit
Unit
Distance Error
Max. Min. Mean
No.1 No.2 No.3 No.4
Unit:
Dis
tance E
rror
Results: No.3
3.2 Accuracy of geometric consistency with a video image and 3DCG
AR image
3. Verification of System
Experim
ent
Position Information of CG Virtual Camera
Angle Information of CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.3 D (GIS) D (GIS) D (GIS) D D D
(0.12m/pixel)
Pixel Error
No.1 No.2 No.3 No.4
Distance Error
Max. Min. Mean
No.1 No.2 No.3 No.4
Unit:
Dis
tance E
rror
Unit
Unit
Results: No.4
3.2 Accuracy of geometric consistency with a video image and 3DCG
AR image
Pixel Error
3. Verification of System
Experim
ent
Position Information of CG Virtual Camera
Angle Information of CG Virtual Camera
Latitude Longitude Altitude yaw pitch roll
No.4 D (GPS) D (GPS) D (GPS) D D D
(0.12m/pixel)
No.1 No.2 No.3 No.4
Unit
Unit
Distance Error
Max. Min. Mean
No.1 No.2 No.3 No.4
Unit:
Dis
tance E
rror
3.2 Accuracy of geometric consistency with a video image and 3DCG
Allowable residual error of longitude and latitude: 8m at the maximum
Result of No.3, the maximum residual error is 6.5 m, a mean distance error is 2.2 m, and it became smaller than anticipation.
When the mean distance error of No.3 was compared with No.4: Horizontal: 0.7 m larger
Vertical: 5 m smaller
Proposed GIS technique obtains position data on higher accuracy especially in a vertical direction rather than GPS.
29
3. Verification of System
No.3 No.4
Max. Min. Mean
No.1 No.2 No.3 No.4
Unit:
Dis
tance E
rror
0.11m
1.1m 1.3m 1.3m
3m
6.3m
2.3m
No.1
Outline
1. Introduction
2. System Development
1. Development Environment of a System
2. System Flow
3. Verification of System
1. Consideration of allowable residual error
2. Accuracy of geometric consistency with a video image and 3DCG
4. Conclusion
30
4.1 Conclusion
The developed AR system has geometric consistency using GIS and the gyroscope with which the smartphone is equipped. Therefore, a user can use it easily and we can describe it as a system with high flexibility.
31
4. Conclusion
In GOAR system, appearance of the residual error of longitude and latitude by a user specifying a current position on Google Maps and the residual error of altitude by using 10m meshed DEM is expected. As a result of the experiment, the maximum residual error of longitude and latitude was 6.5 m, and the mean distance error was 2.2 m. The maximum residual error of altitude was 2.6 m and the mean distance error was 1.3 m. Any result became smaller than assumption.
Consequently, the proposed GOAR system was evaluated as feasible and effective.
4.2 Future Work
A future work should attempt to reduce the residual error included in the dynamic value acquired with gyroscope.
It is also necessary to verify accuracy of the residual error to objects further than 200m away and usability.
32
4. Conclusion
Thank you for your attention!
E-mail: Twitter:
Facebook: Linkedin:
[email protected] fukudatweet Tomohiro Fukuda Tomohiro Fukuda
Top Related