35
SOAR SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT CAADRIA2012, Chennai, India TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU, MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKI Division of Sustainable Energy and Environmental Engineering, Graduate School of Engineering, Osaka University, Japan

SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Embed Size (px)

DESCRIPTION

This slide is presented in CAADRIA2012 (The 17th International Conference on Computer Aided Architectural Design Research in Asia). Abstract. This research presents the development of a sensor oriented mobile AR system which realizes geometric consistency using GPS, a gyroscope and a video camera which are mounted in a smartphone for urban landscape assessment. A low cost AR system with high flexibility is realized. Consistency of the viewing angle of a video camera and a CG virtual camera, and geometric consistency between a video image and 3DCG are verified. In conclusion, the proposed system was evaluated as feasible and effective.

Citation preview

Page 1: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

SOAR SENSOR ORIENTED MOBILE

AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

CAADRIA2012, Chennai, India

TOMOHIRO FUKUDA, TIAN ZHANG, AYAKO SHIMIZU, MASAHARU TAGUCHI, LEI SUN and NOBUYOSHI YABUKI

Division of Sustainable Energy and Environmental Engineering,

Graduate School of Engineering, Osaka University, Japan

Page 2: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Contents

1. Introduction

2. System Development

1. Development Environment of a System

2. System Flow

3. Verification of System

1. Consistency with the viewing angle of a video camera and CG virtual camera

2. Accuracy of geometric consistency with a video image and 3DCG

4. Conclusion

2

Page 3: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Contents

1. Introduction

2. System Development

1. Development Environment of a System

2. System Flow

3. Verification of System

1. Consistency with the viewing angle of a video camera and CG virtual camera

2. Accuracy of geometric consistency with a video image and 3DCG

4. Conclusion

3

Page 4: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

1.1 Motivation -1 1. Introduction

In recent years, the need for landscape simulation has been growing. A review meeting of future landscape is carried out on a planned construction site in addition to being carried out in a conference room.

It is difficult for stakeholders to imagine concretely such an image that is three-dimensional and does not exist. A landscape visualization method using Computer Graphics (CG) and Virtual Reality (VR) has been developed.

However, this method requires much time and expense to make a 3D model. Moreover, since consistency with real space is not achieved when using VR on a planned construction site, it has the problem that a reviewer cannot get an immersive experience.

4 A landscape study on site VR caputure of Kobe city

Page 5: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

1.1 Motivation -2 1. Introduction

In this research, the authors focus Augmented Reality (AR) which can superimpose an actual landscape acquired with a video camera and 3DCG. When AR is used, a landscape assessment object will be included in the present surroundings. Thereby, a drastic reduction of the time and expense involved in carrying out 3DCG modeling of the present surroundings can be expected.

A smartphone is widely available on the market level.

5

Sekai Camera Web http://sekaicamera.com/

Smartphone Market in Japan

Page 6: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

モバイル型景観ARの進化

6 ©2012 Tomohiro Fukuda, Osaka-U

2006

Image Sketch (2005)

1.2 Previous Study 1. Introduction

In AR, realization of geometric consistency with a video image of an actual landscape and CG is an important feature

1. Use of physical sensors such as GPS (Global Positioning System) and gyroscope. To realize highly precise geometric consistency, special hardware which is expensive is required.

Page 7: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

1.2 Previous Study 1. Introduction

7

Yabuki, N., et al.: 2011, An invisible height evaluation system for building height regulation to preserve good landscapes using augmented reality, Automation in Construction, Volume 20, Issue 3, 228-235. artificial marker

2. Use of an artificial marker. Since an artificial marker needs to be always visible by the AR camera, the movable span of a user is limited. Moreover, to realize high precision, it is necessary to use a large artificial marker.

Page 8: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

1.3 Aim 1. Introduction

In this research, SOAR (Sensor Oriented Mobile AR) system which realizes geometric consistency using GPS, a gyroscope and a video camera which are mounted in a smartphone is developed.

A low cost AR system with high flexibility is realizable.

9

Page 9: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Contents

1. Introduction

2. System Development

1. Development Environment of a System

2. System Flow

3. Verification of System

1. Consistency with the viewing angle of a video camera and CG virtual camera

2. Accuracy of geometric consistency with a video image and 3DCG

4. Conclusion

10

Page 10: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.1 Development Environment Of a System

Standard Spec Smartphone: GALAPAGOS 003SH (Softbank Mobile Corp.)

Development Language: OpenGL-ES(Ver.2.0),Java(Ver.1.6)

Development Environment: Eclipse Galileo(Ver.3.5)

Location Estimation Technology: A-GPS (Assisted Global Positioning System)

11

OS Android™ 2.2

CPU Qualcomm®MSM8255 Snapdragon® 1GHz

Memory ROM:1GB RAM:512MB

Weight ≒140g Size ≒W62×H121×D12mm

Display Size 3.8 inch

Resolution 480×800 pixel

Spec of 003SH

003SH

Video Camera

2. System Development

Page 11: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -1 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

Distortion

Calibration of the video camera using Android NDK-OpenCV

While the CG model realizes ideal rendering by the perspective drawing method, rendering of a video camera produces distortion.

Calibration

12

Page 12: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -2 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

3DCG model allocation file

Geometry, Texture, Unit

3DCG model name, File name, Position data (longitude, latitude, orthometric height), Degree of rotation angle, and Zone number of the rectangular plane

Number of the 3DCG model allocation information file, Each name

3DCG Model

3DCG model arrangement information file

13

Page 13: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -3 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

14

GUI of the Developed System

Page 14: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -4 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

15

Coordinate System of Developed AR system

yaw

roll pitch

Page 15: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -4 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

16

Position data is converted into the coordinates (x, y) of a rectangular plane. Orthometric height is created by subtracting the geoid height from the ellipsoidal height.

Angle value of yaw points out magnetic north. In an AR system, in order to use a true north value, a magnetic declination is acquired and corrected.

Page 16: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

2.2 System Flow -5 2. System Development

Definition of landscape assessment 3DCG model

Selection of 3DCG model

Calibration of a video camera

Activation of AR system

Activation of GPS

Position information acquisition

Activation of gyroscope

Angle information acquisition

Definition of position and angle information on CG virtual camera

Superposition to live video image and 3DCG model

Display of AR image

Activation of video camera

Capture of live video image

Save of AR image

17

Page 17: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

モバイル型景観ARの進化

18 2011

Page 18: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Contents

1. Introduction

2. System Development

1. Development Environment of a System

2. System Flow

3. Verification of System

1. Consistency with the viewing angle of a video camera and CG virtual camera

2. Accuracy of geometric consistency with a video image and 3DCG

4. Conclusion

19

Page 19: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Known Building Target ▶ GSE Common East Building at Osaka University Suita Campus

▶ W29.6 m, D29.0 m, H67.0 m

24

3.2 Accuracy of geometric consistency with a video image and 3DCG

64.8

m

28.95m

29.6

m

64.8

m

29.6m

29.6m

28.95m

Photo Drawing

Outlined 3D Model Latitude, Longitude, Orthometric height 34.823026944, 135.520751389, 60.15

3. Verification of System

Page 20: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Known Viewpoint Place ▶ No.14-563 reference point. Distance from the reference point to the center

of the Building was 203 m.

▶ AR system was installed with a tripod at a level height 1.5m.

25

A D B C

Viewpoint (No.14-563 Reference Point)

Building Target

Measuring Points of Residual Error

203m

3.2 Accuracy of geometric consistency with a video image and 3DCG

3. Verification of System

Latitude, Longitude, Orthometric height 34.82145699, 135.519612, 53.1

Page 21: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Parameter Settings of Eight Experiments

26

Experiment Position Information of

CG Virtual Camera Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.1 S S S S S S

No.2 D D D D D D

No.3 D S S S S S

No.4 S D S S S S

No.5 S S D S S S

No.6 S S S D S S

No.7 S S S S D S

No.8 S S S S S D

Parameter Settings (S: Static Value = Known value, D: Dynamic Value = Acquired value from a device )

3.2 Accuracy of geometric consistency with a video image and 3DCG

3. Verification of System

Page 22: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Calculation Procedure of Residual Error 1. Pixel Error: Each difference between the horizontal direction and vertical

direction of four points measured by pixels (Δx, Δy).

27

Calculation image of residual error between live video image and CG

Live Image

CG Model

⊿x

⊿y

3.2 Accuracy of geometric consistency with a video image and 3DCG

2. Distance Error: From the acquired value (Δx, Δy), each difference in the horizontal direction and vertical direction was computed as a meter unit by the formula 1 and the formula 2 (ΔX, ΔY).

(1) (2)

W: Actual width of an object (m) H: Actual height of an object (m) x: Width of 3DCG model on AR image (px) y: Height of 3DCG model on AR image (px)

3. Verification of System

Page 23: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.1 AR image

3.2 Accuracy of geometric consistency with a video image and 3DCG

3. Verification of System

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.1 S S S S S S

(0.12m/pixel)

Page 24: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.2

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.2 D D D D D D

(0.12m/pixel)

Page 25: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.3

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.3 D S S S S S

(0.12m/pixel)

Page 26: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.4

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.4 S D S S S S

(0.12m/pixel)

Page 27: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.5

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.5 S S D S S S

(0.12m/pixel)

Page 28: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.6

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.5 S S S D S S

(0.12m/pixel)

Page 29: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.7

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.5 S S S S D S

(0.12m/pixel)

Page 30: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Results: No.8

3.2 Accuracy of geometric consistency with a video image and 3DCG

AR image

Pixel Error

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Unit:

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

3. Verification of System

Experim

ent

Position Information of CG Virtual Camera

Angle Information of CG Virtual Camera

Latitude Longitude Altitude yaw pitch roll

No.5 S S S S S D

(0.12m/pixel)

Page 31: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

3.2 Accuracy of geometric consistency with a video image and 3DCG

Results in No.1 (All the known static value used)

▶ Pixel error: less than 1.5 pixels

▶ Mean distance error: less than 0.15m

▶ Accuracy of the AR system was found to be high. ▶ Object 200m away can be evaluated when a known static value is

used.

Results in No.2 (General SOAR)

▶ Pixel error: less than 20 pixels (H), 55 pixels (V) ▶ Mean distance error: less than 2.3m (H), 6.3m (V)

Results through No.2-8 ▶ Although No.2 is all dynamic inputs, the X residual error of No.2

is smaller than the one of No.6. It is the result of offsetting the X residual error of No.2, the X residual error of No.6 which is + angle, and the X residual error of No.4 which is - angle.

36

3. Verification of System

No.1

No.2

Distance Error

Max. Min. Mean

No.1 No.2 No.3 No.4 No.5 No.6 No.7 No.8

Unit:

Dis

tance E

rror

Page 32: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Contents

1. Introduction

2. System Development

1. Development Environment of a System

2. System Flow

3. Verification of System

1. Consistency with the viewing angle of a video camera and CG virtual camera

2. Accuracy of geometric consistency with a video image and 3DCG

4. Conclusion

37

Page 33: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

4.1 Conclusion

When known values are used for position and angle information on CG virtual camera and the distance was 200 m, the pixel error is less than 1.5 pixels, and the mean distance error is less than 0.15 m. This value is the tolerance level of landscape assessment.

38

4. Conclusion

When dynamic values acquired with GPS and gyroscope were used, the pixel error was less than 20 horizontal pixels and 55 vertical pixels. The mean distance error was 2.3 horizontal meters and 6.3 vertical meters.

The developed SOAR has geometric consistency using GPS and the gyroscope with which the smart phone is equipped.

Page 34: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

4.2 Future Work

A future work should attempt to reduce the residual error included in the dynamic value acquired with GPS and the gyroscope.

It is also necessary to verify accuracy of the residual error to objects further than 200m away and usability.

39

4. Conclusion

Page 35: SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT

Thank you for your attention!

E-mail: Twitter:

Facebook: Linkedin:

[email protected] fukudatweet Tomohiro Fukuda Tomohiro Fukuda