Leveraging Augmented Reality for Highway Construction

TECHBRIEF

FHWA Publication No.: FHWA-HRT-20-037
FHWA Contact: Hoda Azari, HRDI-20, (202) 493-3064, hoda.azari@dot.gov
Researchers: Kevin Gilson; Jagannath Mallela, Ph.D.; Paul Goodrum, Ph.D.
This document is a technical summary of the Federal Highway Administration report, Leveraging Augmented Reality for Highway Construction (FHWA-HRT-20-038).

INTRODUCTION

Challenges in highway construction management and field operations include the lack of integrated information that can be obtained in real time, gaps between planned solutions and practical implementations, a lack of established quality assurance (QA) practices, and ineffective project communications.

As design and construction workflows based on three-dimensional (3D) models become more common on highway projects, the Federal Highway Administration (FHWA) is promoting these and other innovations through its Every Day Counts initiative and Building Information Modeling (BIM) efforts. This increased use of 3D model–based design and construction workflows as well as the rapid advancement in computer interface design and hardware make augmented reality (AR) a tool for overcoming such challenges.

AR is an immersive technology that combines visual computer-generated information with the real environment in real time. It enhances user perception of reality and enriches information content, helping project managers and engineers deliver projects faster, safer, and with greater accuracy and efficiency because managers can catch errors before construction and potentially improve design and construction details. They may also be able to use the tools for training, construction inspection, and stakeholder outreach.

AR enhances a scene while a user maintains a sense of presence in the real world. AR can augment traditional two-dimensional drawings with digital 3D images to help improve construction inspection and review, QA, worker safety, training, and project management. A close companion to AR is virtual reality (VR). VR typically consists of a fully immersive environment in which a person’s senses are under the control of a display system, usually through a headset. Although VR applications are still being developed, their use in the review of design alternatives and stakeholder communication shows strong potential, especially in collaborative environments.

Research Synopsis

The FHWA study “Leveraging Augmented Reality for Highway Construction” documents current AR technologies and applications focused on the state of the practice in highway design, construction, and inspection. The study included a literature review (with a desk scan to document AR use in construction) and interviews with researchers and vendors. Two workshops that involved technology and application developers, State departments of transportation, contractors, consultants, and other practitioners were also held.

The final task of the FHWA study involved the development of five use-case examples of potential highway construction activities that could be enhanced using AR. These use-cases were based on workshop outcomes that were then refined using research results from chapter 4, which focused on AR state of the practice. Each use-case was outlined in a narrative format with a description of the AR technology and workflow that would be used.

Research Objectives

A key study objective was to identify the availability, accessibility, and reliability of AR for construction inspection and review, QA, training, and project management. Other study objectives included documenting methodologies for managing data flows and integrating AR into highway agency design and planning workflows—in particular, how AR workflows are compatible with BIM workflows.

A final study objective was to identify current AR activities that could illuminate and promote AR in the construction management of highway infrastructure. The study identified emerging AR technologies being used in the field of construction and presented best practices and lessons learned from the use of AR in other industries.

DEFINITION OF AR

Although AR and VR are often discussed in tandem, they are different. A widely accepted definition of AR states that it must combine real and virtual, be interactive in real time, and be registered in 3D space (Azuma 1997). This definition does not require any specific output device, nor does it limit AR to visual media. AR

provides an intuitive and direct connection between the physical world and digital information (Azuma 1997). AR is a general term for several display technologies capable of overlaying (or combining) alphanumeric, symbolic, and graphical information with a user’s real-world view. This technology provides aligned, correlated, stabilized, contextual, and intelligent information that augments the user’s understanding of his or her real-world view. By comparison, VR completely replaces the visual world experienced by the user (Azuma 1997).

Another term, mixed reality (MR), is becoming more common in product literature. Milgram and Kishino (1994) developed a continuum to define the relationship between AR, VR, and MR, which is illustrated in figure 1.

With full reality at one end and full VR at the other end, the region in between is MR. AR is a subset of MR and lies near the reality end of the spectrum. A significant portion of the AR experience is based in the real world, augmented by computer-generated contextual data. The term augmented virtuality refers to the inverse, a mostly virtual experience that can be augmented with images or video from the real world. For construction applications, perception of the real world via AR with the associated virtual data and imagery are a powerful combination (Milgram and Kishino 1994).

OVERVIEW OF AR TECHNOLOGIES

General AR system architecture is described based on the type of sensory input device, the tracking device and software, the type of computing device, the form of media representation, and the data input mechanism.

AR Display Technologies

Two basic display types are found in AR systems— handheld devices, such as smartphones or tablets, and head mounted–display (HMD) devices, such as headsets or glasses. Primary differences include the way the devices display imagery to users and track their position relative to the real world.

Two basic display types are found in AR systems—
handheld devices, such as smartphones or tablets,
and head mounted–display (HMD) devices, such as
headsets or glasses. Primary differences include the
way the devices display imagery to users and track their
position relative to the real world.

Handheld Devices

Smartphone and tablet devices are typically video see through (VST) displays that use back-facing cameras to capture video of the real-world environment and display that image on the front screen. With these displays, the device needs to be held close to eye level and at arm’s length to capture the widest field of view (FOV), which can be difficult over long periods and challenging in a construction site environment.

Handheld devices typically use satellite network position (global navigation satellite system [GNSS]) tracking to determine the initial user location within a few meters and then use inertial movement to calculate their position as they are moved through space and change the view as the device is moved around. Some vendors have demonstrated prototype applications that optically track the imagery in the video feed to support registration with the real-world view and track movement. Many commercially available AR applications use a marker-based positioning tool in which a target is placed in the real world and viewed by the video feed to register the real-world position relative to the virtual model.

HMDs

HMDs are typically optical see-through (OST) displays in which the view out from the device into the real world is overlaid in front by a computer-generated image. These devices offer more immersive and realistic experiences than handheld devices because the realworld view is direct and the virtual view is typically stereoscopic. The scene changes as the user moves his or her head. Most HMDs use a combination of inertial and visual tracking and are good at tracking the realworld scene once the user’s position is established.

Like handheld devices, HMD devices rely on GNSS localization or markers in the field to determine the user’s initial position. Some AR application tools allow 3D models of elements from the real world to be ported to the device and used to register the device view to the real-world elements. These virtual model elements are viewed over the real-world view and then used to line up the model elements to identical elements in the real world (Aukstakalnis 2017).

Display Characteristics

Ocularity and Stereoscopy
Monocular devices present images to one eye. A monocular HMD can be used for AR, but these types of devices are not as immersive as stereoscopic devices. Google® glasses are an example of a monocular display. Perhaps the best use for monocular devices is text display or graphical annotation directly associated with real-world objects; this type of application could have use in training and inspection scenarios. Binocular displays present separate images to each eye, resulting in a stereoscopic effect and a stronger sense of realism and immersion. The trade-off is that separate images must be rendered for each eye, thus increasing system performance requirements.

Stereoscopy and Perception of Scale Human vision is binocular and relies on disparities between each eye to perceive depth and the true scale of objects. Because AR and VR systems present images to each eye with the same geometry a user’s eyes would see in the real world, the user gets an immersive and realistic view that renders the depth and scale of objects much like a real-world view (Schmalstieg and Hollerer 2016).

Device Tracking—Sensing Movement Within the Real-World Environment

AR devices must follow a user’s movement relative to his or her position and orientation in the real world in real time. This tracking process is a significant challenge in AR design. Sensors in AR systems help track the location and orientation of the user’s device and the location of real objects and markers in the environment via real-world coordinates. Several technologies can track position and orientation, including sensors that monitor GNSS and wireless networks; inertial sensors; optical sensors, including video and infrared sensors; simultaneous localization and mapping (SLAM) processors; and sensors that perform 3D scanning of environments on the fly. For outdoor environments, hybrid systems combining inertial sensors and optical tracking technology are the most robust and establish the most accurate positioning for AR application

Construction sites require high tracking accuracy, long-range tracking, high bandwidth, and the ability to process large amounts of data. Outdoor challenges include sensitivity to static and dynamic errors and less control over the environment. Construction sites change often, which may limit or even eliminate the ability to place static sensors or markers onsite.

User-Controlled Positioning and Navigation

User navigation and interaction widely vary between AR devices and applications. Handheld devices typically support swiping and touch controls. Many HMD device manufacturers allow users to navigate and control virtual elements with hand and finger motions. Devices such as Intel® RealSense™ and Leap Motion® natively support these gestural navigation commands and controls and gestures. Most systems support interaction with menus and virtual objects with a technique called “gaze and dwell” that involves a reticle in the scene showing where the user’s view is focused, and when that view is held for a certain amount of time on an object or menu (dwell), the object is selected, or the menu command is processed.

Types of controls available on construction-focused AR applications will be a critical aspect of implementation and adoption for user safety and convenience in a complex construction site environment.

commands and controls and gestures. Most systems support interaction with menus and virtual objects with a technique called “gaze and dwell” that involves a reticle in the scene showing where the user’s view is focused, and when that view is held for a certain amount of time on an object or menu (dwell), the object is selected, or the menu command is processed. Types of controls available on construction-focused AR applications will be a critical aspect of implementation and adoption for user safety and convenience in a complex construction site environment.

CHALLENGES FOR AR SYSTEMS IN CONSTRUCTION SITE ENVIRONMENTS

As noted, construction sites, especially highway construction sites, are particularly challenging for AR systems. Also, highway projects are typically smooth, flat, and curved and do not contain a lot of fine details, making it more difficult for 3D modeling systems to represent these types of objects so they are easily recognized by the user and the AR system. As a result, these elements will require more forethought and preparation for their use in AR systems.

Several technological challenges apply to all AR applications. Specific challengers related to construction applications are described in the following sections.

Performance and Portability

AR systems require significant processing power to concurrently support tracking processes and the real time display of the virtual 3D model. To be truly useful in a construction site environment, AR devices must stand alone and be portable, which means processing for tracking and display must be on board the device. Some systems are tethered to a separate wearable computing device that reduces the necessary weight on an HMD. Larger 3D models, more accurate tracking, and increased display quality require more processing power. As AR systems evolve, there will be a trade-off in performance of the system and the size, weight, and comfort of the AR device.

Portability of the 3D virtual model assets is also a challenge because 3D project models can be quite large, depending on the level of detail and the area covered. The AR device must include enough onboard storage for the model assets or have access to a remote connectivity solution to allow the model to be streamed.

Display Brightness

A key challenge with HMDs is the brightness of the virtual image that is overlaid onto the real-world view.

In an OST AR device, the quality of the virtual image directly depends on the brightness of the real-world environment. Bright outdoor scenes are the most difficult for the display systems to match. Because typical highway construction sites are outdoors and bright, the quality and usefulness of most current AR HMD devices is limited.

Handheld devices have VST displays, and they control the real-world display on the screen. The video display and virtual displays are better matched in overall brightness, providing higher quality and greater realism of the displayed scene. Near term, handheld devices likely provide better opportunities for the use of AR at outdoor construction locations.

FOV

A limiting factor of current devices is the available FOV presented to the user. In handheld devices, the display screen size and video feed limit the FOV. If the FOV of the video feed does not closely match the FOV displayed on the screen, there is a disconnect with the real-world view. In addition, the user must pan around with the device to view a large area, which could be prohibitive over long periods.

Current HMD technology in available devices is also limited in the FOV of the overlaid virtual model view that can be displayed to the user. In most current devices, approximately 60 degrees of virtual view is displayed. To view a large area, users must pan their heads back and forth to fill in the scene, which can be distracting and tiring over a long period.

As the ability of devices to display larger FOVs improves, performance requirements for tracking and virtual model processing by the systems will increase.

Occlusion

AR systems easily display virtual models on top or in front of the real-world image. In complex construction site environments, virtual elements in the model may need to be behind real-world elements. The occlusion, or masking, of the hidden elements presents a complex problem for AR, both in the calculation of the occlusion and in the display. When the occlusion is ignored or poorly displayed, the realism and immersive quality of the displayed scene are significantly affected. Commercially available systems do not currently support occlusion rendering.

Safety

Safety issues on a construction site are of particular concern. Handheld devices are held in front of the user, require the use of at least one hand, and can block visibility. HMDs can limit the user’s peripheral view and block site sounds. Like any display device, some user attention will be focused on the device and not entirely on the surroundings. It is anticipated that these devices could be designed to recognize safety issues and risks for the user because they would know the user’s precise location on the site.

AR INFORMATION DISPLAY OPTIONS

AR can display several distinct categories of information with unique applications to a user on a construction site.

Display What Is Not Yet Constructed

Incorporating AR into a BIM workflow allows users to see 3D design models in the real-world context, providing them with the following opportunities:

  • Compare design alternatives in context.
  • Check relationships between existing and future elements.
  • Monitor site logistics and equipment movements.
  • Preview complex installation procedures.
  • Illustrate construction methods and sequencing.
  • Participate in onsite training.
  • Check traffic management plans and temporary structures.

Display What Was Intended to Be Constructed

AR can overlay and compare 3D design models (design intent) onto what was constructed in the field. Potential opportunities from this capability include the following:

  • Inspect and validate a site.
  • Monitor code, standards, and compliance.
  • Check quantities and work progress.
  • Provide inspection training opportunities.
  • Check traffic management plans and temporary structures.

Display What Is Hidden from View

AR allows the 3D display of existing elements that are not visible to the user in the real world. The challenge is to represent foreground objects as transparent or cut open so the user can see behind them. This capability would allow the user to visualize existing elements, such as buried utilities or structural components, that are obstructed from the current view.

Display Abstract Information Aligned with Real-World Context

AR can align and display abstract information that typically would only be available in a drawing plan view or virtual 3D model. These elements can be aligned and scaled to match visible real-world elements such as the following:

  • Alignment information, easements, site boundaries, and right-of-way boundaries.
  • Environmental boundaries, such as flood levels or sea-level rise data.
  • Sensitive areas, such as archeological and historic sites.
  • Potential work-zone hazards.
  • Metadata tagged to real-world objects.

AR could also help display other types of remote information, including video feeds from another user or documentation, instructions, or user guides associated with real-world objects or activities. Abstract information for user safety could be displayed through AR—the system could monitor and display unsafe areas and risks or guide users safely through a construction site.

AR-ENABLING SOFTWARE AND APPLICATIONS

Several system-level platforms are designed for the development of AR applications on mobile devices and headsets. Each major operating system and Web browser developer has created a software development kit for AR. Many of these kits have built-in support for real-time game development platforms, such as Unreal 4™ and Unity™. WebAR™ is a JavaScript™ library of browser-capable AR tools that allow the creation of simple AR interactivity in a Web environment.

AR applications for construction will likely leverage the display of annotated and graphical information that enhances the understanding of real-world objects or 3D design and construction models when that information is overlaid accurately in the real-world environment. Many current AR applications are already moving in this direction, and most are based on existing workflows and tools that support BIM applications for design and construction. These computer-aided drafting and design (CADD) and BIM tools are a logical starting point for AR tool development. It is reasonable to assume that many AR applications will eventually become extensions of current BIM and 3D model workflows rather than discrete processes.

Virtually all major vendors, such as Autodesk, Bentley, and Trimble, for 3D design applications have implemented tools and workflows for deployment of 3D models to mobile devices for the field, and most support those devices’ built-in georeferencing capabilities.

Most platforms for 3D design applications support the review and collection of data in the field through mobile devices. The data are then synchronized with project models and data in the office. Most of these vendors have also enabled deployment of these 3D models to one or more of the AR HMD devices on the market through these mobile platforms. Although most of the tools offer ways to optimize models and model display on mobile devices with more limited processing power and storage, this optimization will be even more important for AR devices and applications that require enough graphics performance to support real-time stereoscopic rendering of the 3D models. Current limitations of most available AR HMDs include processing power, memory, data storage, and connectivity.

This challenge of bringing large 3D models to the work site is similar to that currently faced by agencies and contractors transitioning to BIM- and 3D model–based workflows. These workflows are now being used more in construction and will certainly be helpful in the adoption of AR tools for construction.

AR-READY DEVICES AND APPLICATIONS

Applications that currently support AR viewing on mobile devices could be used to display 3D design data in a construction site environment. One vendor has demonstrated a solution that includes a 3D modeling application, a platform for model deployment to mobile devices, and a construction site–ready mobile device that can access a model through the platform and register and track the model and viewer onsite in real time. This device is shown in figure 2.

AR-Ready HMD

A few HMD devices have been used in demonstrations and prototype applications specifically for onsite AR display of 3D model data. Two of the more commonly used devices are the Microsoft HoloLens and the DAQRI® Smart Helmet™ and Smart Glasses™. Figure 3 shows a user viewing a virtual model through such a device.

AR-Ready Software Applications

Although no construction management–specific AR applications were found, several applications have recently been released that focus on the architectural design market. For some time, 3D viewing–specific

Handheld AR
Handheld AR

applications intended to deploy 3D design models for viewing on different display devices have been on the market. Many of these vendors are now targeting commercially available AR devices. This trend will likely continue. As these applications and supported devices become more stable and portable, construction site– specific tools will likely emerge.

Mobile Device–Based AR Viewing Applications

Several off-the-shelf AR applications allow the user to display a 3D model over the live video feed of a mobile device. The application is given a target image—either graphical (e.g., a plan or rendering) or a photograph. The image is printed out, and when the application on the mobile device recognizes the target image, the 3D model is displayed in alignment as defined in the application. The model view is locked to the position of the target image in the video feed (figure 4). Many of these applications import standard 3D model formats from the major 3D design vendors.

3D model view

AR STATE OF THE PRACTICE

Few AR implementations specifically for highway construction were found; however, several research studies and product prototype development activities were identified that were closely related to highway construction and inspection. These activities represent workflows and technology applications that would be an important aspect of the development of highway construction–specific applications.

Some key examples of AR applications from this research include the following:

  • Simplified, and sometimes relatively complex, 3D model data were deployed to an AR device for display in the field. This process is currently unique to the device and system platform being used. The 3D model data flows required are similar to existing BIM-to-Field workflows being developed by 3D modeling and CADD software vendors. These workflows have been developed for the most common mobile device platforms. For specific headset devices, the workflow is commonly unique to that device (Jahangiri, Keane, and Tang 2017).
  • Autodesk BIM 360 Docs™ is an established platform for sharing data in the cloud and is extensively used on collaborative design and construction projects. At Autodesk University 2018, Chen et al. discussed the results of a collaboration between DAQRI (a developer of AR headset devices), Autodesk, and McCarthy Building
  • Companies, Inc. (a contractor), that leverages the BIM 360 Docs platform to support an AR application for validating design information in the field. The application was developed using the Autodesk Forge™ system, which is used to create data flows and interfaces within the 360 platforms and accomplished several things that will be essential to AR workflows. The application used the DAQRI Smart Glasses wearable device (Chen et al. 2018).
  • In an application presented by Los Alamos County, users measured areas of newly installed concrete pads to calculate contractor payment amounts. The users captured points with the AR device using a cursor, which was placed over real-world points and captured using finger gestures. Those points and the surface of the existing condition were captured and scanned by the sensors on the HoloLens. Data were processed later to interpret the 3D locations of the points and the areas of the surfaces enclosed (Moreu et al. 2018).

AR IN TRANSPORTATION CONSTRUCTION WORKSHOPS

The AR in Transportation Construction workshops educated participants on the current capabilities of AR technologies and related applications in MR, identified potential new applications for AR in transportation construction, and prioritized future applications of AR in the industry. The workshops offered interactive experiences for the participants to explore the capabilities of a range of AR devices and to engage in eaningful dialogue about future applications.

Workshop participants ranked the use-cases in terms of potential impact on transportation construction and feasibility of development. Final rankings include the following:

  • AR support of Right-of-Way Acquisition: Provide project visualization to property owners to better understand project impact and design options.
  • Visual Variances: Visually tag schedule and quality variances in the field to ensure all parties view the same issues.
  • Inspector’s Toolkit: Use AR to verify proper installation by providing the appropriate information and format for the inspector in the field.
  • NextGen Training and Certification: Use AR-supported training and certification for construction inspection.
  • Automated Inspection: Use AR to support automated code and standard compliance check of installed items through machine learning.

KEY RESEARCH FINDINGS

The FHWA Study “Leveraging Augmented Reality for Highway Construction” resulted in some key take-aways regarding AR and its application:

  • AR hardware, applications, and workflows are rapidly changing and improving. During this research, several new AR technologies were released, and existing technologies and devices have advanced and dramatically improved. Conversely, several devices documented early in the study have disappeared—an indication of the industry’s instability and rapidly changing nature.
  • A few devices in the study focused on the architecture, engineering, and construction market, especially for interior architectural design and construction. The HMD devices with success in this market are mobile and offer robust support of commonly used 3D modeling platforms.
  • Most of the current AR devices were reported to have good tracking capabilities once registered in the real-world environment. Most devices and applications achieve good tracking results by using markers (or target images) in known locations. Once localized, devices track user movement and view with image, 3D, and inertial motion tracking.
  • Challenges with AR visualization were found in outdoor and unstructured open-area environments. Tracking sensors require adequate detail, which highway construction sites may lack, to efficiently track the real world.
  • Challenges with HMD devices include visual display performance, durability in a construction site environment, onboard digital storage capacity, and access to onsite wireless communication. Safety is also an ongoing concern.
  • Tablet devices are making more rapid advances in the highway construction field because they have fewer display and viewing limitations. The trade-off is less accurate tracking technology, and thus, less precision of the registration of virtual to real-world imagery. One vendor focuses on this market by combining a tablet-viewing device integrated with special hardware and software to provide higher precision tracking and better registration of virtual 3D data to the real-world view.
  • A unique application of AR headset devices is the ability to leverage the 3D scanning hardware used for AR tracking to capture existing 3D data in the field. The precision of the data depends on several factors, including available site survey information and the scanning precision of the sensor hardware. On a typical highway construction site with good survey targets and a robust 3D model–based workflow, this capability will become an important aspect of AR implementation, especially for site inspections.
  • Workflows that support AR devices for this industry are similar to 3D model–based workflows and BIM processes. Data flow and 3D model management challenges are similar to current BIM workflow challenges.

Read More About This Guide & Download PDF

Question About this Guide, Post in the Comment.