Project Leader: Dr. Wei Yan, Mattia Flabiano III AIA/Page Southerland Page Professor, Department of Architecture, Presidential Impact Fellow, Texas A&M University
Dr. Wei Yan, Mattia Flabiano III AIA/Page Southerland Page Professor, Department of Architecture, Presidential Impact Fellow, Texas A&M University
Dr. Dezhen Song, Professor and Associate Department Head, Department of Computer Science and Engineering, firstname.lastname@example.org
Dr. Manish Dixit, Assistant Professor, Department of Construction Science, email@example.com
Dr. Mark Clayton, William M. Peña Professor, Department of Architecture, Director of the CRS Center Leadership & Management in the Design & Construction Industry,
Dr. Francis Quek, Professor, Department of Visualization, Director of the Texas A&M Institute for Technology-Infused Learning (TITIL), firstname.lastname@example.org
Units Represented: Architecture, Computer Science and Engineering, Construction Science, Visualization
Schools Represented: Architecture and Engineering
Description: Empowered by Artificial Intelligence (AI), Augmented Reality (AR) is expected to significantly enhance human’s ability to understand their living and working environments, visualize invisible information in the environments, and accomplish complex tasks of making and learning. The project team will research and develop AI-powered AR technologies through a case study of LEGO assembly, in which virtual bricks are superimposed on physical LEGO models to guide the assembly process. The prior work of the project is demonstrated with LEGO Arc de Triomphe, which is built completely with AR (https://youtu.be/7JDW_lDv7FU).
Major assembly, manufacturing, construction, and maintenance projects are increasingly complex. Automation and robotics are replacing many manual, repetitive, and standard tasks. Still, more than 95% of jobs consist of activities that need human labors [McKinsey&Company 2017]. Assembly tasks can take advantages of human-machine interfaces (HMIs) that allow human operators to collect data, and monitor, program, and control the system. However, traditional HMI cannot effectively contextualize and interact with future workflows that now include physical and digital work information [Immerman 2019]. As a major field of HMI, Human-Computer Interface (HCI) has started to integrate Augmented Reality (AR) in the workflow in assembly tasks. AR superimposes digital images on the real-world view of human users, putting the answers right where the questions are [Google 2018], and may greatly benefit manufacturing, building construction, and part assembly by human workers. Artificial Intelligence (AI) has the potential to significantly empower AR and advance HCI in assembly tasks. The proposed project aims to research and develop innovative AI-Powered AR for advancing HCI in assembly tasks.
The goals for the project include the following:
- Advancing HCI in assembly tasks through research, prototyping, and demonstration of AI-powered Augmented Reality
- Advancing students’ learning of creativity and STEM through the project
The PIs, graduate, and undergraduate students from Architecture, Construction Science, and Computer Science as the Project Team will research and develop a prototype of AI-powered AR, and demonstrate it through a case study with LEGO Architecture assembly of sufficient complexity and intriguing for an interdisciplinary learning experience for the students. The activities are planned based on the main functionalities and challenges of the system, as well as the team’s prior work on AR applied to LEGO construction.
1) AI for AR model registration
For assembly using AR as instructions, high accuracy (on the mean localization) and precision (on the variance of localization) of virtual-real model registration are required to reduce costly errors in construction. A comprehensive review of AR for assembly points out that accuracy and latency are the two critical issues [Wang et al. 2016]. User-based evaluations by Tang et al. (2003) support the proposition that AR systems improve assembly task performance significantly, however, the limitations of tracking and calibration techniques being the biggest obstacles. The project team will research, develop, and provide findings about the following AR registration methods: edge-based localization as a flexible method for model registration and 3D point cloud SLAM (Simultaneous Localization and Mapping) for understanding the physical model’s poses of 6 degree of freedom. Both methods will be investigated with the cutting-edge Deep Learning technology.
2) AI for object recognition and hand detection
To enable efficient assembly part finding, automatic detection of the completion of assembly steps, and detection of errors in assembly, Deep Learning CNN (Convolutional Neural Network)-based object detection will be developed. For the case study with LEGO brick assembly, multi-view renderings of digital bricks as training image data will be created. The system will help the user to find correct LEGO bricks for assembly, and help detect if the assembly step is complete and has any errors, e.g. missing bricks and wrong location or orientation of the bricks. To enhance hand-eye coordination and realistic immersive AR experience of users, the project will enable "grasping virtual objects with real hands", through hand detection and hand-brick occlusion. CNN will be utilized for hand detection and hand-brick occlusion.
In the project, students will learn prototyping as a research method, computer programming for basic AR and Deep Learning methods. They will learn how to synthesize training data, setup Deep Learning model parameters, run the training process, and evaluate the outcomes.
The anticipated outcomes from this project include AR and AI apps on mobile iOS devices, demonstration videos, a project website, and publications about research findings. The project team’s prior AR-based LEGO construction prototype was the basis for two grant applications:
1. National Science Foundation (NSF): Brick by Brick: Augmented Reality-based Making & Gaming for Advancing Informal STEM Learning (Pending)
2. Texas A&M Presidential Transformational Teaching Grants (PTTG): Brick By Brick: Augmented Reality-Based Making & Gaming For Teaching Creativity and STEM (Awarded)
The above two proposals were focused on learning and teaching of creativity and STEM, and didn’t include the comprehensive AI functionalities, which are the focuses and innovative components in this Innovation [X] proposal. Based on the data collection and findings from the Innovation [X] project, the team plans to actively apply for more NSF and other external grants.
Benefit to Students
Totally 12 students (3 graduate and 9 undergraduate) from Architecture, Construction Science, and Computer Science as the major participants of the project will research, develop, and demonstrate a prototype of AI-powered AR. Specifically, in the project, students will learn prototyping as a research method, important skills in design and assembly, AR and AI methods, and computer programming. They will learn how to synthesize training data, setup Deep Learning model parameters, run the training process, and evaluate the outcomes. Students will gain experience in the interdisciplinary project. Graduate students will also gain experience of leading group tasks and collaborating among groups.
Students will co-author publications with the faculty leaders, attend conferences to present their research and development, and build networks with other students and researchers in the conferences. Experience in the SXSW Conference, expected through the participation in the Innovation Awards – VR, AR & MR programs, will enable students to better understand and learn the broad topics of creativity and STEM applied to humanities.
Will part of the grant be used to pay undergraduate participants?
Will part of the grant be used to pay graduate student participants?
Will the project require travel?
Additional Information Supplied:
Three (3) undergraduate student workers will be paid with hourly salaries during the research and development phase. Six (6) other undergraduate students will be paid hourly participation fees in the evaluation phase as users. Three (3) graduate student workers will be paid with hourly salaries during the research and development phase.