scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computing in Civil Engineering in 2015"


Journal ArticleDOI
TL;DR: In this article, a new automated approach for recognition of physical progress based on two emerging sources of information: (1) unordered daily construction photo collections, which are currently collected at almost no cost on all construction sites; and (2) building information models (BIMs), which are increasingly turning into binding components of architecture/engineering/construction contracts.
Abstract: Accurate and efficient tracking, analysis and visualization of as-built (actual) status of buildings under construction are critical components of a successful project monitoring. Such information directly supports control decision-making and if automated, can significantly impact management of a project. This paper presents a new automated approach for recognition of physical progress based on two emerging sources of information: (1) unordered daily construction photo collections, which are currently collected at almost no cost on all construction sites; and (2) building information models (BIMs), which are increasingly turning into binding components of architecture/engineering/construction contracts. First, given a set of unordered and uncalibrated site photographs, an approach based on structure-from-motion, multiview stereo, and voxel coloring and labeling algorithms is presented that calibrates cameras, photorealistically reconstructs a dense as-built point cloud model in four dimensions (th...

283 citations


Journal ArticleDOI
TL;DR: The focus of the study presented in this paper is to investigate how much improvement in assembly productivity and performance can be achieved by lowering cognitive workload via AR.
Abstract: Current practice utilizes two-dimensional (2D) drawings as the main visualization means to guide assembly. As an emerging technology, augmented reality (AR) integrates three-dimensional (3D) images of virtual objects into a real-world workspace. The insertion of digitalized information into the real-world workspace using AR can provide workers with the means to implement correct assembly procedures with improved accuracy and reduced errors. The limited available research concerning the applications of AR visualization means in assembly highlights the need for a structured methodology of addressing cognitive and usability issues for the application potentials of AR technology to be fully realized. Thus, the focus of the study presented in this paper is to investigate how much improvement in assembly productivity and performance can be achieved by lowering cognitive workload via AR. The AR system was developed in collaboration with Woodside Energy Ltd. following research project ECHO. Evaluation of ...

122 citations


Journal ArticleDOI
Ying Wang1, Hong Hao
TL;DR: Both numerical and experimental verification results confirm that the proposed CS-based damage identification scheme will be a promising tool for structural health monitoring and will be one of the first few applications of this advanced technique to structural engineering areas.
Abstract: Civil infrastructures are critical to every nation, due to their substantial investment, long service period, and enormous negative impacts after failure. However, they inevitably deteriorate during their service lives. Therefore, methods capable of assessing conditions and identifying damage in a structure timely and accurately have drawn increasing attention. Recently, compressive sensing (CS), a significant breakthrough in signal processing, has been proposed to capture and represent compressible signals at a rate significantly below the traditional Nyquist rate. Due to its sound theoretical background and notable influence, this methodology has been successfully applied in many research areas. In order to explore its application in structural damage identification, a new CS-based damage identification scheme is proposed in this paper, by regarding damage identification problems as pattern classification problems. The time domain structural responses are transferred to the frequency domain as sparse representation, and then the numerical simulated data under various damage scenarios will be used to train a feature matrix as input information. This matrix can be used for damage identification through an optimization process. This will be one of the first few applications of this advanced technique to structural engineering areas. In order to demonstrate its effectiveness, numerical simulation results on a complex pipe soil interaction model are used to train the parameters and then to identify the simulated pipe degradation damage and free-spanning damage. To further demonstrate the method, vibration tests of a steel pipe laid on the ground are carried out. The measured acceleration time histories are used for damage identification. Both numerical and experimental verification results confirm that the proposed damage identification scheme will be a promising tool for structural health monitoring.

106 citations


Journal ArticleDOI
TL;DR: The results indicated that NF-GMDH models could provide more accurate predictions than those obtained using model tree and traditional equations.
Abstract: In this paper, the neuro-fuzzy based group method of data handling (NF-GMDH) as an adaptive learning network was used to predict the scour process at pile groups due to waves. The NF-GMDH network was developed using the particle swarm optimization (PSO) algorithm and gravitational search algorithm (GSA). Effective parameters on the scour depth include sediment size, geometric property, pile spacing, arrangement of pile group, and wave characteristics upstream of group piles. Seven dimensionless parameters were obtained to define a functional relationship between input and output variables. Published data were compiled from the literature for the scour depth modeling due to waves. The efficiency of training stages for both NF-GMDH-PSO and NF-GMDH-GSA models were investigated. The results indicated that NF-GMDH models could provide more accurate predictions than those obtained using model tree and traditional equations.

99 citations


Journal ArticleDOI
TL;DR: The proposed information transformation (ITr) methodology utilizes a rule-based, semantic natural language processing (NLP) approach and a set of semantic mapping rules and conflict resolution (CoR) rules are used to enable the automation of the transformation process.
Abstract: To fully automate regulatory compliance checking of construction projects, regulatory requirements need to be automatically extracted from various construction regulatory documents and then transformed into a formalized format that enables automated reasoning. To address this need, the authors propose an approach for automatically extracting information from construction regulatory textual documents and transforming them into logic clauses that could be directly used for automated reasoning. This paper focuses on presenting the proposed information transformation (ITr) methodology and the corresponding algorithms. The proposed ITr methodology utilizes a rule-based, semantic natural language processing (NLP) approach. A set of semantic mapping (SeM) rules and conflict resolution (CoR) rules are used to enable the automation of the transformation process. Several syntactic text features (captured using NLP techniques) and semantic text features (captured using an ontology) are used in the SeM and Co...

90 citations


Journal ArticleDOI
TL;DR: This study presents a new technique that can simultaneously localize and quantify spalling defects on concrete surfaces using a terrestrial laser scanner and develops a defect classifier to automatically diagnose whether the investigated surface region is damaged, where the defect is located, and how large it is.
Abstract: During construction and maintenance of concrete structures, it is important to achieve and preserve good surface quality of their components. The current quality assessment for concrete surfaces, however, heavily relies on manual inspection, which is time demanding and costly. This study presents a new technique that can simultaneously localize and quantify spalling defects on concrete surfaces using a terrestrial laser scanner. Defect-sensitive features, which have complementary properties to each other, are developed and combined for improved localization and quantification of spalling defects. A defect classifier is developed to automatically diagnose whether the investigated surface region is damaged, where the defect is located, and how large it is. Numerical simulations and experiments are conducted to demonstrate the effectiveness of the proposed defect-detection technique. Furthermore, a parametric study with varying scan parameters is performed for optimal detection performance. The resul...

81 citations


Journal ArticleDOI
TL;DR: In this paper, the feasibility of measuring the operational efficiency of equipment using low-cost accelerometers was examined, and several classifiers using these features were tested to classify equipment operation into engine-off, idling, and working modes.
Abstract: Monitoring the operational efficiency of construction equipment offers great opportunities to enhance not only the productivity but also the environmental performance of construction operations However, existing enabling technologies still suffer from a lack of economic feasibility, as well as technological compatibility with equipment fleets that are outdated or that consist of diverse manufacturers’ models In this context, this paper examines the feasibility of measuring the operational efficiency of equipment using low-cost accelerometers Acceleration data in three axes were collected from a real-world operation of excavators that performed various duty cycles Multiple features were calculated from acceleration data, and several classifiers using these features were tested to classify equipment operation into engine-off, idling, and working modes An accuracy of over 93% was obtained in the classification of excavators’ operation This result has demonstrated that the application of low-cos

71 citations


Journal ArticleDOI
TL;DR: Automated data processing is focused on to convert motion data into available data in existing biomechanical analysis tools, given the BVH motion data from vision-based approaches.
Abstract: Work-related musculoskeletal disorders (WMSDs) are one of the major health issues that workers frequently experience due to awkward postures or forceful exertions during construction tasks. Among available job analysis methods, biomechanical models have been widely applied to assess musculoskeletal risks that may contribute to the development of WMSDs based on motion data during occupational tasks. Recently, with the advent of vision-based motion capture approaches, it has become possible to collect the motion data required for biomechanical analysis under real conditions. However, vision-based motion capture approaches have not been applied to biomechanical analysis because of compatibility issues in body models of the motion data and computerized biomechanical analysis tools. To address this issue, automated data processing is focused on to convert motion data into available data in existing biomechanical analysis tools, given the BVH motion data from vision-based approaches. To examine the feas...

71 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a method to automatically extract 3D points corresponding to as-built pipelines that occupy large areas of industrial plants from laser-scanned data, which consists of the following steps: preprocessing, segmentation of the 3D point cloud, feature extraction based on curvature computation, and pipeline classification.
Abstract: There has been a growing demand for the three-dimensional (3D) reconstruction of as-built pipelines. The as-built 3D pipeline reconstruction process consists of the measurement of an industrial plant, identification of pipelines, and generation of 3D models of the pipelines. Although measurement is now efficiently performed using laser-scanning technology, and in spite of significant progress in 3D pipeline model generation, the identification of pipelines from large and complex sets of laser-scanned data continues to pose a challenge. The aim of this study is to propose a method to automatically extract 3D points corresponding to as-built pipelines that occupy large areas of industrial plants from laser-scanned data. The proposed extraction method consists of the following steps: preprocessing, segmentation of the 3D point cloud, feature extraction based on curvature computation, and pipeline classification. An experiment was performed at an operating industrial plant to validate the proposed method. The experimental result revealed that the proposed method can indeed contribute to the automation of as-built 3D pipeline reconstruction.

68 citations


Journal ArticleDOI
TL;DR: This paper proposes a new algorithm for semantic segmentation and recognition of highway assets using video frames collected from a car-mounted camera, using a Semantic Texton Forest classifier.
Abstract: Efficient data collection of high-quantity and low-cost highway assets such as road signs, traffic signals, light poles, and guardrails is a critical element to the operation, maintenance, and preservation of transportation infrastructure systems. Despite its importance, current practice of highway asset data collection is time-consuming, subjective, and potentially unsafe. The high volume of the data that needs to be collected can also negatively impact the quality of the analysis. To address these limitations, this paper proposes a new algorithm for semantic segmentation and recognition of highway assets using video frames collected from a car-mounted camera. The proposed set of algorithms (1) takes the captured frames and using a pipeline of structure from motion and multiview stereo reconstructs a three-dimensional (3D) point cloud model of the highway and surrounding assets; (2) using a Semantic Texton Forest classifier, each geo-registered two-dimensional (2D) video frame at the pixel-level ...

61 citations


Journal ArticleDOI
TL;DR: A space discretization–based simulation framework is proposed to address critical issues of heterogeneous traffic, including discrete lane changes in the case of lane-based traffic and modeling of continuous lateral movements.
Abstract: Vehicles in homogeneous traffic follow lane-based movement and can be conveniently modeled using car-following and lane-changing models. The former deals with longitudinal movement behavior, while the latter deals with lateral movement behavior. However, typical heterogeneous traffic is characterized by the presence of multiple vehicle types and non-lane-based movement. Because of the off-centered positions of the vehicles, the following driver is not necessarily influenced by a single leader. Additionally, the following behavior of the subject vehicle depends on the type of the front vehicle. Unlike discrete lane changes in the case of lane-based traffic, heterogeneous traffic streams require modeling of continuous lateral movements. Hence, the existing driver behavioral models may not be able to represent the heterogeneous traffic behavior accurately enough. To address these critical issues of heterogeneous traffic, a space discretization–based simulation framework is proposed. The lane is divid...

Journal ArticleDOI
TL;DR: In this paper, the authors developed a systematic methodology and computer system for an optimal construction schedule simulation that minimizes overlapping activities for the enhancement of a project's operational performance, based on identifying overlapping activities, applying fuzzy theory, and analyzing risk levels for schedule overlap issues.
Abstract: As building information modeling (BIM) systems continue to be widely adopted, there is an increasing demand for an active construction schedule management system with more advanced decision-making capabilities. For example, if overlapping between construction activities is significant, the performance of construction operations for the corresponding activities may deteriorate. Thus, a viable construction schedule should be formulated in order to minimize overlapping of proximate construction activities. An active system can be regarded as a certain solution for this issue. The purpose of this study is to develop a systematic methodology and computer system for an optimal construction schedule simulation that minimize overlapping activities for the enhancement of a project’s operational performance. This study centers on identifying overlapping activities, applying fuzzy theory, and analyzing risk levels for schedule overlap issues. In addition, genetic algorithm (GA) theory is adopted for the mini...

Journal ArticleDOI
TL;DR: How finite-element software can be extended to include the modeling of structures under fire load is described, which has the added advantage of being cheap and easily accessible.
Abstract: Computational modeling of structures subjected to extreme static and dynamic loads (such as snow, wind, impact, and earthquake) using finite-element software are part of mainstream structural engineering curricula in universities (at least at graduate level), and many experts can be found in industry who routinely undertake such analyses. However, only a handful or institutions around the world teach structural response to fire (at any level) and only a few of the top consulting engineers in the world truly specialize in this niche area. Among the reasons for this are the lack of cheap and easily accessible software to carry out such analyses and the highly tedious nature of modeling the full (often coupled) sequence of a realistic fire scenario, heat transfer to structure and structural response (currently impossible using a single software). The authors in this paper describe how finite-element software can be extended to include the modeling of structures under fire load. The added advantage of...

Journal ArticleDOI
TL;DR: A novel method for automatically calculating the absolute scale of built infrastructure PCD is proposed, which is not simple, cost effective, or general enough to be considered practical for reconstructing both indoor and outdoor built infrastructure scenes.
Abstract: The global scale of point cloud data (PCD) generated through monocular photography and videogrammetry is unknown and can be calculated using at least one known dimension of the scene. Measuring one or more dimensions for this purpose induces a manual step in the three-dimensional reconstruction process; this increases the effort and reduces the speed of reconstructing scenes, and induces substantial human error in the process due to the high level of measurement accuracy needed. Other ways of measuring such dimensions are based on acquiring additional information by either using extra sensors or specific classes of objects existing in the scene; it was found that these solutions are not simple, cost effective, or general enough to be considered practical for reconstructing both indoor and outdoor built infrastructure scenes. To address the issue, this paper proposes a novel method for automatically calculating the absolute scale of built infrastructure PCD. A premeasured cube for outdoor scenes an...

Journal ArticleDOI
TL;DR: The degree to which the technique of kriging can be useful in forecasting AADT depends highly on an understanding of the decision-making variables, the relationship between the variables, and the practical limitations of the various kriged techniques and variogram models.
Abstract: Transportation planning requires the use of accurate traffic data to produce estimates of traffic volume predictions over time and space. The annual average daily traffic (AADT) data is an important component of transportation design, operation, policy analysis, and planning. The use of traffic volume forecasting models for the characterization, analysis, and estimation of transportation data has proven to be a useful method for reducing high costs, overcoming spatial constraints, and limiting the errors associated with data collection and analysis in transportation planning. The geostatistical kriging technique is a viable method for modeling and forecasting AADT. The degree to which the technique of kriging can be useful in forecasting AADT depends highly on an understanding of the decision-making variables, the relationship between the variables, and the practical limitations of the various kriging techniques and variogram models. This paper applied three different linear kriging techniques [si...

Journal ArticleDOI
TL;DR: A new schema is described, termed the graph data model (GDM), that can be used to employ semantic information, to extract, analyze, and present the topological relationships among 3D objects in 3D space, and to perform topological queries faster.
Abstract: The adoption of building information modeling (BIM) in construction has led to greater integration of architecture, engineering, construction/facility management (AEC/FM) stakeholders at the project design stage; the result being the incorporation of new complex tasks into construction applications. However, conventional two-dimensional (2D) and nonsemantic three-dimensional (3D) models cannot handle the topological analysis of 3D objects that is required by BIM, especially with regard to building elements. This article describes a new schema, termed the graph data model (GDM) that can be used to employ semantic information, to extract, analyze, and present the topological relationships among 3D objects in 3D space, and to perform topological queries faster. This GDM uses weighted graph principles for simplicity and incorporates an industry foundation classes (IFC)-based algorithm for automatic deduction of topological relationships. A prototype of GDM is implemented in a C# platform and verified ...

Journal ArticleDOI
TL;DR: In this article, the conditional probabilities for mode shift were used to model the O&M process of wind farms. But the model did not properly include the knowledge available and that may result in non-optimal strategies for management of the farm.
Abstract: Wind energy is a key renewable source, yet wind farms have relatively high cost compared with many traditional energy sources. Among the life cycle costs of wind farms, operation and maintenance (O&M) accounts for 25–30%, and an efficient strategy for management of turbines can significantly reduce the O&M cost. Wind turbines are subject to fatigue-induced degradation and need periodic inspections and repairs, which are usually performed through semiannual scheduled maintenance. However, better maintenance can be achieved by flexible policies based on prior knowledge of the degradation process and on data collected in the field by sensors and visual inspections. Traditional methods to model the O&M process, such as Markov decision processes (MDPs) and partially observable MDPs (POMDPs), have limitations that do not allow the model to properly include the knowledge available and that may result in nonoptimal strategies for management of the farm. Specifically, the conditional probabilities for mode...

Journal ArticleDOI
TL;DR: This research proposed a visualized environment to facilitate the discussion process of construction projects that includes a stationary display called BIM Table for displaying public information and for collaboration among disciplines, and multiple mobile devices for showing private information.
Abstract: Discussion is critical in identifying, predicting, and resolving potential problems in the field of construction. This process relies heavily on oral communication with the assistance of construction drawings, schedules, and other related documents. Because most construction projects include multiple working phases and involve multiple parties, it is difficult for participants to clearly grasp the whole picture of a construction site and to make accurate predictions about future activities. In this research, the authors proposed a visualized environment to facilitate the discussion process. It includes a stationary display called BIM Table for displaying public information and for collaboration among disciplines, and multiple mobile devices for showing private information. The authors employed augmented reality technologies to connect the BIM Table and the mobile devices as well as the public and private information. The authors named this discussion environment augmented reality and multiscreen (...

Journal ArticleDOI
Sio-Song Ieng1
TL;DR: In this paper, the authors presented an algorithm that estimates the influence line (IL) of a bridge using data collected when trucks pass over the sensors installed in the bridge and tested with data collected from the Millau Viaduct in France using a bridge weigh-in-motion (B-WIM) device.
Abstract: This paper presents an algorithm that estimates the influence line (IL) of a bridge using data collected when trucks pass over the sensors installed in the bridge. The algorithm is tested with data collected from the Millau Viaduct in France using a bridge weigh-in-motion (B-WIM) device. The algorithm uses the maximum likelihood estimation (MLE) and is compared with an old algorithm. The algorithm is more robust because it takes into account many signals for the estimation of the IL.

Journal ArticleDOI
TL;DR: In this article, a shuffled frog-leaping model is proposed to solve complex time-cost-resource optimization problems in construction project planning, which considers the simultaneous optimization of three important objective functions in project planning.
Abstract: Time-cost-resources usage variation trade-off analysis is one of the most challenging tasks of construction project planners. Project planners face complicated multivariate, time-cost-resource optimization (TCRO) problems, which require simultaneous minimization of total project duration and total project cost, while considering issues related to optimal resource allocation and leveling. The authors present a shuffled frog-leaping model to solve complex TCRO problems in construction project planning. The proposed model is different from existing optimization models in construction project planning. The proposed shuffled frog-leaping model considers the simultaneous optimization of three important objective functions in project planning: (1) minimizing total project duration, (2) total project cost, and (3) total variation of resource allocation. This model finds optimal Pareto fronts of project planning solutions in the three-dimensional space of total project duration, total project cost, and tot...

Journal ArticleDOI
TL;DR: In this article, a Monte Carlo simulation-based algorithm was used to optimize work-rest schedule in a hot and humid environment, which may maximize the direct-work rates and minimize the health hazard due to heat stress to the workers concerned.
Abstract: Having established a Monte Carlo simulation-based algorithm to optimize work–rest schedule in a hot and humid environment, this paper attempts to develop the algorithm and identify an optimal work pattern, which may maximize the direct-work rates and minimize the health hazard due to heat stress to the workers concerned. Traditionally, construction workers in Hong Kong start work at 8:00 a.m. and finish work at 6:00 p.m., having one hour lunch break between 12:00 p.m. and 1:00 p.m., and an additional break of 30 min at 3:15 p.m. Construction workers can beat the heat by starting earlier to avoid some extreme conditions, which may occur at certain times of a day. By maintaining the current practice of 9-h working duration for a day, 21 additional work patterns with different start and finish times were proposed and evaluated by the developed optimization algorithm. An optimized schedule (direct-work rate of 87.8%) of working from 7:30 a.m. to 12:00 p.m. with a 20 min break at 9:40 a.m., having lunc...

Journal ArticleDOI
TL;DR: In this article, a general framework and software system to support automated analysis of sewer inspection closed-circuit television (CCTV) videos is discussed. But the proposed system aims primarily to support the off-site review and quality control process of the videos and to enable efficient reevaluation of archived CCTV videos to extract historical sewer condition data.
Abstract: This paper discusses the development of a general framework and software system to support automated analysis of sewer inspection closed-circuit television (CCTV) videos. The proposed system aims primarily to support the off-site review and quality control process of the videos and to enable efficient reevaluation of archived CCTV videos to extract historical sewer condition data. Automated analysis of sewer CCTV videos poses several challenges including the nonuniformity of camera motion and illumination conditions inside the sewer. The paper presents a novel algorithm for optical flow-based camera motion tracking to automatically identify, locate, and extract a limited set of video segments, called regions of interest (ROI), that likely include defects, thus reducing the time and computational requirements needed for video processing. The proposed algorithm attempts to recover the operator actions during the inspection session, which would enable determining the location and relative severity of...

Journal ArticleDOI
TL;DR: In this paper, a stochastic, time-dependent integer program with recursive functions is proposed for the problem of assessing a rail-based freight transportation system's resilience to disaster events, and two solution methods are presented, both employing a decomposition approach that eliminates the need for recursive computations.
Abstract: A stochastic, time-dependent integer program with recursive functions is proposed for the problem of assessing a rail-based freight transportation system’s resilience to disaster events. This work adds to this notion of resilience by explicitly considering that only limited resources will be available to support recovery activities, and their simultaneous implementation assumed in the prior work may not be possible. That is, the order in which recovery actions are taken can greatly affect gains achieved in capacity recovery over time. By developing an optimal schedule for a set of chosen recovery actions for each potential disaster scenario, the proposed model provides a more accurate depiction of the system’s resilience to disaster. Two solution methods are presented, both employing a decomposition approach that eliminates the need for recursive computations. The first is an exact decomposition with branch-and-cut methodology, and the second is a hybrid genetic algorithm that evaluates each chromosome’s fitness based on optimal objective values to the time-dependent maximum flow subproblem. Algorithm performance is assessed on a test network.

Journal ArticleDOI
TL;DR: In this paper, an automated geographic information system (GIS) method using post-event point cloud data collected by terrestrial scanners and preevent aerial images was used to calculate the percentage of roof and wall damage and estimate wind speeds at an individual building scale.
Abstract: There are more than 1,000 tornadoes in the United States each year, yet engineers do not typically design for tornadoes because of insufficient information about wind loads. Collecting building-level damage data in the aftermath of tornadoes can improve the understanding of tornado winds, but these data are difficult to collect because of safety, time, and access constraints. This study presents and tests an automated geographic information system (GIS) method using postevent point cloud data collected by terrestrial scanners and preevent aerial images to calculate the percentage of roof and wall damage and estimate wind speeds at an individual building scale. Simulations determined that for typical point cloud density (>25 points/m2), a GIS raster cell size of 40–50 cm resulted in less than 10% error in damaged roof and wall detection. Data collected after recent tornadoes were used to correlate wind speed estimates and the percent of detected damage. The developed method estimated wind speeds f...

Journal ArticleDOI
TL;DR: This study proposes an automated information (about past accident case) retrieval system that can automatically search for and provide (as a push system) similar accident cases.
Abstract: The repetitive occurrence of similar accidents in construction is one of the most prevalent characteristics of construction accidents. Similar accident cases provide direct information for determining the risks of scheduled activities and for planning safety countermeasures. Moreover, understanding these cases gives laborers the chance to evade and prepare for an expected accident in their workspaces. Researchers have developed many systems in order to retrieve and use past accident cases. Although the developed systems have a clear and limited target (user), most of them were developed under retrieval methods based on ad hoc systems, which can cause inconvenience to users in using the retrieval system. To overcome these limitations, this study proposes an automated information (about past accident case) retrieval system that can automatically search for and provide (as a push system) similar accident cases. The retrieval system extracts building information modeling (BIM) objects and composes a q...

Journal ArticleDOI
TL;DR: In this paper, a process management framework for multisensory data fusion for the purpose of tracking the progress of construction activity is presented, which facilitates the required type of data fusion at any given point in the construction progress, reliably and efficiently.
Abstract: This paper presents a process management framework for multisensory data fusion for the purpose of tracking the progress of construction activity. The developed framework facilitates the required type of data fusion at any given point in the construction progress, reliably and efficiently. Data are acquired from high-frequency automated technologies such as three-dimensional (3D) imaging and ultrawideband (UWB) positioning, in addition to foreman reports, schedule information, and other information sources. The results of validation through a detailed field implementation project show that the developed framework for fusing volumetric, positioning, and project control data can successfully address the challenges associated with fusing multisensory data by tracking activities rather than objects, a feature that offers superior capability, efficiency, and accuracy over the length of the project. Other contributions of this research include the development of fusion processes that are performed at hi...

Journal ArticleDOI
TL;DR: In this article, a recently developed meta-heuristic algorithm, called colliding bodies optimization (CBO), was introduced for optimal design of truss structure with dynamic constraints, where each agent solution is considered as a massed object or body.
Abstract: This paper focuses on the introduction of a recently developed meta-heuristic algorithm, which is called colliding bodies optimization (CBO), and its utility for optimal design of truss structure with dynamic constraints. The idea of the CBO is derived from one-dimensional collisions between bodies, in which each agent solution is considered as a massed object or body. Optimization of structures with dynamic frequency constraints, attributable to having nonconvex and numerous local optima in its search space, is a suitable field for testing algorithms. Numerical results demonstrate the effectiveness of this meta-heuristic algorithm for this field of optimization. Comparative studies illustrate the superiority the CBO algorithm compared with those previously reported in the literature. A parametric study is also conducted to investigate the effect of some parameters on the optimal weight of the structures.

Journal ArticleDOI
TL;DR: The use of text classifiers for automatically classifying documents according to their corresponding group of semantically related documents is evaluated, with the highest performance in terms of classification accuracy achieved by a Rocchio classifier and a kNN classifier with the application of dimensionality reduction and using the tf-idf weighting method.
Abstract: Organizing construction project documents based on semantic similarities offers several advantages over traditional metadata criteria, including facilitating document retrieval and enhancing knowledge reuse. In this study, the use of text classifiers for automatically classifying documents according to their corresponding group of semantically related documents is evaluated. Supporting documents of claims were used as representations of document discourses. The evaluation was performed under varying general conditions (such as dimensionality level and weighting method) to assess the effect of such conditions on performance, and varying classifier-specific parameters. The highest performance in terms of classification accuracy was achieved by a Rocchio classifier and a kNN classifier with the application of dimensionality reduction and using the tf-idf weighting method. A combined classifier approach was also evaluated in which the classification outcome is based on a majority vote strategy between...

Journal ArticleDOI
TL;DR: In addition to inherent degradation with time, geologic hazards such as coastal erosion, landslides, and seismic activity constantly threaten public infrastructure as mentioned in this paper, such as road and rail networks.
Abstract: In addition to inherent degradation with time, geologic hazards such as coastal erosion, landslides, and seismic activity constantly threaten public infrastructure. Repeat surveys using ter...

Journal ArticleDOI
TL;DR: A novel, intelligent back analysis method combining fuzzy systems, imperialistic competitive algorithm, and numerical analysis introduced the particle swarm optimization tuned fuzzy model as the most accurate intelligent model.
Abstract: Tunnels are often designed by uncertain geotechnical data. In order to reduce these uncertainties, back analysis is commonly selected to re-estimate the assumed parameters. This paper presents a novel, intelligent back analysis method combining fuzzy systems, imperialistic competitive algorithm, and numerical analysis. The proposed methodology comprises three phases. First, a database of a real case study and numerical analysis are used to develop the training and testing data of the study. In the second phase, the nonlinear relationship of two sets of parameters, including geomechanical parameters of the soil mass and the zone stress conditions, with surface settlement is investigated by three fuzzy models. These models are designed by three methods including particle swarm optimization, imperialistic competitive algorithm, and integration of nearest neighborhood clustering with gradient descent training. In the last phase, imperialistic competitive algorithm is employed one more time to implement the back analysis procedure in the three tuned fuzzy models. Finally, verification of the models is done with the numerical analysis on the results of back analysis, and then the results are compared with the measured values of settlements. The results introduced the particle swarm optimization tuned fuzzy model as the most accurate intelligent model.