19 results on '"Hu, Wei"'
Search Results
2. Physical modelling of group behaviour of stone column foundations
- Author
-
Hu, Wei
- Subjects
624 ,TA Engineering (General). Civil engineering (General) - Published
- 1995
3. Decarbonization of Engineered Cementitious Composites (ECC)
- Author
-
Hu, Wei-Hsiu
- Subjects
- Engineered Cementitious Composites (ECC), Carbon Sequestration, Sustainable Low-carbon Materials, Industrial Waste Materials (IWMs), Embodied Carbon Footprint
- Abstract
Mitigating CO2 emissions has emerged as one of the most critical global challenges. The concrete industry comprises approximately 8% of global CO2 emissions due primarily to the large carbon footprint from ordinary Portland cement production. Concrete's brittle nature necessitates early infrastructure reconstruction and rehabilitation, leading to high operational embodied carbon emissions throughout its service life. Engineered cementitious composites (ECC) have demonstrated a capacity to enhance structural fatigue resistance and reduce CO2 emissions during the use phase through their high tensile performance and crack width control capability. However, ECC's high cement content and use of synthetic fiber incur substantial economic and environmental costs. Therefore, there is an urgent need to address ECC’s high embodied carbon footprint during the production phase if it is to be used as a sustainable alternative to traditional concrete. The goals of this doctoral research encompass the development of strategies to decarbonize ECC while maintaining its unique ductile performance and showcasing its economic and environmental competitiveness compared to regular concrete. Three major approaches are proposed in this research, including carbon sequestration through carbonation curing, the use of industrial waste materials (IWMs), and the employment of localized materials. The impacts of carbonation curing on ECC are investigated, such as changes to mechanical and micromechanical properties. With the incorporation of IWMs, a low-carbon sustainable WPE-ECC is designed by substituting virgin polyethylene fiber with waste polyethylene fiber (WPE) from waste marine fishing nets. The low carbon ECC’s mechanical properties, including compressive, tensile, flexural strength, and ductility, are examined. Considering the increasing cost and limited availability of commonly used IWMs such as fly ash and manufactured silica sand, a case study of the Kingdom of Saudi Arabia examines replacing these materials with locally available alternatives, namely volcanic ash and desert sand, to mitigate the embodied carbon and cost associated with long-distance material transportation. A localized self-stressing ECC is developed and optimized to mitigate challenges posed by alternative materials and ensure a sufficient working time window and mechanical performance of the ECC. The reductions of embodied carbon footprint and cost for each of these three approaches are quantified and compared to conventional concrete materials. Results indicate that carbonation curing significantly improves fatigue life and reduces the midspan deflection of ECC. CO2-cured ECC exhibits approximately 20% CO2 uptake per cement mass. Carbonation curing increases ECC’s flexural strength and promotes effective crack width control, resulting in reduced post-fatigue crack width. The positive impact of carbonation curing on the fatigue behavior of ECC can simultaneously lower the embodied and operational carbon of ECC structural members during service. In the case of the IWMs method, the findings suggest that carbonation-cured WPE-reinforced ECC has only 50% of the CO2 footprint and 67% of the cost of conventional concrete. Meanwhile, this low-carbon ECC maintains at least 4 MPa tensile strength and 6% tensile ductility, demonstrating the feasibility of developing environmentally-friendly construction materials without compromising high performance for civil infrastructure applications. Similarly, the localized self-stressing ECC exhibits comparable mechanical performance to other ECC grades, showing the feasibility of replacing FA and silica sand with locally available materials, resulting in a low-carbon ECC with promising implications for practical construction applications. This research provides three distinct approaches for ECC decarbonization that can be integrated with one another, offering a potential pathway into the construction industry that urgently needs to be decarbonized.
- Published
- 2023
4. The Use of QTL In Hebrew Aphorism
- Author
-
HU, Wei-Hua, primary
- Full Text
- View/download PDF
5. Threshold dynamics : analysis and applications
- Author
-
Hu, Wei, primary
- Full Text
- View/download PDF
6. Experimental search for high Curie temperature piezoelectric ceramics with combinatorial approaches
- Author
-
Hu, Wei, primary
- Full Text
- View/download PDF
7. Graph signal processing for compression and restoration of piecewise smooth images
- Author
-
Hu, Wei, primary
- Full Text
- View/download PDF
8. Data-driven metallurgical design for high strength low alloy (HSLA) steel
- Author
-
Hu, Wei, primary
- Full Text
- View/download PDF
9. Iowa wind resource assessment and analysis
- Author
-
Hu, Wei, primary
- Full Text
- View/download PDF
10. Urban Recreational Ecosystem Services Investigation based on Social Media Images
- Author
-
Hu, Wei
- Subjects
- social media, ecosystem, urban
- Abstract
Recreational ecosystem services (RES) are understood as the benefits that people derive from landscapes and natural environments through recreational activities. The growing social media datasets have contributed to overcoming limitations of spatial and temporal coverage for RES studies that traditional survey-based approaches have. Related RES research using social media such as photo-sharing platforms has primarily focused on natural and ecological areas outside cities at regional or national scale and utilized geotagged photographs as reliable proxies for empirical access rates. The urban dimension of RES is under-explored, and potential information about the environmental composition and user preferences in photos is overlooked. Using data retrieved from the photo-sharing platform Flickr, we explore the potential role of computer vision (CV) in understanding RES related to environmental composition and human activities. After that, we assess RES for the urban outdoor environment of Ann Arbor. Specifically, by manual validation of recognition results for 1,500 Flickr photographs, we evaluate whether scene recognition algorithms and models pre-trained with three different labeling systems on a standard CV dataset can be applied to tackle complex visual tasks in realistic urban scenarios. Contrary to consistent outstanding performance on standard CV datasets, we find substantial changes in the performance of recognizing physical environmental composition and human activities depending on the semantic scale the model uses for labeling. Via recognition results, we further study people’s preferences for environmental composition and outdoor activities and their associations, then detect popular RES places for different recreational usages in Ann Arbor with a high spatial resolution. This article concludes with the feasibility of applying pre-trained CV models for urban RES studying. Time and resource permitting, future studies should consider combining information from multiple sources for a more accurate evaluation of RES characteristics, thus can be integrated with decision making, planning, and management to enhance city planning and human well-being.
- Published
- 2022
11. Mirror Color Symmetry Breaking in Twin Higgs Model
- Author
-
Hu, Wei
- Abstract
Many conventional approaches to the hierarchy problem necessitate colored top partners around the TeV scale, in tension with bounds from direct searches. The Mirror Twin Higgs (MTH) model address this by positing top partners that are neutral under the Standard Model (SM) gauge group. The SM Higgs emerges as a pseudo Nambu Goldstone boson (pNGB) from a spontaneously broken accidental global symmetry. A crucial ingredient is a $\ztwo$ mirror symmetry that exchanges SM fields with partner fields with equal couplings, removing the quadratic UV sensitivity. However, an exact mirror symmetry is in conflict with Higgs coupling measurements, the $\ztwo$ must be broken to achieve a viable model. In this thesis, we describe a new dynamical approach. Starting from an exact $\ztwo$, we introduce an additional colored scalar field in the visible sector along with its twin partner field. Given a suitable potential, the mirror sector color scalar field obtains a vacuum expectation value and spontaneously breaks both the twin color gauge and $\ztwo$ symmetries. Meanwhile, dramatic differences between the twin and visible sectors occur, in terms of the residual unbroken gauge symmetries, strong confinement scales, and particle spectra. Assuming a single colored scalar of triplet, sextet, or octet we describe five minimal possibilities. In several cases there is a residual color symmetry, either $SU(2)_c$ or $SO(3)_c$, featuring a low confinement scale relative to $\Lambda_{\rm QCD}$. Furthermore, there can be one or more unbroken abelian gauge symmetries. Couplings between the colored scalar and matter are also allowed, providing a new source of twin fermion masses. It implies a fraternal-like scenario by lifting the first and second generation twin fermions. A variety of correlated visible sector effects can be probed through precision measurements and collider searches, coming from baryon and lepton number violation, flavor changing processes, CP-violation, electroweak measurements, Higgs couplings, and direct searches at the LHC. This opens up new possibilities for a viable twin Higgs cosmology with interesting implications for the dark sector physics.
- Published
- 2021
12. Statistical Learning for High-dimensional Imaging Data Analysis
- Author
-
Hu, Wei N/A
- Subjects
- Statistics
- Abstract
The past two decades have witnessed tremendous advancement in medical imaging techniques. The explosive growth of high-dimensional imaging data brings new challenges to statisticians. Machine learning has opened new horizons in a variety of tasks including image recognition and restoration, personalized medicine, medical image analysis and many others. However, machine learning systems remain mostly black boxes despite widespread adoption. Understanding the statistical properties and the predictions behind black-box models is crucial as it can help to interpret the analysis results. This dissertation dedicates to the development of new statistical learning methods for image data analysis and new insights in understanding block box predictive model behavior. We start by proposing a novel linear discriminant analysis approach for the classification of high-dimensional matrix-valued data that commonly arises from imaging studies. Motivated by the equivalence of the conventional linear discriminant analysis and the ordinary least squares, we consider an efficient nuclear norm penalized regression that encourages a low-rank structure. Theoretical properties including a non-asymptotic risk bound and a rank consistency result are established. Simulation studies and an application to electroencephalography data show the superior performance of the proposed method over the existing approaches. Next, we propose a novel nonparametric matrix response regression model to characterize the association between 2D image outcomes and predictors such as time and patient information. Our estimation procedure can be formulated as a nuclear norm regularization problem, which can capture the underlying low-rank structures of the dynamic 2D images. We develop an efficient algorithm to solve the optimization problem and introduce a Bayesian information criterion for our model to select the tuning parameters. Asymptotic theories including the risk bound and rank consistency are derived. We finally evaluate the empirical performance of our method using numerical simulations and real data applications from a calcium imaging study and an electroencephalography study. Finally we propose to trace the predictions of a black-box model back to the training data through a representation theorem calibrated on a continuous, low-dimensional latent space, making the model more transparent. We show that for a given test point and a certain class, the pre-activation prediction value can be decomposed into a sum of representer values, where each representer value corresponds to the importance of the training point on the model prediction. These representer values provide users a deeper understanding of how training points lead the machine learning system to the prediction. We further elaborate our method through theoretical studies, numerical experiments and applications such as debugging models.
- Published
- 2019
13. Evaluation of Intelligent Compaction Technology in Asphalt Pavement Construction and Laboratory Compaction
- Author
-
Hu, Wei
- Subjects
- Compaction Meter Value, Geostatistical analysis, Semivariogram, Intelligent compaction, Asphalt vibratory compactor, Asphalt pavement
- Abstract
While having been successfully used for soil compaction for many years, intelligent compaction (IC) technology is still relatively new for asphalt pavement construction. The potential of using intelligent compaction meter value (ICMV) for evaluating the compaction of asphalt pavements has been hindered by the fact that ICMV can be affected by many factors, which include not only roller operation parameters, but also the temperature of asphalt layer and the underlying support. Therefore, further research is necessary to improve the application of IC for the asphalt compaction. In this study, the feasibility of IC for asphalt compaction was evaluated from many aspects. Based on that, a laboratory IC technology for evaluating asphalt mixture compaction in the laboratory was also developed.In this study, one field project for soil compaction was constructed using IC technology, and a strong and stable linear relationship between ICMV and deflection could be identified when the water content of soil was consistent. After that, more field projects for asphalt compaction were constructed using the IC asphalt roller. The density of asphalt, as the most critical parameter for asphalt layers, along with other parameters, were measured and correlated with the ICMVs. Various factors such as asphalt temperature and the underlying support were considered in this study to improve the correlation between the density and ICMV. Based upon the results of correlation analyses, three IC parameters were recommended for evaluating the compaction quality of resurfacing project. In addition, the geostatistical analyses were performed to evaluate the spatial uniformity of compaction, and the cost-benefit analysis was included to demonstrate the economic benefits of IC technology.Based on the test results of field projects, the IC indices were further utilized to quantify the lab vibratory compaction for paving materials. The compaction processes in the laboratory was monitored by accelerometers. Using Discrete-Time Fourier Transform, the recorded data during compaction were analyzed to evaluate the compactability of paving materials and to further correlate to the field compaction.
- Published
- 2018
14. Synthesizing New Dielectric Elastomers for Actuation
- Author
-
Hu, Wei
- Subjects
- Materials Science
- Abstract
Dielectric elastomers can be actuated under electric field responding to electrostatic force. Compared with other electrical actuation technologies, the advantages of dielectric elastomer actuators include: light weight, good compliancy, large actuation strain, high energy density, quiet operation and low cost. As the active part of an actuator device, the dielectric elastomer material plays a central role. However, most popular elastomers used for dielectric actuation are commercial products designed for other applications. And their confidential formulations also make it difficult to understand the mechanism and further improve the actuation performances. Therefore, the development of new dielectric elastomers from molecular level is of great importance.One subject in this dissertation is the synthesis of a group new dielectric elastomers from molecular level which demonstrate high actuation strains. These dielectric elastomers are polyacrylate formulations with n-butyl acrylates as the based monomer and formed through ultra-violet polymerization. The influences of acrylic acid in the formulation on the mechanical and dielectric properties are investigated. The optimal formulation demonstrates an area actuation strain of 186 %, a dielectric strength of 222 MV/m and an energy density as high as 1.4 MJ/m3.As the dielectric constant of a dielectric elastomer plays a significant role in its actuation performances, one focus of this dissertation is to improve the dielectric constant by utilizing nanocomposites. Aluminum nanoparticles with a self-passivated oxide shell are used as the conductive fillers to increase the dielectric constant of a polyacrylate elastomer while retaining a high dielectric strength. With the addition of 4 vol% Al nanoparticles, the nanocomposite has a dielectric constant as high as 8.4 with a maximum actuation strain of 56 %, a dielectric strength of 140 MV/m and a maximum actuation pressure of 1.5 MPa. Another focus of this dissertation is the innovation of a dielectric elastomer with tunable stiffness. This novel elastomer contains furan-maleimide Diels-Alder adduct moieties as the dynamic bonding. The moduli of these elastomers can be tuned reversibly and incrementally through modulating their crosslinking densities via thermal treatments at moderate temperatures. Capacitive sensors and actuators which can work in multiple modes were fabricated using the new materials.
- Published
- 2015
15. Using Citation Influence and Social Network Analysis to Predict Software Defects
- Author
-
Hu, Wei
- Subjects
- Citation influence, Software defects, Social network analysis
- Abstract
Abstract: Resource constraints, e.g. lack of time and human resources, is a major issue in software testing practice. In short, testers have limited time to test software systems. Therefore, managers are expected to spend more resources on software components that are likely to contain many defects. To help managers make better decisions of selective testing, it is beneficial to identify defect-prone software components before the actual testing. In this thesis, we propose a model for software defect prediction. The proposed model combines the topological properties of the software dependency network and the textual information in source code to predict defect-prone software components. We evaluate our model on data from Eclipse, Netbeans, and Gnome projects at different levels of granularity. The evaluation results are encouraging, showing that our model achieves higher prediction accuracy than prior work.
- Published
- 2013
16. Sufficiency-based Filtering of Invariants for Sequential Equivalence Checking
- Author
-
Hu, Wei
- Subjects
- Invariant filtering, Assume and Verify, Boolean Satisfiability(SAT), Sequential Equivalence Checking(SEC)
- Abstract
Verification, as opposed to Testing and Post-Silicon Validation, is a critical step for Integrated Circuits (IC) Design, answering the question "Are we designing the right function?" before the chips are manufactured. One of the core areas of Verification is Equivalence Checking (EC), which is a special yet independent case of Model Checking (MC). Equivalence Checking aims to prove that two circuits, when fed with the same inputs, produce the exact same outputs. There are broadly two ways to conduct Equivalence Checking, simulation and Formal Equivalence Checking. Simulation requires one to try out different input combinations and observe if the two circuits produce the same output. Obviously, since it is not possible to enumerate all combinations of different inputs, completeness cannot be guaranteed. On the other hand, Formal Equivalence Checking can achieve 100% confidence. As the number of gates and in particular, the number of flip-flops, in circuits has grown tremendously during the recent years, the problem of Formal Equivalence Checking has become much harder â A recent evaluation of a general-case Formal Equivalence Checking engine [1] shows that about 15% of industrial designs cannot be verified after a typical sequential synthesis flow. As a result, a lot of attention on Formal Equivalence Checking has been drawn both academically and industrially. For years Combinational Equivalence Checking(CEC) has been the pervasive framework for Formal Equivalence Checking(FEC) in the industry. However, due to the limitation of being able to verify circuits only with 1:1 flip-flop pairing, a pure CEC-based methodology requires a full regression of the verification process, meaning that performing sequential optimizations like retiming or FSM re-encoding becomes somewhat of a bottleneck in the design cycle [2]. Therefore, a more powerful framework — Sequential Equivalence Checking (SEC) — has been gradually adopted in industry. In this thesis, we target on Sequential Equivalence Checking by finding efficient yet powerful group of relationships (invariants) among the signals of the two circuits being compared. In order to achieve a high success rate on some of the extremely hard-to-verify circuits, we are interested in both two-node and multi-node (up to 4 nodes) invariants. Also we are interested in invariants among both flip-flops and internal signals. For large circuits, there can be too many potential invariants requiring much time to prove. However, we observed that a large portion of them may not even contribute to equivalence checking. Moreover, equivalence checking can be significantly helped if there exists a method to check if a subset of potential invariants would be sufficient (e.g., whether two-nodes are enough or multi-nodes are also needed) prior to the verification step. Therefore, we propose two sufficiency-based approaches to identify useful invariants out of the initial potential invariants for SEC. Experimental results show that our approach can either demonstrate insufficiency of the invariants or select a small portion of them to successfully prove the equivalence property. Our approaches are quite case-independent and flexible. They can be applied on circuits with different synthesis techniques and combined with other techniques.
- Published
- 2011
17. VALUATION OF NON-TRANSFERABLE AND NON-HEDGEABLE CONTINGENT CLAIMS AND AN EXECUTIVE STOCK OPTIONS IMPLEMENTATION
- Author
-
Hu, Wei
- Subjects
- Stochastic discount factor, ESO, Constrained portfolio optimization, Credit risk, Reload option
- Abstract
As traditional contingent claims valuation methods do not apply to non-transferable and non-hedgeable contingent claims, recent proliferation of such claims creates the need for the development of new valuation methods. Further, the essential role of executive stock options (henceforth ESOs) in current economic system makes their valuation a necessity for optimally allocating resources and incentivizing executives. In this thesis, I offer a novel method of valuating non-transferable, non-hedgeable (henceforth NTNH) contingent claims and then implement this method in pricing ESOs. I find that NTNH constraints break the local co-linearity caused by including contingent claims in solving the portfolio optimization problems. Thus, I am able to translate the portfolios that include contingent claims optimization problems into primary assets only portfolios optimization problems, by replicating contingent claims using primary assets. I integrate the NTNH constraints into one single rectangular constraint, under which solving the portfolio optimization problem identifies the pricing stochastic discount factor. I then use this stochastic discount factor to price the NTNH contingent claims and implement the method in pricing ESOs. I investigate both block exercise and continuous partial exercise, and derive the first order conditions with respect to optimal exercise rates for continuous partial exercise case. The priced assets could also be pensions, human capital, real estate, etc. I also address default NTNH contingent claim valuation. I extend the above model by introducing default primary assets which help replicating default contingent claims. Again, I derive a stochastic discount factor to price the default NTNH contingent claims. I implement the valuation method in pricing ESOs with job termination. Finally, I apply NTNH contingent claims valuation method to reload options pricing, again, by replication, solving portfolio optimization problems, and identifying the appropriate stochastic discount factor. I start in an unconstrained setting and find that as the frequency of reload increases the optimal reload policy evolves from an optimal stopping time into a barrier hitting time. The barrier is the historically high price, and the number of replicating shares converges. As the mature- stock-for-strike convention being added into the reload option exercise policy, if the vesting period for stock and option are optimally chosen, then the option quality measure, the incentive per unit dead weight cost will be increased.
- Published
- 2011
18. Spatial Control of Gene Delivery on Bioengineered Scaffolds for Tissue Regeneration.
- Author
-
Hu, Wei-Wen
- Subjects
- Tissue Engineering, Adenovirus, Gene Therapy, Bioconjugation, Surface Modification, Biomaterial Scaffolds
- Abstract
Different gene delivery systems were developed in this dissertation to promote tissue regeneration by regenerative in vivo gene therapy. A local virus delivery method was developed using a lyophilized adenovirus formulation to restrict viral vector delivery in and around biomaterials. This strategy may reduce the dispersion of virus to avoid unwanted systemic infection and decrease the viral concentration within scaffolds. We also determined that virus bioactivity can be preserved for long-term storage using this method, which allows freeze-dried adenoviruses to be incorporated with biomaterials as a pre-made construct to be use at the time of surgery. This delivery has been applied to successfully repair not only critical-sized craniofacial defects, but also osteonecrosis caused by radiation therapy. To enhance the spatial control of gene delivery, two different strategies were established to effectively bind viral vectors on scaffold surfaces. Avidin-biotin and antibody-antigen interactions were used to mediate virus immobilization. By binding viral vectors to biomaterials, only cells that adhered and proliferated on scaffolds would be transduced to express bioactive signals. Furthermore, a wax masking technique was introduced to control the bioconjugation on defined regions of biomaterials for spatially controlling transgene expression. In order to broadly apply the immobilized gene delivery methods to different biomaterial scaffolds, chemical vapor deposition (CVD) polymerization was utilized to functionalize inert biomaterial, poly-ε-caprolactone (PCL), surfaces for immobilization of cell-signaling viruses. This surface modification was able to be performed on 2-D and 3-D structures. Through these controlled gene delivery systems, bioactive factors may be precisely expressed to engineer distinct tissue interfaces.
- Published
- 2009
19. Cross-Cultural Impact and Learning Needs for Expatriate Hotel Employees in Taiwan Lodging Industry
- Author
-
Hu, Wei-Tang
- Published
- 2002
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.