scispace - formally typeset
Search or ask a question

Showing papers on "Heap (data structure) published in 2021"


Journal ArticleDOI
TL;DR: A new improved optimization algorithm based on the Heap-based optimizer (HBO) has been proposed to estimate the unknown parameters of PEMFCs models using an objective function that minimizes the error between the measured and estimated data.

32 citations


Journal ArticleDOI
26 Aug 2021
TL;DR: The experimental results demonstrate that the proposed hybrid HBJSA outperforms the standard HBA and JSA and other reported techniques when handling the CHP economic dispatch.
Abstract: This paper proposes a hybrid algorithm that combines two prominent nature-inspired meta-heuristic strategies to solve the combined heat and power (CHP) economic dispatch. In this line, an innovative hybrid heap-based and jellyfish search algorithm (HBJSA) is developed to enhance the performance of two recent algorithms: heap-based algorithm (HBA) and jellyfish search algorithm (JSA). The proposed hybrid HBJSA seeks to make use of the explorative features of HBA and the exploitative features of the JSA to overcome some of the problems found in their standard forms. The proposed hybrid HBJSA, HBA, and JSA are validated and statistically compared by attempting to solve a real-world optimization issue of the CHP economic dispatch. It aims to satisfy the power and heat demands and minimize the whole fuel cost (WFC) of the power and heat generation units. Additionally, a series of operational and electrical constraints such as non-convex feasible operating regions of CHP and valve-point effects of power-only plants, respectively, are considered in solving such a problem. The proposed hybrid HBJSA, HBA, and JSA are employed on two medium systems, which are 24-unit and 48-unit systems, and two large systems, which are 84- and 96-unit systems. The experimental results demonstrate that the proposed hybrid HBJSA outperforms the standard HBA and JSA and other reported techniques when handling the CHP economic dispatch. Otherwise, comparative analyses are carried out to demonstrate the suggested HBJSA’s strong stability and robustness in determining the lowest minimum, average, and maximum WFC values compared to the HBA and JSA.

31 citations


Journal ArticleDOI
TL;DR: In this paper, a heap optimization algorithm (HOA) is used to solve the problem of optimal power flow (OPF) in the electricity networks, where the goal is to optimize the cost of fuel of the conventional generators under the system limitations.
Abstract: This paper presents a novel endeavor to use the Heap optimization algorithm (HOA) to solve the problem of optimal power flow (OPF) in the electricity networks. The key objective is to optimize the cost of fuel of the conventional generators under the system limitations. Various scenarios are studied in a later stage considering the addition of the PV panel and/or wind farm with changing load curves during a typical day. The active output power of the generators is selected to be the OPF problem search space. The HOA is employed to get the best solution of the fitness function and provides the corresponding best solutions. The modeling of the heap-based optimizer (HBO) depends on three levels: the relation between the subordinates and the boss, the relation between the same level employees, and the contribution of the employee oneself. The validity of the proposed algorithm is tested for a variety of electric grids, the IEEE 30-bus, IEEE 57-bus and 118 bus networks. These networks are simulated under various scenarios. Real load curves, in this study, are considered to achieve a practical outcome. The simulation outcomes are evaluated and tested. The results indicate that the implemented HOA-based OPF methodology is flexible and applicable compared with that achieved by using the genetic algorithm.

24 citations


Journal ArticleDOI
TL;DR: In this article, an improved heap-based optimizer (IHBO) is proposed to improve the performance of a recently published technique called Heap-based Optimizer (HBO), and two algorithms based on the original HBO and IHBO are developed for solving OPRD problem.
Abstract: Optimal reactive power dispatch (ORPD) in a typical power system is a complicated multi-objective optimization problem. The proper modeling of the multi-objective optimization problem has a significant impact on system operation and control. In this paper, an Improved Heap-based optimizer (IHBO) is proposed to improve the performance of a recently published technique called Heap-based optimizer (HBO). In addition, two algorithms based on the original HBO and IHBO are developed for solving OPRD problem. Pareto front approach is utilized in the proposed OPRD algorithm with the aim of solving two or three objective functions simultaneously. The performance of HBO is improved by utilizing the chaotic sequences with the aim of improving its global search capability and avoiding getting stuck in a local optimum. Both original HBO and proposed IHBO are applied to determine the optimal settings of the generator’s voltages, shunt capacitor reactive power, and tap settings of transformers. Therefore, this study aims for minimizing three most objective functions of the real power loss, total voltage deviation (TVD) and voltage stability index (VSI), with satisfying different operational constraints. The effectiveness of the IHBO is tested on three test systems IEEE 30-bus, IEEE 57-bus, and IEEE 118-bus test systems. The results of the proposed IHBO are compared with recently published algorithms in the literature. The simulation results proven the superiority and robustness of IHBO in solving the ORPD problem.

22 citations


Journal ArticleDOI
TL;DR: In this paper, an improved A-Star algorithm based on terrain data is proposed to overcome the limitation of poor processing times for long-distance off-road path planning, and an optimized search strategy was designed that does not check whether the destination is reached in the initial stage of searching for the global optimal path, thus improving execution efficiency.
Abstract: To overcome the limitation of poor processing times for long-distance off-road path planning, an improved A-Star algorithm based on terrain data is proposed in this study. The improved A-Star algorithm for long-distance off-road path planning tasks was developed to identify a feasible path between the start and destination based on a terrain data map generated using a digital elevation model. This study optimised the algorithm in two aspects: data structure, retrieval strategy. First, a hybrid data structure of the minimum heap and 2D array greatly reduces the time complexity of the algorithm. Second, an optimised search strategy was designed that does not check whether the destination is reached in the initial stage of searching for the global optimal path, thus improving execution efficiency. To evaluate the efficiency of the proposed algorithm, three different off-road path planning tasks were examined for short-, medium-, and long-distance path planning tasks. Each group of tasks corresponded to three different off-road vehicles, and nine groups of experiments were conducted. The experimental results show that the processing efficiency of the proposed algorithm is significantly better than that of the conventional A-Star algorithm. Compared with the conventional A-Star algorithm, the path planning efficiency of the improved A-Star algorithm was accelerated by at least 4.6 times, and the maximum acceleration reached was 550 times for long-distance off-road path planning. The simulation results show that the efficiency of long-distance off-road path planning was greatly improved by using the improved algorithm.

17 citations


Journal ArticleDOI
15 Dec 2021-Energy
TL;DR: EHBO is faster and more accurate than the classical HBO in achieving the global optimum as well as balancing exploration and exploitation abilities and is a promising optimization method to deal with uncertain parameters of SGUs with different technologies.

16 citations


Journal ArticleDOI
30 Aug 2021-Energies
TL;DR: Results obtained show that the proposed QOHBO-PF technique has less computation time, further enhancement of reliability in the presence of PVG, and has the ability to provide multiple PF solutions that can be utilized for voltage stability analysis.
Abstract: Load flow analysis is an essential tool for the reliable planning and operation of interconnected power systems. The constant increase in power demand, apart from the increased intermittency in power generation due to renewable energy sources without proportionate augmentation in transmission system infrastructure, has driven the power systems to function nearer to their limits. Though the power flow (PF) solution may exist in such circumstances, the traditional Newton–Raphson based PF techniques may fail due to computational difficulties owing to the singularity of the Jacobian Matrix during critical conditions and faces difficulties in solving ill-conditioned systems. To address these problems and to assess the impact of large-scale photovoltaic generator (PVG) integration in power systems on power flow studies, a derivative-free quasi-oppositional heap-based optimization (HBO) (QOHBO) technique is proposed in the present paper. In the proposed approach, the concept of quasi-oppositional learning is applied to HBO to enhance the convergence speed. The efficacy and effectiveness of the proposed QOHBO-PF technique are verified by applying it to the standard IEEE and ill-conditioned systems. The robustness of the algorithm is validated under the maximum loadability limits and high R/X ratios, comparing the results with other well-known methods suggested in the literature. The results thus obtained show that the proposed QOHBO-PF technique has less computation time, further enhancement of reliability in the presence of PVG, and has the ability to provide multiple PF solutions that can be utilized for voltage stability analysis.

15 citations


Journal ArticleDOI
TL;DR: Heap bio-leaching is a microbial technology that catalyzes the decomposition of ore without grinding as discussed by the authors, where the crushed ore is stacked on the liner, and the microbial solution flows through the heap from...
Abstract: Heap bioleaching is a microbial technology that catalyzes the decomposition of ore without grinding. The crushed ore is stacked on the liner, and the microbial solution flows through the heap from ...

14 citations


Journal ArticleDOI
TL;DR: In this article, the authors derived an expression for the angle of repose between the sloping side of a heap of particles and the horizontal line of the horizontal, which is the most important observation characterizing the packing and flowability of a granular material.
Abstract: The angle of repose—i.e., the angle θ r between the sloping side of a heap of particles and the horizontal—provides one of the most important observables characterizing the packing and flowability of a granular material. However, this angle is determined by still poorly understood particle-scale processes, as the interactions between particles in the heap cause resistance to roll and slide under the action of gravity. A theoretical expression that predicts θ r as a function of particle size and gravity would have impact in the engineering, environmental, and planetary sciences. Here we present such an expression, which we have derived from particle-based numerical simulations that account for both sliding and rolling resistance, as well as for nonbonded attractive particle–particle interactions (van der Waals). Our expression is simple and reproduces the angle of repose of experimental conical heaps as a function of particle size, as well as θ r obtained from our simulations with gravity from 0.06 to 100 times that of Earth. Furthermore, we find that heaps undergo a transition from conical to irregular shape when the cohesive to gravitational force ratio exceeds a critical value, thus providing a proxy for particle-scale interactions from heap morphology.

13 citations


Book ChapterDOI
18 Jul 2021
TL;DR: Go as discussed by the authors is an imperative language targeting concurrent and distributed systems that offers structural subtyping and lightweight concurrency through goroutines with message-passing communication, which poses interesting challenges for static verification, most prominently the combination of a mutable heap and advanced concurrency primitives.
Abstract: Go is an increasingly-popular systems programming language targeting, especially, concurrent and distributed systems. Go differentiates itself from other imperative languages by offering structural subtyping and lightweight concurrency through goroutines with message-passing communication. This combination of features poses interesting challenges for static verification, most prominently the combination of a mutable heap and advanced concurrency primitives.

12 citations


Journal ArticleDOI
TL;DR: The study results show that the proposed optimization algorithm of IHBO is better than the artificial electric field algorithm, the grey wolf optimizer, Harris hawks optimization, and the original HBO algorithm.
Abstract: The hybrid microgrid system is considered one of the best solution methods for many problems, such as the electricity problem in regions without electricity, to minimize pollution and the depletion of fossil sources. This study aims to propose and implement a new algorithm called improved heap-based optimizer (IHBO). The objective of minimizing the microgrid system cost is to reduce the net present cost while respecting the reliability, power availability, and renewable fraction factors of the microgrid system. The results show that the PV/diesel/battery hybrid renewable energy system (HRES) gives the best solution, with a net present cost of MAD 120463, equivalent to the energy cost of MAD 0.1384/kWh. The reliability is about 3.89%, the renewable fraction is about 95%, and the power availability is near to 99%. The optimal size considered is represented as 167.3864 m2 of PV area, which is equivalent to 44.2582 kW and 3.8860 kW of diesel capacity. The study results show that the proposed optimization algorithm of IHBO is better than the artificial electric field algorithm, the grey wolf optimizer, Harris hawks optimization, and the original HBO algorithm. A comparison of the net present cost with a different fuel price is carried out, in which it is observed that the net present cost is reduced even though its quantity used is mediocre.

Journal ArticleDOI
TL;DR: In this article, the column bioleaching of copper flotation tailings was comparatively investigated using layered heap construction method (LM), agglomerate heap construction (AM), and pellets-sintering heap construction methods (PM).

DOI
01 Jan 2021
TL;DR: Saabunud and Veebis as discussed by the authors proposed an approach to deal with the problem of Aktsepteeritud, and accepted 22.06.2021, the deadline for acceptance.
Abstract: Saabunud / Received 28.05.2021 ; Aktsepteeritud / Accepted 22.06.2021 ; Avaldatud veebis / Published online 22.06.2021 ; Vastutav autor / Corresponding author: Juri Olt jyri.olt@emu.ee

Journal ArticleDOI
TL;DR: In this article, a 3D photogrammetry with area-weighted average slope calculation on a triangular mesh is proposed to measure the angle of repose (AoR) in granular materials.

Book ChapterDOI
01 Jan 2021
TL;DR: In 2015, Imogen Heap, Grammy Award winner and British female singer and songwriter, sold her MP3-format music works to users by directly using the e-wallet to receive the digital currency Ethereum using blockchain technology as mentioned in this paper.
Abstract: In 2015, Imogen Heap, Grammy Award winner and British female singer and songwriter, sold her MP3-format music works to users by directly using the e-wallet to receive the digital currency Ethereum using blockchain technology.

Journal ArticleDOI
TL;DR: In this paper, a security method to protect IoT system operation from memory heap penetration and address modification attack is proposed, which prevents directed attack by encrypting the object Garbage Collection at run time.
Abstract: The extensive networking of devices and the large amount of data generated from the Internet of Things (IoT) has brought security issues to the attention of the researcher. Java is the most common platform for embedded applications such as IoT, Wireless Sensors Networks (WSN), Near Field Communications (NFC) and Radio Frequency Identification (RFID). The object programming languages such as Java, SWIFT, PHP and C++ use garbage collection after any object run which creates security loophole for attacks such as Next Memory Address Occupation (NMAO), memory replay, Learning Tasks Behaviors (LTB). The security risk increases in IoT when attacks exceeds the target device to the surrounding connected devices. Inappropriate or wrong operations causes energy loss and increased costs. In this paper, a security method to protect IoT system operation from memory heap penetration and address modification attack is proposed. The proposed method prevents directed attack by encrypting the object Garbage Collection at run time. To form a unique signature mechanism, the Cryptographic Hash Function (CHF) which employs a specific one-way hash algorithm. The proposed framework uses L-function based ECC and one-time Key (OTK) to secure the memory heap. Our method is used with open system where the effect on the operating system is not considered. The proposed method proved to be powerful and efficient which can help in achieving higher levels of security across several IoT applications, by enabling better detection of malicious attacks.

Proceedings ArticleDOI
22 Jun 2021
TL;DR: MaPHeA as mentioned in this paper is a profile-guided heap allocation framework for both HPC and embedded systems that improves application performance by guiding and applying the optimized allocation of dynamically allocated heap objects with very low profiling overhead.
Abstract: Hardware performance monitoring units (PMUs) are a standard feature in modern microprocessors for high-performance computing (HPC) and embedded systems, by providing a rich set of microarchitectural event samplers. Recently, many profile-guided optimization (PGO) frameworks have exploited them to feature much lower profiling overhead than conventional instrumentation-based frameworks. However, existing PGO frameworks mostly focus on optimizing the layout of binaries and do not utilize rich information provided by the PMU about data access behaviors over the memory hierarchy. Thus, we propose MaPHeA, a lightweight Memory hierarchy-aware Profile-guided Heap Allocation framework applicable to both HPC and embedded systems. MaPHeA improves application performance by guiding and applying the optimized allocation of dynamically allocated heap objects with very low profiling overhead and without additional user intervention. To demonstrate the effectiveness of MaPHeA, we apply it to optimizing heap object allocation in an emerging DRAM-NVM heterogeneous memory system (HMS), and to selective huge-page utilization. In an HMS, by identifying and placing frequently accessed heap objects to the fast DRAM region, MaPHeA improves the performance of memory-intensive graph-processing and Redis workloads by 56.0% on average over the default configuration that uses DRAM as a hardware-managed cache of slow NVM. Also, by identifying large heap objects that cause frequent TLB misses and allocating them to huge pages, MaPHeA increases the performance of read and update operations of Redis by 10.6% over the transparent huge-page implementation of Linux.

Journal ArticleDOI
TL;DR: In this article, the authors argue that fragmentalism does not offer a new, satisfactory realistic account of the quantum state, at least along the lines proposed by Simon, and raise the question about whether there are some other viable forms of quantum fragmentalisms.
Abstract: Fragmentalism was originally introduced as a new A-theory of time. It was further refined and discussed, and different developments of the original insight have been proposed. In a celebrated paper, Jonathan Simon contends that fragmentalism delivers a new realist account of the quantum state—which he calls conservative realism—according to which: (i) the quantum state is a complete description of a physical system, (ii) the quantum (superposition) state is grounded in its terms, and (iii) the superposition terms are themselves grounded in local goings-on about the system in question. We will argue that fragmentalism, at least along the lines proposed by Simon, does not offer a new, satisfactory realistic account of the quantum state. This raises the question about whether there are some other viable forms of quantum fragmentalism.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a mean-based sorting algorithm that sorts integer/non-integer data by making approximately the same length independent quasi-sorted subarrays.
Abstract: Computer and communication systems and networks deal with many cases that require rearrangement of data either in descending or ascending order. This operation is called sorting, and the purpose of an efficient sorting algorithm is to reduce the computational complexity and time taken to perform the comparison, swapping, and assignment operations. In this article, we propose an efficient mean-based sorting algorithm that sorts integer/non-integer data by making approximately the same length independent quasi-sorted subarrays. It gradually finds sorted data and checks if the elements are partially sorted or have similar values. The elapsed time, the number of divisions and swaps, and the difference between the locations of the sorted and unsorted data in different samples demonstrate the superiority of the proposed algorithm to the Merge, Quick, Heap, and conventional mean-based sorts for both integer and non-integer large data sets which are random or partially/entirely sorted. Numerical analyses indicate that the mean-based pivot is appropriate for making subarrays with approximately similar lengths. Also, the complexity study shows that the proposed mean-based sorting algorithm offers a memory complexity same as the Quick-sort and a time complexity better than the Merge, Heap, and Quick sorts in the best-case. It is similar to the Merge and Heap sorts in view of the time complexity of the worst-case much better than the Quick-sort while these algorithms experience identical complexity in the average-case. In addition to finding part by part incremental (or decremental) sorted data before reaching the end, it can be implemented by parallel processing the sections running at the same time faster than the other conventional algorithms due to having independent subarrays with similar lengths.

Journal ArticleDOI
TL;DR: Heaps are algebraic structures endowed with para-associative ternary operations, bijectively exemplified by groups via the operation ( x, y, z ) as mentioned in this paper.
Abstract: Heaps are algebraic structures endowed with para-associative ternary operations, bijectively exemplified by groups via the operation ( x , y , z ) ↦ x y − 1 z . They are also ternary self-distribut...

Book ChapterDOI
18 Jul 2021
TL;DR: In this article, Wang et al. developed machine-checked verifications of the full functional correctness of C implementations of the eponymous graph algorithms of Dijkstra, Kruskal, and Prim.
Abstract: We develop machine-checked verifications of the full functional correctness of C implementations of the eponymous graph algorithms of Dijkstra, Kruskal, and Prim. We extend Wang et al.’s CertiGraph platform to reason about labels on edges, undirected graphs, and common spatial representations of edge-labeled graphs such as adjacency matrices and edge lists. We certify binary heaps, including Floyd’s bottom-up heap construction, heapsort, and increase/decrease priority.

Journal ArticleDOI
01 Jun 2021
TL;DR: This robotic excavation system achieves the world’s first autonomous completion of free-form embankments with high accuracy in a robotic landscape system that tightly connects survey, design and fabrication to exchange information in real-time during fabrication.
Abstract: Automating earth-moving tasks has the potential to resolve labour-shortage, allow for unseen designs and foster sustainability through using on-site materials. In this interdisciplinary project involving robotics and landscape architecture, we combine our previous work on autonomous excavation of free-form shapes, dynamic landscape design and terrain modelling tools into a robotic landscape system. It tightly connects survey, design and fabrication to exchange information in real-time during fabrication. We purposely built a LiDAR survey drone for tight integration. The design environment contains terrain modelling tools to balance cut and fill volumes for material-neutral, on-site construction. Its parametric nature allows it to adapt the geometry to changing site conditions during fabrication. Our autonomous walking excavator is used to create these free-form shapes in natural granular material. We propose an excavation planner for free-form embankments that computes the next excavation location and subsequently the location where the excavated soil should be dumped. This robotic excavation system achieves the world’s first autonomous completion of free-form embankments with high accuracy. A $$20\hbox { m}$$ 20 m long S-shaped and a two-faced embankment with a corner with roughly 0.03–0.05 m average error were created.

Journal ArticleDOI
01 Feb 2021
Abstract: The article deals with the problem of separation of combed heap of winter wheat with an experimental working unit consisting of a segregator and a sieve. In order to expand the range of information on the qualitative side of the functioning of the working unit, it is suggested to introduce an additional assessment parameter – the impurity separation efficiency coefficient. Experimental studies of the technological process of the working unit were carried out using the mathematical theory of experimental design, where the response function was represented by the functional dependence of the change in the impurity separation efficiency coefficient on the specific feed of the combed heap, the oscillation frequency of the working unit and the diameter of the sieve openings. To conduct the experimental studies, the Box-Behnken design was selected. Verification of the significance of the obtained coefficients according to Student’s t-test showed that all the coefficients were significant. The adequacy of the model was assessed according to Fisher’s test. As a result of the calculations, it was established that the model was adequate and could be used for further research.

Proceedings ArticleDOI
29 Jul 2021
TL;DR: In this article, a BIM big data storage framework based on the IFC standard, analyzed the inefficiency of the Huffman compression algorithm when constructing a binary tree, and proposed a construction process based on heap sorting.
Abstract: With the wide application of BIM, the mass BIM data generated has brought severe challenges to the storage, processing, and transmission on the limited bandwidth network of the model. In order to achieve smooth browsing and query of BIM data on the shared platform, this paper built a BIM big data storage framework based on the IFC standard, analyzed the inefficiency of the Huffman compression algorithm when constructing a binary tree, and proposed a construction process based on heap sorting. The proposed new algorithm was tested through experiments, and the results show that the efficiency of Huffman compression algorithm was improved when the amount of data was relatively large. In order to realize fast query based on compressed data, the idea of ElasticSearch+Hadoop server cluster is proposed.

Journal ArticleDOI
TL;DR: The helminth egg analysis platform (HEAP) as mentioned in this paper was developed as a user-friendly microscopic Helminth eggs identification and quantification platform to assist medical technicians during parasite infection examination.
Abstract: Background Millions of people throughout the world suffer from parasite infections. Traditionally, technicians use manual eye inspection of microscopic specimens to perform a parasite examination. However, manual operations have limitations that hinder the ability to obtain precise egg counts and cause inefficient identification of infected parasites on co-infections. The technician requirements for handling a large number of microscopic examinations in countries that have limited medical resources are substantial. We developed the helminth egg analysis platform (HEAP) as a user-friendly microscopic helminth eggs identification and quantification platform to assist medical technicians during parasite infection examination. Methods Multiple deep learning strategies including SSD (Single Shot MultiBox Detector), U-net, and Faster R-CNN (Faster Region-based Convolutional Neural Network) are integrated to identify the same specimen allowing users to choose the best predictions. An image binning and egg-in-edge algorithm based on pixel density detection was developed to increase the performance. Computers with different operation systems can be gathered to lower the computation time using our easy-to-deploy software architecture. Results A user-friendly interface is provided to substantially increase the efficiency of manual validation. To adapt to low-cost computers, we architected a distributed computing structure with high flexibilities. Conclusions HEAP serves not only as a prediction service provider but also as a parasitic egg database of microscopic helminth egg image collection, labeling data and pretrained models. All images and labeling resources are free and accessible at http://heap.cgu.edu.tw . HEAP can also be an ideal education and training resource for helminth egg examination.

Journal ArticleDOI
24 Jan 2021-Minerals
TL;DR: In this paper, the authors describe a method that is based on the implementation and setup of a mechatronic system that can recognize and detect, through thermal analysis, the zones where heap leaching piles may become locally saturated.
Abstract: This manuscript describes a method that is based on the implementation and setup of a mechatronic system that can recognize and detect, through thermal analysis, the zones where heap leaching piles may become locally saturated. Such a condition could trigger the potential of liquefaction, generating local or general collapse in the pile. In order to reduce this potential danger, and therefore achieve full stability in the pile, the irrigation system must be properly controlled; for instance, in potentially saturated zones, the irrigation flow can be reduced or eliminated until the saturation has disappeared. The mechatronic system consists of a hexacopter, equipped with a thermal infrared camera mounted on its structure and pointing down to the ground, which is used to obtain the temperature information of the heat transfer between the heap pile and the environment. Such information is very useful, as the level of saturated zones can then be traced. The communication between the operator of the irrigation system and the mechatronic system is based on a radio-frequency link, in which geo-referenced images are transmitted.

Journal ArticleDOI
13 Apr 2021
TL;DR: In this paper, it was shown that the Baer-Kaplansky Theorem can be extended to all abelian groups provided that the rings of endomorphisms of groups are replaced by trusses of the endomorphism of corresponding heaps.
Abstract: It is shown that the Baer–Kaplansky Theorem can be extended to all abelian groups provided that the rings of endomorphisms of groups are replaced by trusses of endomorphisms of corresponding heaps....

DOI
09 Apr 2021
TL;DR: The purpose of this research is to develop and analyze applications regarding the potential of Gunungkidul culture based on progressive web apps and which can be used without having to install applications and can run in all operating systems.
Abstract: Gunungkidul has various cultural potentials that make it a tourist destination. To make it easier for tourists to visit and get tourist destination information, several researchers developed a mobile application-based information system. However, mobile applications have several drawbacks as such as the user must install the application and can only be used on specific operating systems. The purpose of this research is to develop and analyze applications regarding the potential of Gunungkidul culture based on progressive web apps and which can be used without having to install applications and can run in all operating systems. The application development method uses Scrum and Ionic Framework as a framework for the application. The performance analysis method was tested on runtime performance (loading, scripting, rendering, painting, system) and memory usage (min JS Heap and max JS Heap) by using Chrome Developer Tools for 30 tests. The results of the development show that there are 7 features in the application, namely (1) Peta; (2) Geoheritage; (3) Daerah; (4) Cagar Budaya; (5) Kuliner; (6) Seni Adat Tradisi; (7) Agenda. Runtime performance and memory usage test results show the average value on aspects (1) Loading: 33.60 ms; (2) Scripting: 1069.20 ms; (3) Rendering: 84.90 ms; (4) Painting: 22.33; (5) System: 429.67 ms; (6) Min JS Heap: 8.05 MB; and (8) Max JS Heap: 19.51 MB.

Book ChapterDOI
06 Sep 2021
TL;DR: ComplexityParser as mentioned in this paper is a static complexity analyzer for Java programs providing the first implementation of a tier-based typing discipline, and it uses antlr to generate a parse tree on which it performs an efficient type inference: linear in the input size, provided that the method arity is bounded by some constant.
Abstract: ComplexityParser is a static complexity analyzer for Java programs providing the first implementation of a tier-based typing discipline. The input is a file containing Java classes. If the main method can be typed and, provided the program terminates, then the program is guaranteed to do so in polynomial time and hence also to have heap and stack sizes polynomially bounded. The application uses antlr to generate a parse tree on which it performs an efficient type inference: linear in the input size, provided that the method arity is bounded by some constant.

Journal ArticleDOI
TL;DR: In this article, the authors identify a temperature that could facilitate the composting process of green waste in the presence of lignocellulose content, which is a technical challenge obstructing the process.
Abstract: Owing to its high lignocellulose content, the recalcitrance of green waste is a technical challenge obstructing the composting process. This study aimed to identify a temperature that could facilit...