Institution
Center for Discrete Mathematics and Theoretical Computer Science
Facility•Piscataway, New Jersey, United States•
About: Center for Discrete Mathematics and Theoretical Computer Science is a facility organization based out in Piscataway, New Jersey, United States. It is known for research contribution in the topics: Local search (optimization) & Optimization problem. The organization has 140 authors who have published 175 publications receiving 2345 citations.
Topics: Local search (optimization), Optimization problem, Very-large-scale integration, Auxiliary function, Nonlinear programming
Papers published on a yearly basis
Papers
More filters
••
01 Nov 2018TL;DR: In this paper, a modified simulated annealing (MSA) algorithm is presented with a new cost function for fixed-outline floorplanning and a two-step strategy is used to improve the efficiency of the algorithm and balance the main function and secondary function.
Abstract: Floorplanning is an indispensable step in very large scale integration (VLSI) design flow. Fixed-outline floorplanning is a challenging problem because it requires additional constraint that the floorplanning must meet both width and height criteria. In this paper, a modified simulated annealing (MSA) algorithm is presented with a new cost function for this problem. In our algorithm, a two-step strategy is used to improve the efficiency of the algorithm and balance the main function and secondary function. In addition, a local search strategy is employed to make sure a feasible solution can be found. Experimental results indicate that the success rate of each benchmark is 100% in deferent aspect ratio
4 citations
••
01 Jan 2016
TL;DR: In this paper, it was shown that for any constant d, unless k-Clique can be solved in n^{o(k)} time, there is no poly(m,n) time algorithm for Gap-k-VectorSum when k = omega((log(log( n)))^{c_0}).
Abstract: This work investigates the hardness of computing sparse solutions to systems of linear equations over F_2. Consider the k-EventSet problem: given a homogeneous system of linear equations over $\F_2$ on $n$ variables, decide if there exists a nonzero solution of Hamming weight at most k (i.e. a k-sparse solution). While there is a simple O(n^{k/2})-time algorithm for it, establishing fixed parameter intractability for k-EventSet has been a notorious open problem. Towards this goal, we show that unless \kclq can be solved in n^{o(k)} time, k-EventSet has no polynomial time algorithm when k = omega(log^2(n)).
Our work also shows that the non-homogeneous generalization of the problem - which we call k-VectorSum - is W[1]-hard on instances where the number of equations is O(k*log(n)), improving on previous reductions which produced Omega(n) equations. We use the hardness of k-VectorSum as a starting point to prove the result for k-EventSet, and additionally strengthen the former to show the hardness of approximately learning k-juntas. In particular, we prove that given a system of O(exp(O(k))*log(n)) linear equations, it is W[1]-hard to decide if there is a k-sparse linear form satisfying all the equations or any function on at most k-variables (a k-junta) satisfies at most (1/2 + epsilon)-fraction of the equations, for any constant epsilon > 0. In the setting of computational learning, this shows hardness of approximate non-proper learning of k-parities.
In a similar vein, we use the hardness of k-EventSet to show that that for any constant d, unless k-Clique can be solved in n^{o(k)} time, there is no poly(m,n)*2^{o(sqrt{k})} time algorithm to decide whether a given set of $m$ points in F_2^n satisfies: (i) there exists a non-trivial k-sparse homogeneous linear form evaluating to 0 on all the points, or (ii) any non-trivial degree d polynomial P supported on at most k variables evaluates to zero on approx Pr_{F_2^n}[P({z}) = 0] fraction of the points i.e., P is fooled by the set of points.
Lastly, we study the approximation in the sparsity of the solution. Let the Gap-k-VectorSum problem be: given an instance of k-VectorSum of size n, decide if there exist a k-sparse solution, or every solution is of sparsity at least k' = (1+delta_0)k. Assuming the Exponential Time Hypothesis, we show that for some constants c_0, delta_0 > 0 there is no poly(n) time algorithm for Gap-k-VectorSum when k = omega((log(log( n)))^{c_0}).
4 citations
••
TL;DR: This paper studies the redundant via insertion and guiding template assignment for DSA with MP problem at the postrouting stage and proposes a graph-methodology-based solution framework, which can save 58%, 82%, and 96% runtime, respectively.
Abstract: Inserting redundant vias is necessary for improving via yield in circuit designs. Block copolymer directed self-assembly (DSA) is an emerging and promising lithography technology for the manufacture of vias and redundant vias, in which guiding templates are used to enhance the resolution. Considering manufacturability of via layer, multiple patterning (MP) lithography is also needed in advanced designs. In this paper, we study the redundant via insertion and guiding template assignment for DSA with MP problem at the postrouting stage. We propose a graph-methodology-based solution framework. First, by analyzing the structure of guiding templates, we propose a new solution expression by introducing the concept of multiplet to discard redundant solutions, and then, honoring the compact solution expression, we construct a conflict graph on the grid model. Second, we formulate the problem with single patterning (SP) as a constrained maximum weight-independent set problem, for which a fast linear interpolation algorithm is introduced to obtain a local optimal solution. To avoid undesirable local optima, we propose an effective initial solution generation method. Our framework is general and is further extended to solve the problem with double patterning (DP) or triple patterning (TP) in a two-stage manner. Experimental results validate the efficiency and effectiveness of our method. Specifically, compared with the state-of-the-art work for the problem with SP, DP, and TP, our method can save 58%, 82%, and 96% runtime, respectively.
4 citations
••
01 May 2018TL;DR: It is suggested that researchers should follow a protocol of functional abstraction, considering which features of the natural algorithm provide the efficiency/effectiveness in the real world, and then use those abstracted features as design components to build purposeful, tailored (perhaps even optimized) solutions.
Abstract: Although bio-inspired designs for cybersecurity have yielded many elegant solutions to challenging problems, the vast majority of these efforts have been ad hoc analogies between the natural and human-designed systems. We propose to improve on the current approach of searching through the vast diversity of existing natural algorithms for one most closely resembling each new cybersecurity challenge, and then trying to replicate it in a designed cyber setting. Instead, we suggest that researchers should follow a protocol of functional abstraction, considering which features of the natural algorithm provide the efficiency/effectiveness in the real world, and then use those abstracted features as design components to build purposeful, tailored (perhaps even optimized) solutions. Here, we demonstrate how this can work by considering a case study employing this method. We design an extension of an existing (and ad hoc-created) algorithm, DIAMoND, for application beyond its originally intended solution space (detection of Distributed Denial of Service attacks in simple networks) to function on multilayer networks. We show how this protocol provides insights that might be harder or take longer to discover by direct analogy-building alone; in this case, we see that differential weighting of shared information by the providing network layer is likely to be effective.
4 citations
••
TL;DR: In this paper, it was shown that the theorems of Luce and Marley about combining conjoint measurement on a product with extensive measurement on one or both of the components is replaced with qualitative probability measurement, and the results in the case where one component has qualitative probability measurements and one has extensive measurement are applied to give an extension of the Savage axioms of statistical decision theory to the case of additive utility functions.
Abstract: Extensions of the measurement theory literature regarding laws of exchange are obtained. It is shown that the theorems of Luce and Marley about combining conjoint measurement on a product $A_1 \times A_2 $ with extensive measurement on $A_1 $ and $A_2 $—the theorems give axioms for the conjoint measures to be power functions of the extensive measures—still hold if extensive measurement on one or both of the components is replaced with qualitative probability measurement. The results in the case where one component has qualitative probability measurement and one has extensive measurement are applied to give an extension of the Savage axioms of statistical decision theory to the case of additive utility functions. They are also applied to the foundations of integration and to the measurement of the community noise exposure level. The case of qualitative probability measurement on both components is applied to time preference and solves a problem posed by Jamison (1970).
3 citations
Authors
Showing all 148 results
Name | H-index | Papers | Citations |
---|---|---|---|
Aravind Srinivasan | 60 | 266 | 13711 |
Ding-Zhu Du | 52 | 421 | 13489 |
Elena N. Naumova | 47 | 232 | 8593 |
Rebecca N. Wright | 37 | 113 | 4722 |
Boris Mirkin | 35 | 178 | 6722 |
Mona Singh | 32 | 91 | 5451 |
Fred S. Roberts | 32 | 181 | 5286 |
Tanya Y. Berger-Wolf | 31 | 135 | 3624 |
Rephael Wenger | 26 | 67 | 1900 |
Marios Mavronicolas | 26 | 151 | 2880 |
Seoung Bum Kim | 26 | 165 | 2260 |
M. Montaz Ali | 26 | 101 | 3093 |
Lazaros K. Gallos | 24 | 69 | 4770 |
Myong K. Jeong | 24 | 95 | 1955 |
Nina H. Fefferman | 23 | 107 | 2362 |