scispace - formally typeset
Search or ask a question

Showing papers by "Stevens Institute of Technology published in 2007"


Journal ArticleDOI
TL;DR: This paper studies two problems in secondary spectrum access with minimum signal to interference noise ratio (quality of service (QoS) guarantee under an interference temperature constraint and proposes a centralized reduced complexity search algorithm to find the optimal solution.
Abstract: Spectrum is one of the most precious radio resources. With the increasing demand for wireless communication, efficiently using the spectrum resource has become an essential issue. With the Federal Communications Commission's (FCC) spectrum policy reform, secondary spectrum sharing has gained increasing interest. One of the policy reforms introduces the concept of an interference temperature - the total allowable interference in a spectral band. This means that secondary users can use different transmit powers as long as the sum of these power is less than the interference threshold. In this paper, we study two problems in secondary spectrum access with minimum signal to interference noise ratio (quality of service (QoS)) guarantee under an interference temperature constraint. First, when all the secondary links can be supported, a nonlinear optimization problem with the objective to maximize the total transmitting rate of the secondary users is formulated. The nonlinear optimization is solved efficiently using geometric programming techniques. The second problem we address is, when not all the secondary links can be supported with their QoS requirement, it is desirable to have the spectrum access opportunity proportional to the user priority if they belong to different priority classes. In this context, we formulate an operator problem which takes the priority issues into consideration. To solve this problem, first, we propose a centralized reduced complexity search algorithm to find the optimal solution. Then, in order to solve this problem distributively, we define a secondary spectrum sharing potential game. The Nash equilibria of this potential game are investigated. The efficiency of the Nash equilibria solutions are characterized. It is shown that distributed sequential play and an algorithm based on stochastic learning attain the equilibrium solutions. Finally, the performances are examined through simulations

461 citations


Journal ArticleDOI
TL;DR: In this paper, a quantum-dot-based single-photon source with a measured singlephoton emission rate of 4.0MHz (31MHz into the first lens, with an extraction efficiency of 38%) due to the suppression of exciton dark states was demonstrated.
Abstract: Optoelectronic devices that provide non-classical light states on demand have a broad range of applications in quantum information science1, including quantum‐key‐distribution systems2, quantum lithography3 and quantum computing4. Single-photon sources5,6 in particular have been demonstrated to outperform key distribution based on attenuated classical laser pulses7. Implementations based on individual molecules8, nitrogen vacancy centres9 or dopant atoms10 are rather inefficient owing to low emission rates, rapid saturation and the lack of mature cavity technology. Promising single-photon-source designs combine high-quality microcavities11 with quantum dots as active emitters12. So far, the highest measured single-photon rates are ∼ 200 kHz using etched micropillars13,14. Here, we demonstrate a quantum-dot-based single-photon source with a measured single-photon emission rate of 4.0 MHz (31 MHz into the first lens, with an extraction efficiency of 38%) due to the suppression of exciton dark states. Furthermore, our microcavity design provides mechanical stability, and voltage-controlled tuning of the emitter/mode resonance and of the polarization state.

370 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss organizational unlearning based on the organizational change and memory literature enhancing the organizational learning and change scholarship, and argue that unlearning is conceptualized as organizational memory eliminating, and is operationalized as changing beliefs and routines covariates in organizations.
Abstract: Purpose – Organizational learning and unlearning is a popular and important topic in business as well as academia. Even though there is a plethora of studies on organizational learning, surprisingly little is known about the conceptualization and operationalization of organizational unlearning. The purpose of this paper is to discuss organizational unlearning based on the organizational change and memory literature enhancing the organizational learning and change scholarship.Design/methodology/approach – It is argued that unlearning is conceptualized as organizational memory eliminating, and is operationalized as changing beliefs and routines covariates in organizations. This is followed with a discussion of unlearning types, specifically, reinventive, formative, operative and adjustive, which are contingent on the environmental conditions. Finally, future research suggestions are proposed to leverage understanding on unlearning in the literature.Findings – Shows that organizations first need to unlearn e...

278 citations


Journal ArticleDOI
TL;DR: A model for testing the relative effectiveness of engineering laboratories in education that takes account of the interface to the equipment, the discussions students have among themselves, and other factors is presented.
Abstract: Economic pressures on universities and the emergence of new technologies have spurred the creation of new systems for delivering engineering laboratories in education, in particular simulations and remote-access laboratory systems. Advocates of simulation argue that physical labs needlessly consume university space and students' time. However, proponents of hands-on laboratories argue that student engineers should be exposed to real environments. Remote laboratories have appeared as a third option. These laboratories are similar to simulation techniques in that they require minimal space and time, because the experiments can be rapidly configured and run over the Internet. But unlike simulations, they provide real data. Studying the relative effectiveness of these modes of delivering student laboratories is complex, for the underlying technology of the laboratory is just one of many possible factors that could influence effectiveness. For example, the interface to the equipment may be of importance, as might the discussions students have among themselves. This paper presents a model for testing the relative effectiveness of engineering laboratories in education that takes account of these and other factors. The results are presented for an assessment study comparing versions of remote labs versus hands-on labs in a junior-level mechanical engineering course on machine dynamics and mechanisms. The results suggest that students learned lab content information equally well from both types of laboratories, and that they have a realistic understanding and appreciation of the practical advantages of remote laboratories.

278 citations


Posted Content
TL;DR: In this article, a step-by-step guide is presented that leads managers through the four components of the technological base: 1. technological assets, 2. organizational assets, 3. external assets, and 4. project management.
Abstract: A common mistake in assessing an organization's technological base is narrowing the review to matters of technical competence. Managers need a framework for assessing the much broader question of how well their organizations are positioned to derive competitive advantage from technology. A step-by-step guide is presented that leads managers through the 4 components of the technological base: 1. technological assets, 2. organizational assets, 3. external assets, and 4. project management. Case studies of organizations in the defense industry illustrate how 2 companies' strategies for moving into a new business were shaped by the strengths and weaknesses of their respective technological bases. Of the 4 dimensions, it is usually the organizational assets that prove to be the limiting element. A hierarchy was found among the organizational assets: 1. skills, 2. procedures, 3. structure, 4. strategy, and 5. culture.

233 citations


Journal ArticleDOI
TL;DR: Differences in lab formats led to changes in group functions across the plan-experiment-analyze process: For example, students did less face-to-face work when engaged in remote or simulated laboratories, as opposed to hands-on laboratories.
Abstract: Laboratories play a crucial role in the education of future scientists and engineers, yet there is disagreement among science and engineering educators about whether and which types of technology-enabled labs should be used. This debate could be advanced by large-scale randomized studies addressing the critical issue of whether remotely operated or simulation-based labs are as effective as the traditional hands-on lab format. The present article describes the results of a large-scale (N = 306) study comparing learning outcomes and student preferences for several different lab formats in an undergraduate engineering course. The lab formats that were evaluated included traditional hands-on labs, remotely operated labs, and simulations. Learning outcomes were assessed by a test of the specific concepts taught in each lab. These knowledge scores were as high or higher (depending on topic) after performing remote and simulated laboratories versus performing hands-on laboratories. In their responses to survey items, many students saw advantages to technology-enabled lab formats in terms of such attributes as convenience and reliability, but still expressed preference for hands-on labs. Also, differences in lab formats led to changes in group functions across the plan-experiment-analyze process: For example, students did less face-to-face work when engaged in remote or simulated laboratories, as opposed to hands-on laboratories.

226 citations


Journal ArticleDOI
TL;DR: The price dynamics in a competitive market consisting of spectrum agile network service providers and users, where multiple self interested spectrum providers operating with different technologies and costs compete for potential customers is explored.
Abstract: We explore the price dynamics in a competitive market consisting of spectrum agile network service providers and users. Here, multiple self interested spectrum providers operating with different technologies and costs compete for potential customers. Different buyers or consumers may evaluate the same seller differently depending on their applications, operating technologies and locations. Two different buyer populations, the quality-sensitive and the price-sensitive are investigated, and the resulting collective price dynamics are studied using a combination of analysis and simulations. Various scenarios are considered regarding the nature and accuracy of information available to the sellers. A myopically optimal strategy is studied when full information is available, while a stochastic learning based strategy is considered when the information is limited. Cooperating groups may be formed among the sellers which will in-turn influence the group profit for those participants. Free riding phenomenon is observed under certain circumstances

208 citations


Journal ArticleDOI
TL;DR: The statistics of project success suggest that most projects still fail and many projects... as discussed by the authors, however, irony suggests that most project success suggests that very few projects still succeed, while many projects fail.
Abstract: Project management is one of the fastest growing disciplines in organizations today. However, ironically, the statistics of project success suggests that most projects still fail and many projects ...

201 citations


Posted Content
TL;DR: In this paper, the authors investigated the nomological relations among team improvisation and unlearning, new product success, and environmental turbulence, and contributed to the literature on NPD team learning, and on team flexibility under turbulent conditions.
Abstract: Team learning is vital for organizations in order to compete in fast-paced environments. However, the ways learning can be effective in such environments warrents research, especially for teams developing new products under rapidly changing technological and market conditions. Interestingly, recent new product development (NPD) literature demonstrates the essential role of improvisation (i.e., planning and executing any action simultaneously) and unlearning (i.e., changes in team beliefs and project routines) for effective learning and performing under turbulent conditions. However, the combined effect of team improvisation and unlearning on new product success (NPS) has largely been ignored. This paper investigates the nomological relations among team improvisation and unlearning, new product success, and environmental turbulence, and contributes to the literature on NPD team learning, and on team flexibility under turbulent conditions. By examining 197 new product-development projects, we found that (1) environmental turbulence positively affects team unlearning, (2) team unlearning concurrently stimulates team improvisation, (3) team improvisation positively impacts new product success by utilizing/implementing new knowledge acquired by unlearning and improvisation. We further discuss the theoretical and managerial implications of our conclusions.

172 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the nomological relations among team improvisation and unlearning, new product success, and environmental turbulence, and contributed to the literature on NPD team learning, and on team flexibility under turbulent conditions.
Abstract: Team learning is vital for organizations in order to compete in fast-paced environments. However, the ways learning can be effective in such environments warrents research, especially for teams developing new products under rapidly changing technological and market conditions. Interestingly, recent new product development (NPD) literature demonstrates the essential role of improvisation (i.e., planning and executing any action simultaneously) and unlearning (i.e., changes in team beliefs and project routines) for effective learning and performing under turbulent conditions. However, the combined effect of team improvisation and unlearning on new product success (NPS) has largely been ignored. This paper investigates the nomological relations among team improvisation and unlearning, new product success, and environmental turbulence, and contributes to the literature on NPD team learning, and on team flexibility under turbulent conditions. By examining 197 new product-development projects, we found that (1) environmental turbulence positively affects team unlearning, (2) team unlearning concurrently stimulates team improvisation, (3) team improvisation positively impacts new product success by utilizing/implementing new knowledge acquired by unlearning and improvisation. We further discuss the theoretical and managerial implications of our conclusions.

169 citations


Journal ArticleDOI
TL;DR: Given the costs of phosphate treatment, the use of biogenic phosphate sources, such as bone meal, may be a more environmentally sustainable approach toward this end, and the success and sustainability of applying phosphate as a BMP in firing range soils remain questionable.

Journal ArticleDOI
TL;DR: It may, however, be possible to go even further and design 'pseudo-cell' nanofactories that work with molecules already in the body to fight disease.
Abstract: Nanotechnology is having a major impact on medicine and the treatment of disease, notably in imaging and targeted drug delivery. It may, however, be possible to go even further and design 'pseudo-cell' nanofactories that work with molecules already in the body to fight disease.

Journal ArticleDOI
17 Aug 2007-Langmuir
TL;DR: Positive charged silver nanoparticles demonstrate superior SERS activity over negatively charged citrate reduced Ag nanoparticles for the detection of thiocyanate and perchlorate ions and are promising candidates for sensing and detection of a variety of negatively charged analytes in aqueous solutions using surface-enhanced Raman spectroscopy (SERS).
Abstract: Branched polyethyleneimine (BPEI) and 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES) were used collaboratively to reduce silver nitrate under UV irradiation for the synthesis of positively charged silver nanoparticles. The effects of molar ratio of the ingredients and the molecular weight of BPEI on the particle size and distribution were investigated. The mechanism for the reduction of Ag+ ions in the BPEI/HEPES mixtures entails oxidative cleavage of BPEI chains that results in the formation of positively charged BPEI fragments enriched with amide groups as well as in the production of formaldehyde, which serves as a reducing agent for Ag+ ions. The resultant silver nanoparticles are positively charged due to protonation of surface amino groups. Importantly, these positively charged Ag nanoparticles demonstrate superior SERS activity over negatively charged citrate reduced Ag nanoparticles for the detection of thiocyanate and perchlorate ions; therefore, they are promising candidates for sens...

Proceedings ArticleDOI
10 Apr 2007
TL;DR: In this paper, the authors address the problems of automatically planning autonomous underwater vehicle (AUV) paths which best exploit complex current data, from computational estuarine model forecasts, while also avoiding obstacles.
Abstract: This paper addresses the problems of automatically planning autonomous underwater vehicle (AUV) paths which best exploit complex current data, from computational estuarine model forecasts, while also avoiding obstacles. In particular we examine the possibilities for a novel type of AUV mission deployment in fast flowing tidal river regions which experience bi-directional current flow. These environments are interesting in that, by choosing an appropriate path in space and time, an AUV may both bypass adverse currents which are too fast to be overcome by the vehicle's motors and also exploit favorable currents to achieve far greater speeds than the motors could otherwise provide, while substantially saving energy. The AUV can "ride" currents both up and down the river, enabling extended monitoring of otherwise energy-exhausting, fast flow environments. The paper discusses suitable path parameterizations, cost functions and optimization techniques which enable optimal AUV paths to be efficiently generated. These paths take maximum advantage of the river currents in order to minimize energy expenditure, journey time and other cost parameters. The resulting path planner can automatically suggest useful alternative mission start and end times and locations to those specified by the user. Examples are presented for navigation in a simple simulation of the fast flowing Hudson River waters around Manhattan.

Posted Content
TL;DR: This article extended the analysis to include a broad range of concurrent disclosure contained in earnings press releases: financial disclosure captured as accounting ratios; and verbal components of disclosure, both content and style, which are captured using elementary computer-based content analysis.
Abstract: Similar to a classic event study, this study examines market reaction to firms' earnings announcements. This study extends the examination to include a broad range of concurrent disclosure contained in earnings press releases: financial disclosure captured as accounting ratios; and verbal components of disclosure, both content and style, which are captured using elementary computer-based content analysis. Extending the analysis to such a broad range of concurrent disclosures requires a methodology designed to utilize a large number of predictor variables, and predictive data mining algorithms are specifically designed to do so. Therefore, this study employs a widely used data-mining algorithm - classification and regression trees (CART). Results of the study show that inclusion of predictor variables capturing verbal content and writing style of earnings-press releases results in more accurate predictions of market response.

Journal ArticleDOI
TL;DR: A procedure to culture microorganisms below freezing point on solid media with ethanol as the sole carbon source without using artificial antifreezes yielded isolated organisms able to grow only transiently for 3 weeks after cooling with measurable respiratory and biosynthetic activity.

Journal ArticleDOI
TL;DR: It is shown that the differential cooperative DAF and DDF schemes achieve cooperative diversity and outperform the conventional noncooperative differential modulation.
Abstract: This paper examines differential binary modulation for wireless networks that utilize wireless relays to seek cooperative diversity and improved performance. Two differential cooperative transmission schemes, referred to as differential amplify-and-forward (DAF) and differential decode-and-forward (DDF), respectively, are introduced. These schemes require no channel state information at any node in the system. A set of analytical results pertaining to the probability density function of the instantaneous signal-to-noise ratio, average bit error rate, outage probability, and diversity order of the proposed schemes in Rayleigh fading channels are obtained. The analytical results are confirmed by numerical simulations. It is shown that the differential cooperative DAF and DDF schemes achieve cooperative diversity and outperform the conventional noncooperative differential modulation

Journal ArticleDOI
TL;DR: In this paper, a vision-based self-calibration method for a serial robot manipulator, which only requires a ground-truth scale in the reference frame, is proposed.
Abstract: Unlike the traditional robot calibration methods, which need external expensive calibration apparatus and elaborate setups to measure the 3D feature points in the reference frame, a vision-based self-calibration method for a serial robot manipulator, which only requires a ground-truth scale in the reference frame, is proposed in this paper. The proposed algorithm assumes that the camera is rigidly attached to the robot end-effector, which makes it possible to obtain the pose of the manipulator with the pose of the camera. By designing a manipulator movement trajectory, the camera poses can be estimated up to a scale factor at each configuration with the factorization method, where a nonlinear least-square algorithm is applied to improve its robustness. An efficient approach is proposed to estimate this scale factor. The great advantage of this self-calibration method is that only image sequences of a calibration object and a ground-truth length are needed, which makes the robot calibration procedure more autonomous in a dynamic manufacturing environment. Simulations and experimental studies on a PUMA 560 robot reveal the convenience and effectiveness of the proposed robot self-calibration approach.

Journal ArticleDOI
TL;DR: In this article, EOFs were used to identify the dominant modes of longshore shoreline variability at Duck, North Carolina, the Gold Coast, Australia, and at several locations within the Columbia River Littoral Cell in the US Pacific Northwest.

Journal ArticleDOI
TL;DR: The new approach proposed herein is based on a finite difference formalism, i.e., the transmission line matrix (TLM), which provides an exact solution for the linear system whilst significantly reducing the computational complexity when compared with the time domain approach.
Abstract: The multiresolution frequency domain parflow (MR-FDPF) approach is applied to radio wave propagation in indoor environments. This method allows for a better understanding of indoor propagation and hence greatly assists the development of WiFi-like network planning tools. The efficiency of such wireless design tools is strongly impacted by the quality of the coverage predictions which have to be estimated with a limited computational load. The usual approaches are based either on an empirical modeling relying on measurement campaigns or on geometrical optics leading to ray-tracing. While the former approach suffers from a lack of accuracy, the later one needs to balance accuracy with computational load requirements. The new approach proposed herein is based on a finite difference formalism, i.e., the transmission line matrix (TLM). Once the problem is developed in the frequency domain, the linear system thus obtained is solved in two steps: a pre-processing step which consists of an adaptive MR (multigrid) pre-conditioning and a propagation step. The first step computes a MR data structure represented as a binary tree. In the second step the coverage of a point source is obtained by up-and-down propagating through the binary tree. This approach provides an exact solution for the linear system whilst significantly reducing the computational complexity when compared with the time domain approach

Journal ArticleDOI
TL;DR: In this article, a multiscale stochastic finite element method (MsSFEM) is developed to resolve scale-coupling problems in the context of scale-bridging multi-scale shape functions.

Journal ArticleDOI
TL;DR: In this paper, a hydrogen peroxide was safely produced by direct combination of hydrogen and oxygen in a micro-reactor in the explosive regime, and the reaction was shown to be free of mass transfer limitations at the conditions of kinetic experiments.


Journal ArticleDOI
TL;DR: A fly ash-based stabilization/solidification (S/S) technique was investigated using field soil samples contaminated with arsenic (As) and lead (Pb) and a semi-dynamic leaching test was used to evaluate the effectiveness of the S/S treatment.

Journal ArticleDOI
TL;DR: The first complete theoretical convergence analysis for the iterative extensions of the Sturm/Triggs algorithm is given, showing that the simplest extension, SIESTA, converges to nonsense results and implies that CIESTA gives a reliable way of initializing other algorithms such as bundle adjustment.
Abstract: We give the first complete theoretical convergence analysis for the iterative extensions of the Sturm/Triggs algorithm. We show that the simplest extension, SIESTA, converges to nonsense results. Another proposed extension has similar problems, and experiments with "balanced" iterations show that they can fail to converge or become unstable. We present CIESTA, an algorithm that avoids these problems. It is identical to SIESTA except for one simple extra computation. Under weak assumptions, we prove that CIESTA iteratively decreases an error and approaches fixed points. With one more assumption, we prove it converges uniquely. Our results imply that CIESTA gives a reliable way of initializing other algorithms such as bundle adjustment. A descent method such as Gauss-Newton can be used to minimize the CIESTA error, combining quadratic convergence with the advantage of minimizing in the projective depths. Experiments show that CIESTA performs better than other iterations.

Journal ArticleDOI
TL;DR: This article reviewed the evolution of these concepts in business literature and provided comprehensive definitions, conceptual models, and examples to help clarify and distinguish the concepts so that failures of communication can be avoided.
Abstract: Core competence, distinctive competence, and competitive advantage are 3 of the most important business concepts that managers, researchers, and educators rely on for decision making, pedagogy, and research. However, little attention has been paid to defining these concepts. As a result, they have become buzzwords that are used so frequently that their meanings are often taken for granted but are not fully understood. In this article, the author reviews the evolution of these concepts in business literature and provides comprehensive definitions, conceptual models, and examples to help clarify and distinguish the concepts so that failures of communication can be avoided.

Journal ArticleDOI
TL;DR: Experimental results for different system complexities show that these new CIM can effectively estimate the criticality of components with respect to multi-state system reliability and show that the CIM-based heuristic can be used as a fast and effective technique to guide system reliability improvements.

Journal ArticleDOI
TL;DR: This work describes the unique attributes of the mobile ad hoc wireless networks (MAWN) and how the classical analysis of network reliability can be adjusted to model and analyze this type of network.

Journal ArticleDOI
TL;DR: The results from the bench scale study indicated that a calcium polysulfide dosage twice the molar stoichiometric requirement (2x) proved effective in meeting the New Jersey Department of Environmental Protection (NJDEP) total Cr(VI) and the Environmental Protection Agency (EPA) Toxicity Characteristic Leaching Procedure (TCLP) regulatory standards.

Journal ArticleDOI
TL;DR: In this article, a packed-bed micro-reactor was used for the hydrogenation of o-nitroanisole to o-anisidine in a pharmaceutical and fine chemicals industry with the aim of investigating the reactor performance and kinetics of the reaction.