scispace - formally typeset
Search or ask a question
Author

Cyrus Bazeghi

Bio: Cyrus Bazeghi is an academic researcher from University of California, Santa Cruz. The author has contributed to research in topics: Ocean color & Buoy. The author has an hindex of 3, co-authored 7 publications receiving 49 citations.

Papers
More filters
Proceedings ArticleDOI
12 Nov 2005
TL;DR: In this article, the authors introduce a methodology to measure and estimate processor design effort called /spl mu/Complexity, which consists of three main parts, namely a procedure to account for the contributions of the different components in the design, accurate statistical regression of experimental measures using a nonlinear mixed-effects model, and a productivity adjustment to adjust the productivities of different teams.
Abstract: Microprocessor design complexity is growing rapidly. As a result, current development costs for top of the line processors are staggering, and are doubling every 4 years. As we design ever larger and more complex processors, it is becoming increasingly difficult to estimate how much time it will take to design and verify them. To compound this problem, processor design cost estimation still does not have a quantitative approach. Although designing a processor is very resource consuming, there is little work measuring, understanding, and estimating the effort required. To address this problem, this paper introduces /spl mu/Complexity, a methodology to measure and estimate processor design effort. /spl mu/Complexity consists of three main parts, namely a procedure to account for the contributions of the different components in the design, accurate statistical regression of experimental measures using a nonlinear mixed-effects model, and a productivity adjustment to account for the productivities of different teams. We use /spl mu/Complexity to evaluate a series of design effort estimators on several processor designs. Our analysis shows that the number of lines of HDL code, the sum of the fan-ins of the logic cones in the design, and a linear combination of the two metrics are good design effort estimators. On the other hand, power, area, frequency, number of flip-flops, and number of standard cells are poor estimators of design effort. We also show that productivity adjustments are necessary to produce accurate estimations.

25 citations

Proceedings ArticleDOI
23 Jul 2007
TL;DR: The design, prototype construction and initial testing of a small minibuoy that is aimed at use in a coordinated, wireless networked array of buoys for near-surface ocean sensing to fill the gap between larger ocean surface vessels and/or moored buoys and subsurface gliders.
Abstract: We report the design, prototype construction and initial testing of a small minibuoy that is aimed at use in a coordinated, wireless networked array of buoys for near-surface ocean sensing. This vehicle is designed to fill the gap between larger ocean surface vessels and/or moored buoys and subsurface gliders. The size and cost is low enough that these versatile sensor platforms can be deployed easily and in quantity. Since these minibuoys are mobile, they can keep station in currents as large as 25 cm/s or move as an adaptive, coordinated sensor array for high resolution in both time and space. The buoy is about 74 cm (29 in) long, 41 cm (16 in) wide (max) and weighs about 14.5 kg (32 lbs); hence, it can be deployed easily from small craft. Deployment times are about 1 to 2 days or more - longer with solar power. The buoy structure is fiberglass and PVC with two 2 W DC motors. Control is done with GPS and magnetic heading sensors and a PID scheme to maintain course. Communication is via a 900 MHz system with a range of 1 to 2 km and plans for a longer range HF/VHF or satellite system. The initial sensor system is designed for ocean hyperspectral observations as surface truth for airborne system calibration and validation and other ocean color applications. Acoustic, wave, air & water temperature sensors as well as GPS are included. The Mark I prototype has been successfully tested in a pool with manual control.

12 citations

Proceedings ArticleDOI
22 Feb 2009
TL;DR: This evaluation shows how measurements of an Altera Cyclone II FPGA can be used to derive variability models for several 90nm commercial designs such as the Sun Niagara and Intel Pentium D.
Abstract: The focus of this paper is to measure and qualify high-level process variation models by measuring variability on FPGAs. Measurements are done with high spatial resolution and demonstrate how the high-resolution data matches two industry test cases. The benefit of such an approach is that several inexpensive FPGAs, which are normally on the leading edge of technologies compared to ASICs, obviate the need of fabricating many custom test chips. Specifically, our evaluation shows how measurements of an Altera Cyclone II FPGA can be used to derive variability models for several 90nm commercial designs such as the Sun Niagara and Intel Pentium D. Even though the FPGAs and commercial processors are produced by different fabs (TSMC, TI, and Intel, respectively), we find the FPGAs to be very useful for predicting variation in the commercial processors.

7 citations

Proceedings ArticleDOI
12 Dec 2007
TL;DR: This paper introduces μPCBComplexity, a methodology to measure and estimate PCB (printed circuit board) design effort, and uses it to evaluate a series of design effort estimators for twelve PCB designs.
Abstract: System design complexity is growing rapidly. As a result, current development costs are constantly increasing. It is becoming increasingly difficult to estimate how much time it will take to design and verify these designs, which are getting denser and increasingly more complex. To compound this problem, circuit design cost estimation still does not have a quantitative approach. Although designing a system is very resource consuming, there is little work invested in measuring, understanding, and estimating the effort required. To address part of the current shortcomings, this paper introduces μPCBComplexity, a methodology to measure and estimate PCB (printed circuit board) design effort. PCBs are the central component of many systems and require large amounts of resources to properly design and verify. μPCBComplexity consists of two main parts; a procedure to account for the contributions of the different elements in the design, and a non-linear statistical regression of experimental measures in order to determine a good design effort metric. We use μPCBComplexity to evaluate a series of design effort estimators for twelve PCB designs. By using the proposed μPCBComplexity metric, designers can estimate PCB design effort.

2 citations

Proceedings ArticleDOI
18 Jun 2007
TL;DR: In this article, the design, prototype construction and initial testing of a small minibuoy that is aimed at use in a coordinated, wireless networked array of buoys for ocean surface sensing, both above and below the surface.
Abstract: We report the design, prototype construction and initial testing of a small minibuoy that is aimed at use in a coordinated, wireless networked array of buoys for ocean surface sensing, both above and below the surface This vehicle is designed to fill the gap between larger ocean surface vessels and/or moored buoys and subsurface gliders The size and cost is low enough that these versatile sensor platforms can be deployed easily and in quantity Since these minibuoys are mobile, they can keep station in currents as large as 25 cm/s or move as an adaptive, coordinated sensor array for high resolution in both time and space The buoy is about 29 inches (74 cm) long, 16 inches (41 cm) wide (maximum) and weighs about 32 pounds (145 kg); hence, it can be deployed easily from small craft Deployment times are estimated to be one or two days or more depending on propulsion requirements and battery pack -longer with solar power The buoy structure is fiberglass and PVC with two 2 W electric motors Control is affected by GPS and magnetic heading sensors and a PID scheme to maintain heading and required speed Communication is via a 900 MHz system with a range of a km or two with plans for a longer range HF/VHF or satellite system The initial sensor system is designed for ocean hyperspectral observations as surface truth for an airborne system calibration and validation as well as other ocean color applications Acoustic, wave, air & water temperature sensors as well as GPS are included The Mark I prototype has been successfully tested in a pool with manual control of movement

2 citations


Cited by
More filters
Journal ArticleDOI
11 Sep 2014-Sensors
TL;DR: A comprehensive review of the state-of-the-art technologies in the field of marine environment monitoring using wireless sensor networks using WSNs and some related projects, systems, techniques, approaches and algorithms is provided.
Abstract: With the rapid development of society and the economy, an increasing number of human activities have gradually destroyed the marine environment. Marine environment monitoring is a vital problem and has increasingly attracted a great deal of research and development attention. During the past decade, various marine environment monitoring systems have been developed. The traditional marine environment monitoring system using an oceanographic research vessel is expensive and time-consuming and has a low resolution both in time and space. Wireless Sensor Networks (WSNs) have recently been considered as potentially promising alternatives for monitoring marine environments since they have a number of advantages such as unmanned operation, easy deployment, real-time monitoring, and relatively low cost. This paper provides a comprehensive review of the state-of-the-art technologies in the field of marine environment monitoring using wireless sensor networks. It first describes application areas, a common architecture of WSN-based oceanographic monitoring systems, a general architecture of an oceanographic sensor node, sensing parameters and sensors, and wireless communication technologies. Then, it presents a detailed review of some related projects, systems, techniques, approaches and algorithms. It also discusses challenges and opportunities in the research, development, and deployment of wireless sensor networks for marine environment monitoring.

310 citations

01 Jan 1981
TL;DR: In this article, the authors provide an overview of economic analysis techniques and their applicability to software engineering and management, including the major estimation techniques available, the state of the art in algorithmic cost models, and the outstanding research issues in software cost estimation.
Abstract: This paper summarizes the current state of the art and recent trends in software engineering economics. It provides an overview of economic analysis techniques and their applicability to software engineering and management. It surveys the field of software cost estimation, including the major estimation techniques available, the state of the art in algorithmic cost models, and the outstanding research issues in software cost estimation.

283 citations

Journal ArticleDOI
TL;DR: In this paper, a review of renewable power sources for remote environmental monitoring is presented, where the authors evaluate the challenges and potentials of the renewable energy sources and discuss how to use them to generate reliable power.

179 citations

Journal ArticleDOI
10 Apr 2019-Sensors
TL;DR: A review of the application of the Internet of Things in the field of marine environment monitoring, including the potential application of IoT and Big Data in marine environment protection is presented.
Abstract: Marine environment monitoring has attracted more and more attention due to the growing concern about climate change. During the past couple of decades, advanced information and communication technologies have been applied to the development of various marine environment monitoring systems. Among others, the Internet of Things (IoT) has been playing an important role in this area. This paper presents a review of the application of the Internet of Things in the field of marine environment monitoring. New technologies including advanced Big Data analytics and their applications in this area are briefly reviewed. It also discusses key research challenges and opportunities in this area, including the potential application of IoT and Big Data in marine environment protection.

111 citations

Proceedings ArticleDOI
01 Dec 2007
TL;DR: An optimization technique to improve the efficiency of complex processors using a new metric (^Utilization), the designer can identify infrequently-used functionality which contributes little to performance and then systematically "prune" it from the design.
Abstract: Design complexity is rapidly becoming a limiting fac- tor in the design of modern, high-performance micro- processors. This paper introduces an optimization tech- nique to improve the efficiency of complex processors. Us- ing a new metric ( µUtilization), the designer can identify infrequently-used functionality which contributes little to performance and then systematically "prune" it from the design. For cases in which architectural pruning may affect design correctness, previously proposed techniques can be applied to guarantee forward progress. To explore the benefits of architectural pruning, we study a candidate Optimistic-Checker Tandem architecture, which combines a complex Alpha EV6-like out-of-order Op- timistic core, with some of the underutilized functionality pruned from its design, with a non-pruned EV5-like in-order Checker core. Our results show that by removing 3% of infrequently used functionality from the optimistic core an increase in frequency of 25% can be realized. Taking into account the replay overhead triggered by the removed func- tionality, the Tandem system is still able to achieve a 12% overall speedup.

109 citations