scispace - formally typeset
Search or ask a question

Showing papers by "Polytechnic University of Milan published in 1991"


Book
17 Aug 1991
TL;DR: This chapter discusses Conceptual Design, Logical Design, and Design Tools for Database Design, as well as Joint Data and Functional Analysis, and Improving the Quality of a Database Schema.
Abstract: I. CONCEPTUAL DATABASE DESIGN. 1. An Introduction to Database Design. 2. Data Modeling Concepts. 3. Methodologies for Conceptual Design. 4. View Design. 5. View Integration. 6. Improving the Quality of a Database Schema. 7. Schema Documentation and Maintenance. II. FUNCTIONAL ANALYSIS FOR DATABASE DESIGN. 1. Functional Analysis Using the Dataflow Model. 2. Joint Data and Functional Analysis. 3. Case Study. III. LOGICAL DESIGN AND DESIGN TOOLS. 1. High-Level Logical Design Using the Entity-Relationship Model. 2. Logical Design for the Relational Model. 3. Logical Design for the Network Model. 4. Logical Design for the Hierarchical Model. 5. Database Design Tools. Index. 0805302441T04062001

1,018 citations


Journal ArticleDOI
TL;DR: In this paper, the focusing of synthetic-aperture-radar (SAR) data using migration techniques quite similar to those used in geophysics is treated, which works in the omega -k/sub x/domain.
Abstract: The focusing of synthetic-aperture-radar (SAR) data using migration techniques quite similar to those used in geophysics is treated. The algorithm presented works in the omega -k/sub x/ domain. Because time delays can be easily accommodated with phase shifts that increase linearly with omega , range migration poses no problem. The algorithm is described in plane geometry first, where range migration and phase history can be exactly matched. The effects of the sphericity of the Earth, of the Earth's rotation, and of the satellite trajectory curvature are taken into account, showing that the theoretically achievable spatial resolution is well within the requirements of present day and near future SAR missions. Terrestrial swaths as wide as 100 km can be focused simultaneously with no serious degradation. The algorithm has been tested with synthetic data, with Seasat-A data, and with airplane data (NASA-AIR). The experimental results fully support the theoretical analysis. >

695 citations


Journal ArticleDOI
TL;DR: In this article, the authors summarize the studies devoted to the elucidation of the different crystal structures and crystal morphologies and the different conditions favouring one or the other of the polymorphs of isotactic polypropylene.

455 citations


Journal ArticleDOI
TL;DR: A high-level Petri net formalism-environment/relationship (ER) nets-which can be used to specify control, function, and timing issues-is introduced and time can be modeled via ER nets by providing a suitable axiomatization.
Abstract: The authors introduce a high-level Petri net formalism-environment/relationship (ER) nets-which can be used to specify control, function, and timing issues. In particular, they discuss how time can be modeled via ER nets by providing a suitable axiomatization. They use ER nets to define a time notation that is shown to generalize most time Petri-net-based formalisms which appeared in the literature. They discuss how ER nets can be used in a specification support environment for a time-critical system and, in particular, the kind of analysis supported. >

356 citations


Journal ArticleDOI
01 Jul 1991
TL;DR: Constrained receding-horizon predictive control (CRHPC) as mentioned in this paper optimizes a quadratic function over a costing horizon subject to the condition that the output matches the reference value over a further constraint range.
Abstract: Constrained receding-horizon predictive control (CRHPC) is intended for demanding control applications where conventional predictive control designs can fail. The idea behind CRHPC is to optimise a quadratic function over a 'costing horizon' subject to the condition that the output matches the reference value over a further constraint range. Theorems show that the method stabilises general linear plants (e.g. unstable, nonminimum-phase, dead-time). Simulation studies demonstrate good behaviour with even nearly unobservable systems (where generalised predictive control is ineffective) and that control-costing is a particularly effective tuning parameter. >

330 citations


Journal ArticleDOI
TL;DR: In this article, a technique for measuring the release of minority carriers emitted from deep levels in avalanche photodiodes (APDs) at operating conditions is discussed, which can be useful in tailoring gettering processes for APDs and in studies of traps at high electric fields.
Abstract: A technique for measuring the release of minority carriers emitted from deep levels in avalanche photodiodes (APDs) at operating conditions is discussed. The method, time-correlated carrier counting (TCCC), is very sensitive and accurate. Densities of filled traps were measured down to 10/sup 9/ cm/sup -3/ and lifetimes in the nanosecond range. This technique can be useful in tailoring gettering processes for APDs and in studies of traps at high electric fields. >

280 citations


Proceedings ArticleDOI
19 Jun 1991
TL;DR: In this paper, a tool based on the mathematical group theory for the synthesis of new parallel structure robots is presented, by the kinematic principle of displacement subgroups intersection a family of 3 degrees of freedom robots for pure spatial translation movements.
Abstract: Presents a tool, based on the mathematical group theory, for the synthesis of new parallel structure robots. By the kinematic principle of displacement subgroups intersection a family of 3 degrees of freedom robots for pure spatial translation movements is conceived. One of the many possible implementations is also given as an example. >

233 citations



Journal ArticleDOI
TL;DR: A framework useful for designing a performance measurement system which is consistent with time‐based principles and can support managers both in strategic and in operating decisions is suggested.
Abstract: An increasing number of firms are planning to become “time‐based companies”, that is to consider time as the main issue of their manufacturing strategy. However, such change in attitude, in order to be effective, must be supported by a performance measurement system focused on time. This article suggests a framework useful for designing a performance measurement system which is consistent with time‐based principles and can support managers both in strategic and in operating decisions. The framework takes into account the different ways through which a company can use time to create a competitive advantage and considers the main activities that are critical for achieving such results. Hence, a “minimum set of measures”, consistent with the information requirements of each company, is determined.

153 citations


Proceedings ArticleDOI
01 Sep 1991
TL;DR: The Hypertext-Hypermedia Model (HDM) as mentioned in this paper is a design model for Hypertext Applications, which is based on the DEXTER model and can be translated into a node-and-link model, either manually or through a compiler.
Abstract: We present the latest developments of HDM, a design model for Hypertext Applications.The basic features of HDM are the representation of applications through several designprimitives: typed entities composed of hierarchies of component different perspectivesfor each componen~ units corresponding to component-perspective pairs; bodiesrepresenting the actual content of the units; structural links, binding together componentsor sub-entities of the same entity; typed application links, interconnecting componentsbelonging to different entities; and a spccitlc browsing semantics based on anchors, as away to activate many different link types from within a unit.The development of HDM is part of the HYTEA project, carried on by a Europeanconsortium, aiming at the development of a set of authoring tools for an “engineereddevelopment of Hypertext-Hypermedia applications. A HYTEA application is made by anHDM schema and an HDM Hyperbase (i.e., a set of instances). The basic HDM hasalready been shown to be translatable, either manually or through a compiler, into a node-and-link model (“a la DEXTER model”); the translated application can be targeted onseveral implementation tools (i.e., standard Hypertext tools already available on themarket). HDM has already been used to develop a (small number) of applications, and todescribe preexisting applications. These experiments have shown the need forimprovements that are discussed in the papec aggregate entities; sharing of components;is-a relationships and inheritance between entity types; sharing of bodies; structured accessand “guided tours”; use of active media (animations and video-clips).lE-mail:relett34@ imipoli.bitnet2Daniel Schwabe developed this work while on sabbatical leave during 1990 (partiallysupported by CNPq-Brasil). E-maih schwabe@inf.puc-rio.brHypertext ’91 Proceedings 313 December 1991

137 citations


Journal ArticleDOI
TL;DR: Theoretical and experimental analyses of the electromagnetic transient following the out-of-phase synchronization of a three-phase five-limb step-up transformer are presented in this article.
Abstract: Theoretical and experimental analyses are presented of the electromagnetic transient following the out-of-phase synchronization of a three-phase five-limb step-up transformer; this means an abnormal condition where the angle between phasors representing the generated voltages and those representing the power network voltages at the instant of closure of the connecting circuit breaker is not near zero, as normal, but may be as much as 180 degrees (phase opposition). When this happens, the peak values of the transient currents in the windings of the transformer might be sensibly higher than those of the failure currents estimated in a conventional way; in addition, they correspond to unbalanced magnetomotive forces (MMF) in the primary and secondary windings of each phase of the machine. The currents and fluxes during the transient are computed by a nonconventional circuital nonlinear model of the transformer simulated by the electromagnetic transient program (EMTP). The results of an experimental validation made on a specially built 100 kVA three-phase five-limb transformer are also reported. >

Book
01 Jan 1991
TL;DR: This book provides selective, in-depth coverage of the fundamentals of software engineering by stressing principles and methods through rigorous formal and informal approaches to enable readers to respond to the rapid changes in technology that are common today.
Abstract: From the Publisher: This book provides selective, in-depth coverage of the fundamentals of software engineering by stressing principles and methods through rigorous formal and informal approaches. In contrast to other books which are based on the lifecycle model of software development, the authors emphasize identifying and applying fundamental principles that are applicable throughout the software lifecycle. This emphasis enables readers to respond to the rapid changes in technology that are common today. Principles and techniques are emphasized rather than specific tools—users learn why particular techniques should or should not be used. Understanding the principles and techniques on which tools are based makes mastering a variety of specific tools easier. The authors discuss principles such as design, specification, verification, production, management and tools. Now coverage includes: more detailed analysis and explanation of object-oriented techniques; the use of Unified Modeling Language (UML); requirements analysis and software architecture; Model checking—a technique that provides automatic support to the human activity of software verification; GQM—used to evaluate software quality and help improve the software process; Z specification language. For software engineers.

Book ChapterDOI
01 Jan 1991
TL;DR: The theoretical and empirical literature on the relationships between space and technological change is literally immense, and scattered along different directions that may be listed tentatively in the following: as mentioned in this paper The theory of innovation diffusion, the spatial preconditions for (and obstacles to) innovation, presence of human capital, availability of producer services, urban environment, industrial structure, characteristics of innovative environments: valleys, corridors, routes, parks; the Third Italy phenomenon; the ’Milieux innovateurs' of the new Gremi approach.
Abstract: The theoretical and empirical literature on the relationships between space and technological change is literally immense, and scattered along different directions that may be listed tentatively in the following: the theory of innovation diffusion the spatial geography of R&D the spatial preconditions for (and obstacles to) innovation: presence of human capital, availability of producer services, ’urban’ environment, industrial structure the characteristics of innovative environments: valleys, corridors, routes, parks; the ’Third Italy’ phenomenon; the ’milieux innovateurs’ of the new Gremi approach (see below) the regional differentials in productivity growth the effects of technological change on regional development the effects of technological change on urban development the spatial effects of specific technologies: industrial automation, information technologies, telecommunications,…

Journal ArticleDOI
TL;DR: In this paper, the performance of a large-area silicon drift detector (∼ 4 × 4 cm 2 ) for high-resolution tracking at the CERN p-p collider has been investigated.
Abstract: This report presents results on the performance of a large-area silicon drift detector (∼ 4 × 4 cm 2 ), which has been designed for use as a high-resolution tracking device in the experiment UA6 at the CERN p- p collider. We give here the basic characteristics of the design, and report the first experimental results. The influence, on the detector's performance, of the adopted design criteria and of the quality of the semiconductors has been experimentally determined and is discussed. Results of the first drift-time calibration using an on-board device for charge injection are also given.

Proceedings ArticleDOI
11 Jun 1991
TL;DR: Two different approaches to nonlinearity simplification in neural nets are presented, with a first solution yielding a very simple architecture, but involving discontinuous functions; a second solution, slightly more complex, but based on a continuous function.
Abstract: Two different approaches to nonlinearity simplification in neural nets are presented. Both the solutions are based on approximation of the sigmoidal mapper often used in neural networks (extensions are being considered to allow approximation of a more general class of functions). In particular, a first solution yielding a very simple architecture, but involving discontinuous functions is presented; a second solution, slightly more complex, but based on a continuous function is then presented. This second solution has been successfully used in conjunction with the classical generalized delta rule algorithm. >

Journal ArticleDOI
TL;DR: In this paper, a new identification method, based on standard recursive least squares, was proposed for the estimation of time delay in linear sampled systems, which relies on the fact that if the continuous time model has a delay or anticipation shorter than one sampling time, then a real negative zero arises in the corresponding sampled system, by inspection of the phase contribution of this zero, the value of the delay is recursively updated.

Journal ArticleDOI
TL;DR: The rules which allow the transformation of a general E3VPC expression to a Canonical Form, which can be manipulated using traditional, two-valued predicate calculus are given; in this way, problems like equivalence analysis of SQL queries are completely solved.
Abstract: The semantics of SQL queries is formally defined by stating a set of rules that determine a syntax-driven translation of an SQL query to a formal model. The target model, called Extended Three Valued Predicate Calculus (E3VPC), is largely based on a set of well-known mathematical concepts. The rules which allow the transformation of a general E3VPC expression to a Canonical Form, which can be manipulated using traditional, two-valued predicate calculus are also given; in this way, problems like equivalence analysis of SQL queries are completely solved. Finally, the fact that reasoning about the equivalence of SQL queries using two-valued predicate calculus, without taking care of the real SQL semantics can lead to errors is shown, and the reasons for this are analyzed.

Journal ArticleDOI
TL;DR: The required bandwidth per source in a finite buffer multiplexer in order to achieve a given Grade of Service (GOS) in ATM networks for bursty as well as variable bit rate (VBR) traffic is obtained.

Journal ArticleDOI
TL;DR: Passive and active detection techniques have been employed in order to measure neutron fluence rates and corresponding exposure rates around medical electron accelerators operating at energies well above the neutron binding energies of the structural materials.
Abstract: Passive and active detection techniques have been employed in order to measure neutron fluence rates and corresponding exposure rates around medical electron accelerators operating at energies well above the neutron binding energies of the structural materials. In these conditions from the treatment head, in the direct photon flux and from the shielded region, a fast neutron flux emerges which is partly absorbed and partly scattered by the walls, eventually establishing a nearly uniform thermal and epithermal flux in the room. Both direct and scattered flux contribute to the dose to the patient. A smaller neutron dose rate can also be found outside the treatment room, where the therapy staff works. Passive detectors, of moderation type, have been employed in the treatment room and {sup 3}He active detectors in the external zones. For the treatment room the activation data were compared with results of Monte Carlo simulation of the neutron transport in the room. Technical features of the two measures are briefly presented and results obtained around three different types of accelerators are reported. At the higher beam energies, i.e., 25 MV, a neutron dose of 0.36 Sv was estimated in the treatment field in addition to a therapeutic x-ray dosemore » of 50 Gy. At lower energies or out of the treatment field the neutron dose drops significantly. In the external zones the dose rates everywhere are below the recommended limits and normally very low, the highest values being recorded in positions very close to the access door of the treatment room.« less

Journal ArticleDOI
TL;DR: The ability of this technique to identify and quantify individual fluorophores within granules may provide an important insight into the origin and development of lipofuscin within the retinal pigment epithelium and ultimately into the mechanisms of age‐related retinal diseases.
Abstract: The photophysical properties of purified populations of melanin and lipofuscin granules from human retinal pigment epithelium, and their changes with donor age, have been investigated using high-sensitivity time-resolved fluorescence spectroscopy techniques with picosecond gating capabilities. The overall fluorescence intensity of both melanin and lipofuscin granules clearly increased with increasing donor age, the increase being most marked for melanin. In all granule populations the fluorescence decays were multiexponential with subnanosecond and nanosecond decay components. The resultant time-integrated and time-gated spectra also exhibited marked age-variations for each type of granule. Young melanin showed spectral patterns similar to those of bovine melanin, while a yellow-orange fluorescence band appeared in melanin samples from older age groups. Lipofuscin granules exhibited a blue, a yellow and an orange band whose relative amounts were age-related. The results demonstrate the potential of time-resolved techniques for discriminating fluorophores in vitro and in situ, and have confirmed results previously obtained using extraction techniques. Furthermore, the ability of this technique to identify and quantify individual fluorophores within granules may provide an important insight into the origin and development of lipofuscin within the retinal pigment epithelium and ultimately into the mechanisms of age-related retinal diseases.

Journal ArticleDOI
TL;DR: An approximation model for the computation of the electric fields produced in the brain tissues by magnetic stimulation is presented and it is predicted that one of the major drawbacks of the technique can be partially overcome by more effective coil positioning and/or assembly.
Abstract: An approximation model for the computation of the electric fields produced in the brain tissues by magnetic stimulation is presented. Results are given in terms of induced electric field and current density caused by coils of different radii and locations. Nontraditional coil locations and assemblies are also considered (multicoil arrangements). Model simulations show that a good control of the excitation spread can be achieved by proper positioning of the coil. It is also predicted that one of the major drawbacks of the technique, (i.e. the poor ability to concentrate the current spread into a small brain area) can be partially overcome by more effective coil positioning and/or assembly. Some comparisons are made among the results obtained from electric and magnetic stimulation. This is thought to be helpful in the design of experiments aimed at understanding the relative role of different brain structures responsible for the motor response. >

Journal ArticleDOI
TL;DR: In this article, the authors proposed to decompose the measured field into monochromatic plane waves of appropriate amplitudes, wave lengths and propagation directions, and their recombination at the time when the sounding pulse was emitted, can produce a map of the backscattered electromagnetic field.
Abstract: Synthetic aperture radar (SAR) data focusing has been traditionally performed using matched filter techniques. However, downward continuation techniques can produce, with a great computational efficiency, results that only the most sophisticated conventional techniques can achieve. The basic idea is to decompose the measured field into monochromatic plane waves of appropriate amplitudes, wave lengths and propagation directions. The backpropagation of these plane waves, and their recombination at the time when the sounding pulse was emitted, can produce a map of the backscattered electromagnetic field. In order to obtain correct focusing of synthetic aperture radar raw data, both the geometrical and the transmission parameters of the system should be known as precisely as possible. The transmission parameters are generally known very precisely, whereas the geometrical ones (i.e. sensor-target relative position, satellite velocity and attitude, etc.) can be derived from the ephemerides of the satel...

Journal ArticleDOI
TL;DR: A technique and an environment-supporting specialization of generalized software components are described, based on symbolic execution, that allows one to transform a generalized software component into a more specific and more efficient component.
Abstract: A technique and an environment-supporting specialization of generalized software components are described. The technique is based on symbolic execution. It allows one to transform a generalized software component into a more specific and more efficient component. Specialization is proposed as a technique that improves software reuse. The idea is that a library of generalized components exists and the environment supports a designer in customizing a generalized component when the need arises for reusing it under more restricted conditions. It is also justified as a reengineering technique that helps optimize a program during maintenance. Specialization is supported by an interactive environment that provides several transformation tools: a symbolic executor/simplifier, an optimizer, and a loop refolder. The conceptual basis for these transformation techniques is described, examples of their application are given, and how they cooperate in a prototype environment for the Ada programming language is outlined. >

Journal ArticleDOI
TL;DR: In this article, the output stabilization problem for discrete-time linear periodic systems is solved by resorting to a time-invariant reformulation, and the stabilizing controller is constituted by a control law and an asymptotic state predictor.
Abstract: The output stabilization problem for discrete-time linear periodic systems is solved. Both the state-feedback control law and the state-predictor are based on a suitable time-invariant state-sampled reformulation associated with a periodic system. Preliminary concepts of periodic system theory are briefly recalled. In particular, the structural properties of a linear discrete-time periodic system are properly related to those of a time-invariant system associated with it. By resorting to such a time-invariant reformulation, the output stabilization problem via pole placement is solved. The stabilizing controller is constituted by a control law and an asymptotic state predictor, both of which are shown to be periodic. >

Book ChapterDOI
16 Dec 1991
TL;DR: G-Log is introduced, a declarative graphical query language which combines of the expressive power of logic, the modelling power of objectorientedness and the representation power of graphs and it is proved that G- log is a graphical equivalent of the first order predicate calculus.
Abstract: In this paper we introduce G-Log, a declarative graphical query language which combines of the expressive power of logic, the modelling power of objectorientedness and the representation power of graphs. As in the case of prolog, G-Log may be used in a totally declarative way, as well as in a “more procedural” way. Furthermore, it provides an intuitive and flexible graphical tool for non-expert database users. We prove that G-Log is a graphical equivalent of the first order predicate calculus. Finally, we study its features as a non-deterministic language and compare it with other existing non-deterministic languages.

Journal ArticleDOI
TL;DR: In this article, the results of an experimental research on the drilling of aramid fiber reinforced plastics are reported, where a drilling machine has been instrumented in order to detect the thrust force and the torque as a function of the depth of the hole.
Abstract: The results of an experimental research on the drilling of aramid fibre reinforced plastics are reported. A drilling machine has been instrumented in order to detect the thrust force and the torque as a function of the depth of the hole. The signals of the transducers were acquired by an analog to digital converter card inserted in a personal computer. The apparatus enabled analysis of the cutting mechanism of commercially available twist drills intended for the drilling of aramid fibre reinforced plastics. Relationships have been evidenced between the trends of the thrust force and the torque with regard to the tool bit geometry and the composite structure, for different feed rates and for different composite constructions. The thrust force shows irregular courses which are discussed in terms of non-uniform distribution of the thrust force along the tool cutting edges and of poor interlaminar strength of composites. A continuous decrease of the thrust force mean value has also been found which has been attributed to the heat build-up at the cutting front. The torque is strongly influenced by friction at the lands of the twist drill.

Proceedings ArticleDOI
03 Jun 1991
TL;DR: In this paper, the application of the w-k technique to spot mode SAR focusing is analyzed, where the variation of the antenna beam pointing angle is controlled by the system, and a trivial but efficient solution for the spectral unfolding is proposed.
Abstract: In this paper the application of the w - k technique to spot mode SAR focusing is analyzed. Spot-light mode SAR can be regarded as a special case of time- varying Doppler centroid system. In this case, the variation of the antenna beam pointing angle is controlled by the system. During a relatively long time, the antenna beam is pointed to the same ground position and the Doppler centroid varies almost linearly with the platform position. As for the general time-varying situation, spectral folding will be generated by the limited sampling frequency. In the spot-mode case, a trivial but efficient solution for the spectral unfolding is proposed. Results obtained with simulated spot-light data are presented.

Journal ArticleDOI
TL;DR: In this article, it was shown that the steady Boltzmann equation in a slab [0,a] has solutionsx→μ¯¯ x ∼ 0,a such that the ingoing boundary measuresμ0∣{ξ>0} andμワンα∣ {ξ<0} can be prescribed a priori, and the collision kernel is truncated such that particles with small component of the velocity have a reduced collision rate.
Abstract: It is shown that the steady Boltzmann equation in a slab [0,a] has solutionsx→μ x such that the ingoing boundary measuresμ 0∣{ξ>0} andμ α∣{ξ<0} can be prescribed a priori. The collision kernel is truncated such that particles with smallx-component of the velocity have a reduced collision rate.

Journal ArticleDOI
TL;DR: In this paper, the trasport properties of water through polyether-polyurethane membranes were evaluated by absorption and permeability measurements by FTIR, and the state of water and the active absorption centres of the polymer were studied by the FTIR.

Book ChapterDOI
21 Oct 1991
TL;DR: The rationale of ASTRAL's design is discussed and how the language builds on previous language experiments is shown, by discussing a case study taken from telephony.
Abstract: ASTRAL is a formal specification language for realtime systems. This paper discusses the rationale of ASTRAL's design and shows how the language builds on previous language experiments. ASTRAL is intended to support formal software development; therefore, the language itself has been formally defined. ASTRAL's specification style is illustrated by discussing a case study taken from telephony.