scispace - formally typeset
Search or ask a question

Showing papers by "Polytechnic University of Milan published in 1999"


Journal ArticleDOI
TL;DR: In the 21st century, manufacturing companies must possess new types of manufacturing systems that are cost-effective and very responsive to all these market changes as mentioned in this paper, which are the cornerstones of this new manufacturing paradigm.

1,706 citations


Journal ArticleDOI
TL;DR: In this paper, a three-dimensional finite deformation cohesive element and a class of irreversible cohesive laws are proposed to track dynamic growing cracks in a drop-weight dynamic fracture test.
Abstract: SUMMARY We develop a three-dimensional nite-deformation cohesive element and a class of irreversible cohesive laws which enable the accurate and ecient tracking of dynamically growing cracks. The cohesive element governs the separation of the crack anks in accordance with an irreversible cohesive law, eventually leading to the formation of free surfaces, and is compatible with a conventional nite element discretization of the bulk material. The versatility and predictive ability of the method is demonstrated through the simulation of a drop-weight dynamic fracture test similar to those reported by Zehnder and Rosakis. 1 The ability of the method to approximate the experimentally observed crack-tip trajectory is particularly noteworthy. Copyright ? 1999 John Wiley & Sons, Ltd.

1,375 citations


Proceedings ArticleDOI
TL;DR: In this paper, the authors presented results obtained using 45 ERS SAR images gathered over the Italian town of Camaiore (within a time span of more than 6 years and a range of normal baseline of over 2000 m) are presented.
Abstract: Differential SAR interferometry measurements provide a unique tool for low-cost, large-coverage surface deformations monitoring. Limitations are essentially due to temporal decorrelation and atmospheric inhomogeneities. Though temporal decorrelation and atmospheric disturbances strongly affect interferogram quality, reliable deformation measurements can be obtained in a multi-image framework on a small subset of image pixels, corresponding to stable areas. These points, hereafter called Permanent Scatterers, can be used as a `natural GPS network' to monitor terrain motion, analyzing the phase history of each one. In this paper, results obtained using 45 ERS SAR images gathered over the Italian town of Camaiore (within a time span of more than 6 years and a range of normal baseline of more than 2000 m) are presented. The area is of high geophysical interest because it is known to be unstable. A subterranean cavity collapsed in October 1995 causing the ruin of several houses in that location. Time series analysis of the phase values showed the presence of precursors three months before the collapse.© (1999) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

1,320 citations


Proceedings ArticleDOI
28 Jun 1999
TL;DR: In this article, the Permanent Scatterers (PS) were used as a "natural GPS network" to monitor terrain motion, analysing the phase history of each one, and results obtained using 34 ERS SAR images gathered over the Italian city of Ancona are presented.
Abstract: Differential SAR interferometry (DInSAR) is an unique tool for low-cost, large-coverage surface deformations monitoring. As well known, the technique involves interferometric phase comparison of SAR images gathered at different times and has the potential to provide millimetric accuracy. Though temporal decorrelation and atmospheric dishomogeneities strongly affect interferogram quality, reliable deformation measurements can be obtained in a multi-image framework on a small subset of image pixels, corresponding to stable areas. These points, hereafter called Permanent Scatterers (PS), can be used as a "natural GPS network" to monitor terrain motion, analysing the phase history of each one. In this paper, results obtained using 34 ERS SAR images gathered over the Italian city of Ancona are presented.

517 citations


Journal ArticleDOI
01 Oct 1999
TL;DR: This paper describes a specification-based method for constructing a suite of test sequences, where a test sequence is a sequence of inputs and outputs for testing a software implementation.
Abstract: Recently, many formal methods, such as the SCR (Software Cost Reduction) requirements method, have been proposed for improving the quality of software specifications. Although improved specifications are valuable, the ultimate objective of software development is to produce software that satisfies its requirements. To evaluate the correctness of a software implementation, one can apply black-box testing to determine whether the implementation, given a sequence of system inputs, produces the correct system outputs. This paper describes a specification-based method for constructing a suite of test sequences, where a test sequence is a sequence of inputs and outputs for testing a software implementation. The test sequences are derived from a tabular SCR requirements specification containing diverse data types, i.e., integer, boolean, and enumerated types. From the functions defined in the SCR specification, the method forms a collection of predicates called branches, which “cover” all possible software behaviors described by the specification. Based on these predicates, the method then derives a suite of test sequences by using a model checker's ability to construct counterexamples. The paper presents the results of applying our method to four specifications, including a sizable component of a contractor specification of a real system.

398 citations


Journal ArticleDOI
TL;DR: This paper investigates the current situation of Web development tools, both in the commercial and research fields, by identifying and characterizing different categories of solutions, evaluating their adequacy to the requirements of Web application development, enlightening open problems, and exposing possible future trends.
Abstract: The exponential growth and capillar diffusion of the Web are nurturing a novel generation of applications, characterized by a direct business-to-customer relationship. The development of such applications is a hybrid between traditional IS development and Hypermedia authoring, and challenges the existing tools and approaches for software production. This paper investigates the current situation of Web development tools, both in the commercial and research fields, by identifying and characterizing different categories of solutions, evaluating their adequacy to the requirements of Web application development, enlightening open problems, and exposing possible future trends.

397 citations


Journal ArticleDOI
TL;DR: It is shown that XCS's generalization mechanism is effective, but that the conditions under which it works must be clearly understood, and the compactness of the representation evolved by XCS is limited by the number of instances of each generalization actually present in the environment.
Abstract: The XCS classifier system represents a major advance in learning classifier systems research because (1) it has a sound and accurate generalization mechanism, and (2) its learning mechanism is based on Q-learning, a recognized learning technique. In taking XCS beyond its very first environments and parameter settings, we show that, in certain difficult sequential (“animat”) environments, performance is poor. We suggest that this occurs because in the chosen environments, some conditions for proper functioning of the generalization mechanism do not hold, resulting in overly general classifiers that cause reduced performance. We hypothesize that one such condition is a lack of sufficiently wide exploration of the environment during learning. We show that if XCS is forced to explore its environment more completely, performance improves dramatically. We propose a technique, based on Sutton's Dyna concept, through which wider exploration would occur naturally. Separately, we demonstrate that the compactness of the representation evolved by XCS is limited by the number of instances of each generalization actually present in the environment. The paper shows that XCS's generalization mechanism is effective, but that the conditions under which it works must be clearly understood.

358 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a resource-based view of an SME's sustainable competitive advantage and propose an approach to strategy analysis based on such a view, which can be used to support strategic analysis and management in SMEs.
Abstract: Few articles have been published that specifically deal with how to support strategic analysis and management in small-medium sized enterprises (SMEs). In the last decade, however, literature on strategic management has paid considerable attention to the resource-based theory, which seems to fit well the needs of owners and executives of SMEs. The objective of this article is twofold: (i) to present a resource-based view of an SME's sustainable competitive advantage; (ii) to propose an approach to strategy analysis based on such a view.

350 citations


Journal ArticleDOI
TL;DR: In this paper, the authors suggest a contingent framework to support SMEs in the analysis of the drivers of green product innovation and in the choice of a proper R&D strategy that explicitly accounts for the eco-efficiency of product technologies.
Abstract: The growing social and regulatory concern for the environment is leading an increasing number of companies to considering ‘green’ issues as a major source of strategic change. In particular, this trend has major and complex implications on the technological strategy of a company and on its product innovations. Indeed, most authors acknowledge that eco-efficiency will be one of the major challenges for R&D practice and theory in the next decade. Unfortunately, studies usually focus on large corporations. There is a debate as to whether this factor will affect R&D practices and product innovation in small and medium enterprises (SMEs). A superficial glimpse at the problem could lead one to think that SMEs will not be major green innovators, especially as far as product technologies are concerned, and that they will simply try to comply with environmental regulations (mainly on production processes). This paper shows that ‘green’ product innovation may occur and may also have strategic implications in SMEs. Starting from the analysis of four selected case studies and using a Precursors Events methodology, this paper illustrates why ‘green’ product innovation cannot be considered a marginal issue for most SMEs, even for those that are not directly affected by environmental regulations. Hence, the paper suggests a contingent framework to support SMEs in the analysis of the drivers of ‘green’ product innovation and in the choice of a proper R&D strategy that explicitly accounts for the eco-efficiency of product technologies.

343 citations


Journal ArticleDOI
TL;DR: In this paper, the V2O5-MoO3/TiO2 catalysts are considered and the presence of electronic interactions between the TiO2-supported V and Mo oxides is also apparent.

329 citations


Journal ArticleDOI
TL;DR: The impact that features have on different phases of the life cycle is discussed, some ideas on how these phases can be improved by fully exploiting the concept of feature are provided, and topics for a research agenda in feature engineering are suggested.

Journal ArticleDOI
TL;DR: A wavelet domain approach is proposed, which provides a valuable tool not only for DEMs combination (improving accuracy), but for data evaluation and selection, since the phase error power is estimated for each interferogram.
Abstract: Multibaseline synthetic aperture radar (SAR) interferometry can be exploited successfully for high-quality digital elevation model (DEM) reconstruction, provided that both noise and atmospheric effects are taken into account. A weighted combination of many uncorrelated topographic profiles strongly reduces the impact of phase artifacts on the final DEM. The key issue is weights selection. In the present article a wavelet domain approach is proposed. Taking advantage of the particular frequency trend of the atmospheric distortion, it is possible to estimate, directly from the data, noise and atmospheric distortion power for each interferogram. The available DEMs are then combined by means of a weighted average, carried out in the wavelet domain. This new approach provides a valuable tool not only for DEMs combination (improving accuracy), but for data evaluation and selection, since the phase error power is estimated for each interferogram. Results obtained using simulated and real data (ERS-1/2 TANDEM data of a test area around the Etna volcano, Sicily) are presented.

Journal ArticleDOI
17 May 1999
TL;DR: This paper introduces XML-GL, a graphical query language for XML documents that is inspired by G-log, a general purpose, logic-based language for querying structured and semi-structured data.
Abstract: The growing acceptance of XML as a standard for semi-structured documents on the Web opens up challenging opportunities for Web query languages. In this paper we introduce XML-GL, a graphical query language for XML documents. The use of a visual formalism for representing both the content of XML documents (and of their DTDs) and the syntax and semantics of queries enables an intuitive expression of queries, even when they are rather complex. XML-GL is inspired by G-log, a general purpose, logic-based language for querying structured and semi-structured data. The paper presents the basic capabilities of XML-GL through a sequence of examples of increasing complexity.

Posted Content
TL;DR: A comparison of five, representative query languages for XML, highlighting their common features and differences is presented.
Abstract: XML is becoming the most relevant new standard for data representation and exchange on the WWW. Novel languages for extracting and restructuring the XML content have been proposed, some in the tradition of database query languages (i.e. SQL, OQL), others more closely inspired by XML. No standard for XML query language has yet been decided, but the discussion is ongoing within the World Wide Web Consortium and within many academic institutions and Internet-related major companies. We present a comparison of five, representative query languages for XML, highlighting their common features and differences.

Journal ArticleDOI
TL;DR: A set of measures for cohesion and coupling are defined, which satisfy a previously published set of mathematical properties that are necessary for any such measures to be valid, and their relationship to fault-proneness on three large scale projects is investigated to provide empirical support for their practical significance and usefulness.
Abstract: The availability of significant measures in the early phases of the software development life-cycle allows for better management of the later phases, and more effective quality assessment when quality can be more easily affected by preventive or corrective actions. We introduce and compare various high-level design measures for object-based software systems. The measures are derived based on an experimental goal, identifying fault-prone software parts, and several experimental hypotheses arising from the development of Ada systems for Flight Dynamics Software at the NASA Goddard Space Flight Center (NASA/GSFC). Specifically, we define a set of measures for cohesion and coupling, which satisfy a previously published set of mathematical properties that are necessary for any such measures to be valid. We then investigate the measures' relationship to fault-proneness on three large scale projects, to provide empirical support for their practical significance and usefulness.

Journal ArticleDOI
TL;DR: In this paper, a valence force field based on Huckel's theory has been developed, which allows to establish a close correlation between phonons of graphite and the normal modes of small polycyclic aromatic hydrocarbons (such as coronene and hexabenzocoronene).
Abstract: A valence force field based on H\"uckel's theory has been developed, which allows us to establish a close correlation between phonons of graphite and the normal modes of small polycyclic aromatic hydrocarbons (such as coronene and hexabenzocoronene). The results show that in these systems two kinds of motions dominate the Raman spectrum: the \cyrchar\CYRYA{} mode and the ``breathing'' A mode. These modes are the equivalent, in a finite domain, of the ${E}_{2g}$ phonon of graphite at the \ensuremath{\Gamma} point and the ${A}^{\ensuremath{'}}$ phonon at the K point of the first Brillouin zone. This study provides a useful basis for the understanding of the Raman spectra of any material containing ${\mathrm{sp}}^{2}$ carbon domains.

Book ChapterDOI
TL;DR: In this article, the expanding ring test of Grady and Benson (1983) is taken as a convenient yet challenging validation problem for assessing the fidelity of cohesive models in situations involving ductile dynamical fracture.
Abstract: The expanding ring test of Grady and Benson (1983) is taken as a convenient yet challenging validation problem for assessing the fidelity of cohesive models in situations involving ductile dynamical fracture. Attention has been restricted to 1100-0 aluminum samples. Fracture has been modelled by recourse to an irreversible cohesive law embedded into cohesive elements. The finite element model is three-dimensional and fully Lagrangian. In order to limit the extent of deformation-induced distortion, we resort to continuous adaptive remeshing. The cohesive behavior of the material is assumed to be rate independent and, consequently, all rate effects predicted by the calculations are due to inertia and the rate dependency in plastic deformation. The numerical simulations are revealed to be highly predictive of a number of observed features, including: the number of dominant and arrested necks; the fragmentation patterns; the dependence of the number of fragments and the fracture strain on the expansion speed; and the distribution of fragment sizes at fixed expansion speed.

Journal ArticleDOI
TL;DR: In this paper, three local criteria for the identification of vortices are analyzed and discussed; they are based on the analysis of invariants of the velocity gradient tensor ▿u or invariant of the tensor S 2 + Ω 2, where S and Ω are the symmetric and antisymmetric parts of ▾u.
Abstract: Three proposed local criteria for the identification of vortices are analyzed and discussed; they are based on the analysis of invariants of the velocity gradient tensor ▿u or invariants of the tensor S 2 +Ω 2 , where S and Ω are the symmetric and antisymmetric parts of ▿u. Moreover, a tentative non-local procedure is proposed, which takes advantage of the observation that vortices tend to be made up of the same fluid particles; this leads to the definition of a Galilean invariant quantity, which can be computed and used to identify vortical structures. Three analytical flow fields are used for a comparative evaluation of both local and non-local criteria, which allows a deeper understanding of the physical meaning of the considered techniques.

Journal ArticleDOI
TL;DR: The results showed that both the perceptual estimation and the hand shaping while grasping the disc were similarly influenced by the illusion, and the stronger the perceptual illusion, the greater the effect on the grip scaling.
Abstract: In the present study, we investigated the effects of the Titchener circles illusion in perception and action. In this illusion, two identical discs can be perceived as being different in size when one is surrounded by an annulus of smaller circles and the other is surrounded by an annulus of larger circles. This classic size-contrast illusion, known as Ebbinghaus or Titchener Circles Illusion, has a strong perceptual effect. By contrast, it has recently been demonstrated that when subjects are required to pick up one of the discs, their grip aperture during reaching is largely appropriate to the size of the target. This result has been considered as evidence of a clear dissociation between visual perception and visuomotor behaviour in the intact human brain. In this study, we suggest and investigate an alternative explanation for these results. We argue that, in a previous study, while perception was subjected to the simultaneous influence of the large and small circles displays, in the grasping task only the annulus of circles surrounding the target object was influential. We tested this hypothesis by requiring 18 subjects to perceptually estimate and grasp a disc centred in a single annulus of Titchener circles. The results showed that both the perceptual estimation and the hand shaping while grasping the disc were similarly influenced by the illusion. Moreover, the stronger the perceptual illusion, the greater the effect on the grip scaling. We discuss the results as evidence of an interaction between the functional pathways for perception and action in the intact human brain.

Journal ArticleDOI
TL;DR: The perfluorocarbon-hydrocarbon self-assembly allows the resolution of racemic 2-dibromohexafluoropropane and result in enantiopure and infinite supramolecular helices.
Abstract: Halogen bonds, attractive intermolecular interactions between perfluoroalkyl bromides and bromide ions, are present in cocrystals of (-)-sparteinium hydrobromide (1) and (S)-1,2-dibromohexafluoropropane (2; shown schematically), and result in enantiopure and infinite supramolecular helices. The perfluorocarbon-hydrocarbon self-assembly allows the resolution of racemic 2.

Journal ArticleDOI
TL;DR: In this article, the authors investigated early development practices in 18 Italian and Swedish companies, operating in the vehicle, helicopter, and white-goods industries, and identified four possible approaches to manage the early phases (detailed, selective, comprehensive, and postponed), where anticipation and reaction have different balances.

Journal ArticleDOI
TL;DR: Histomorphometric analysis conducted on samples inserted in the cancellous bone of distal femoral epiphysis of Sprague-Dawley rats gave the following results: Affinity index (AI%) data proving the surface osteconductive properties of non-anodized acid etched Ti indicated that hydroxyapatite allowed a higher bone to implant contact respect to Ti only.

Journal ArticleDOI
TL;DR: In this paper, the reactivity in the selective catalytic reduction (SCR) reaction and the redox behavior of V2O5-MoO3/TiO2 catalysts was investigated by means of the temperature programmed reduction (TPR)/reaction technique, and compared with that of binary V 2O5/TiOs and MoO3-TiOs having the same metal oxide loading, which indicated that the simultaneous presence of V and Mo enhances the catalyst redox properties, and thus its reactivity.
Abstract: The reactivity in the selective catalytic reduction (SCR) reaction and the redox behavior of V2O5–MoO3/TiO2 catalysts was investigated by means of the temperature programmed reduction (TPR)/reaction technique, and compared with that of binary V2O5/TiO2 and MoO3/TiO2 catalysts having the same metal oxide loading. It was found that the ternary catalysts are more active in the SCR reaction at low temperatures compared to the corresponding binary samples: hence the ‘temperature window’ of the reaction is widened and shifted towards lower temperatures. Transient reactivity data provide clear evidence in favor of the hypothesis of a redox mechanism for the SCR reaction and point out that the ternary catalysts are more easily reduced and reoxidized than the corresponding binary samples: this indicates that the simultaneous presence of V and Mo enhances the catalyst redox properties, and thus its reactivity. Such conclusions are also in line with the results of the characterization studies pointing out the existence of electronic interactions involving the V and Mo surface oxide species. The overall picture closely resembles the one obtained in the case of the analogous V2O5–WO3/TiO2 system and indicates that the effects of the addition of WO3 and MoO3 to V2O5/TiO2 are similar, both oxides acting as ‘chemical’ promoters besides playing a ‘structural’ function as well.

Journal ArticleDOI
TL;DR: In this article, a multivariate mathematical and statistical design and modelling has been used for the process optimisation has led to the development of a multiivariate mathematical model which describes the oxidation process with high predictive capacity, an increase of the vanillin yield from 4.1% of the current procedure to 7.2% is obtained.

Journal ArticleDOI
01 Jun 1999-Extremes
TL;DR: In this article, it was shown that small cracks, defects and nonmetallic inclusions having the same value of the square root of projection area, √area, have the identical influence on the fatigue limit regardless of different stress concentration factors.
Abstract: The method explained in this paper for quantitative evaluation of fatigue limit for materials containing defects is based on the experimental evidences that inhomogeneities and micro-notches can be treated like cracks. First, the basic concept of the √area parameter model is explained introducing the various data obtained by the first author's group for over last 15 years. Evidences are shown that small cracks, defects and nonmetallic inclusions having the same value of the square root of projection area, √area, have the identical influence on the fatigue limit regardless of different stress concentration factors. Various applications of these concepts to various defect types and microstructural inhomogeneities are shown. Since the estimation of fatigue strength is related to the estimation of the size of maximum defects occurring in a piece, the methods for searching the defects and the quality control of materials with respect to inclusion or defect rating as well as their statistical implications are discussed.

Journal ArticleDOI
TL;DR: Guadagnini and Neuman as mentioned in this paper developed complementary integrodifferential equations for second conditional moments of head and flux which serve as measures of predictive uncertainty; obtained recursive closure approximations for both the first and second conditional moment equations through expansion in powers of a small parameter σY which represents the standard estimation error of ln K(x).
Abstract: We consider the effect of measuring randomly varying hydraulic conductivitiesK(x) on one's ability to predict numerically, without resorting to either Monte Carlo simulation or upscaling, steady state flow in bounded domains driven by random source and boundary terms. Our aim is to allow optimum unbiased prediction of hydraulic heads h(x) and fluxes q(x) by means of their ensemble moments, 〈h(x)〉c and 〈q(x)〉c, respectively, conditioned on measurements of K(x). These predictors have been shown by Neuman and Orr [1993a] to satisfy exactly an integrodifferential conditional mean flow equation in which 〈q(x)〉c is nonlocal and non-Darcian. Here we develop complementary integrodifferential equations for second conditional moments of head and flux which serve as measures of predictive uncertainty; obtain recursive closure approximations for both the first and second conditional moment equations through expansion in powers of a small parameter σY which represents the standard estimation error of ln K(x); and show how to solve these equations to first order in σY2 by finite elements on a rectangular grid in two dimensions. In the special case where one treats K(x) as if it was locally homogeneous and mean flow as if it was locally uniform, one obtains a localized Darcian approximation 〈q(x)〉c ≈ −Kc(x)∇〈h(x)〉c in which Kc(x) is a space-dependent conditional hydraulic conductivity tensor. This leads to the traditional deterministic, Darcian steady state flow equation which, however, acquires a nontraditional meaning in that its parameters and state variables are data dependent and therefore inherently nonunique. It further explains why parameter estimates obtained by traditional inverse methods tend to vary as one modifies the database. Localized equations yield no information about predictive uncertainty. Our stochastic derivation of these otherwise standard deterministic flow equations makes clear that uncertainty measures associated with estimates of head and flux, obtained by traditional inverse methods, are generally smaller (often considerably so) than measures of corresponding predictive uncertainty, which can be assessed only by means of stochastic models such as ours. We present a detailed comparison between finite element solutions of nonlocal and localized moment equations and Monte Carlo simulations under superimposed mean-uniform and convergent flow regimes in two dimensions. Paper 1 presents the theory and computational approach, and paper 2 [Guadagnini and Neuman, this issue] describes unconditional and conditional computational results.

Proceedings ArticleDOI
27 Jun 1999
TL;DR: In this article, a new maximum power point tracker (MPPT) approach is presented that allows minimization of the drawback caused by the intrinsic capacitance of the photovoltaic array (generally neglected) giving the possibility of operation with a large degree of freedom, independent from the converter topology and photovolastic power generator, electrical network and technology.
Abstract: The present trend for commercial telecommunication and scientific satellites is the utilization of standard platform, characterized by a high level of flexibility and reduced nonrecurring costs. One of the areas where flexibility is mandatory is the electrical primary power subsystem, due to the impact on solar array configuration and dimensions and on power conditioning unit. Use of the maximum power point tracker (MPPT) concept allows optimization of the above mentioned subsystem, maximizing the power transfer from the photovoltaic generator. The purpose of the paper is to present a new MPPT approach that allows minimization of the drawback caused by the intrinsic capacitance of the photovoltaic array (generally neglected) giving the possibility of operation with a large degree of freedom, independent from the converter topology and photovoltaic power generator, electrical network and technology.

Journal ArticleDOI
TL;DR: In this paper, the authors apply various estimation procedures to synthetic periodic time series in order to verify the performance of each estimation method and to determine which estimators should be used when periodicity may be present.

Journal ArticleDOI
TL;DR: Among the different possible formulations of the new paradigm, that termed “strategically flexible production” is presented and is assessed in support of the fifth thesis, i.e. the emergence of a new paradigm.
Abstract: The evolution of production models raises a number of questions on the changes which are taking place, on the continuity of or break with consolidated models, and on whether new production paradigms are emerging. Traces back this broad and multi‐faceted debate to five theses which summarise the principal interpretative approaches: the emergence of lean production as the dominant model; the indeterminacy of production models and the unpredictability of their evolution; the existence of a number of different models which are strongly dependent on context; the asymptotic convergence over time of different models on a single point of reference which is not lean production, while the latter will decline or be revised; and the emergence of a new unifying paradigm which leaves room for and even requires specific variations and adaptations. Reassesses these positions in the light of the life‐cycle of management models and in support of the fifth thesis, i.e. the emergence of a new paradigm. Among the different possible formulations of the new paradigm, that termed “strategically flexible production” is presented.

Journal ArticleDOI
TL;DR: In this paper, the decolorization and mineralization of some azo and anthraquinone dyes by photoactivated hydrogen peroxide has been studied, and a simple kinetic model, describing adequately the process, has been proposed; pH does not influence significantly the process in the range going from 3 to 9.