scispace - formally typeset
Search or ask a question

Showing papers by "Altran published in 2014"


Book ChapterDOI
13 Jul 2014
TL;DR: This paper presents a retrospective of the experiences with applying theorem proving to the verification of SPARK programs, both in terms of projects and the technical evolution of the language and tools over the years.
Abstract: This paper presents a retrospective of our experiences with applying theorem proving to the verification of SPARK programs, both in terms of projects and the technical evolution of the language and tools over the years.

35 citations


Journal ArticleDOI
TL;DR: In this article, the authors take into account the varying optical coefficients in the energy balance equations and obtain significant differences regarding electricity production between the optical-thermal approach and the classical thermal model.

33 citations


Proceedings Article
20 Jul 2014
TL;DR: This paper focuses on the mining and the analysis of social netw orks between course units or training providers and proposes a two-step clustering approach for partitioning educational processes following key performance Indicators.
Abstract: Educational Process Mining constitutes a new opportunity to better understand students’ learning habits and finely analyze the complete set of educational proc esses. In this paper, we investigate further the potential and cha llenges of Process Mining in the field of professional training. Firstly, we focus on the mining and the analysis of social netw orks between course units or training providers. Secondl y, we propose a two-step clustering approach for partitioning educational processes following key performance ind icators. KeywordsProcess Mining; Educational Data Mining; Curriculum Mining; Key Performance Indicators; ProM.

28 citations


Journal ArticleDOI
TL;DR: In this paper, the circulation in the vicinity of La Reunion and Mauritius islands, i.e., within 500 km offshore, on the intraseasonal time scale, using a high-resolution realistic modeling strategy.
Abstract: The objective of this study is to document the circulation in the vicinity of La Reunion and Mauritius islands, i.e., within 500 km offshore, on the intraseasonal time scale, using a high-resolution realistic modeling strategy. The simulated sea level anomalies, water mass properties, and large-scale circulation compare favorably with satellite and in situ observations. Our high-resolution simulation suggests that the currents around the islands are maximal locally, oriented southwestward, to the southeast of both islands which is not visible in low-resolution satellite observations. It also highlights the high degree of variability of the circulation, which is dominated by westward propagating features. The predominant time scale of variability is 60 days. This coincides with the period of a barotropic mode of variability confined to the Mascarene Basin. The characteristics of the westward propagating anomalies are related to baroclinic Rossby waves crossing the Indian Ocean but only in the long-wave resting ocean limit. Tracking those anomalies as eddies shows that they also have a meridional tendency in their trajectory, northward for cyclones and southward for anticyclones, which is consistent with previous studies. Sensitivity experiments suggest that they are predominantly advected from the east, but there is also local generation in the lee of the islands, due to interaction between the circulation and topography.

28 citations


Journal ArticleDOI
TL;DR: In this paper, a textile soft surface is proposed to reduce back radiation of a textile patch antenna, and the performance is analyzed when the antenna is placed on a bent surface, which is assumed to be c...
Abstract: A textile soft surface is proposed to reduce back radiation of a textile patch antenna, and the performance is analyzed when the antenna is placed on a bent surface. This surface is assumed to be c ...

24 citations


Journal ArticleDOI
TL;DR: High CD3 and CD8 lymphocytes SR densities are associated with better cancer-specific survival for MIBC and Th1 reaction against the tumour seems to be protective for bladder cancer.
Abstract: Objective: To define immunoscore in bladder cancer studying T helper 1 (Th1) immunoreaction. To define a cancer-specific survival model based on Th1 cells infiltration. Methods: A total of 252 patients underwent primary transurethral resection of bladder tumour at our Institution. A retrospective review of a selected cohort with pT1 and muscle-invasive bladder cancer (MIBC) lesions was performed. Pathology blocks were marked with CD3 and CD8 antibodies. Immune cells density in stromal reaction (SR) was measured on five distinct high-power field (HPF) by two dedicated uro-pathologist blinded for patients' evolution. Statistics: Student test or non-parametric Wilcoxon test as appropriate to compare means between two groups. Receiver operating char- acteristics (ROC) curve to define markers threshold. Cox model to assess survival's predictors. Results: Ten pT1 and 20 MIBC consecutive cases were analysed. Median follow-up was 33.4 months. Immunohistological analysis for pT1 lesions featured limited SR. For MIBC, the mean density of lymphocytes in the SR was of 105/HPF (CD3) and 86/HPF (CD8). Survivors harboured higher lymphocytes densities versus non survivors (CD3: p = 0.0319; CD8: p = 0.0279). CD3 (p = 0.034) and CD8 (p = 0.034) lymphocytes densities were independently associated with cancer-specific survival on Cox model analyses. The retrospective design and small size of cohorts are the study limitations.

17 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an exploratory study aiming to understand the use of maintenance documentation by the technicians in the aircraft maintenance context and why they do not systematically use it.
Abstract: The aim of this article is to present an exploratory study aiming to understand the use of maintenance documentation by the technicians in the aircraft maintenance context and why they do not systematically use it. We seek to establish a global model based on the results. Previous studies can provide us with an understanding of why aircraft maintenance technicians sometimes do not follow the requisite procedure. Here we use these empirical data and psychological models as a framework, and consider the use by an aircraft maintenance technician of a document specifically as an information-seeking task and as a secondary task. A qualitative survey involving 13 maintenance technicians was conducted, with observations and semidirected interviews. The survey gives preliminary results about why, when, and how technicians use their maintenance documents, and why they sometimes do not use them although they are required to do so. Thus, the decision by an aircraft technician to use or not use a prescribed document ...

15 citations


Book ChapterDOI
16 Oct 2014
TL;DR: A semi-automatic approach for solving the ill-posed problem of initial alignment for Augmented Reality systems during liver surgery using an image-based soft-tissue reconstruction technique and an atlas-based approach.
Abstract: Each year in Europe 50,000 new liver cancer cases are diagnosed for which hepatic surgery combined to chemotherapy is the most common treatment. In particular the number of laparoscopic liver surgeries has increased significantly over the past years. This type of minimally invasive procedure which presents many benefits for the patient is challenging for the surgeons due to the limited field of view. Recently new augmented reality techniques which merge preoperative data and intraoperative images and permit to visualize internal structures have been proposed to help surgeons during this type of surgery. One of the difficulties is to align preoperative data with the intraoperative images. We propose in this paper a semi-automatic approach for solving the ill-posed problem of initial alignment for Augmented Reality systems during liver surgery. Our registration method relies on anatomical landmarks extracted from both the laparoscopic images and three-dimensional model, using an image-based soft-tissue reconstruction technique and an atlas-based approach, respectively. The registration evolves automatically from a quasi-rigid to a non-rigid registration. Furthermore, the surface-driven deformation is induced in the volume via a patient specific biomechanical model. The experiments conducted on both synthetic and in vivo data show promising results with a registration error of 2 mm when dealing with a visible surface of 30% of the whole liver.

14 citations


Proceedings ArticleDOI
22 Jun 2014
TL;DR: This study applies the concept of artificial noise (AN) transmission in an untrusted relay network to enhance secure communication between the source and destination and derives the analytical symbol error rate (SER) expression, which is maximized for the optimal AN design.
Abstract: We apply the concept of artificial noise (AN) transmission in an untrusted relay network to enhance secure communication between the source and destination. Specifically, in addition to square quadrature amplitude modulated (QAM) signals broadcasted by the source, the relay simultaneously receives AN symbols designed and transmitted by the destination. For the relay, based on the assumption of additive white Gaussian noise, we derive the analytical symbol error rate (SER) expression, which is maximized for the optimal AN design. Under an average power constraint, we find the optimal phase and power distribution of the AN. Interestingly, our study shows that the Gaussian distribution is generally not optimal to generate AN and the results in this paper can be used as benchmarks for future analyses of AN-based techniques. More importantly, rather than conducting analysis from an information-theoretic perspective, our SER-based approach takes practical communication issues into account, such as QAM signalling.

12 citations


Proceedings ArticleDOI
01 Oct 2014
TL;DR: This work deals with 2D error estimations of the edge detection process, the starting step of the whole tracking procedure enabling to determine the outline of the imaged marker, and fitting techniques to describe the geometric features composing the outline.
Abstract: Work described in this contribution focuses on error analysis in augmented reality (AR) systems. The tracking, the process of locating an object (e.g. fiducial marker) in an environment, is critical to the accuracy of AR applications as more realistic results can be obtained in the presence of accurate AR registration. This deals with 2D error estimations of the edge detection process, the starting step of the whole tracking procedure enabling to determine the outline of the imaged marker. Using fitting techniques to describe the geometric features composing the outline, errors bounds are determined and, as a result of this step, edge detection errors are estimated. These 2D errors are then propagated up to the final tracking step.

11 citations



Book ChapterDOI
Johannes Kanig1, Roderick Chapman2, Cyrille Comar1, Jerome Guitton1, Yannick Moy1, Emyr Rees2 
24 Jul 2014
TL;DR: This paper shows how formal verification was addressed in an industrial project using the SPARK formal verification technology, and proposes a partial automation of this process, using the notion of explicit assumptions, for fine-grain integration of formal verification and testing of Ada programs.
Abstract: Formal modular verification of software is based on assume-guarantee reasoning, where each software module is shown to provide some guarantees under certain assumptions and an overall argument linking results for individual modules justifies the correctness of the approach. However, formal verification is almost never applied to the entire code, posing a potential soundness risk if some assumptions are not verified. In this paper, we show how this problem was addressed in an industrial project using the SPARK formal verification technology, developed at Altran UK. Based on this and similar experiences, we propose a partial automation of this process, using the notion of explicit assumptions. This partial automation may have the role of an enabler for formal verification, allowing the application of the technology to isolated modules of a code base while simultaneously controlling the risk of invalid assumptions. We demonstrate a possible application of this concept for the fine-grain integration of formal verification and testing of Ada programs.

Journal ArticleDOI
TL;DR: In this paper, the in situ follow-up of the viscosity evolution with reaction time during the synthesis of isocyanate-terminated urethane prepolymers has been done.

Journal ArticleDOI
M Sabin1, S Piva
02 Jul 2014
TL;DR: In this paper, the results of a numerical investigation of heat and fluid flow in a liquid cold plate for FM radio power amplifiers are presented, and the performance of a blister cold plate designed to dissipate the heat generated by a known set of electronic components, in order to limit their maximum temperature during operations.
Abstract: The results of a numerical investigation of heat and fluid flow in a liquid cold plate for FM radio power amplifiers are presented. The objective is to verify, by using a commercial CFD code, the performance of a blister cold plate designed to dissipate the heat generated by a known set of electronic components, in order to limit their maximum temperature during operations. Since in a blister cold-plate mainly the cover is thermally active, the cold-plate is simplified and lightened by using plastics in the base plate. A 3-D conjugate CFD approach, where thermal and fluid flow analyses are combined, is followed. Several design options for the cold plate are examined and the validity of the full 3-D CFD approach in the dimensioning of the cooling systems of electronic equipment is demonstrated.

Journal ArticleDOI
TL;DR: The methode experimentale des essais interlaboratoires presente un grand interet for quantifier les incertitudes d'une methode de mesure, dans des conditions donnees as discussed by the authors.
Abstract: En parallele de la methode de propagation des incertitudes, methode de reference (GUM - Guide pour l'expression de l'incertitude de mesure) qui necessite de modeliser l'ensemble du processus de mesure, la methode experimentale des essais interlaboratoires presente un grand interet pour quantifier les incertitudes d'une methode de mesure, dans des conditions donnees. Cette methode, tres utilisee dans certains domaines (chimie, biologie, essais mecaniques, etc.), est encadree par des normes ISO compatibles avec le GUM decrivant la methode de reference.Ainsi, les comparaisons interlaboratoires hydrometriques permettent de quantifier l'incertitude resultant des erreurs de mesure qui s'expriment lors de la repetition de jaugeages simultanes par plusieurs equipes en conditions de repetabilite et de reproductibilite, et en particulier pour une plage de debit constant. Des formules simples permettent de quantifier, a partir des resultats experimentaux de variance de repetabilite et de variance interlaboratoire, l'incertitude de la methode de jaugeage testee, supposee non biaisee.A partir d'exemples recents d'intercomparaisons de jaugeages ADCP, sur perche, par camion et par dilution, des recommandations generales sont proposees pour l'organisation d'intercomparaisons hydrometriques qui soient exploitables pour quantifier des incertitudes conformes aux textes de reference. La methode normalisee des essais interlaboratoires ameliore l'evaluation des incertitudes sur les techniques de jaugeage.

11 Apr 2014
TL;DR: The PREVIMER project as mentioned in this paper developed, funded and organized part of in situ observing networks in the Bay of Biscay and the Channel in order to sustain model applications.
Abstract: To design a prototype for an Integrated Ocean Observing System (IOOS), at least three components are mandatory: a modeling platform, an in situ observing system and a structure to collect and to disseminate the information (e.g. database, website). The PREVIMER project followed this approach and in order to sustain model applications, PREVIMER has developed, funded and organized part of in situ observing networks in the Bay of Biscay and the Channel. For a comprehensive system, focus was addressed on fi xed platforms (MAREL MOLIT, MAREL Iroise, Island network and D4 for sediment dynamics), ships of opportunity (RECOPESCA program and FerryBoxes), and coastal profi lers (ARVOR-C/Cm). Each system is briefl y described and examples of scientifi c results obtained with corresponding data are highlighted to show how these systems contribute to solve scientifi c multidisciplinary issues from the coastal ocean dynamics to the biodiversity including pelagic and benthic habitats.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the issues in defining sustainability metrics and propose some methodologies and a system of indicators to assess the sustainability of a PV module, including recyclability, viability of PV industry, equilibrium along the value chain or social acceptability.
Abstract: In the context of a rapidly growing energy demand and concerns about global climate change, renewable energies and in particular photovoltaic (PV) power are considered long-term solutions towards secured energy supply and for the reduction of greenhouse gas emissions. However, is solar PV a truly sustainable solution? Climate change will certainly not be the only environmental issue we will have to deal with. Regarding the issue of mineral resources, current research efforts aim at reducing raw material consumption in the manufacturing of PV panels. But, how do we assess the environmental efficiency of a panel, especially concerning the raw material consumption? Moreover, the PV industry consumes raw materials which are principally produced in non-EU countries such as Cadmium (Cd), Gallium (Ga) and Indium (In). How do we consider the issue of critical substance and accessibility? Required materials for PV modules may also be used in other applications (for example Gallium and Indium in electrical appliance production). Should competitiveness between applications be taken into account in a sustainability assessment? Are we going to valorize the PV module ability to use substitute substances? In addition to responsible resource management indicators, many other aspects have to be taken into account in order to achieve a complete sustainability assessment, especially recyclability, viability of PV industry, equilibrium along the value chain or social indicators such as social acceptability... Designing sustainability metrics is a new and complex research field. The whole value chain has to be evaluated and all dimensions (environmental, economic, and social) need to be explored. The paper will discuss the issues in defining sustainability metrics and propose some methodologies and a system of indicators to lassess the sustainability of a PV module.

Proceedings ArticleDOI
20 Nov 2014
TL;DR: The application of the Monte Carlo Method is presented as an approach to propagate the uncertainty of the input data sets in order to estimate a confidence interval for each FSV indicator.
Abstract: The Feature Selective Validation (FSV) is the standard method used for validation assessment in Computational Electromagnetics, and it uses both quantitative and qualitative indicators to measure de similarity between a pair of data sets. However, standardized FSV rely on a heuristic procedure for graphical comparison that does not include considerations about the uncertainty of the data sets involved. The reliability of the validation results, and therefore of the model under validation, depends on the uncertainty of the data sets used as input for the FSV, even more considering that some measurements associated to electromagnetic compatibility tests are characterized by a large uncertainty. Nonetheless, the FSV algorithm makes the propagation of such uncertainties a difficult and cumbersome task through the conventional approaches. This paper presents the application of the Monte Carlo Method as an approach to propagate the uncertainty of the input data sets in order to estimate a confidence interval for each FSV indicator. Finally, a numerical example is presented and discussed.

Proceedings ArticleDOI
20 Nov 2014
TL;DR: A statistical analysis of 40 different cases of study is carried out covering a wide range of real-life applications, showing that more effort is required to achieve generally coherent validation results between the FSV and the FSNMI.
Abstract: This paper presents a performance comparison between two validation methods developed specifically for the Computational Electromagnetics purposes: the Feature Selective Validation (FSV) and the Feature Selective Normalized Mutual Information (FSNMI) index. To achieve this goal, a statistical analysis of 40 different cases of study (pairs of data sets) is carried out covering a wide range of real-life applications, such as, frequency domain, noisy and transient data, among others. The results provided an insight of the relationships between each method, showing that more effort is required to achieve generally coherent validation results between the FSV and the FSNMI.

Journal ArticleDOI
TL;DR: In this article, a central limit theorem for the annual or pluri-annual wind power production is derived and quantiles for one, ten or twenty years future periods are obtained.
Abstract: Wind power is an intermittent resource due to wind speed intermittency. However wind speed can be described as a stochastic process with short memory. This allows us to derive a central limit theorem for the annual or pluri-annual wind power production and then get quantiles of the wind power production for one, ten or twenty years future periods. On the one hand, the interquantile spread oers a measurement of the intrinsic uncertainties of wind power production. On the other hand, dierent quantiles with dierent periods of time are used by financial institutions to quantify the financial risk of the wind turbine. Our method is then applied to real datasets corresponding to a French wind turbine. Since confidence intervals can be enhanced by taking into account seasonality, we present some tools for change point analysis on wind series.

Proceedings ArticleDOI
01 Oct 2014
TL;DR: A new X-band High Stable Synthesizer (HSS) architecture, which combines Phase Lock Loop (PLL), Direct Digital Synthesis (DDS) and multiplier techniques, in order to cope with low phase noise and frequency agility requirements of the next radar generations.
Abstract: Spectral purity and source stability represent two of the primary goals in the design of high performance frequency synthesizer for modern radar systems This paper presents a new X-band High Stable Synthesizer (HSS) architecture, which combines Phase Lock Loop (PLL), Direct Digital Synthesis (DDS) and multiplier techniques, in order to cope with low phase noise and frequency agility requirements of the next radar generations This synthesizer offers frequency coverage in X band with a percentage bandwidth of a minimum of 10%, with fast switching time, high digitally-tunable resolution and very low phase noise The achievement of such performance has been demonstrated by experimental results measured on a laboratory demonstrator: they show a lock-in time within 10 us and a phase noise improvement up to 30 dB compared to the current state of art Frequency Generation Unit (FGU) Moreover, for its versatility, the architecture is able to provide an intermediate output covering the needs for new S-L band application

09 Apr 2014
TL;DR: In this article, the authors proposed a sustainable business model for ICT platforms of eHealth services, in terms of economic and social benefits (medical cost reduction, better health care, enhanced coordination and data sharing among stakeholders etc.).
Abstract: The recent widespread of innovative ICT eHealth services platform at the international and transnational levels confirms their market booster position in structuring the emerging eHealth sector. Indeed, such platforms can provide different types of both medical and social services packaged in a "bouquet". The integration of ICT platforms in the medico-social sector brings interesting opportunities in terms of economic and social benefits (medical cost reduction, better health care, enhanced coordination and data sharing amongst stakeholders etc.), but also risks, causing economic, human and ethical barriers. While many studies have identified key success factors for these projects, this paper proposes to move forward and study what are the possible sustainable business models for Innovative ICT Platform of eHealth Services. In the light of digital economy and through microeconomics theory, this paper offers potential solutions to deal with various types of contracts and pricing models, related to revenue structure models. However, the potential models discussed may be difficult to transpose to the ICT eHealth sector since its specific nature (confidence, ethics) needs to be taken into account.

Book ChapterDOI
01 Jan 2014
TL;DR: In this article, the authors show that innerhalb eines Intervalls jeder beliebige Wert angenommen werden kann, ist eine theoretische Annahme und unserer Vorstellung von „Kontinuitat” geschuldet.
Abstract: Kann eine Zufallsmessung prinzipiell einen beliebigen Wert auf einer metrischen Zahlenskala annehmen, spricht man von einer stetigen Zufallsvariablen. Die Korpertemperatur oder der systolische Blutdruck von Probanden einer klinischen Studie sind typischerweise stetige Zufallsvariablen. Zeit- und Langenmessungen sind ebenfalls typische Beispiele. Dass innerhalb eines Intervalls jeder beliebige Wert angenommen werden kann, ist eine theoretische Annahme und unserer Vorstellung von „Kontinuitat” geschuldet. In der Praxis werden die Daten naturgemas aufgrund der begrenzten Messgenauigkeit und unvermeidbaren Rundungen immer in diskreter Form vorliegen. Trotzdem ist es sinnvoll, bei der Modellierung die Merkmale als stetige Zufallsvariable aufzufassen.

Book ChapterDOI
01 Jan 2014
TL;DR: In this article, the authors present a rechnerisch vollig analog zum unabhangigkeit von Merkmalen test and zu Fishers exaktem test verlaufen.
Abstract: In diesem Kapitel wird der \( {\textit{X}}^2\)-Test und der auf R.A. Fisher zuruckgehende exakte Test auf Unabhangigkeit von Merkmalen behandelt. Der \( {\textit{X}}^2\)-Unabhangigkeitstest basiert direkt auf dem \( {\textit{X}}^2\)-Anpassungstest, und die entsprechenden p-Werte werden bei Vorliegen „groserer“ Fallzahlen naherungsweise berechnet. Dagegen liefert „Fishers exakter Test“, der vorzugsweise bei kleineren Trefferzahlen angewendet wird, eine genaue Berechnung der fraglichen Wahrscheinlichkeiten. In Homogenitatstests wird die Verteilungshomogenitat bezuglich mehrerer Stichproben untersucht. Auch wenn solche Tests grundsatzlich von Unabhangigkeitstests zu unterscheiden sind, werden wir Versionen kennenlernen, die rechnerisch vollig analog zum \( {\textit{X}}^2\)-Test und zu Fishers exaktem Test verlaufen.

Book ChapterDOI
01 Jan 2014
TL;DR: In this article, a reale Anwendungsbeispiele der bisher besprochenen statistischen methoden vorstellen are vorstellt.
Abstract: Im Rahmen dieses Praxis-Teils wollen wir einige reale Anwendungsbeispiele der bisher besprochenen statistischen Methoden vorstellen. Wir mochten hier auch den nicht-mathematischen Kontext, in dem die Methoden eingesetzt werden, jeweils genauer betrachten.

Proceedings ArticleDOI
01 Dec 2014
TL;DR: In this paper, the authors focused on the design of the transmission lines utilized in microelectromechanical capacitive shunt connected switches and employed these transmission lines sections as matching elements to improve the performance of the component in the on state.
Abstract: This paper is focused on the design of the transmission lines utilized in microelectromechanical capacitive shunt connected switches. These transmission lines sections are employed as matching elements to improve the performance of the component in the on state. The image phase parameter is used to develop an analytic procedure for the synthesis of such transmission lines. The proposed method is applied for the design of a switch, adopting for the capacitive element the experimental data of a microelectromechanical device previously realized by the authors.

Book ChapterDOI
01 Jan 2014
TL;DR: In this article, the R-Commander is described as a grafisch erwahnte grafische Oberflache, die die Arbeit mit R wesentlich erleichtern soll.
Abstract: Nach der Installation des Programmes und den ersten zaghaften Schritten stellt sich dem Nutzer schnell die Frage: Wie arbeitet man am einfachsten und effzientesten mit R? Wir wollen im vorliegenden Kapitel versuchen, erste Antworten auf diese Frage zu geben, indem wir zuerst die verschiedenen Programmkomponenten und deren Zusammenspiel miteinander erlautern (Abschnitt 19.1). Der Begriff des Objektes ist bei der Arbeit mit R omniprasent. Ein gewisses Grundwissen uber die wichtigsten Objekttypen von R ist daher unumganglich zum Verstandnis der Arbeits- und Funktionsweise des Programms (Abschnitt 19.2). Der R-Commander ist die im vorherigen Kapitel schon mehrfach erwahnte grafische Oberflache, die die Arbeit mit R wesentlich erleichtern soll. Da der R-Commander daruber hinaus in diesem Buch eine zentrale Stellung einnimmt, geben wir in Abschnitt 19.3 eine erste Einfuhrung in seine Funktionsweise.

Book ChapterDOI
01 Jan 2014
TL;DR: In this paper, a Kapitel stehen Zufallsexperimente with binarem Ausgang im Zentrum, d.h. die Zufallsvariable kann nur zwei mogliche Auspragungen annehmen.
Abstract: In diesem Kapitel stehen Zufallsexperimente mit binarem Ausgang im Zentrum, d.h. die Zufallsvariable kann nur zwei mogliche Auspragungen annehmen. Probleme dieser Art finden beispielsweise in Industriebetrieben Anwendung, wie folgendes Beispiel zeigen soll.

Book ChapterDOI
01 Jan 2014
TL;DR: In this paper, weiß, eine der schonsten Seiten an der Arbeit eines Statistikers ist es, dass man mit sehr vielen unterschiedlichen Fachrichtungen und Themen in Beruhrung kommt.
Abstract: Eine der schonsten Seiten an der Arbeit eines Statistikers ist es, dass man mit sehr vielen unterschiedlichen Fachrichtungen und Themen in Beruhrung kommt. Speziell Biologen und Mediziner sind Gruppen, die man haufig in ihrer Arbeit unterstutzen kann.

Journal ArticleDOI
01 Jul 2014-Insight
TL;DR: In this article, the issues facing an owner operator who needs to be the Design Authority are discussed, as well as aspects of SE that require further work to support an owner DA, and how SE can support a DA on a complex systems program in such a shared design management arrangement and regulated environment.
Abstract: Over the years acquisition methodologies, e.g. BVP, have resulted in a shift in engineering procurement where large organisations have been increasingly asking their supply chain to take on the responsibility for part of, or overall system design. However, just handing over a set of requirements to a supplier and then waiting for them to deliver is a high risk strategy especially with novel/complex programmes. To reduce the risk of problems and to comply with some regulatory environments, the intelligent customer needs to take design responsibility by focusing on technically assuring that the system will be ‘fit for purpose‘. This paper explores the issues facing an owner operator who needs to be the Design Authority, identifies aspects of SE that require further work to support an Owner DA, and shows how SE can support a DA on a complex systems programme in such a shared design management arrangement and regulated environment.