scispace - formally typeset
Search or ask a question

Showing papers by "City University London published in 1998"


Journal ArticleDOI
TL;DR: This paper considers some of the key developments in camera calibration for aerial mapping and attempts to put them into perspective and the driving forces behind each improvement have been highlighted.
Abstract: Corriger les images de la distorsion des chambres de prises de vues a ete une preoccupation importante des usagers, aussi longtemps qu'ils ont voulu utiliser ou reconstituer fidelement les informations observees. La cartographie a ete initialement la principale application. Tandis que cette tâche se poursuit de nos jours, d'autres applications necessitent egalement un etalonnage precis des chambres, telles que les determinations tridimensionnelles a courte distance et de nombreuses autres mesures a deux dimensions. Dans le passe, les chambres utilisees etaient peu nombreuses et tres onereuses tandis qu'aujourdhui une grande societe industrielle aura normalement de nombreuses cameras bon marche qu'il utilisera pour des travaux de mesures tres importants. On se sert de nos jours de cameras beaucoup plus qu'on ne le fit jamais, mais l'âge d'or de l'etalonnage des chambres de prises de vues aeriennes pour la cartographie est desormais termine. On examine dans cet article quelques uns des essais et des developpements-cles et on les remet en perspective. On y met en particulier en evidence les forces sous-jacentes qui ont preside a chaque amelioration.

404 citations


Journal ArticleDOI
TL;DR: A method and software assistant tool for scenario based RE that integrates with use case approaches to object oriented development and suggests appropriate generic requirements to deal with the problems encountered is reported.
Abstract: Scenarios have been advocated as a means of improving requirements engineering yet few methods or tools exist to support scenario based RE. The paper reports a method and software assistant tool for scenario based RE that integrates with use case approaches to object oriented development. The method and operation of the tool are illustrated with a financial system case study. Scenarios are used to represent paths of possible behavior through a use case, and these are investigated to elaborate requirements. The method commences by acquisition and modeling of a use case. The use case is then compared with a library of abstract models that represent different application classes. Each model is associated with a set of generic requirements for its class, hence, by identifying the class(es) to which the use case belongs, generic requirements can be reused. Scenario paths are automatically generated from use cases, then exception types are applied to normal event sequences to suggest possible abnormal events resulting from human error. Generic requirements are also attached to exceptions to suggest possible ways of dealing with human error and other types of system failure. Scenarios are validated by rule based frames which detect problematic event patterns. The tool suggests appropriate generic requirements to deal with the problems encountered. The paper concludes with a review of related work and a discussion of the prospects for scenario based RE methods and tools.

376 citations


Journal ArticleDOI
TL;DR: Studies had to detail the number of participants in each group and the nature, duration, span, and delivery of treatment, provide a comparison of preand postintervention speech and language measures, and fulfill one of three design criteria.
Abstract: Study selection Study designs of evaluations included in the review For intervention: the studies had to detail the number of participants in each group and the nature, duration, span, and delivery of treatment, provide a comparison of preand postintervention speech and language measures, and fulfill one of three design criteria: experimental study with randomised non-treatment controls; quasi-experimental studies (with non-random/pseudo-random or non-equivalent non-treatment control groups); or a single-subject experimental design with graphical displays or session-by-session data for individuals.

335 citations


Journal ArticleDOI
TL;DR: The authors presented and discussed transcripts of some 270 explanations subjects provided subsequently for recognition memory decisions that had been associated with remember, know, or guess responses at the time the recognition decisions were made.

321 citations


Journal ArticleDOI
TL;DR: A model for matching COTS product features with user requirements is proposed and extended to extend state of the art requirements acquisition techniques to the component based software engineering process.
Abstract: Commercial off the shelf software can save development time and money if you can find a package that meets your customer's needs. The authors propose a model for matching COTS product features with user requirements. To support requirements acquisition for selecting commercial off the shelf products, we propose a method we used recently for selecting a complex COTS software system that had to comply with over 130 customer requirements. The lessons we learned from that experience refined our design of PORE (procurement oriented requirements engineering), a template based method for requirements acquisition. We report 11 of these lessons, with particular focus on the typical problems that arose and solutions to avoid them in the future. These solutions, we believe, extend state of the art requirements acquisition techniques to the component based software engineering process.

300 citations


Journal ArticleDOI
TL;DR: The paper is an attempt to explore some of the issues underlying scenario-based approaches in requirements engineering and to propose a framework for their classification, a four-dimensional framework which advocates that a scenario- based approach can be well defined by its form, content, purpose and life cycle.
Abstract: The requirements engineering, information systems and software engineering communities recently advocated scenario-based approaches which emphasise the user/system interaction perspective in developing computer systems. Use of examples, scenes, narrative descriptions of contexts, mock-ups and prototypes-all these ideas can be called scenario-based approaches, although exact definitions are not easy beyond stating that these approaches emphasise some description of the real world. Experience seems to tell us that people react to ‘real things’ and that this helps in clarifying requirements. Indeed, the widespread acceptance of prototyping in system development points to the effectiveness of scenario-based approaches. However, we have little understanding about how scenarios should be constructed, little hard evidence about their effectiveness and even less idea about why they work. The paper is an attempt to explore some of the issues underlying scenario-based approaches in requirements engineering and to propose a framework for their classification. The framework is a four-dimensional framework which advocates that a scenario-based approach can be well defined by itsform, content, purpose andlife cycle. Every dimension is itself multifaceted and a metric is associated with each facet. Motivations for developing the framework are threefold: (a) to help in understanding and clarifying existing scenario-based approaches; (b) to situate the industrial practice of scenarios; and (c) to assist researchers develop more innovative scenario-based approaches.

267 citations


Journal ArticleDOI
TL;DR: The adequacy of similarity to prototype as an account of categorization in natural concepts was assessed by analyzing the monotonicity of the relation between typicality of an item in a category and the probability of a positive categorization response using data from McCloskey and Glucksberg (1978).

197 citations


Journal ArticleDOI
TL;DR: This paper investigated perceived occupational stress in a sample of 582 academic staff members working in institutions of higher education in the UK and found that women perceived the structure and content of their jobs similarly to men.
Abstract: The study investigated perceived occupational stress in a sample of 582 academic staff members working in institutions of higher education in the UK. Data was collected using the Maslach Burnout Inventory (Maslach and Jackson 1986), The Job Diagnostic Survey (Hackman and Oldham 1974) and the Faculty Stress Index (Gmelch et al. 1986). The results indicate that women academics perceive the structure and content of their jobs similarly to men. However, women generally experience higher overall levels of stress in their jobs and results indicate that they may cope better with the demands placed upon them than their male counterparts. There is some evidence of the presence of a ‘glass ceiling’ in the institutions studied, with women holding more junior positions, and remaining in them longer, than men. A difference in effect size was found between those women who do achieve senior positions and men in similar posts. Higher grades predict greater job strain for women but not for men. The results from this study suggest fruitful avenues for future research.

174 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine the relationship between the two testing goals, using a probabilistic analysis, and try to answer the question of how to attain program reliability: is it better to test by probing for defects as in debug testing, or to assess reliability directly as in operational testing?
Abstract: There are two main goals in testing software: (1) to achieve adequate quality (debug testing), where the objective is to probe the software for defects so that these can be removed, and (2) to assess existing quality (operational testing), where the objective is to gain confidence that the software is reliable. Debug methods tend to ignore random selection of test data from an operational profile, while for operational methods this selection is all-important. Debug methods are thought to be good at uncovering defects so that these can be repaired, but having done so they do not provide a technically defensible assessment of the reliability that results. On the other hand, operational methods provide accurate assessment, but may not be as useful for achieving reliability. This paper examines the relationship between the two testing goals, using a probabilistic analysis. We define simple models of programs and their testing, and try to answer the question of how to attain program reliability: is it better to test by probing for defects as in debug testing, or to assess reliability directly as in operational testing? Testing methods are compared in a model where program failures are detected and the software changed to eliminate them. The "better" method delivers higher reliability after all test failures have been eliminated. Special cases are exhibited in which each kind of testing is superior. An analysis of the distribution of the delivered reliability indicates that even simple models have unusual statistical properties, suggesting caution in interpreting theoretical comparisons.

170 citations


Journal ArticleDOI
TL;DR: This paper summarises the reflective properties of the ocular tapeta often found in deep-sea fish, the pigmentation of their lenses and the absorption characteristics of their visual pigments, and highlights three genera of stomiid dragonfishes, which uniquely produce far red bioluminescence from suborbital photophores.

147 citations


Journal ArticleDOI
TL;DR: An empirical study of attempts to achieve change in clinical behaviour across a United Kingdom National Health Service (NHS) Health Authority (HA) suggests that the evidence based medicine (EBM) movement underpinning such attempts is premised upon a highly rationalistic conception of change.


Journal ArticleDOI
TL;DR: The authors used the longitudinal employment history records for the 3,898 33-year-old mothers in the Fifth Sweep of the 1958 National Child Development Study cohort in the United Kingdom to examine the dynamics of women's labour supply at a crucial stage of their lifecycle.
Abstract: The dynamics of women's labour supply are examined at a crucial stage of their lifecycle. This paper uses the longitudinal employment history records for the 3,898 33-year-old mothers in the Fifth Sweep of the 1958 National Child Development Study cohort in the United Kingdom. Models of binary recurrent events are estimated, which correct for unobserved heterogeneity, using SABRE software. These focus on women's first transition to employment after the first childbirth, and on the monthly transitions from first childbirth until censoring at the interview. Evidence of a polarization is found between highly educated, high-wage mothers and lower-educated, low-wage mothers.

Journal ArticleDOI
TL;DR: A method for scenario-based requirements engineering is described, which uses two types of scenario: structure models of the system context and scripts of system usage.
Abstract: A method for scenario-based requirements engineering is described. The method uses two types of scenario: structure models of the system context and scripts of system usage. A modelling language is reported for describing scenarios, and heuristics are given to cross-check dependencies between scenario models and the requirements specification. Heuristics are grouped into several analytic treatments that investigate correspondences between users’ goals and system functions; input events and system processes to deal with them; system output and its destination in the scenario model, and acceptability analysis of system output for different stakeholders. The method is illustrated with a case study taken from the London Ambulance Service report. The prospects for scenario-based requirements engineering and related work are discussed.

Journal ArticleDOI
TL;DR: A theory of domain knowledge is proposed to define the semantics and composition of generic domain models in the context of requirements engineering and a modeling language and a library of models arranged in families of classes are described.
Abstract: Retrieval, validation, and explanation tools are described for cooperative assistance during requirements engineering and are illustrated by a library system case study. Generic models of applications are reused as templates for modeling and critiquing requirements for new applications. The validation tools depend on a matching process which takes facts describing a new application and retrieves the appropriate generic model from the system library. The algorithms of the matcher, which implement a computational theory of analogical structure matching, are described. A theory of domain knowledge is proposed to define the semantics and composition of generic domain models in the context of requirements engineering. A modeling language and a library of models arranged in families of classes are described. The models represent the basic transaction processing or 'use case' for a class of applications. Critical difference rules are given to distinguish between families and hierarchical levels. Related work and future directions of the domain theory are discussed.

Journal ArticleDOI
TL;DR: Nouns were produced more successfully than verbs in spontaneous speech, picture naming and when naming to definition, suggesting that an inability to access verbs' phonological representations can severely impair sentence formulation.

Journal ArticleDOI
TL;DR: The 'Bart's Nursing OSCE' is an innovative approach to the assessment of clinical skills, through the medium of simulated professional practice, which focuses upon the rationale, authenticity, validity and reliability of the assessment.


Journal ArticleDOI
TL;DR: Results suggest a specific influence of estrogen in men on verbal memory tasks, similar to that seen in prior studies of women, which is discussed in terms of differential processing demands of the two memory tasks.

Journal ArticleDOI
TL;DR: To determine the repeatability of measurement of the corneal sensation threshold using the Non‐Contact Corneal Aesthesiometer (NCCA) and to compare the sensation thresholds found with the NCCA and with the Cochet–Bonnet Aestheticometer (C–BA), on the same group of normal, human subjects.

Proceedings ArticleDOI
06 Apr 1998
TL;DR: A new method which integrates techniques from several disciplines in response to lessons learned from a complex commercial off-the-shelf product selection exercise is proposed, and lessons learned inform design of PORE (Procurement-Oriented Requirements Engineering), a template-based method for requirements acquisition.
Abstract: An increasing number of organisations are procuring off-the-shelf systems from commercial suppliers. However, successful selection of off-the-shelf systems to fit customer requirements remains problematic. The London Ambulance Service fiasco in 1992 is a well-known example of system failure due, at least in part, to poor product selection. New methods and techniques for requirements acquisition and product selection are needed. The authors propose a new method which integrates techniques from several disciplines in response to lessons learned from a complex commercial off-the-shelf product selection exercise undertaken by the authors. They report on a recent experience in selecting a complex commercial off-the-shelf software system to be compliant with over 130 customer requirements, and lessons learned from the experience. These lessons learned inform design of PORE (Procurement-Oriented Requirements Engineering), a template-based method for requirements acquisition. This paper reports 11 of these lessons. Particular focus is put on the typical problems which arose during acquisition of requirements to enable this selection, and solutions to avoid these problems in the future.

Journal ArticleDOI
TL;DR: This article attempts to lay down theoretical groundwork for information retrieval (IR) that involves the combined efforts of several users based on the ethos of voluntary cooperation to facilitate free exchange of ideas and stimulate creativity.
Abstract: This article attempts to lay down theoretical groundwork for information retrieval (IR) that involves the combined efforts of several users. It is argued that the fundamental intellectual problems of IR are the production and consumption of knowledge. Knowledge production is fundamentally a collaborative labor, which is deeply embedded in the practices of a community of participants constituting a domain. The current technological advances in networked systems make the intertextual and intersubjective nature of meaning production and communication readily visible by merging various heterogeneous media into the homogenizing framework of the digital medium. Collaborative IR as envisaged in this article would be based on the ethos of voluntary cooperation to facilitate free exchange of ideas and stimulate creativity. What sorts of functionalities can be expected in a Collaborative IR system are illustrated with the help of some examples of collaborative systems and services from various domains. © 1998 John Wiley & Sons, Inc.

Journal ArticleDOI
TL;DR: The Basic Skills Agency for England and Wales who funded the surveys of the 1958 and 1970 cohorts involving the assessment of basic skills were sponsored by the Social Statistics Research Unit, City University, Northampton Square, London EC1V 0HB, UK as discussed by the authors.
Abstract: Requests for reprints should be sent to John Bynner, Social Statistics Research Unit, City University, Northampton Square, London EC1V 0HB, UK. The research reported here was sponsored by the Basic Skills Agency for England and Wales who funded the surveys of the 1958 and 1970 cohorts involving the assessment of basic skills. The views expressed in this paper are, however, those of the author alone. The whole birth cohort study programme on which the work builds has been funded mainly by the UK Economic and Social Research Council and UK government departments. Gratitude is due to them and to the cohort members who have freely given their co-operation in the studies throughout their lives.

Journal ArticleDOI
TL;DR: An innovative programme of shared learning in acute care, involving final year medical students and newly qualified staff nurses, developed in response to the blurring of professional roles between nurses and junior doctors is discussed.

Journal ArticleDOI
TL;DR: This systematic review was hypothesis driven and aimed to establish whether, given the available evidence, there was sufficient evidence to warrant the introduction of universal screening for speech and language delays in children up to seven years of age.
Abstract: Screening young children for developmental conditions such as speech and language delay is considered to be a part of the Child Health Surveillance programme in the UK. It is currently practised in many different ways throughout the country and like screening for other conditions conventionally identified in infancy, has been the subject of some concern for those responsible for providing such services. This systematic review (Law et al. 1998) was hypothesis driven and aimed to: i) establish whether, given the available evidence, there was sufficient evidence to warrant the introduction of universal screening for speech and language delays in children up to seven years of age; ii) identify gaps in the available literature; iii) identify priority areas in need of further investigation and iv); provide evidence-based recommendations for the future provision of services.

Journal ArticleDOI
01 Feb 1998
TL;DR: A framework for modelling uncertainty and combining diverse evidence was provided in such a way that it could be used to represent an entire argument about a system's dependability and was subsequently proved to be practical and highly popular.
Abstract: A primary objective of the DATUM (Dependability Assessment of safety critical systems Through the Unification of Measurable evidence) project was to improve the way dependability of software intensive safety-critical systems was assessed. The authors' hypothesis was that improvements were possible if multiple types of evidence could be incorporated. To achieve the objective, the authors had to investigate how to obtain improved dependability predictions given certain specific information over and above failure data alone. A framework for modelling uncertainty and combining diverse evidence was provided in such a way that it could be used to represent an entire argument about a system's dependability. The various methods and technologies for modelling uncertainty were examined in depth and a Bayesian approach was selected as the most appropriate method. To implement this approach for combining evidence, Bayesian belief networks (BBNs) were used. With the help of a BBN tool, a framework for dependability assessment was provided that met the original objective and which was subsequently proved to be practical and highly popular. A major benefit of this approach was that otherwise hidden assumptions used in an assessment become visible and auditable.

Journal ArticleDOI
01 Jun 1998
TL;DR: It is argued that effective design and evaluation are dependent upon the adoption of appropriate methodology set firmly within a systemic framework, and systems modeling is proposed as an approach to system design, with evaluation adopting an approach incorporating evaluability analysis and formative and summative evaluation.
Abstract: In this paper, the design and evaluation of decision support systems, including those incorporating a telematic component, are considered. It is argued that effective design and evaluation are dependent upon the adoption of appropriate methodology set firmly within a systemic framework. Systems modeling is proposed as an approach to system design, with evaluation adopting an approach incorporating evaluability analysis and formative and summative evaluation, including the use of stakeholder matrix analysis. The relevance of such systemic methodology is demonstrated in the context of diabetes and end-stage renal disease as examples of the generic clinical problem of the management of chronic disease.

Proceedings ArticleDOI
06 Apr 1998
TL;DR: The method worked well but there were problems in the use of design rationale and control of turn taking in RE sessions, and future improvements for the method are discussed.
Abstract: A method of scenario based requirements engineering is described that uses a combination of early prototypes, scenario scripts and design rationale to elicit and validate user requirements. Experience in using the method on an EU project, Multimedia Broker, is reported. Quantitative data on requirements sessions is analysed to assess user participation and quality of requirements captured. The method worked well but there were problems in the use of design rationale and control of turn taking in RE sessions. Lessons learned in using the method are summarised and future improvements for the method are discussed.

Journal ArticleDOI
TL;DR: The pattern of latency variation for pupil responses and reaction times suggests that the mechanisms that trigger the responses lie at different levels in cortex, which is surprising given present knowledge of visual cortical organization.
Abstract: Visual latencies, and their variation with stimulus attributes, can provide information about the level in the visual system at which different attributes of the image are analysed, and decisions about them made. A change in the colour, structure or movement of a visual stimulus brings about a highly reproducible transient constriction of the pupil that probably depends on visual cortical mechanisms. We measured this transient response to changes in several attributes of visual stimuli, and also measured manual reaction times to the same stimulus changes. Through analysis of latencies, we hoped to establish whether changes in different stimulus attributes were processed by mechanisms at the same or different levels in the visual pathway. Pupil responses to a change in spatial structure or colour are almost identical, but both are ca. 40 ms slower than those to a change in light flux, which are thought to depend largely on subcortical pathways. Manual reaction times to a change in spatial structure or colour, or to the onset of coherent movement, differ reliably, and all are longer than the reaction time to a change in light flux. On average, observers take 184 ms to detect a change in light flux, 6 ms more to detect the onset of a grating, 30 ms more to detect a change in colour, and 37 ms more to detect the onset of coherent motion. The pattern of latency variation for pupil responses and reaction times suggests that the mechanisms that trigger the responses lie at different levels in cortex. Given our present knowledge of visual cortical organization, the long reaction time to the change in motion is surprising. The range of reaction times across different stimuli is consistent with decisions about the onset of a grating being made in V1 and decisions about the change in colour or change in motion being made in V4.

Journal ArticleDOI
TL;DR: The ageing eye reveals a more rapid increase in forward scatter, and a reduction in contrast sensitivity, despite apparently good visual acuity above 45 years, according to a new scatter test implemented on the P_SCAN 100 pupillometers.