scispace - formally typeset
Search or ask a question

Showing papers on "Cloud computing published in 2007"


Journal ArticleDOI
01 Dec 2007
TL;DR: This research presents a meta-service architecture that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of developing and deploying new types of services and applications.
Abstract: Powerful services and applications are being integrated and packaged on the Web in what the industry now calls "cloud computing"

677 citations


01 Jan 2007
TL;DR: It is found that this collection of AmazonWeb Services (AWS) has great promise but are hobbled by service consistency problems, the lack of a Service Level Agreement (SLA), and a problematic Web Services Licensing Agreement (WSLA).
Abstract: Amazon.com’s Elastic Compute Cloud (EC2), Simple Storage Service (S3) and Simple Queue Service (SQS) offer enterprise-class computing, storage and coordination facilities to any organization or individual in the world with a valid credit card. This paper details our experience working with these commodity grid computing services between November 2006 and May 2007, including an analysis of the overall system’s API and ease-of-use; an analysis of EC2’s management and security facilities; an end-to-end performance analysis of S3’s throughput and latency as observed from Amazon’s EC2 cluster and other locations on the Internet; and an analysis of the SQS operation and performance. We conclude with a report of our experience moving a large-scale research application from dedicated hardware to the Amazon offering. We find that this collection of AmazonWeb Services (AWS) has great promise but are hobbled by service consistency problems, the lack of a Service Level Agreement (SLA), and a problematic Web Services Licensing Agreement (WSLA).

322 citations


Proceedings ArticleDOI
27 Aug 2007
TL;DR: The design and implementation of distributed rate limiters are presented, which work together to enforce a global rate limit across traffic aggregates at multiple sites, enabling the coordinated policing of a cloud-based service's network traffic.
Abstract: Today's cloud-based services integrate globally distributed resources into seamless computing platforms. Provisioning and accounting for the resource usage of these Internet-scale applications presents a challenging technical problem. This paper presents the design and implementation of distributed rate limiters, which work together to enforce a global rate limit across traffic aggregates at multiple sites, enabling the coordinated policing of a cloud-based service's network traffic. Our abstraction not only enforces a global limit, but also ensures that congestion-responsive transport-layer flows behave as if they traversed a single, shared limiter. We present two designs - one general purpose, and one optimized for TCP - that allow service operators to explicitly trade off between communication costs and system accuracy, efficiency, and scalability. Both designs are capable of rate limiting thousands of flows with negligible overhead (less than 3% in the tested configuration). We demonstrate that our TCP-centric design is scalable to hundreds of nodes while robust to both loss and communication delay, making it practical for deployment in nationwide service providers.

244 citations


Journal ArticleDOI
TL;DR: In this paper, the U.S. Department of Energy Atmospheric Radiation Measurement (ARM) Program operates millimeter-wavelength cloud radars in several climatologically distinct regions.
Abstract: The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) Program operates millimeter-wavelength cloud radars in several climatologically distinct regions. The digital signal processors for these radars were recently upgraded and allow for enhancements in the operational parameters running on them. Recent evaluations of millimeter-wavelength cloud radar signal processing performance relative to the range of cloud dynamical and microphysical conditions encountered at the ARM Program sites have indicated that improvements are necessary, including significant improvement in temporal resolution (i.e., less than 1 s for dwell and 2 s for dwell and processing), wider Nyquist velocities, operational dealiasing of the recorded spectra, removal of pulse compression while sampling the boundary layer, and continuous recording of Doppler spectra. A new set of millimeter-wavelength cloud radar operational modes that incorporate these enhancements is presented. A significant change in radar samplin...

131 citations


Journal ArticleDOI
24 Oct 2007-Nature
TL;DR: In this paper, the authors argue that cloud computing can be a new nirvana for scientists drowning in data, but can it deliver? And they investigate the feasibility of cloud computing for scientific applications.
Abstract: 'Cloud computing' is being pitched as a new nirvana for scientists drowning in data. But can it deliver? Eric Hand investigates.

92 citations


Journal ArticleDOI
TL;DR: In this paper, a quadrature-based method is proposed for the averaging of photochemistry over complex cloud fields within a grid square and can be readily implemented in current global models.
Abstract: [1] A new approach defined here allows for the averaging of photochemistry over complex cloud fields within a grid square and can be readily implemented in current global models. As diagnosed from observations or meteorological models, fractional cloud cover with many overlying cloud layers can generate hundreds to thousands of different cloud profiles per grid square. We define a quadrature-based method, applied here to the problem of averaging photolysis rates over this range of cloud patterns, which opens new opportunities for modeling in-cloud chemistry in global models. We select up to four representative cloud profiles, optimizing the selection and weighting of each to minimize the difference in photolysis rates when compared with the integration over the entire set of cloud distributions. To implement our algorithm, we adapt the UCI fast-JX photolysis code to the cloud statistics from the ECMWF forecast model at T42L40 resolution. For the tropics and midlatitudes, grid-square-averaged photolysis rates for O3, NO2, and NO3 using four representative atmospheres differ by at most 3.2% RMS from rates averaged over the hundreds or more cloudy atmospheres derived from a maximum-random overlap scheme. Further, bias errors in both the free troposphere and the boundary layer are less than 1%. Similar errors are shown to be 10–20% for current approximation methods. Errors in the quadrature method are less than the uncertainty in the choice of maximum-random overlap schemes. We apply the method to the averaging of photochemistry over different cloud profiles and outline extensions to heterogeneous cloud chemistry.

76 citations


Journal Article
TL;DR: A novel similarity measuring method, namely the likeness comparing method based on cloud model (LICM), which compares the similarity of two users on knowledge level, which can overcome the drawback of attributes' strictly matching is proposed.
Abstract: Recommendation system is one of the most important technologies applied in e-commerce. Similarity measuring method is fundamental to collaborative filtering algorithm,and traditional methods are inefficient especially when the user rating data are extremely sparse. Based on the outstanding characteristics of Cloud Model on the process of transforming a qualitative concept to a set of quantitative numerical values,a novel similarity measuring method,namely the likeness comparing method based on cloud model (LICM) is proposed in this paper. LICM compares the similarity of two users on knowledge level,which can overcome the drawback of attributes’ strictly matching. This work analysis traditional methods throughly and puts forward a novel collaborative filtering algorithm,which is based on the LICM method. Experiments on typical data set show the excellent performance of the present collaborative filtering algorithm based on LICM,even with extremely sparsity of data.

68 citations


Journal ArticleDOI
G S Pankiewicz1
TL;DR: A study of currently available pattern recognition techniques, which include supervised and unsupervised classification, image segmentation and scale context, are reviewed, together with a method for selecting the optimal image characteristics for the application of interest.
Abstract: A wealth of often under-used information is present in visible and infrared meteorological satellite imagery, in the form of cloud size, shape, texture and context. In an effort to provide an objective system that can identify meteorological objects over a range of scales, a study of currently available pattern recognition techniques has been undertaken. These techniques, which include supervised and unsupervised classification, image segmentation and scale context, are reviewed, together with a method for selecting the optimal image characteristics for the application of interest. This paper also considers a number of meteorological applications where these techniques can be used, such as cloud classification for operational or climatological purposes and the improvement of numerical weather prediction models. The result of the study has been the design of a synoptic-scale object recognition system based on three layers of artificial neural networks, each operating at a different scale in a bottom-up approach.

63 citations



Journal ArticleDOI
TL;DR: In this article, the optical and microphysical properties of high-ice clouds over the Tropics (30°S-30°N) over a 3-yr period from September 2002 through August 2005 were surveyed.
Abstract: This study surveys the optical and microphysical properties of high (ice) clouds over the Tropics (30°S–30°N) over a 3-yr period from September 2002 through August 2005. The analyses are based on the gridded level-3 cloud products derived from the measurements acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard both the NASA Earth Observing System Terra and Aqua platforms. The present analysis is based on the MODIS collection-4 data products. The cloud products provide daily, weekly, and monthly mean cloud fraction, cloud optical thickness, cloud effective radius, cloud-top temperature, cloud-top pressure, and cloud effective emissivity, which is defined as the product of cloud emittance and cloud fraction. This study is focused on high-level ice clouds. The MODIS-derived high clouds are classified as cirriform and deep convective clouds using the International Satellite Cloud Climatology Project (ISCCP) classification scheme. Cirriform clouds make up more tha...

62 citations


Proceedings ArticleDOI
21 May 2007
TL;DR: A new paradigm is proposed, called Transparent Computing, to store and manage the commodity programs including OS codes centrally, while stream them to be run in non-state clients through a distributed 4VP+ platform.
Abstract: With the rapid improvements in hardware, software and networks, the computing paradigm has also shifted from mainframe computing to ubiquitous or pervasive computing, in which users can focus on their desired services rather than specific computing devices and technologies. However, the emerging of ubiquitous computing has brought many challenges, one of which is that it is hard to allow users to freely obtain desired services, such as heterogeneous OSes and applications via different light-weight devices. We have proposed a new paradigm, called Transparent Computing, to store and manage the commodity programs including OS codes centrally, while stream them to be run in non-state clients. This leads to a service-centric computing environment, in which users can select the desired services on demand, without concerning these services' administrations, such as their installation, maintenance, management, upgrade, and so on. In this paper, we introduce a novel concept: Meta OS to support such program streaming through a distributed 4VP+ platform. Based on this platform, a pilot system has been implemented and it supports Windows and Linux environments. We verify the effectiveness of the platform through both real deployments and testbed experiments. The evaluation results suggest that 4VP+ platform is a feasible and promising solution for future computing infrastructure in ubiquitous computing.

01 Jan 2007
TL;DR: It is found that under high volume of video demand, a P2P built-in incentive model performs better than any other model for both high-definition and standard-definition media, while the usage-based model generally generates more profits when the request rate is low.
Abstract: This paper studies the conditions under which peer-to-peer (P2P) technology may be beneficial in providing IPTV services over typical network architectures. It has two major contributions. First, we contrast two network models used to study the performance of such a system: a commonly used logical “Internet as a cloud” model and a “physical” model that reflects the characteristics of the underlying network. Specifically, we show that the cloud model overlooks important architectural aspects of the network and may drastically overstate the benefits of P2P technology by a factor of 3 or more. Second, we provide a cost-benefit analysis of P2P video content delivery focusing on the profit trade-offs for different pricing/incentive models rather than purely on capacity maximization. In particular, we find that under high volume of video demand, a P2P built-in incentive model performs better than any other model for both high-definition and standard-definition media, while the usage-based model generally generates more profits when the request rate is low. The flat-reward model generally falls in-between the usagebased model and the built-in model in terms of prof itability.

Proceedings ArticleDOI
23 Jul 2007
TL;DR: An architecture for the Event Cloud system is presented, which supports a continuous near real-time integration of business events with the aim of decreasing the time it takes to make them available for searching purposes.
Abstract: Market players that can respond to critical business events faster than their competitors will end up as winners in the fast moving economy. Event-based systems have been developed and used to implement networked and adaptive business environments based on loosely coupled systems. In this paper, we introduce Event Cloud, a system that allows searching for business events in a variety of contexts that also take the relationships between events into consideration. Event Cloud supports knowledge workers in their daily operations in order to perform investigations and analyses based on historical events. It enables users to search in large sets of historical events which are correlated and indexed in a data staging process with an easy-to-use search interface. For improving the search results, we propose an index based ranking system. We present an architecture for the Event Cloud system, which supports a continuous near real-time integration of business events with the aim of decreasing the time it takes to make them available for searching purposes. We have fully implemented the proposed architecture and discuss implementation details.

28 Mar 2007
TL;DR: In this paper, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases, and the corresponding watermark algorithms such as Cloud watermark embedding algorithm and detection algorithm are proposed.
Abstract: With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected. Keywords—cloud watermark, copyright protection, digital watermark, relational database

Journal ArticleDOI
TL;DR: In this article, the performance of a five-channel estimation-based method was evaluated in the context of numerical synthetic experiments and real-world data and examined the implications of these results on the global retrieval of ice cloud microphysical properties over the global oceans.
Abstract: [1] This work determines the performance of a five-channel ice cloud retrieval scheme in context of numerical synthetic experiments and real-world data and examines the implications of these results on the global retrieval of ice cloud microphysical properties over the global oceans. This estimation-based scheme, designed from information content principles, uses a rigorous, state-dependent error analysis to combine measurements from the visible, near-infrared, and infrared spectral regions. In the synthetic experiments, the five-channel scheme performed as well or better in terms of retrieval bias and random error than the traditional split-window and Nakajima and King bispectral retrieval techniques for all states of the atmosphere. Although the five-channel scheme performed favorably compared to the other methods, the inherently large uncertainties associated with ice cloud physics dictate typical retrieval uncertainties in both IWP and effective radius of 30–40%. These relatively large uncertainties suggest caution in the strict interpretation of small temporal or spatial trends found in existing cloud products. In MODIS and CRYSTAL-FACE applications, the five-channel scheme exploited the strengths of each of the bispectral approaches to smoothly transition from a split-window type approach for thin clouds to a Nakajima and King type approach for thick clouds. Uniform application of such a retrieval scheme across different satellite and field measurement campaigns would provide a set of consistent cloud products to the user community, theoretically allowing the direct comparison of cloud properties for the climate processes studies found throughout the literature.


Proceedings ArticleDOI
20 Aug 2007
TL;DR: The kinematics of the evolving cloud to approximate the shape of the cloud using splinegons is explored, which is e‐cient in that only the vertices and segment curvatures are required to deflne the cloud boundary, rather than a distribution function.
Abstract: In this paper we describe research work currently being undertaken to detect, model and track the shape of a contaminant cloud boundary using air borne sensor swarms. The model of the cloud boundary is then used to predict the future evolution of the cloud shape so that an airborne sensor swarm of UAVs can perform manoeuvres that will enable the exact shape and track of the cloud to be determined accurately and in a timely fashion. The contaminant cloud models currently used are usually based on numerical techniques. However in this research work the kinematics of the evolving cloud to approximate the shape of the cloud using splinegons is explored. This approach is e‐cient in that only the vertices and segment curvatures are required to deflne the cloud boundary, rather than a distribution function.

Journal ArticleDOI
TL;DR: The experimental results show that CIVIC (CROWN-based infrastructure for virtual computing) can offer separated and isolated computing environment for users and can also realize hardware and software consolidation and centralized management for computer administrators.
Abstract: Based on the analysis of hypervisor technologies and typical projects on the hypervisor based virtual computing environment, the design of CIVIC (CROWN-based infrastructure for virtual computing) is provided, which has three characteristics. Firstly, it can offer separated and isolated computing environment for users. Secondly, it can also realize hardware and software consolidation and centralized management for computer administrators. Thirdly, it can be transparent to upper layer applications, hiding the dynamicity, distribution and heterogeneousness of underlying resources from applications. The experimental results show that CIVIC can

Journal ArticleDOI
TL;DR: In this paper, an optimal method for the retrieval of cloud properties using five channels (0.6, 3.7, 6, 6.8 and 12.0) was presented.
Abstract: The present study documents optimal methods for the retrieval of cloud properties using five channels (0.6, 3.7, 6.7, 10.8 and 12.0 µm) that are used in many geostationary meteorological satellite observations. Those channels are also to be adopted for the Communication, Ocean and Meteorological Satellite (COMS) scheduled to be launched in 2008. The cloud properties focused on are cloud thermodynamic phase, cloud optical thickness, effective particle radius and cloud-top properties with specific uncertainties. Discrete ordinate radiative transfer models are simulated to build up the retrieval algorithm. The cloud observations derived from the Moderate-resolution Imaging Spectroradiometer (MODIS) are compared with the results to assess the validity of the algorithm. The preliminary validation indicates that the additional use of a band at 6.7 µm would be better in discriminating the cloud ice phase. Cloud optical thickness and effective particle radius can also be produced up to, respectively, 64 and 32 µm by functionally eliminating both ground-reflected and cloud-and ground-thermal radiation components at 0.6 and 3.7 µm. Cloud-top temperature (pressure) in ±3 K (±50 hPa) uncertainties can be estimated by a simple 10.8-µm method for opaque clouds, and by an infrared ratioing method using 6.7 and 10.8 µm for semitransparent clouds.

Journal ArticleDOI
TL;DR: Cooperative edge cache grid (cooperative EC grid) is presented, which is a large-scale cooperative edge cache network for efficiently delivering highly dynamic Web content with varying server update frequencies, and the concept of cache clouds is introduced as a generic framework of cooperation in large- scale edge cache networks.
Abstract: In recent years, edge computing has emerged as a popular mechanism to deliver dynamic Web content to clients. However, many existing edge cache networks have not been able to harness the full potential of edge computing technology. In this paper, we argue and experimentally demonstrate that cooperation among the individual edge caches coupled with scalable server-driven document consistency mechanisms can significantly enhance the capabilities and performance of edge cache networks in delivering fresh dynamic content. However, designing large-scale cooperative edge cache networks presents many research challenges. Toward addressing these challenges, this paper presents cooperative edge cache grid (cooperative EC grid, for short)?a large-scale cooperative edge cache network for efficiently delivering highly dynamic Web content with varying server update frequencies. The design of the cooperative EC grid focuses on the scalability and reliability of dynamic content delivery in addition to cache hit rates, and it incorporates several novel features. We introduce the concept of cache clouds as a generic framework of cooperation in large-scale edge cache networks. The architectural design of the cache clouds includes dynamic hashing-based document lookup and update protocols, which dynamically balance lookup and update loads among the caches in the cloud. We also present cooperative techniques for making the document lookup and update protocols resilient to the failures of individual caches. This paper reports a series of simulation-based experiments which show that the overheads of cooperation in the cooperative EC grid are very low, and our architecture and techniques enhance the performance of the cooperative edge networks.

Journal ArticleDOI
TL;DR: In this article, the authors evaluate the ability of a cloud-resolving model (CRM) to simulate the physical properties of tropical deep convective cloud objects identified from a Clouds and the Earth's Radiant Energy System (CERES) data product.
Abstract: The present study evaluates the ability of a cloud-resolving model (CRM) to simulate the physical properties of tropical deep convective cloud objects identified from a Clouds and the Earth’s Radiant Energy System (CERES) data product. The emphasis of this study is the comparisons among the small-, medium-, and large-size categories of cloud objects observed during March 1998 and between the large-size categories of cloud objects observed during March 1998 (strong El Nino) and March 2000 (weak La Nina). Results from the CRM simulations are analyzed in a way that is consistent with the CERES retrieval algorithm and they are averaged to match the scale of the CERES satellite footprints. Cloud physical properties are analyzed in terms of their summary histograms for each category. It is found that there is a general agreement in the overall shapes of all cloud physical properties between the simulated and observed distributions. Each cloud physical property produced by the CRM also exhibits differen...

Journal ArticleDOI
TL;DR: This paper proposes and realises the first release of the Instrument Element (IE), a new grid component that provides the computational/data grid with an abstraction of real instruments, and grid users with a more interactive interface to control them.
Abstract: Current grid technologies offer unlimited computational power and storage capacity for scientific research and business activities in heterogeneous areas all over the world Thanks to the grid, different virtual organisations can operate together in order to achieve common goals However, concrete use cases demand a closer interaction between various types of instruments accessible from the grid on the one hand and the classical grid infrastructure, typically composed of Computing and Storage Elements, on the other We cope with this open problem by proposing and realising the first release of the Instrument Element (IE), a new grid component that provides the computational/data grid with an abstraction of real instruments, and grid users with a more interactive interface to control them In this paper we discuss in detail the implemented software architecture for this new component and we present concrete use cases where the IE has been successfully integrated


Book ChapterDOI
11 Jul 2007
TL;DR: Based on the concept of the Symbiotic computing, symbiotic society will be realized, where human and ubiquitous information environment can cooperatively co-exist.
Abstract: Aiming towards the next generation ubiquitous stage we have been pursuing a research on information and communication paradigm, called "Symbiotic computing." Here, we first describe a basic concept and an architecture of the Symbiotic computing, and through several applications such as watch-over system for elderly people and adhoc communication support, we show and discuss the effectiveness of the Symbiotic computing. We first define the traditional ubiquitous computing environment that consists of two computing aspects: mobile computing and pervasive computing. Then, we define the advanced ubiquitous computing that consists of two computing axes: the traditional ubiquitous computing and web computing. In addition to these two axes, we introduced a third new axis (creation of new value) and by integrated these three axes, we created the Symbiotic computing paradigm. Based on the concept of the Symbiotic computing, symbiotic society will be realized, where human and ubiquitous information environment can cooperatively co-exist.

01 Oct 2007
TL;DR: In this paper, cloud characteristics in relation to radio wave propagation over some selected locations in different geographical region in eastern India have been presented, where it is seen that low cloud occurrence over Patna, Bhubaneswar and Ranchi is quite significant.
Abstract: In this paper, cloud characteristics in relation to radio wave propagation over some selected locations in different geographical region in eastern India have been presented. It is seen that low cloud occurrence over Patna, Bhubaneswar and Ranchi is quite significant. The performance of radio systems deteriorates due to cloud attenuation as well as due to cloud noise temperature. Based on the cloud attenuation results, the total atmospheric noise temperature including the noise contribution from cloud for different months during different times for different cloud thicknesses at 10 GHz, 18 GHz, 32 GHz, 44 GHz and 70 GHz has been determined for the aforesaid stations. The results, presented here are useful to design future earthspace communication links over these aforesaid locations in India.

Journal Article
TL;DR: The exploration of uncertainty and fuzziness characters which was always considered by two internal properties of trust provided a step in the direction of proper understanding and definition of human trust.
Abstract: Trust management model is fundamental for information security in open networksA formalism method for subjective trust use cloud model was proposed which was the transforming model between qualitative and quantitativeThe exploration of uncertainty and fuzziness characters which was always considered by two internal properties of trust provided a step in the direction of proper understanding and definition of human trustThe qualitative reasoning mechanisms of trust cloud were given to enable trust-based decisions

31 Jan 2007
TL;DR: This work presents an approach to visualizing clouds by means of a particle system that consists of soft balls, so-called metaballs, and yields a large-scale, realistic, 3D cloud visualization that supports cloud fly-throughs.
Abstract: Modern weather prediction models create new challenges but also offer new possibilities for weather visualization. Since weather model data has a complex three-dimensional structure and various abstract parameters it cannot be presented directly to a lay audience. Nevertheless, visualizations of weather data are needed daily for weather presentations. One important visual clue for the perception of weather is given by clouds. After a discussion of weather data and its specific demands on a graphical visualization we present an approach to visualizing clouds by means of a particle system that consists of soft balls, so-called metaballs (Dobashi et al. 2000). Particular attention is given to the special requirements of large-scale cloud visualizations. Since weather forecast data typically lacks specific information on the small-scale structure of clouds we explain how to interprete weather data in order to extract information on their appearance, thereby obtaining five visual cloud classes. Based on this cloud extraction and classification, modeling techniques for each visual cloud class are developed. For the actual rendering we extend and adapt the metaball approach by introducing flattened particles and derived metaball textures. As shown by our implementation our approach yields a large-scale, realistic, 3D cloud visualization that supports cloud fly-throughs.

Proceedings ArticleDOI
26 Nov 2007
TL;DR: A novel authentication scheme which can distinguish malicious attack from natural processing with cloud watermark is proposed and has a good tamper detection capability even the video is recompressed.
Abstract: Video authentication has gained more and more attention in recent years. However, many existed authentication methods have their obvious drawbacks. In this paper, we propose a novel authentication scheme which can distinguish malicious attack from natural processing with cloud watermark. As videos are often stored and transmitted in compressed format nowadays, we will focus on content authentication of MPEG-2 compressed video. The scheme can also be extended to MPEG-4 video. Our authentication system first separates video sequence into shots and extracts the feature vector from each shot. Then the extracted feature is used to generate watermark cloud drops with a cloud generator. Experimental results show that the proposed approach has a good tamper detection capability even the video is recompressed.

Journal ArticleDOI
TL;DR: In this paper, a variational parameter estimation technique is employed to adjust empirical model cloud parameters in both space and time, in order to better represent assimilated International Satellite Cloud Climatology Project (ISCCP) cloud fraction and optical depth and Special Sensor Microwave Imager (SSM/I) liquid water path.
Abstract: General circulation models are unable to resolve subgrid-scale moisture variability and associated cloudiness and so must parameterize grid-scale cloud properties. This typically involves various empirical assumptions and a failure to capture the full range (synoptic, geographic, diurnal) of the subgrid-scale variability. A variational parameter estimation technique is employed to adjust empirical model cloud parameters in both space and time, in order to better represent assimilated International Satellite Cloud Climatology Project (ISCCP) cloud fraction and optical depth and Special Sensor Microwave Imager (SSM/I) liquid water path. The value of these adjustments is verified by much improved cloud radiative forcing and persistent improvement in cloud fraction forecasts.

Patent
27 Sep 2007
TL;DR: In this paper, the authors describe methods and systems for providing multiple level text cloud navigation, where various categories are displayed in a first text cloud and, when a category is selected, a second text cloud is displayed having child nodes of the selected category and selected lower level nodes.
Abstract: Methods and systems for providing multiple level text cloud navigation are described, where various categories are displayed in a first text cloud and, when a category is selected, a second text cloud is displayed having child nodes of the selected category and selected lower level nodes. The categories, child nodes, and selected other nodes are displayed using an importance identifier indicative of the number of results in that category or node, a relative importance thereof, a similarities metric, a recommendations metric, or the like.