scispace - formally typeset
Search or ask a question

Showing papers by "University of California published in 2003"


Journal ArticleDOI
TL;DR: The ubiquitous puns on "matter" do not, alas, mark a rethinking of the key concepts (materiality and signification) and the relationship between them, rather, it seems to be symptomatic of the extent to which matters of "fact" have been replaced with matters of signification (no scare quotes here).
Abstract: L anguage has been granted too much power. The linguistic turn, the semiotic turn, the interpretative turn, the cultural turn: it seems that at every turn lately every “thing”—even materiality—is turned into a matter of language or some other form of cultural representation. The ubiquitous puns on “matter” do not, alas, mark a rethinking of the key concepts (materiality and signification) and the relationship between them. Rather, it seems to be symptomatic of the extent to which matters of “fact” (so to speak) have been replaced with matters of signification (no scare quotes here). Language matters. Discourse matters. Culture matters. There is an important sense in which the only thing that does not seem to matter anymore is matter. What compels the belief that we have a direct access to cultural representations and their content that we lack toward the things represented? How did language come to be more trustworthy than matter? Why are language and culture granted their own agency and historicity while matter is figured as passive and immutable, or at best inherits a potential for change derivatively from language and culture? How does one even go about inquiring after the material conditions that have led us to such a brute reversal of naturalist beliefs when materiality itself is always already figured within a linguistic domain as its condition of possibility?

4,728 citations


Book
01 Jan 2003
TL;DR: Hirsch, Devaney, and Smale's classic "Differential Equations, Dynamical Systems, and an Introduction to Chaos" has been used by professors as the primary text for undergraduate and graduate level courses covering differential equations as mentioned in this paper.
Abstract: Hirsch, Devaney, and Smale's classic "Differential Equations, Dynamical Systems, and an Introduction to Chaos" has been used by professors as the primary text for undergraduate and graduate level courses covering differential equations. It provides a theoretical approach to dynamical systems and chaos written for a diverse student population among the fields of mathematics, science, and engineering. Prominent experts provide everything students need to know about dynamical systems as students seek to develop sufficient mathematical skills to analyze the types of differential equations that arise in their area of study. The authors provide rigorous exercises and examples clearly and easily by slowly introducing linear systems of differential equations. Calculus is required as specialized advanced topics not usually found in elementary differential equations courses are included, such as exploring the world of discrete dynamical systems and describing chaotic systems. This is a classic text by three of the world's most prominent mathematicians. It continues the tradition of expository excellence. It contains updated material and expanded applications for use in applied studies.

1,214 citations


Journal ArticleDOI
TL;DR: This article examined the ways in which consumers construct identities by digitally associating themselves with signs, symbols, material objects, and places, and revealed insights into the strategies behind constructing a digital self, projecting a digital likeness, and reorganizing linear narrative structures.
Abstract: This article examines personal Web sites as a conspicuous form of consumer self-presentation. Using theories of self-presentation, possessions, and computer-mediated environments (CMEs), we investigate the ways in which consumers construct identities by digitally associating themselves with signs, symbols, material objects, and places. Specifically, the issues of interest include why consumers create personal Web sites, what consumers want to communicate, what strategies they devise to achieve their goal of self-presentation, and how those Web space strategies compare to the self-presentation strategies of real life (RL). The data reveal insights into the strategies behind constructing a digital self, projecting a digital likeness, digitally associating as a new form of possession, and reorganizing linear narrative structures.

1,119 citations


Book ChapterDOI
17 Aug 2003
TL;DR: This paper proposes several efficient techniques for building private circuits resisting side channel attacks, and provides a formal threat model and proofs of security for their constructions.
Abstract: Can you guarantee secrecy even if an adversary can eavesdrop on your brain? We consider the problem of protecting privacy in circuits, when faced with an adversary that can access a bounded number of wires in the circuit This question is motivated by side channel attacks, which allow an adversary to gain partial access to the inner workings of hardware Recent work has shown that side channel attacks pose a serious threat to cryptosystems implemented in embedded devices In this paper, we develop theoretical foundations for security against side channels In particular, we propose several efficient techniques for building private circuits resisting this type of attacks We initiate a systematic study of the complexity of such private circuits, and in contrast to most prior work in this area provide a formal threat model and give proofs of security for our constructions

968 citations


Journal ArticleDOI
TL;DR: In this paper, the optical and electronic properties of the In1−xGaxN alloys have been investigated and shown to exhibit a much higher resistance to high energy (2 MeV) proton irradiation than the standard currently used photovoltaic materials such as GaAs and GaInP, and therefore offer great potential for radiation-hard high-efficiency solar cells for space applications.
Abstract: High-efficiency multijunction or tandem solar cells based on group III–V semiconductor alloys are applied in a rapidly expanding range of space and terrestrial programs. Resistance to high-energy radiation damage is an essential feature of such cells as they power most satellites, including those used for communications, defense, and scientific research. Recently we have shown that the energy gap of In1−xGaxN alloys potentially can be continuously varied from 0.7 to 3.4 eV, providing a full-solar-spectrum material system for multijunction solar cells. We find that the optical and electronic properties of these alloys exhibit a much higher resistance to high-energy (2 MeV) proton irradiation than the standard currently used photovoltaic materials such as GaAs and GaInP, and therefore offer great potential for radiation-hard high-efficiency solar cells for space applications. The observed insensitivity of the semiconductor characteristics to the radiation damage is explained by the location of the band edge...

598 citations


Book ChapterDOI
21 Feb 2003
TL;DR: An ongoing effort to define common APIs for structured peer-to-peer overlays and the key abstractions that can be built on them is described to facilitate independent innovation in overlay protocols, services, and applications, to allow direct experimental comparisons, and to encourage application development by third parties.
Abstract: In this paper, we describe an ongoing effort to define common APIs for structured peer-to-peer overlays and the key abstractions that can be built on them. In doing so, we hope to facilitate independent innovation in overlay protocols, services, and applications, to allow direct experimental comparisons, and to encourage application development by third parties. We provide a snapshot of our efforts and discuss open problems in an effort to solicit feedback from the research community.

578 citations


Proceedings ArticleDOI
27 Oct 2003
TL;DR: This work designs a variant of the sketch data structure, k-ary sketch, which uses a constant, small amount of memory, and has constant per-record update and reconstruction cost, and enables it to summarize traffic at various levels and detects significant changes by looking for flows with large forecast errors.
Abstract: Traffic anomalies such as failures and attacks are commonplace in today's network, and identifying them rapidly and accurately is critical for large network operators. The detection typically treats the traffic as a collection of flows that need to be examined for significant changes in traffic pattern (eg, volume, number of connections). However, as link speeds and the number of flows increase, keeping per-flow state is either too expensive or too slow. We propose building compact summaries of the traffic data using the notion of sketches. We have designed a variant of the sketch data structure, k-ary sketch, which uses a constant, small amount of memory, and has constant per-record update and reconstruction cost. Its linearity property enables us to summarize traffic at various levels. We then implement a variety of time series forecast models (ARIMA, Holt-Winters, etc.) on top of such summaries and detect significant changes by looking for flows with large forecast errors. We also present heuristics for automatically configuring the model parameters.Using a large amount of real Internet traffic data from an operational tier-1 ISP, we demonstrate that our sketch-based change detection method is highly accurate, and can be implemented at low computation and memory costs. Our preliminary results are promising and hint at the possibility of using our method as a building block for network anomaly detection and traffic measurement.

549 citations


Book ChapterDOI
12 Oct 2003
TL;DR: The issues of multipath routing in MANETs are examined to support application constraints such as reliability, load-balancing, energy-conservation, and Quality-of-Service (QoS).
Abstract: Mobile ad hoc networks (MANETs) consist of a collection of wireless mobile nodes which dynamically exchange data among themselves without the reliance on a fixed base station or a wired backbone network MANET nodes are typically distinguished by their limited power, processing, and memory resources as well as high degree of mobility In such networks, the wireless mobile nodes may dynamically enter the network as well as leave the network Due to the limited transmission range of wireless network nodes, multiple hops are usually needed for a node to exchange information with any other node in the network Thus routing is a crucial issue to the design of a MANET In this paper, we specifically examine the issues of multipath routing in MANETs Multipath routing allows the establishment of multiple paths between a single source and single destination node It is typically proposed in order to increase the reliability of data transmission (ie, fault tolerance) or to provide load balancing Load balancing is of especial importance in MANETs because of the limited bandwidth between the nodes We also discuss the application of multipath routing to support application constraints such as reliability, load-balancing, energy-conservation, and Quality-of-Service (QoS)

525 citations


Book ChapterDOI
TL;DR: Statistical considerations are frequently to the fore in the analysis of microarray data, as researchers sift through massive amounts of data and adjust for various sources of variability in order to identify the important genes amongst the many which are measured.
Abstract: Statistical considerations are frequently to the fore in the analysis of microarray data, as researchers sift through massive amounts of data and adjust for various sources of variability in order to identify the important genes amongst the many which are measured. This article summarizes some of the issues involved and provides a brief review of the analysis tools which are available to researchers to deal with them. Any microarray experiment involves a number of distinct stages. Firstly there is the design of the experiment. The researchers must decide which genes are to be printed on the arrays, which sources of RNA are to be hybridized to the arrays and on how many arrays the hybridizations will be replicated. Secondly, after hybridization, there follows a number of data-cleaning steps or `low-level analysis’ of the microarray data. The microarray images must be processed to acquire red and green foreground and background intensities for each spot. The acquired red/green ratios must be normalized to adjust for dye-bias and for any systematic variation other than that due to the differences between the RNA samples being studied. Thirdly, the normalized ratios are analyzed by various graphical and numerical means to select differentially expressed (DE) genes or to find groups of genes whose expression profiles can reliably classify the different RNA sources into meaningful groups. The sections of this article correspond roughly to the various analysis steps. The following notation will be used throughout the article. The foreground red and green

504 citations


Journal ArticleDOI
TL;DR: It is confirmed that pitch processing is enhanced in high-functioning autism and as predicted by the enhanced perceptual functioning model for peaks of ability in autism, autistic individuals outperform typically developing population in a variety of low-level perceptual tasks.
Abstract: Past research has shown a superiority of participants with high-functioning autism over comparison groups in memorizing picture-pitch associations and in detecting pitch changes in melodies. A subset of individuals with autism, known as "musical savants," is also known to possess absolute pitch. This superiority might be due to an abnormally high sensitivity to fine-grained pitch differences in sounds. To test this hypothesis, psychoacoustic tasks were devised so as to use a signal detection methodology. Participants were all musically untrained and were divided into a group of 12 high-functioning individuals with autism and a group of 12 normally developing individuals. Their task was to judge the pitch of pure tones in a "same-different" discrimination task and in a "high-low" categorization task. In both tasks, the obtained psychometric functions revealed higher pitch sensitivity for subjects with autism, with a more pronounced advantage over control participants in the categorization task. These findings confirm that pitch processing is enhanced in "high-functioning" autism. Superior performance in pitch discrimination and categorization extends previous findings of enhanced visual performance to the auditory domain. Thus, and as predicted by the enhanced perceptual functioning model for peaks of ability in autism (Mottron & Burack, 2001), autistic individuals outperform typically developing population in a variety of low-level perceptual tasks.

456 citations


Journal Article
TL;DR: It is clear from data collated that the impact from musculoskeletal conditions and trauma varies among different parts of the world and is influenced by social structure, expectation and economics, and that it is most difficult to measure impact in less developed nations, where the predicted increase is greatest.
Abstract: Musculoskeletal conditions are extremely common and include more than 150 different diseases and syndromes, which are usually associated with pain and loss of function. In the developed world, where these conditions are already the most frequent cause of physical disability, ageing of the most populous demographic groups will further increase the burden these conditions impose. In the developing world, successful care of childhood and communicable diseases and an increase in road traffic accidents is shifting the burden to musculoskeletal and other noncommunicable conditions. To help better prepare nations for the increase in disability brought about by musculoskeletal conditions, a Scientific Group meeting was held to map out the burden of the most prominent musculoskeletal conditions at the start of the Bone and Joint Decade. In particular, the Group gathered data on the incidence and prevalence of rheumatoid arthritis, osteoarthritis, osteoporosis, major limb trauma and spinal disorders. Data were collected and organized by world region, gender and age groups to assist with the ongoing WHO Global Burden of Disease 2000 study. The Group also considered what is known about the severity and course of these conditions, along with their economic impact. The most relevant domains to assess and monitor the consequences of these conditions were identified and used to describe health states for the different stages of the conditions. Instruments that measure these most important domains for the different conditions were recommended. It is clear from data collated that the impact from musculoskeletal conditions and trauma varies among different parts of the world and is influenced by social structure, expectation and economics, and that it is most difficult to measure impact in less developed nations, where the predicted increase is greatest.

Book
08 Jul 2003
TL;DR: This paper conducted an empirical study of 14 pulp manufacturing mills in the United States, Canada, Australia and New Zealand and found that steadily tightening regulatory standards have been crucial for raising environmental performance.
Abstract: How much does regulation matter in shaping corporate behaviour? This in-depth study of 14 pulp manufacturing mills in the United States, Canada, Australia and New Zealand reveals that steadily tightening regulatory standards have been crucial for raising environmental performance. But while all firms have shown improvement, some have improved more than others, many going substantially beyond compliance. What explains the variation in compliance? It's not necessarily the differences in regulation in each country. Rather, variation is accounted for by the complex interaction between tightening regulations and a social license to operate - especially pressures from community and environmental activists - economic constraints, and differences in corporate environmental management style. This book provides a systematic empirical study of why firms achieve the levels of environmental performance that they do.

Journal ArticleDOI
TL;DR: It is shown quantitatively how repulsion must dominate attraction to avoid collapse of the group to a tight cluster and the existence of a well-spaced locally stable state, having a characteristic individual distance.
Abstract: We formulate a Lagrangian (individual-based) model to investigate the spacing of individuals in a social aggregate (e.g., swarm, flock, school, or herd). Mutual interactions of swarm members have been expressed as the gradient of a potential function in previous theoretical studies. In this specific case, one can construct a Lyapunov function, whose minima correspond to stable stationary states of the system. The range of repulsion (r) and attraction (a) must satisfy r cAa(d+1) where R, A are magnitudes, c is a constant of order 1, and d is the space dimension) to avoid collapse of the group to a tight cluster. We also verify the existence of a well-spaced locally stable state, having a characteristic individual distance. When the number of individuals in a group increases, a dichotomy occurs between swarms in which individual distance is preserved versus those in which the physical size of the group is maintained at the expense of greater crowding.

Book ChapterDOI
21 Feb 2003
TL;DR: It is suggested that the peer-to-peer network does not have enough capacity to make naive use of either of search techniques attractive for Web search, and a number of compromises that might achieve the last order of magnitude are suggested.
Abstract: This paper discusses the feasibility of peer-to-peer full-text keyword search of the Web. Two classes of keyword search techniques are in use or have been proposed: flooding of queries over an overlay network (as in Gnutella), and intersection of index lists stored in a distributed hash table. We present a simple feasibility analysis based on the resource constraints and search workload. Our study suggests that the peer-to-peer network does not have enough capacity to make naive use of either of search techniques attractive for Web search. The paper presents a number of existing and novel optimizations for P2P search based on distributed hash tables, estimates their effects on performance, and concludes that in combination these optimizations would bring the problem to within an order of magnitude of feasibility. The paper suggests a number of compromises that might achieve the last order of magnitude.

Patent
01 Jul 2003
TL;DR: In this article, the authors present a system for transforming warped video images into rectilinear video images, real-time tracking of persons and objects, face recognition of persons, monitoring and tracking head pose of a person and associated perspective view of the person.
Abstract: Digital video imaging systems and techniques for efficiently transforming warped video images into rectilinear video images, real-time tracking of persons and objects, face recognition of persons, monitoring and tracking head pose of a person and associated perspective view of the person.

Book ChapterDOI
09 Jan 2003
TL;DR: In this paper, a detailed review of the theory of decoherence-free subspaces and subsystems focusing on their usefulness for preservation of quantum information is provided. But the authors do not consider the decoherent properties of the subsystems.
Abstract: Decoherence is the phenomenon of non-unitary dynamics that arises as a consequence of coupling between a system and its environment. It has important harmful implications for quantum information processing, and various solutions to the problem have been proposed. Here we provide a detailed a review of the theory of decoherence-free subspaces and subsystems, focusing on their usefulness for preservation of quantum information.

Book ChapterDOI
01 Jan 2003
TL;DR: In this paper, the authors argue that the competitive advantage of firms stems from dynamic capabilities rooted in high performance routines operating inside the frm, embedded in the firm's processes, and conditioned by its history.
Abstract: An expanded paradigm is needed to explain how competitive advantage is gained and held. Firms resorting to ‘resource-based strategy’ attempt to accumulate valuable technology assets and employ an aggressive intellectual property stance. However, winners in the global marketplace have been frms demonstrating timely responsiveness and rapid and flexible product innovation, along with the management capability to effectively coordinate and redeploy internal and external competences. This source of competitive advantage, ‘dynamic capabilities’, emphasizes two aspects. First, it refers to the shifting character of the environment; second, it emphasizes the key role of strategic management in appropriately adapting, integrating, and re-configuring internal and external organizational skills, resources, and functional competences toward a changing environment. Only recently have researchers begun to focus on the specifics of developing firm-specific capabilities and the manner in which competences are renewed to respond to shifts in the business environment. The dynamic capabilities approach provides a coherent framework to integrate existing conceptual and empirical knowledge, and facilitate prescription. This chapter argues that the competitive advantage of firms stems from dynamic capabilities rooted in high performance routines operating inside the frm, embedded in the firm’s processes, and conditioned by its history. It offers dynamic capabilities as an emerging paradigm of the modern business firm that draws on multiple disciplines and advances, with the help of industry studies in the USA and elsewhere.

Patent
29 Dec 2003
TL;DR: In this paper, a monolithic elastomer membrane associated with an integrated pneumatic manifold allows the placement and actuation of a variety of fluid control structures, such as structures for pumping, isolating, mixing, routing, merging, splitting, preparing, and storing volumes of fluid.
Abstract: Methods and apparatus for implementing microfluidic analysis devices are provided. A monolithic elastomer membrane associated with an integrated pneumatic manifold allows the placement and actuation of a variety of fluid control structures, such as structures for pumping, isolating, mixing, routing, merging, splitting, preparing, and storing volumes of fluid. The fluid control structures can be used to implement a variety of sample introduction, preparation, processing, and storage techniques.

Journal ArticleDOI
TL;DR: In this paper, a study of 14 pulp and paper manufacturing plants in Australia, New Zealand, British Columbia, and the states of Washington and Georgia in the United States found that regulatory requirements and intensifying political pressures have brought about large improvements and considerable convergence in environmental performance by pulp manufacturers.
Abstract: How and to what extent does regulation matter in shaping corporate behavior? How important is it compared to other incentives and mechanisms of social control, and how does it interact with those mechanisms? How might we explain variation in corporate responses to law and other external pressures? This article addresses these questions through an study of environmental performance in 14 pulp and paper manufacturing mills in Australia, New Zealand, British Columbia, and the states of Washington and Georgia in the United States. Over the last three decades, we find tightening regulatory requirements and intensifying political pressures have brought about large improvements and considerable convergence in environmental performance by pulp manufacturers, most of which have gone "beyond compliance" in several ways. But regulation does not account for remaining differences in environmental performance across facilities. Rather, "social license" pressures (particularly from local communities and environmental activists) and corporate environmental management style prod some firms toward better performance compliance than others. At the same time, economic pressures impose limits on "beyond performance" investments. In producing large gains in environmental performance, however, regulation still matters greatly, but less as a system of hierarchically imposed, uniformly enforced rules than as a coordinative mechanism, routinely interacting with market pressures, local and national environmental activists, and the culture of corporate management in generating environmental improvement while narrowing the spread between corporate leaders and laggards. I. Introduction In what ways and to what extent does regulation matter in shaping corporate behavior? How important is it compared to other incentives and mechanisms of social control, and how does it interact with those mechanisms? As all firms do not respond in the same way to law or to other external pressures, how do we understand variation in corporate behavior? In seeking to answer these questions, the sociolegal and policy literature on regulatory administration traditionally has focused on explaining corporate compliance and noncompliance with existing legal requirements. The tacit assumption has been that legal compliance by targeted groups is the key to meeting the objectives of social regulation. Underlying that assumption is another: that regulated business corporations take costly measures to improve their performance only when they believe that legal noncompliance is likely to be detected and harshly penalized (Becker 1968; Stigler 1970; Miller & Anderson 1986; OECD 2000).1 From the viewpoint of traditional models of corporations as "amoral calculators" (Kagan & Scholz 1984), why would a profit-maximizing company want to do more than the law requires since compliance is itself often expensive and overcompliance even more so? Yet it is becoming apparent that an increasing number of companies now perform, to a greater or lesser extent, "beyond compliance" with existing regulatory requirements. This suggests that the degree of variation in, and the motivations for, corporate behavior may be much broader than many researchers have imagined. This is of practical importance: some existing regulatory strategies, in focusing on compliance, have failed to facilitate, reward, or encourage beyond-compliance behavior, or even inadvertently discourage it,2 while other regulatory reformers, in contrast, have argued that government-mandated self-regulation is the key to progress. There is no better illustration of the importance of studying "overcompliance" as well as compliance than the arena of environmental regulation. For here there is considerable variation in how firms respond to external pressures, including regulation, and in at least some industries, considerable evidence of "beyond-compliance" behavior (Smart 1992; Hoffman 1997; Prakash 2000). …

Journal ArticleDOI
TL;DR: Data support the concept that phenolic antioxidants from processed honey are bioavailable, and that they increase antioxidant activity of plasma, and support the idea that the substitution of honey in some foods for traditional sweeteners could result in an enhanced antioxidant defense system in healthy adults.
Abstract: Free radicals and reactive oxygen species (ROS) have been implicated in contributing to the processes of aging and disease. Humans protect themselves from these damaging compounds, in part, by absorbing antioxidants from high-antioxidant foods. This report describes the effects of consuming 1.5 g/kg body weight of corn syrup or buckwheat honey on the antioxidant and reducing capacities of plasma in healthy human adults. The corn syrup treatment contained 0.21 +/- 0.06 mg of phenolic antioxidants per gram, and the two buckwheat honey treatments contained 0.79 +/- 0.02 and 1.71 +/- 0.21 mg of phenolic antioxidants per gram. Following consumption of the two honey treatments, plasma total-phenolic content increased (P < 0.05) as did plasma antioxidant and reducing capacities (P < 0.05). These data support the concept that phenolic antioxidants from processed honey are bioavailable, and that they increase antioxidant activity of plasma. It can be speculated that these compounds may augment defenses against oxidative stress and that they might be able to protect humans from oxidative stress. Given that the average sweetener intake by humans is estimated to be in excess of 70 kg per year, the substitution of honey in some foods for traditional sweeteners could result in an enhanced antioxidant defense system in healthy adults.

Book ChapterDOI
17 Aug 2003
TL;DR: A block-cipher mode of operation, CMC, that turns an n-bit block cipher into a tweakable enciphering scheme that acts on strings of mn bits, where m ≥ 2.
Abstract: We describe a block-cipher mode of operation, CMC, that turns an n-bit block cipher into a tweakable enciphering scheme that acts on strings of mn bits, where m ≥ 2. When the underlying block cipher is secure in the sense of a strong pseudorandom permutation (PRP), our scheme is secure in the sense of tweakable, strong PRP. Such an object can be used to encipher the sectors of a disk, in-place, offering security as good as can be obtained in this setting. CMC makes a pass of CBC encryption, xors in a mask, and then makes a pass of CBC decryption; no universal hashing, nor any other non-trivial operation beyond the block-cipher calls, is employed. Besides proving the security of CMC we initiate a more general investigation of tweakable enciphering schemes, considering issues like the non-malleability of these objects.

Journal ArticleDOI
TL;DR: In this article, the authors summarize the mechanism through which the pleiotropic effects of NF-κB are regulated within the cells, and show that reversible acetylation of RelA serves as an important intranuclear regulatory mechanism that further provides for dynamic control of NF -κB action.
Abstract: Although the proximal cytoplasmic signaling events that control the activation of the NF-κB transcription factor are understood in considerable detail, the subsequent intranuclear events that regulate the strength and duration of the NF-κB-mediated transcriptional response remain poorly defined. Recent studies have revealed that NF-κB is subject to reversible acetylation and that this posttranslational modification functions as an intranuclear molecular switch to control NF-κB action. In this review, we summarize this new and fascinating mechanism through which the pleiotropic effects of NF-κB are regulated within the cells. NF-κB is a heterodimer composed of p50 and RelA subunits. Both subunits are acetylated at multiple lysine residues with the p300/CBP acetyltransferases playing a major role in this process in vivo. Further, the acetylation of different lysines regulates different functions of NF-κB, including transcriptional activation, DNA binding affinity, IκBα assembly, and subcellular localization. Acetylated forms RelA are subject to deacetylation by histone deacetylase 3 (HDAC3). This selective action of HDAC3 promotes IκBα binding and rapid CRM1-dependent nuclear export of the deacetylated NF-κB complex, which terminates the NF-κB response and replenishes the cytoplasmic pool of latent NF-κB/IκBα complexes. This readies the cell for the next NF-κB-inducing stimulus. Thus, reversible acetylation of RelA serves as an important intranuclear regulatory mechanism that further provides for dynamic control of NF-κB action.

Journal ArticleDOI
TL;DR: This work proposes an interactional- normative framework that focuses on interpretations of messages from multiple perspectives in the situated and evolving context of appropriateness norms, and incorporates intentionality and individuals’ strategic choices in language use and channel selection.
Abstract: Researchers examining ‘flaming’ - defined as hostile and aggressive interactions via text-based computer mediated-communication - have proposed theoretical frameworks to explain possible causes. However, precise conceptual and operational definitions of ‘flaming’ have yet to be established, which has implications for understanding this phenomenon. Consequently, we propose an interactional- normative framework that focuses on interpretations of messages from multiple perspectives in the situated and evolving context of appropriateness norms. This framework incorporates intentionality and individuals’ strategic choices in language use and channel selection. We discuss the implications of this framework for research on flaming and other problematic interactions.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the erosion and sedimentation associated with the 17 July 1998 Papua New Guinea tsunami, and describe the sedimentation of a layer averaging 8 cm thick of gray sand on a brown muddy soil.
Abstract: — This paper describes erosion and sedimentation associated with the 17 July 1998 Papua New Guinea tsunami. Observed within two months of the tsunami, distinct deposits of a layer averaging 8-cm thick of gray sand rested on a brown muddy soil. In most cases the sand is normally graded, with more coarse sand near the base and fine sand at the top. In some cases the deposit contains rip-up clasts of muddy soil and in some locations it has a mud cap. Detailed measurements of coastal topography, tsunami flow height and direction indicators, and deposit thickness were made in the field, and samples of the deposit were collected for grain-size analysis in the laboratory. Four shore-normal transects were examined in detail to assess the shore-normal and along shore distribution of the tsunami deposit. Near the shoreline, the tsunami eroded approximately 10–25 cm of sand from the beach and berm. The sandy layer deposited by the tsunami began 50–150 m inland from the shoreline and extended across the coastal plain to within about 40 m of the limit of inundation; a total distance of up to 750 m from the beach. As much as 2/3 of the sand in the deposit originated from offshore. Across most of the coastal plain the deposit thickness and mean grain size varied little. In the along-shore direction the deposit thickness varied with the tsunami wave height; both largest near the entrance to Sissano Lagoon.

Patent
23 Jan 2003
TL;DR: In this paper, a long-term implantable ultrasound therapy system and method is provided that provides directional, focused ultrasound to localized regions of tissue within body joints, such as spinal joints.
Abstract: A long-term implantable ultrasound therapy system and method is provided that provides directional, focused ultrasound to localized regions of tissue within body joints, such as spinal joints An ultrasound emitter or transducer is delivered to a location within the body associated with the joint and heats the target region of tissue associated with the joint from the location Such locations for ultrasound transducer placement may include for example in or around the intervertebral discs, or the bony structures such as vertebral bodies or posterior vertebral elements such as facet joints Various modes of operation provide for selective, controlled heating at different temperature ranges to provide different intended results in the target tissue, which ranges are significantly effected by pre-stressed tissues such as in-vivo intervertebral discs In particular, treatments above 70 degrees C, and in particular 75 degrees C, are used for structural remodeling, whereas lower temperatures achieves other responses without appreciable remodeling

Journal ArticleDOI
TL;DR: Data suggest that reducing the population size structure, structural complexity and cover of living rhodoliths could decrease species richness and abundance, and increased anthropogenic disturbance from trawling, anchoring and changes in water quality can directly impact the bed community through substrate alteration.
Abstract: Rhodolith beds, unattached coralline reefs, support a high diversity and abundance of marine species from both hard and soft benthos. We used surveys in multiple shallow (3-20 m) beds in the Gulf of California to (1) examine seasonal patterns in associated floral and faunal diversity and abundance, (2) compare differences in faunal associations between rhodolith beds and adjacent sedimentary habitats, (3) examine the importance of complexity of rhodolith structure to community structure, and (4) estimate the impact of anthropogenic disturbance on rhodoliths and associated species. 2. Macroalgal richness was seasonal, and beds supported higher richness in winter (to 36 species) than summer (6-7 species), primarily due to foliose red algae. Strong seasonal variation in the abundance of dominant cover organisms was due to a shift from macroalgae and mat-forming colonial invertebrate species to microalgae. 3. The community in a rhodolith bed of high-density thalli (El Coyote average 11000 thalli/ m 2 ) had higher richness (52 versus 30 species) and abundance of epibenthic and crypto- and in- faunal species compared with an adjacent sand community. Species diversity and abundance was particularly high for unique cryptofaunal organisms associated with rhodolith interstices. Cryptofauna reached average densities of 14.4 organisms/ cm 3 rhodolith, the majority of which were crustaceans, polychaetes and cnidarians along with rhodolith-specific chitons. 4. Results from sampling across a range of rhodolith morphs in the El Requeson bed (with lower average cryptofaunal densities of 2.3 organisms/ cm 3 ) revealed that the total organisms supported by a rhodolith significantly increased with both complexity (branching density) and space available (thallus volume). These data suggest that reducing the population size structure, structural complexity and cover of living rhodoliths could decrease species richness and abundance. 5. While disturbance is a natural feature of these free-living beds, increased anthropogenic disturbance from trawling, anchoring and changes in water quality can directly impact the bed community through substrate alteration. Commercial fishing threatens rhodolith beds in the Gulf of California by decreasing rhodolith size and increasing sedimentation and burial rates. In addition to

PatentDOI
TL;DR: In this article, a white light emitting electrophosphorescent polymeric light-emitting diodes (PLEDs) are demonstrated using semiconducting polymers blended with organometallic emitters as emissive materials in a common region.
Abstract: White light-emitting electrophosphorescent polymeric light-emitting diodes (PLEDs) are demonstrated using semiconducting polymers blended with organometallic emitters as emissive materials in a common region. These materials may be cast from solution. The CIE coordinates, the color temperatures and the color rendering indices of the white emission are insensitive to the brightness, applied voltage and applied current density.

Journal ArticleDOI
TL;DR: A layered view of QoS provisioning in MANETs is considered, starting from the physical anti going up to the application layer, and the efforts on QoS support at each of the layers are described.
Abstract: The widespread use of mobile and handheld devices is likely to popularize ad hoc networks, which do not require any wired infrastructure for intercommunication. The nodes of mobile ad hoc networks operate as end hosts as well as routers. They intercommunicate through single-hop and multihop paths in a peer-to-peer fashion. With the expanding scope of applications of MANETs, the need to support QoS in these networks is becoming essential. This article provides a survey of issues in supporting QoS in MANETs. We have considered a layered view of QoS provisioning in MANETs. In addition to the basic issues in QoS, the report describes the efforts on QoS support at each of the layers, starting from the physical anti going up to the application layer. A few proposals on interlayer approaches to QoS provisioning are also addressed. The article concludes with a discussion on the future directions and challenges in the areas of QoS support in MANETs.

Journal ArticleDOI
TL;DR: A behaviorally neutral agent-based model is developed involving a forager engaged in a random walk within a uniform environment that acknowledges the possibility that Paleolithic behavioral adaptations were sometimes not responsive to differences between stone raw material types in the ways implied by current archaeological theory.
Abstract: Stone, tool assemblage variability is considered a reliable proxy measure of adaptive variability. Raw material richness, transport distances, and the character of transported technologies are thought to signal (1) variation in raw material selectivity based on material quality and abundance, (2) optimization of time and energy costs associated with procurement of stone from spatially dispersed sources, (3) planning depth that weaves raw material procurement forays into foraging activities, and (4) risk minimization that sees materials transported in quantities and forms that are energetically economical and least likely to fail. This paper dispenses with assumptions that raw material type and abundance play any role in the organization of mobility and raw material procurement strategies. Rather, a behaviorally neutral agent-based model is developed involving a forager engaged in a random walk within a uniform environment. Raw material procurement in the model is dependent only upon random encounters with stone sources and the amount of available space in the mobile toolkit. Simulated richness-sample size relationships, frequencies, of raw material transfers as a function of distance from source, and both quantity-distance and reduction intensity-distance relationships are qualitatively similar to commonly observed archaeological patterns. In some archaeological cases it may be difficult to reject the neutral model. At best, failure to reject the neutral model may mean that intervening processes (e.g., depositional time-averaging) have erased high-frequency adaptive signals in the data. At worst, we may have to admit the possibility that Paleolithic behavioral adaptations were sometimes not responsive to differences between stone raw material types in the ways implied by current archaeological theory.

Journal ArticleDOI
12 Feb 2003-Planta
TL;DR: The tla1 strain required a higher light intensity for the saturation of photosynthesis and showed greater solar conversion efficiencies and a higher photosynthetic productivity than the wild type under mass culture conditions.
Abstract: DNA insertional mutagenesis and screening of the green alga Chlamydomonas reinhardtii was employed to isolate tla1, a stable transformant having a truncated light-harvesting chlorophyll antenna size. Molecular analysis showed a single plasmid insertion into an open reading frame of the nuclear genome corresponding to a novel gene (Tla1) that encodes a protein of 213 amino acids. Genetic analysis showed co-segregation of plasmid and tla1 phenotype. Biochemical analyses showed the tla1 mutant to be chlorophyll deficient, with a functional chlorophyll antenna size of photosystem I and photosystem II being about 50% and 65% of that of the wild type, respectively. It contained a correspondingly lower amount of light-harvesting proteins than the wild type and had lower steady-state levels of Lhcb mRNA. The tla1 strain required a higher light intensity for the saturation of photosynthesis and showed greater solar conversion efficiencies and a higher photosynthetic productivity than the wild type under mass culture conditions. Results are discussed in terms of the tla1 mutation, its phenotype, and the role played by the Tla1 gene in the regulation of the photosynthetic chlorophyll antenna size in C. reinhardtii.