Texas Law Review
About: Texas Law Review is an academic journal. The journal publishes majorly in the area(s): Supreme court & Constitution. It has an ISSN identifier of 0040-4411. Over the lifetime, 812 publications have been published receiving 8791 citations.
Papers published on a yearly basis
TL;DR: In this article, the authors study several interconnected problems that arise under the current U.S. patent system when a patent covers one component or feature of a complex product and show how holdup problems are magnified in the presence of royalty stacking.
Abstract: We study several interconnected problems that arise under the current U.S. patent system when a patent covers one component or feature of a complex product. This situation is common in the information technology sector of the economy. Our analysis applies to cases involving reasonable royalties but not lost profits. First, we show using bargaining theory that the threat to obtain a permanent injunction greatly enhances the patent holder's negotiating power, leading to royalty rates that exceed a natural benchmark range based on the value of the patented technology and the strength of the patent. Such royalty overcharges are especially great for weak patents covering a minor feature of a product with a sizeable price/cost margin, including products sold by firms that themselves have made substantial research and development investments. These royalty overcharges do not disappear even if the allegedly infringing firm is fully aware of the patent when it initially designs its product. However, the holdup problems caused by the threat of injunctions are reduced if courts regularly grant stays to permanent injunctions to give defendants time to redesign their products to avoid infringement when this is possible. Second, we show how holdup problems are magnified in the presence of royalty stacking, i.e., when multiple patents read on a single product. Third, using third-generation cellular telephones and Wi-Fi as leading examples, we illustrate that royalty stacking can become a very serious problem, especially in the standard-setting context where hundreds or even thousands of patents can read on a single product standard. Fourth, we discuss the use of "reasonable royalties" to award damages in patent infringement cases. We report empirical results regarding the measurement of reasonable royalties by the courts and identify various practical problems that tend to lead courts to overestimate reasonable royalties in the presence of royalty stacking. Finally, we make suggestions for patent reform based on our theoretical and empirical findings. I. Introduction The patent system is designed with a paradigm invention in mind - a new device or machine covered by a single patent. Historically, this paradigm was a fairly accurate portrayal of the typical patent.1 As Robert Merges put it, "for Jefferson, if you put technology in a bag and shook it, it would make some noise."2 In the last few decades that has begun to change markedly. Not only have patents on chemical, biotechnological, hardware, and software inventions proliferated, but more and more products incorporate not a single new invention but a combination of many different components, each of which may be the subject of one or more patents.3 In the information technology sector in particular, modern products such as microprocessors, cell phones, or memory devices can easily be covered by dozens or even hundreds of different patents. As a striking example, literally thousands of patents have been identified as essential to the proposed new standards for 3G cellular telephone systems.4 The fact that a great many patents can read on a single product, and that this is common in certain critical industries, creates numerous practical problems for the operation of the patent system.5 We focus here on two critical, interacting areas in which problems arise: injunction threats and royalty stacking. We are especially interested in how these problems affect the royalties that will be negotiated between patent holders and downstream firms that produce products that may infringe those patents. After all, since far more patents are licensed or settled than litigated to judgment, the primary economic effect of rules governing patent litigation arises through the effect of those rules on the licensing terms that are negotiated in the shadow of litigation. The threat that a patent holder will obtain an injunction that will force the downstream producer to pull its product from the market can be very powerful. …
TL;DR: This Article argues that the set of rules for information flows imposed by technology and communication networks form a "Lex Informatica" that policymakers must understand, consciously recognize, and encourage.
Abstract: Joel R Reidenberg* I Introduction to Lex Informatica During the middle ages, itinerant merchants traveling across Europe to trade at fairs, markets, and sea ports needed common ground rules to create trust and confidence for robust international trade The differences among local, feudal, royal, and ecclesiastical law provided a significant degree of uncertainty and difficulty for merchants Custom and practices evolved into a distinct body of law known as the "Lex Mercatoria," which was independent of local sovereign rules and assured commercial participants of basic fairness in their relationships1 In the era of network and communications technologies, participants traveling on information infrastructures confront an unstable and uncertain environment of multiple governing laws, changing national rules, and conflicting regulations For the information infrastructure, default ground rules are just as essential for participants in the Information Society as Lex Mercatoria was to merchants hundreds of years ago2 Confusion and conflict over the rules for information flows run counter to an open, robust Information Society Principles governing the treatment of digital information must offer stability and predictability so that participants have enough confidence for their communities to thrive, just as settled trading rules gave confidence and vitality to merchant communities At present, three substantive legal policy areas are in a critical state of flux in the network environment The treatment of content, the treatment of personal information, and the preservation of ownership rights each presents conflicting policies within nations and shows a lack of harmonization across national borders In addition, serious jurisdictional obstacles confront the enforcement of any substantive legal rights in the network environment3 But just as clear accounting rules reassured participants in twentieth century financial markets, ground rules for the access, distribution, and use of information will shape the trust, confidence, and fairness in the twenty-first century digital world for citizens, businesses, and governments Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information4 This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rulemaking Technological capabilities and system design choices impose rules on participants5 The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations Even user preferences and technical choices create overarching, local default rules6 This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a "Lex Informatica" that policymakers must understand, consciously recognize, and encourage7 The Article begins in Part II with a sketch of the information policy problems inherent in the legal regulation of content, personal information, and intellectual property on global networks Part II proceeds to show specific technical solutions and responses to these policy problems as an illustration of the rule-making power of technology and networks These illustrations serve as a prelude to the articulation of a theory of Lex Informatica Part III then defines the theoretical foundation for Lex Informatica by showing technological constraints as a distinct source of rules for information flows Lex Informatica intrinsically links rule-making capabilities well suited for the Information Society with substantive information policy choices Lex Informatica may establish a single, immutable norm for information flows on the network or may enable the customization and automation of information flow policies for specific circumstances that adopt a rule of flexibility …
TL;DR: Merrill et al. as discussed by the authors proposed a regulatory approach to the Internet of Things (IoT) and proposed concrete first steps for a regulatory framework to protect consumer privacy.
Abstract: The consumer "Internet of Things" is suddenly reality, not science fiction. Electronic sensors are now ubiquitous in our smartphones, cars, homes, electric systems, health-care devices, fitness monitors, and workplaces. These connected, sensor-based devices create new types and unprecedented quantities of detailed, high-quality information about our everyday actions, habits, personalities, and preferences. Much of this undoubtedly increases social welfare. For example, insurers can price automobile coverage more accurately by using sensors to measure exactly how you drive (e.g., Progressive's Snapshot system), which should theoretically lower the overall cost of insurance. But the Internet of Things raises new and difficult questions as well. This Article shows that four inherent aspects of sensor-based technologies-the compounding effects of what computer scientists call "sensor fusion," the near impossibility of truly de-identifying sensor data, the likelihood that Internet of Things devices will be inherently prone to security flaws, and the difficulty of meaningful consumer consent in this context-create very real discrimination, privacy, security, and consent problems. As connected, sensor-based devices tell us more and more about ourselves and each other, what discrimination-racial, economic, or otherwise-will that permit, and how should we constrain socially obnoxious manifestations? As the Internet of Things generates ever more massive and nuanced datasets about consumer behavior, how to protect privacy? How to deal with the reality that sensors are particularly vulnerable to security risks? How should the law treat-and how much should policy depend upon-consumer consent in a context in which true informed choice may be impossible? This Article is the first legal work to describe the new connected world we are creating, address these four interrelated problems, and propose concrete first steps for a regulatory approach to the Internet of Things.[E]very animate and inanimate object on Earth will soon be generating data, including our homes, our cars, and yes, even our bodies.1-Anthony D. Williams, in The Human Face of Big Data (2012)Very soon, we will see inside ourselves like never before, with wearable, even internal [. | sensors that monitor even our most intimate biological processes. It is likely to happen even before we figure out the etiquette and laws around sharing this knowledge.2-Quentin Hardy, The New York Times (2012)[A]ll data is credit data, we just don't know how to use it yet.. .. Data matters. More data is always better.3-Douglas Merrill, Google's former CIO & CEO of ZestFinanceIntroductionThe Breathometer is a small, black plastic device that plugs into the headphone jack of an Android or iPhone smartphone.4 Retailing for $49, the unit contains an ethanol sensor to estimate blood alcohol content from the breath.5 The company's website advertises that the device will give you "the power to make smarter decisions when drinking."6 The device works only in conjunction with the downloadable Breathometer application (app), which both displays the results of any given test and shows a user's longitudinal test history.The Breathometer is representative of a huge array of new consumer devices promising to measure, record, and analyze different aspects of daily life that have exploded onto the market in the last twelve to eighteen months.7 For example, a Fitbit bracelet or Nike+ FuelBand can track the steps you take in a day, calories burned, and minutes asleep; a Basis sports watch will track your heart rate; a Withings cuff will graph your blood pressure on your mobile phone or tablet; an iBGStar iPhone add-on will monitor your blood glucose levels; a Scanadu Scout will measure your temperature, heart rate, and hemoglobin levels; an Adidas miCoach Smart Ball will track your soccer performance;8 a UVeBand or JUNE bracelet will monitor your daily expo- sure to ultraviolet rays and notify your smartphone if you need to reapply sunscreen;9 a Helmet by LifeBEAM will track your heart rate, blood flow, and oxygen saturation as you cycle; a Mimo Baby Monitor "onesie" shirt will monitor your baby's sleep habits, temperature, and breathing patterns; a W/Me bracelet from Phyode will track changes in your autonomic nervous system to detect mental state (e. …
TL;DR: This article found a large, statistically significant, and robust relationship between aggregated institutionalization and homicide rates, using a Prais-Winsten regression model that corrects for autocorrelation in time-series data, and holding constant three leading structural covariates of homicide.
Abstract: The incarceration revolution of the late twentieth century fueled ongoing research on the relationship between rates of incarceration and crime, unemployment, education, and other social indicators. In this research, the variable intended to capture the level of confinement in society was conceptualized and measured as the rate of incarceration in state and federal prisons and county jails. This, however, fails to take account of other equally important forms of confinement, especially commitment to mental hospitals and asylums. When the data on mental hospitalization rates are combined with the data on imprisonment rates for the period 1928 through 2000, the incarceration revolution of the late twentieth century barely reaches the level of aggregated institutionalization that the United States experienced at mid-century. The highest rate of aggregated institutionalization during the entire period occurred in 1955 when almost 640 persons per 100,000 adults over age 15 were institutionalized in asylums, mental hospitals, and state and federal prisons. Equally surprising, the trend for aggregated institutionalization reflects a mirror image of the national homicide rate during the period 1928 through 2000. Using a Prais-Winsten regression model that corrects for autocorrelation in time-series data, and holding constant three leading structural covariates of homicide, this Article finds a large, statistically significant, and robust relationship between aggregated institutionalization and homicide rates. These findings underscore, more than anything, how much institutionalization there was at mid-century. The implications are both practical and theoretical. As a practical matter, empirical research that uses confinement as a value of interest should use an aggregated institutionalization rate that incorporates mental hospitalization rates. At a theoretical level, these findings suggest that it may be the continuity of confinement-and not just the incarceration explosion-that needs to be explored and explained. I. Introduction The classic texts of social theory from the 1960s tell a consistent story not only about the rise and (in some cases) fall of discrete carceral institutions, but also of the remarkable continuity of confinement and social exclusion. This pattern is reflected in the writings of Erving Goffman on Asylums,1 Gerald Grob on The State and the Mentally Ill,2 David Rothman on The Discovery of the Asylum,3 and Michel Foucault.4 In Madness and Civilization, for instance, Foucault traces the continuity of confinement through different stages of Western European history, from the lazar houses for lepers on the outskirts of Medieval cities, to the Ships of Fools navigating down rivers of Renaissance Europe, to the establishment in the seventeenth century of the Hopital General in Paris-that enormous house of confinement for the poor, the unemployed, the homeless, the vagabond, the criminal, and the insane.5 Surprisingly, this literature never made its way into the empirical social science research on the incarceration revolution of the late twentieth century. With the marked exception of a few longitudinal studies on the interdependence of mental hospital and prison populations,6 as well as a small subset of the empirical research on the causes of the late-twentieth century prison explosion,7 no published empirical research conceptualizes the level of confinement in society through the lens of institutionalization writ large. Uniformly, the research limits the prism to rates of imprisonment only. None of the research that uses confinement as an independent variable-in other words, that studies the effect of confinement (and possibly other social indicators) on crime, unemployment, education, or other dependent variables-includes mental hospitalization in its measure of confinement.8 Moreover, none of the binary studies of confinement-in other words, research that explores the specific relationship between confinement and unemployment, or confinement and crime, or confinement and any other non-mental-health-related indicator-uses a measure of coercive social control that includes rates of mental hospitalization. …
TL;DR: Nussbaum's Frontiers of Justice as mentioned in this paper is a major critical assessment of John Rawls's contractarian theory of justice and a highly original treatment of three "unsolved problems of justice": our duties to the impaired and disabled; to members of other nationalities; and to other animal species.
Abstract: Frontiers of Justice: The Capabilities Approach vs. Contractarianism FRONTIERS OF JUSTICE: DISABILITIES, NATIONALITY, SPECIES MEMBERSHIP. By Martha C. Nussbaum.[dagger] Cambridge, MA: Harvard University Press, 2006. Pp. 487. $35.00. Martha Nussbaum's Frontiers of Justice1 is a major critical assessment of John Rawls's contractarian theory of justice and a highly original treatment of three "unsolved problems of justice": our duties to the impaired and disabled; to members of other nationalities; and to other animal species. Nussbaum develops and applies the "capabilities approach" to justice, which she set forth in Women and Human Development.2 In that book, Nussbaum presents the capabilities approach in connection with issues of women's rights and sex equality in developing countries. In Frontiers of Justice, she exhibits the versatility of the capabilities approach by further developing her theory in addressing three different problems. The intuitive idea behind the capabilities approach is that there are certain basic human needs and capacities that must be realized to a minimum degree if human beings are to live a decent life consonant with human dignity and flourishing. Some of these "capabilities for human functioning" are straightforward and noncontroversial: a normal life span; health and nutrition; bodily integrity; and adequate development of one's senses, imagination, and thinking capacities. Others are more controversial (e.g., opportunities for political participation; nondiscrimination on the basis of sex, sexual orientation, religion, race, ethnicity, etc.). Working from ideas of human dignity and "truly human functioning" said to be implicit in Aristotle and Marx, Nussbaum presents an account of the central human capabilities that society must satisfy-by the provision of necessary rights, entitlements, and background social conditions-if it is to render minimal justice to all its members. In developing and applying her capabilities approach to rights of the disabled, other nationalities, and animals, Nussbaum engages in a thorough criticism of social contract doctrine. She focuses on John Rawls, providing a lengthy discussion and assessment of his position. Her main contention is that, for all its strengths in dealing with issues of social, political, and economic justice among "normal" cooperating members of a democratic society, contractarianism proves inadequate when extended to disabilities, other nationalities, and other species. For this and other reasons, she finds Rawls's account of justice "seriously flawed." In my discussion, I proceed differently than Nussbaum's exposition. Whereas she discusses contractarianism's shortcomings first and then argues that her position better deals with the issues, I first set forth in Part I the broad outlines of Nussbaum's capabilities approach to justice, and briefly discuss how it applies to disabilities and other nationalities. This enables a better appreciation of the originality and strengths of her capabilities approach. In Part II, I review Nussbaum's general criticisms of Rawlsian contractarianism. I then present in Part III a different interpretation of Rawls's contractarianism than Nussbaum's. Parts IV and VI respond to Nussbaum's specific criticisms of Rawls's treatment of disabilities and international justice respectively, while Part V explains why primary goods are the appropriate measure of the "currency" of justice. I conclude in Part VII with a brief exposition of her account of our duties to other animal species. This is a lengthy and complex book. I do not pretend to have addressed all of Nussbaum's criticisms of contractarianism, or to have fully done her book justice. For all the disagreements I have with Nussbaum's criticisms of Rawls, her book is a major development of an alternative approach to justice and human rights than provided by contractarianism and other contemporary positions. Her capabilities approach provides significant advances in our understanding of three difficult areas in moral and political philosophy and in pointing the way towards addressing these problems. …