scispace - formally typeset
Search or ask a question
Author

Melanie Dulong de Rosnay

Bio: Melanie Dulong de Rosnay is an academic researcher from Centre national de la recherche scientifique. The author has contributed to research in topics: Public domain & Metadata. The author has an hindex of 10, co-authored 68 publications receiving 387 citations. Previous affiliations of Melanie Dulong de Rosnay include Harvard University & University of Paris.


Papers
More filters
Journal ArticleDOI
TL;DR: It will argue that even if sectors are currently regulated by different laws and policies governing data of a different nature, a common techno-legal framework can be defined to address legal, cultural and institutional challenges in a cross-sectorial manner.
Abstract: This paper addresses the current trends and issues with regards to opening up data held by public entities in various sectors, including public sector information, geographic data, cultural heritage, scientific publications and data. In the paper, opening up public data is defined as making it available for any purpose of use. While several initiatives have been taken within Europe to make public data available, many issues still remain unsolved. Based on the state of play in various sectors, this paper gives an overview of common issues that need to be addressed in order to move to more and better accessibility and reusability of public data. It will argue that even if sectors are currently regulated by different laws and policies governing data of a different nature, a common techno-legal framework can be defined to address legal, cultural and institutional challenges in a cross-sectorial manner.

44 citations

Dissertation
26 Oct 2007
TL;DR: This work analyzes how law and technology were conceptualised independently and introduces a model based on the mutual influence between regulation by law and regulation by technology, toward legal categorisation redesign and an improved technical rights expression.
Abstract: Technological developments lead to an exponential increase of the spread of works and information on the networks. Regulatory models from the analogical era based on physical medium scarcity and exclusivity are questioned by digital technology paradigms of copying, remixing and sharing. Copyright has been developed and adaptated at the same time than reproduction and dissemination technologies innovation, as an artificial corrective granting a limited monopoly of exploitation. But copyright can also lead to commons. We analyse how law and technology were conceptualised independently. Technical standards production process and the extension of exclusive rights are creating tensions between cultural industries and the public. This conception led to an intrication between regulation by law and technical protection measures, for the benefit of regulation by technology. Following research on lex informatica, we thus introduce a model based on the mutual influence between regulation by law and regulation by technology, toward legal categorisation redesign and an improved technical rights expression. The development of applications, ontologies and legal metadata allow to automate information and works exchanges management. Integrating regulation by law and regulation by technology, this model was built on the systematic analysis of various licensing models emerging on the networks, between access control and the constitution of Commons.

38 citations

BookDOI
01 Jun 2012
TL;DR: In this article, the authors argue that the public domain is fundamental to a healthy society and that the norms regulating culture's use (i.e., copyright and related rights) have become increasingly restrictive.
Abstract: Digital technology has made culture more accessible than ever before. Texts, audio, pictures and video can easily be produced, disseminated, used and remixed using devices that are increasingly user-friendly and affordable. However, along with this technological democratization comes a paradoxical flipside: the norms regulating culture's use — copyright and related rights — have become increasingly restrictive. This book brings together essays by academics, librarians, entrepreneurs, activists and policy makers, who were all part of the EU-funded Communia project. Together the authors argue that the Public Domain — that is, the informational works owned by all of us, be that literature, music, the output of scientific research, educational material or public sector information — is fundamental to a healthy society. The essays range from more theoretical papers on the history of copyright and the Public Domain, to practical examples and case studies of recent projects that have engaged with the principles of Open Access and Creative Commons licensing.

29 citations

Proceedings ArticleDOI
04 Jun 2007
TL;DR: Generic means that rightholders should only need to express the license they need once, and semi-automatic tools should then translate this license so it can be browsed by any specific system, hence the necessity to be able to model concept semantics in order to translate a license expressed in generic terms into more specific terms that are compliant with the specific standards used by distribution systems.
Abstract: Digital contents distributed over the internet are regulated by law and by technical management systems. The latter include a semantic component that describes licenses, i.e. rights of use which are granted to the user. These elements of Digital Rights Management (DRM) systems are called Rights Expression Languages (REL), they gather terms and relations needed to build licenses. Some are based on an ontology of online licenses, not necessarily related to applicable law and various legal systems, and cannot interoperate. As a consequence, there is a need for a more generic way to express licenses. Here, generic means that rightholders should only need to express the license they need once, and semi-automatic tools should then translate this license so it can be browsed by any specific system. Hence it implies the necessity to be able to model concept semantics in order to translate a license expressed in generic terms into more specific terms that are compliant with the specific standards used by distribution systems. This work comes as part of larger studies on legal ontologies, legal systems and RELs.

26 citations

01 Jan 2009
TL;DR: In this article, the authors present the different licenses, identify various possible sources of legal incompatibility, evaluates their actual impact, and finally proposes recommendations to mitigate risks and improve compatibility, consistency, clarity, and legal security by restructuring and simplifying the system.
Abstract: Creative Commons licenses have been designed to facilitate the use and reuse of creative works by granting some permissions in advance. However, the system is complex, with a multiplicity of licenses options, formats and versions available, including translations into different languages and adaptation to specific legislations towards versions that are declared compatible with each other after an international porting process. It should be assessed whether all licenses cover exactly the same subject matter, rights, and restrictions or whether small language differences may have an impact on the rights actually granted, ensuring legal security of current users or availability of works for future generations to access and build upon. As different licenses have different phrasing, differences may change the content of the grant and its substantial conditions, thereby affecting users' expectations and threatening the validity of the consent along the modification chain. Possible sources of legal uncertainty and incompatibility - as well as their actual or potential consequences on the validity and enforceability of the licenses across jurisdictions with different and possibly inconsistent legislations - need to be evaluated. This study presents the different licenses (chapter 2), identifies various possible sources of legal incompatibility (chapter 3), evaluates their actual impact (chapter 4) and finally proposes recommendations (chapter 5) to mitigate risks and improve compatibility, consistency, clarity, and legal security by restructuring and simplifying the system.

19 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The open government data life-cycle is described and a discussion on publishing and consuming processes required within open governmentData initiatives is focused on, and guidelines for publishing data are provided and an integrated overview is provided.

567 citations

Journal Article
TL;DR: Shirky's Cognitive Surplus: Creativity and Generosity in a Connected Age by Clay Shirky as mentioned in this paper argues that the free time of the world's educated citizenry as an aggregate, a kind of cognitive surplus, is not always used wisely.
Abstract: Cognitive Surplus: Creativity and Generosity in a Connected Age. Clay Shirky. New York, NY: Penguin, 2010. 242 pages, $25.95 pbk. It sometimes seems that the hardest thing to do in the Information Age is to communicate. In the rush of easily accessible data and the maelstrom of conflicting viewpoints, two otherwise intelligent people can talk past one another as they stake out territory with the tenacity of computer viruses. NYU professor Clay Shirky and media critic Nicholas Carr have been squaring off now for two years over what impact the Internet is having on our society. Shirky takes the more optimistic viewpoint, Carr the more pessimistic. Carr threw down the gauntlet with his 2008 Atlantic cover article "Is Google Making Us Stoopid?" and has continued the debate with his recent book The Shallows. Shirky provides a response in Cognitive Surplus, but criticizes skeptics like Carr only obliquely. Shirky says their main frustration is with the profusion of choice: "Scarcity," he says, "is easier to deal with than abundance." But this book doesn't dwell much on the naysayers. Instead, it frames the subject of the Internet with a bold and startling vision about its potential. "Imagine," he says, "treating the free time of the world's educated citizenry as an aggregate, a kind of cognitive surplus." We've not always used that surplus wisely. Shirky's case in point is television, and how it has come to dominate our culture. Over much of the planet, Shirky writes, "the three most common activities are . . . work, sleep, and watching TV." Like gin in early-eighteenth-century London, twentieth-century television is one of those social habits that critics have denounced and tried hard to minimize, but without success. There are signs now, however, that TV viewing - still thoroughly popular - isn't quite the juggernaut it used to be. Young people are increasingly turning to the Internet; and the Web, it turns out, allows humans to do things they can't do with other media - namely, create, produce, and connect. Instead of devoting twenty passive hours a week to the tube (the international average), people now use a medium that lets them make and share things. That may seem trivial, considering the amount of silly, offensive, or deceptive fare on the Web, but think of it this way: Which do you think contains more enduring cultural, intellectual, and societal value - posting comments on a blog or watching Gitligan's Island! The Internet is no utopia, but neither are older media. In fact, some of them may be a good deal less salutary. Take, for example, the online fantasy game World of Warcraft. As Shirky puts it in a tart retort: "However pathetic you may think it is to sit in your basement pretending to be an elf, I can tell you from personal experience: it's worse to sit in your basement trying to decide whether Ginger or Mary Ann is cuter." There's a wide variety of content on the Internet, of course - the widest in any medium - and the largest number of producers of content: a colossal and revolutionary force. The "hundred million hours of cumulative thought" it took to produce Wikipedia is one example of Shirky's "cognitive surplus," and while it still pales before the "two hundred billion hours of TV every year," you can readily see the potential. …

421 citations

Journal ArticleDOI
TL;DR: The Plunder of Nature and Knowledge: Biopiracy as mentioned in this paper is a collection of essays about the plunder of nature and knowledge in the context of social epistemology.
Abstract: (1999). Biopiracy: The Plunder of Nature and Knowledge. Social Epistemology: Vol. 13, No. 2, pp. 239-240.

392 citations

Journal ArticleDOI
TL;DR: Darpa-sponsored technologies employed effectively by the Pentagon include such military computer networks as Milnet and Simnet, graphically oriented flight simulators used by the U S Air Force, packet-switching radio and satellite systems for a digital military communications system, and advanced command-and-control systems, among many others.
Abstract: erhaps because networking and microcomputer companies have been so successful during the past few years, the current tendency in Washington, D C , is to give the industrial sector all the credit for the grand successeq in computing of the past half-century In fact, at least as much of the technology that has driven this revolution originated in acadernid, and much 01 the research in both sectors would not have been under taken without the far sighted support of the U S government By far the largest Federal supporter of computer R&D has been the Information Processing Techniques Office of the Defense Advanced Research Projects Agency (Darpa, or, during the first couple years of the Clinton administration, themselves in military terms alone Examples of Darpa-sponsored technologies employed effectively by the Pentagon include such military computer networks as Milnet and Simnet, graphically oriented flight simulators used by the U S Air Force, packet-switching radio and satellite systems for a digital military communications system, and advanced command-and-control systems, among many others Darpa originated in 1958 as a response to the Soviet Union's launch of the first spacc satcllite, Sputnik In setting up the agency, President Dwight D Eisenhower's intention was to transcend the rivalries among the military services in advanced research and development The agency's early years were devoted to the space program and nuclear test verification, and it was not until 1962 that the Information Processing Techniques office was formed But the computing program grew with a vengeance, and within a year its expcriditures were large1 than dll other U S government outlays on computing R&D combined The influence of Darpa's computing office h a s been most not iceable in breakthrough technologies In t h e 19605, for example, it supported the Massachusetts Institute of Technology's Project MAC, which built the first timesharing computer The result was that

339 citations

Journal Article
TL;DR: The Code and Other Laws of Cyberspace by Lawrence Lessig as discussed by the authors is perhaps the most original book yet written about cyberspace law, focusing on the relationship between law, economic markets, norms, and an intriguing category he calls "architecture".
Abstract: Code and Other Laws of Cyberspace. Lawrence Lessig. New York: Basic Books, 1999. 230 pp. $21 hbk. Lawrence Lessig's Code and Other Laws of Cyberspace is perhaps the most original book yet written about cyberspace law. With an accessible style that is rich in anecdotes and metaphors and surprisingly low on legal and technical jargon, Lessig, a constitutional law scholar at Harvard and a consultant on the Microsoft antitrust case, writes for a wide scholarly audience. Even more than the recent The Control Revolution (New York: Century Foundation, 1999) by Andrew Shapiro, Lessig has managed to write about a rapidly evolving subject at a level of abstraction and theoretical sophistication that ensures that his contribution will long remain relevant. Central to Lessig's theoretical framework is his breakdown of four modalities of regulation: law, economic markets, norms, and an intriguing category he calls "architecture." Architecture includes constraints that the natural world imposes or that people construct-for example, speed bumps on roads. The architecture in cyberspace is computer code: code constructs the cyber-world. Consider the example of pornography regulation. Lessig contrasts how these four modalities of regulation control access to pornography in "real space" and in cyberspace, vividly demonstrating why a simple translation of existing law from "real space" to cyberspace is often impossible. In real space, pornography is extensively regulated through all four modalities, in ways that limit children's access to it. Laws require that vendors sell pornography only to adults. Markets prevent children from accessing pornography because it costs money. Norms help limit children's access to pornography, both by stigmatizing pornography and stigmatizing dealers who would sell it to children. In the category of real space "architecture," there is the simple matter of how a child is: even if a child attempted to disguise himself as an adult, he would probably notbe successful at fooling a salesperson. These four kinds of constraints may be imperfect, but are reasonably effective in limiting children's access to pornography in real space, Lessig contends. In cyberspace, however, the modalities of regulation of pornography operate differently and less effectively. Law is unsettled. Markets are different in at least two ways: distribution of digital files is much cheaper than distribution of printed material or videos, and though there are plenty of commercial pornography sites, much pornography can be accessed at no cost through newsgroups. Differences in the "architecture" of cyberspace are crucial because they allow faceless transactions. Faceless transactions may make disguising identity and age simple, therefore allowing a child to avoid the restrictions of law and norms. Thus the "real space" regulatory regime is upset, making pornography generally more available to children in cyberspace. Ways of filtering content or zoning the Internet are now extensively debated, but no simple transfer of the offline constraints to cyberspace seems possible. As he applies his theory to pornography regulation and the regulation of intellectual property and privacy, Lessig emphasizes that not only the codes of law but computer codethe "architecture" of cyberspace-an "embed" values and enable government and private actors to control behavior. …

333 citations