scispace - formally typeset
Search or ask a question
Author

Federico Maggi

Bio: Federico Maggi is an academic researcher from University of Sydney. The author has contributed to research in topics: Malware & Settling. The author has an hindex of 36, co-authored 185 publications receiving 4120 citations. Previous affiliations of Federico Maggi include Lawrence Berkeley National Laboratory & Duke University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors used a global database of pesticide applications and a spatially explicit environmental model to estimate the world geography of environmental pollution risk caused by 92 active ingredients in 168 countries.
Abstract: Pesticides are widely used to protect food production and meet global food demand but are also ubiquitous environmental pollutants, causing adverse effects on water quality, biodiversity and human health Here we use a global database of pesticide applications and a spatially explicit environmental model to estimate the world geography of environmental pollution risk caused by 92 active ingredients in 168 countries We considered a region to be at risk of pollution if pesticide residues in the environment exceeded the no-effect concentrations, and to be at high risk if residues exceeded this by three orders of magnitude We find that 64% of global agricultural land (approximately 245 million km2) is at risk of pesticide pollution by more than one active ingredient, and 31% is at high risk Among the high-risk areas, about 34% are in high-biodiversity regions, 5% in water-scarce areas and 19% in low- and lower-middle-income nations We identify watersheds in South Africa, China, India, Australia and Argentina as high-concern regions because they have high pesticide pollution risk, bear high biodiversity and suffer from water scarcity Our study expands earlier pesticide risk assessments as it accounts for multiple active ingredients and integrates risks in different environmental compartments at a global scale Pesticide pollution is a risk for two-thirds of agriculture land A third of high-risk areas are in high-biodiversity regions and a fifth are in low- and lower-middle-income areas, according to environmental modelling combined with pesticide application data

274 citations

Book ChapterDOI
10 Jul 2014
TL;DR: This work proposes Phoenix, a mechanism that, in addition to telling DGA- and non-DGA-generated domains apart using a combination of string and IP-based features, characterizes the DGAs behind them, and, most importantly, finds groups of DGAs that are representative of the respective botnets.
Abstract: Modern botnets rely on domain-generation algorithms (DGAs) to build resilient command-and-control infrastructures. Given the prevalence of this mechanism, recent work has focused on the analysis of DNS traffic to recognize botnets based on their DGAs. While previous work has concentrated on detection, we focus on supporting intelligence operations. We propose Phoenix, a mechanism that, in addition to telling DGA- and non-DGA-generated domains apart using a combination of string and IP-based features, characterizes the DGAs behind them, and, most importantly, finds groups of DGA-generated domains that are representative of the respective botnets. As a result, Phoenix can associate previously unknown DGA-generated domains to these groups, and produce novel knowledge about the evolving behavior of each tracked botnet. We evaluated Phoenix on 1,153,516 domains, including DGA-generated domains from modern, well-known botnets: without supervision, it correctly distinguished DGA- vs. non-DGA-generated domains in 94.8 percent of the cases, characterized families of domains that belonged to distinct DGAs, and helped researchers “on the field” in gathering intelligence on suspicious domains to identify the correct botnet.

211 citations

Proceedings ArticleDOI
05 Dec 2016
TL;DR: ShieldFS, an add-on driver that makes the Windows native filesystem immune to ransomware attacks, is proposed and evaluated in real-world working conditions on real, personal machines, against samples from state of the art ransomware families.
Abstract: Preventive and reactive security measures can only partially mitigate the damage caused by modern ransomware attacks. Indeed, the remarkable amount of illicit profit and the cyber-criminals' increasing interest in ransomware schemes suggest that a fair number of users are actually paying the ransoms. Unfortunately, pure-detection approaches (e.g., based on analysis sandboxes or pipelines) are not sufficient nowadays, because often we do not have the luxury of being able to isolate a sample to analyze, and when this happens it is already too late for several users! We believe that a forward-looking solution is to equip modern operating systems with practical self-healing capabilities against this serious threat. Towards such a vision, we propose ShieldFS, an add-on driver that makes the Windows native filesystem immune to ransomware attacks. For each running process, ShieldFS dynamically toggles a protection layer that acts as a copy-on-write mechanism, according to the outcome of its detection component. Internally, ShieldFS monitors the low-level filesystem activity to update a set of adaptive models that profile the system activity over time. Whenever one or more processes violate these models, their operations are deemed malicious and the side effects on the filesystem are transparently rolled back. We designed ShieldFS after an analysis of billions of low-level, I/O filesystem requests generated by thousands of benign applications, which we collected from clean machines in use by real users for about one month. This is the first measurement on the filesystem activity of a large set of benign applications in real working conditions. We evaluated ShieldFS in real-world working conditions on real, personal machines, against samples from state of the art ransomware families. ShieldFS was able to detect the malicious activity at runtime and transparently recover all the original files. Although the models can be tuned to fit various filesystem usage profiles, our results show that our initial tuning yields high accuracy even on unseen samples and variants.

206 citations

Book ChapterDOI
02 Nov 2015
TL;DR: HelDroid is presented, a fast, efficient and fully automated approach that recognizes known and unknown scareware and ransomware samples from goodware, based on detecting the "building blocks" that are typically needed to implement a mobile ransomware application.
Abstract: In ransomware attacks, the actual target is the human, as opposed to the classic attacks that abuse the infected devices e.g., botnet renting, information stealing. Mobile devices are by no means immune to ransomware attacks. However, there is little research work on this matter and only traditional protections are available. Even state-of-the-art mobile malware detection approaches are ineffective against ransomware apps because of the subtle attack scheme. As a consequence, the ample attack surface formed by the billion mobile devices is left unprotected. First, in this work we summarize the results of our analysis of the existing mobile ransomware families, describing their common characteristics. Second, we present HelDroid, a fast, efficient and fully automated approach that recognizes known and unknown scareware and ransomware samples from goodware. Our approach is based on detecting the "building blocks" that are typically needed to implement a mobile ransomware application. Specifically, HelDroid detects, in a generic way, if an app is attempting to lock or encrypt the device without the user's consent, and if ransom requests are displayed on the screen. Our technique works without requiring that a sample of a certain family is available beforehand. We implemented HelDroid and tested it on real-world Android ransomware samples. On a large dataset comprising hundreds of thousands of APKs including goodware, malware, scareware, and ransomware, HelDroid exhibited nearly zero false positives and the capability of recognizing unknown ransomware samples.

192 citations

Journal ArticleDOI
TL;DR: An unsupervised host-based intrusion detection system based on system call arguments and sequences that has a good signal-to-noise ratio, and is also able to correctly contextualize alarms, giving the user more information to understand whether a true or false positive happened.
Abstract: We describe an unsupervised host-based intrusion detection system based on system call arguments and sequences. We define a set of anomaly detection models for the individual parameters of the call. We then describe a clustering process that helps to better fit models to system call arguments and creates interrelations among different arguments of a system call. Finally, we add a behavioral Markov model in order to capture time correlations and abnormal behaviors. The whole system needs no prior knowledge input; it has a good signal-to-noise ratio, and it is also able to correctly contextualize alarms, giving the user more information to understand whether a true or false positive happened, and to detect global variations over the entire execution flow, as opposed to punctual ones over individual instances.

146 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

01 Jan 2002

9,314 citations

01 Jan 2016
TL;DR: The modern applied statistics with s is universally compatible with any devices to read, and is available in the digital library an online access to it is set as public so you can download it instantly.
Abstract: Thank you very much for downloading modern applied statistics with s. As you may know, people have search hundreds times for their favorite readings like this modern applied statistics with s, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. modern applied statistics with s is available in our digital library an online access to it is set as public so you can download it instantly. Our digital library saves in multiple countries, allowing you to get the most less latency time to download any of our books like this one. Kindly say, the modern applied statistics with s is universally compatible with any devices to read.

5,249 citations

Journal ArticleDOI
23 Nov 2015-Nature
TL;DR: It is argued that the available evidence does not support the formation of large-molecular-size and persistent ‘humic substances’ in soils, and instead soil organic matter is a continuum of progressively decomposing organic compounds.
Abstract: Instead of containing stable and chemically unique ‘humic substances’, as has been widely accepted, soil organic matter is a mixture of progressively decomposing organic compounds; this has broad implications for soil science and its applications. The exchange of nutrients, energy and carbon between soil organic matter, the soil environment, aquatic systems and the atmosphere is important for agricultural productivity, water quality and climate. Long-standing theory suggests that soil organic matter is composed of inherently stable and chemically unique compounds. Here we argue that the available evidence does not support the formation of large-molecular-size and persistent ‘humic substances’ in soils. Instead, soil organic matter is a continuum of progressively decomposing organic compounds. We discuss implications of this view of the nature of soil organic matter for aquatic health, soil carbon–climate interactions and land management. Soil organic matter contains a large portion of the world's carbon and plays an important role in maintaining productive soils and water quality. Nevertheless, a consensus on the nature of soil organic matter is lacking. Johannes Lehmann and Markus Kleber argue that soil organic matter should no longer be seen as large and persistent, chemically unique substances, but as a continuum of progressively decomposing organic compounds.

2,206 citations