A framework for cyber surveillance of unlawful activities for critical infrastructure using computational grids
Summary (3 min read)
- News reports reveal the usage of cyber infrastructure for framing up unlawful activities against the Government, critical infrastructure etc.
- Moreover, if the authors can consider biometrical identification of suspicious criminals (or with some background), they can scan images, videos, keystrokes and mouse movements and so on.
- Of course, volume of such data is very huge; therefore the grid computing is a good tool for managing and mining such data.
- Web mining has become very critical for developing key intelligence to monitor social networks, cyber terrorism related activities, flooding of abusive contents, Web management, security, business and support services, personalization, network traffic flow analysis and so on.
2.1. Reducing and Indexing Voluminous Data
- Data reduction is a critical problem for cyber terrorism; there are large collections of documents that must be analyzed and processed, raising issues related to performance, lossless reduction, polysemy (i.e., the meaning of individual words being influenced by their surrounding words), and synonymy (i.e., the possibility of the same term being described indifferent ways).
- Web mining has become very critical for developing key intelligence to monitor social networks, cyber terrorism related activities, flooding of abusive contents, Web management, security, business and support services, personalization and network traffic flow analysis.
- Traditional data mining algorithms are enhanced to find specific relationships or patterns in the particular data source.
- Non-negative matrix and tensor factorization (NMF) is a recently developed technique for finding parts-based, linear representations of non-negative data.
- Techniques like Singular Value Decomposition (SVD), Semi-Discrete Decomposition (SDD), and High Order SVD also generate basis vectors – various additive and subtractive combinations of which can be used to reconstruct the original space.
2.2. Distributed Data Mining
- The authors require managing and devising a scheme for updating the huge Web databases efficiently.
- In a distributed environment the data gets updated in a periodic fashion and by managing the synchronization between the master and a replica (periodic updated version) it is possible to manage the data effectively.
- Traditional approaches to knowledge discovery for Web data repositories concentrated on the process of collecting and synchronizing the data into a single location using some kind of client/server architecture and data warehousing, and then analyzing the data using fast parallel computers along with efficient knowledge discovery algorithms.
- The authors propose to analyze the available full data rather than taking samples at regular intervals or random samples.
- A particular data-mining algorithm will analyze each data and the individual descriptors are finally combined to formulate meta-knowledge as shown in Figure 2(a).
2.3. Web Traffic Flow Monitoring
- It has been found that new aspects of Internet behaviour emerge that are either unknown or poorly understood can be revealed sometimes by accurate traffic flow monitoring.
- Network monitoring and measurement would also help to spot abnormal traffic flow within a suspected social network.
- Traffic flow monitoring also helps us to analyze network infrastracture trends and user behavior and to improve the security of the cyber-infrastracture.
- LOBSTER is a successful pilot European passive Internet traffic monitoring project that is capable of providing early warning alerts for traffic anomaly, security incidents, and providing accurate and meaningful measurements of performance .
- The authors plan to deploy passive monitoring sensors at speeds starting from 2.5 Gbps and possibly up to 10 Gbps based on LOBSTER technology for advanced network traffic flow monitoring.
- Automatic surveillance is relatively new area of application for galloping video and sound technologies in communication.
- It allows better resource management, scalability and application of more computationally intensive algorithms.
- Thermography makes it also easier to detect and track moving targets in real-time.
- Some Exemplary scenarios are given below: 1. Street abnormal activity detection – detecting suspicious/unlawful activities like theft, robbery, fights, gatherings, etc. 2. Surveillance Computing Grid disadvantages: 1. Need for unobstructed secure communications channel and procedures between crucial parts of the system (Data gathering –> Data analysis -> Physical action) 2.
3.1. Surveillance Systems
- The authors predict the substantial growth of importance and the need for automatic surveillance systems.
- Furthermore many terrorist activities in European cities make need for security a pressing issue.
- Biometric personal identification systems are in some cases complex and computationally intensive.
- Improving their effectiveness often involves increasing the number of features and incorporating additional pre-processing steps.
- They are naturally suited for distributing complex stages of such systems among many parallel units.
5. Data Integrity and Security
- To implement security in a grid computational environment, without appreciable performance degradation in grids.
- This is particularly so since the size of the key has to increase to protect the encrypted data.
- The process that needs to execute at a dedicated Domain Resource Manager (DRM) must first authenticate itself with that DRM.
- The authors suggest different methods, based on exchange of key information to secure subsequent communication and data transfer.
- Distributed Intrusion Detection and Prevention Systems could consolidate intrusion detection information from many different individual sensors in the grid network and even multiple grids.
6. QOS Based Surveillance System
- Privacy can be seen as an aspect of security — one in which trade-offs between the interests of one group and another can become particularly clear.
- The right against unsanctioned invasion of privacy by the government, corporations or individuals is part of many countries' privacy laws, and in some cases, constitutions.
- Nations have different laws, which in some way limit privacy.
- To protect critical infrastructures, in some cases, the Government or policy makers might have to voluntarily sacrifice privacy in exchange for perceived benefits and very often with specific dangers and losses.
- Final aspect of this project is to develop an adaptive surveillance system, which could maintain and protect privacy factors depending on the regulations and constitution (quality of service) of the countries involved.
- This paper detailed a framework for cyber surveillance of unlawful activities using a computational grid based environment, which is capable of distributed data mining.
- Proposed framework integrates several areas of computer science research namely data mining, computational grids, biometrics, social networks.
Did you find this useful? Give us your feedback
"A framework for cyber surveillance ..." refers methods in this paper
...We propose a DIDPS that will be encapsulated inside mobile agents and placed at grid nodes / network ....
"A framework for cyber surveillance ..." refers background in this paper
...Methods of secure transfer and exchange of the required key(s) are also to be developed ....