scispace - formally typeset
Search or ask a question

Use of Big Data Analysis in MIS ? 


Best insight from top research papers

Big data analysis plays a crucial role in Management Information Systems (MIS) by enhancing security, detecting hidden patterns, and improving operational efficiency. By utilizing big data technologies, MIS can benefit from advanced analytics techniques like predictive analysis, data mining, and statistical analysis . In the realm of network security, big data analysis aids in detecting malicious traffic, identifying network attacks, and enhancing intelligence testing . The rapid development of technology, including the Internet of Things (IoT), has propelled the growth of the big data era, necessitating the extraction of valuable insights from complex data to analyze market trends and ensure network security . Therefore, integrating big data analysis into MIS not only strengthens network security but also optimizes decision-making processes and overall operational effectiveness.

Answers from top 5 papers

More filters
Papers (5)Insight
Not addressed in the paper.
Big data analytics in MIS can enhance operational efficiency, customer service, marketing strategies, and revenue opportunities by analyzing large datasets to uncover patterns, correlations, and trends.
Big data analysis enhances network security and intelligence. It aids in extracting valuable insights from complex data, crucial for analyzing market trends and enforcing security standards in computer networks.
The paper discusses the application of big data analysis in studying the characteristics of Miao people's folk songs in Xiangxi, showcasing how big data technology aids in understanding cultural elements.
Big data analysis is utilized in information security operation and maintenance management through distributed data mining based on SNMP statistical observation for detecting network attacks effectively.

Related Questions

What big data analytic technology is used in industry?5 answersBig data analytics technology is extensively utilized in various industries, such as shipping, manufacturing, and tourism. This technology involves processing and analyzing massive amounts of data to extract valuable insights and improve decision-making processes. In the shipping industry, big data analysis is employed to diagnose mechanical faults and enhance innovation. Industry 4.0 relies on big data for communication between machines, forming the foundation for digital factories and smart products. Furthermore, the tourism sector leverages big data technologies to understand tourist preferences and enhance decision-making processes. The application of big data analytics transcends traditional data handling methods, offering revolutionary changes in businesses and scientific research. Overall, big data analytics plays a crucial role in driving innovation and efficiency across various industries.
How big data analytics can influence AI adoption and use in SMEs?5 answersBig Data Analytics (BDA) plays a crucial role in influencing AI adoption and use in Small and Medium Enterprises (SMEs). Research indicates that BDA adoption is influenced by factors such as perceived benefits, top management support, organizational readiness, and environmental factors. BDA acts as a mediator between project performance and various factors like project knowledge management, green purchasing, and project operational capabilities, positively impacting project success in SMEs. Additionally, the acceptance of BDA in SMEs is influenced by factors like infrastructure quality and the perceived difficulty of implementation, emphasizing the importance of investing in a good infrastructure for successful BDA implementation in SMEs. Overall, integrating BDA can enhance AI adoption and utilization in SMEs by providing valuable insights, improving decision-making processes, and fostering innovation.
How to use genAI in MIS/IS research?4 answersGenAI can be used in MIS/IS research by leveraging its capabilities in analyzing and visualizing large genomic datasets. One approach is to develop user-friendly tools with graphical user interfaces (GUI) hosted on web-based servers. These GUI applications, such as Shiny, make complex workflows accessible to researchers without programming knowledge. For example, GENAVi (Gene Expression Normalization Analysis and Visualization) is a GUI-based tool that allows scientists without bioinformatics expertise to perform normalization and differential expression analysis (DEA) of RNA-Seq data. GENAVi combines Bioconductor packages and provides features like visualization of expression, clustering samples, principal components analysis, DEA, and gene set enrichment. It also offers a range of normalization methods and clear session reporting for reproducible analysis. This user-friendly application provides a convenient platform for researchers in MIS/IS to analyze and visualize gene expression data without requiring extensive bioinformatics skills ^[Reyes] ^[Wilkinson] ^[Jain].
What do Big Data analytics do?5 answersBig Data analytics refers to the process of examining large and complex datasets to uncover hidden patterns, correlations, and insights that can help organizations make informed business decisions. It leverages advanced computational techniques to extract insights from voluminous data sets. Big Data analytics can help organizations make sense of the massive amounts of data generated by businesses and individuals. It is essential in supply chain management-related activities as it can manage global, complex, tempestuous, and dynamic value chains. Big Data analytics is a critical success element for organizations in the global and dynamic market. It provides computational approaches for extracting meaningful insights from massive amounts of data generated from various sources. Organizations are leveraging Big Data analytics to understand user behavior and provide valuable solutions. Overall, Big Data analytics enables organizations to gain insights, make informed decisions, and gain a competitive edge in the business world.
How can use big data in auditing?5 answersBig data can be used in auditing to improve efficiency, effectiveness, and the quality of audit results. By utilizing big data analytics, auditors can extract multi-level features from various materials, such as visual characteristics and textual abilities, using deep learning models like convolution neural networks (CNN). Big data analytics can also minimize the risks associated with sampling, allowing auditors to provide a reasonable level of assurance. Additionally, big data technology can be applied in forensic accounting and auditing practices, aiding in fraud detection and resolving internal abnormalities prior to external audits. Optimizing big data can help auditors process data faster, change the sampling paradigm, and resolve audit time pressure issues. Overall, the application of big data technology in auditing is seen as a beneficial trend that can enhance audit processes and outcomes.
How can AI be used to analyze big data?5 answersAI can be used to analyze big data by applying state-of-the-art AI software to help decision makers after major global crises such as climate change, natural disasters, and pandemics. It can also be used to develop an AI model using big data and AI statistical software, which can be valuable to many socioeconomic stakeholders. Another approach is to utilize a multi-feature information retrieval approach and an AI-powered big data MFE scheme to analyze and extract keywords from current events and news. The confluence of Big Data and AI allows companies to automate and improve complex analytical tasks, deriving insights and unlocking the value of big data. Additionally, AI can aid in the resolution of critical healthcare concerns by utilizing big data analysis and data standardization. In industrial systems, AI models can be used for big data analytics, but they are vulnerable to adversarial attacks. Information fingerprinting techniques based on deep learning can be used to detect adversarial samples and ensure the reliability and safety of AI models.

See what other people are reading

What are the different versions of STAR (Statistical Tool for Agricultural Research) application?
4 answers
The STAR (Statistical Tool for Agricultural Research) application has multiple versions tailored for data analysis in agricultural research. The current version of STAR offers modules for data management, randomization, layout, analysis of variance, descriptive statistics, basic graphs, correlation analysis, linear and non-linear regression analysis, multivariate analysis, and non-parametric analysis. Additionally, the Assistat software version 7.7 serves as another statistical tool for analyzing experimental data in agriculture, focusing on analysis of variance for randomized block and factorial experiments, as well as regression analysis for quantitative treatments. These tools aim to provide user-friendly interfaces and efficient functionalities for researchers and students in the agricultural field, addressing the need for accessible and effective statistical analysis platforms.
Applications of visual analytics besides radiology in medical sector?
5 answers
Visual analytics finds applications beyond radiology in the medical sector. It is utilized in clinical applications for patient data representation, quality assessment, and improvement, enabling selective filtering of diagnostic results and treatments. In cancer research, visual analytics is employed in Computational Pathology to extract imaging features from histopathology images for cancer diagnosis, prognosis, and treatment decisions. Moreover, visual analytics methods are applied in epidemiologic cohort studies to identify correlations, subgroup deviations in health disorder risks, and address missing value problems through appropriate imputation strategies. These diverse applications showcase the versatility of visual analytics in enhancing decision-making, prognostic evaluations, and data analysis in various medical domains beyond radiology.
How does climate change predict carbon?
4 answers
Climate change prediction of carbon is facilitated through various methods such as utilizing machine learning models, attention indices, and emission forecasting techniques. Studies have shown that incorporating climate attention indices into predictive models for carbon futures returns can enhance forecasting accuracy and economic performance. Machine learning models like Support Vector Machine (SVM) and Convolutional Neural Network (CNN) have been employed to predict CO2 trapping efficiency and emissions, aiding in decision-making for global emission reduction policies. Additionally, advanced techniques like the wavelet enhanced extreme learning machine (W-EELM) have been utilized to predict CO2 concentrations on different time scales, proving to be robust and superior in modeling carbon dioxide levels. These approaches collectively contribute to a better understanding and prediction of carbon dynamics in the context of climate change.
What are outcome needed from talent development model?
5 answers
The outcomes needed from a talent development model span across various dimensions, reflecting the multifaceted nature of talent cultivation and its impact on individuals and organizations. Firstly, a talent development model should aim to identify predictors and indicators of talent, providing a multidimensional and dynamic view of talent development, as seen in the TAD music model which emphasizes cognitive abilities, personality traits, and psychosocial skills. It should also improve the orientation of talent training, adapting to new societal requirements and enhancing the acceptance and effectiveness of training models among students and teachers, as demonstrated by the improved evaluation scores in music education. Moreover, the model should incorporate predictive analytics to forecast talent retention, thereby aiding organizations in making informed hiring decisions and optimizing resource investment in talent development. It's crucial for the model to foster an inclusive educational environment that supports all students, particularly those from disadvantaged backgrounds, leading to improvements in school climate, student attendance, and promotion rates. Additionally, the model should facilitate the identification of gifted and talented students and promote tools for their development, addressing the needs of these students through comprehensive and integrated approaches. The Talent Development model should also redirect the traditional function of schooling towards maximizing every child's potential for academic development, ensuring that all students can learn in demanding, high-expectation settings. It's important for the model to support the specific and evolving nature of talent, providing assessment and intervention strategies that sustain talent development. Furthermore, the model should employ evidence-based strategies to promote academic achievement and social competence, particularly among minority and low-income students. Lastly, it should contribute to the reform of secondary education in urban settings, aligning with recent commitments from educators and policymakers.
What are the recent trends in educational games research according to the Association for Psychological Science (APA)?
5 answers
Recent trends in educational games research, according to various studies, indicate a growing interest in the field. Studies show an increasing number of publications focusing on game-based learning, serious games, and digital, mobile, and computer-based games since the 2000s. Additionally, there is a shift towards simulating real environments for validation purposes and diverse implementations for educational, training, and support purposes. The USA, Taiwan, and Spain are prominent in global research collaborations, with institutions like the National Taiwan University of Science and Technology playing critical roles. Overall, these trends highlight the evolving landscape of educational games research, emphasizing the importance of understanding and exploring new avenues in this interdisciplinary field.
What is advanced analytics?
5 answers
Advanced analytics refers to the application of predictive and prescriptive models to analyze large and complex datasets for making critical business decisions. It plays a crucial role in various fields such as renewable energy, social media analysis, and workload management systems in high-energy physics experiments. In the context of renewable energy, advanced analytics is highlighted as a key area for research and application in engineering and management. The use of advanced analytics is essential for organizations to gain a competitive advantage, achieve sustainable development, and enhance productivity by leveraging data, analytics, and automation. Overall, advanced analytics enables organizations to harness the power of data-driven insights to optimize operations, improve efficiency, and drive innovation across different industries and sectors.
How have public policies impacted the adoption and utilization of Business Intelligence (BI) in different industries?
4 answers
Public policies play a crucial role in shaping the adoption and utilization of Business Intelligence (BI) across various industries. In Taiwan, concerns about AI have led to societal alarm due to potential privacy infringements. Governments worldwide, including in Malaysia, are recognizing the transformative potential of AI in public service delivery but are also cautious about the associated challenges and pitfalls, emphasizing the need for clear regulatory policies. Conversely, the lack of wide application of BI in the Portuguese Public Administration hinders the full achievement of efficiency and performance analysis goals, highlighting the importance of policy support for broader BI initiatives. Overall, public policies can either facilitate or impede the adoption and effective utilization of BI in different sectors, underscoring the significance of regulatory frameworks in driving successful BI implementation.
What are the most effective methods for conducting exploratory research?
5 answers
The most effective methods for conducting exploratory research involve utilizing various techniques and tools to analyze complex datasets without preexisting hypotheses. Exploratory research is crucial for generating hypotheses, especially from pilot studies with small sample sizes or when investigating secondary outcomes in larger trials. It is essential to employ exploratory data analysis methods such as drawing, tabulation, and calculation to explore data structures and patterns effectively. Additionally, exploratory research often involves iterative modeling with computationally intensive methods to provide optimal outcomes for learning assessment and performance management. Researchers should be cautious when interpreting exploratory results, as they have a high risk of Type I errors, which can lead to false positive findings. By combining these approaches, researchers can conduct thorough exploratory research to derive meaningful insights and hypotheses for further investigation.
What is research variable?
4 answers
A research variable is a fundamental component in a study, representing what is measured to answer research questions and achieve objectives. Variables can be classified based on their nature and measurement scale, such as nominal, ordinal, interval, or ratio. Understanding the types and characteristics of variables is crucial for designing, conducting, and interpreting research studies. The selection of variables is essential to the research protocol, as they help in conceptualizing relationships between them and in planning statistical analyses. Variables play a significant role in formulating hypotheses, clarifying research problems, and choosing appropriate measurement scales, particularly in social science research. Neglecting variables can lead to erroneous conclusions and misrepresentations of reality in research findings.
What are the steps for the asum-dm methodologies?
4 answers
The ASUM-DM methodology involves several key steps for big data & analytics projects. Firstly, the adaptation of ASUM-DM, a refined CRISP-DM, is essential, incorporating big data analysis, application prototyping, and prototype evaluation. Additionally, strong project management practices emphasizing communication play a crucial role in the success of the methodology. The process begins with the initiation of a resident process in the terminal equipment system, followed by the start of a management protocol host process and the opening of a thread for receiving local configuration information. Subsequently, the thread transfers the information to the resident process, which forwards it to a target process for processing and receives feedback. This structured approach ensures effective handling of data processing tasks in a cross-disciplinary, multi-organization, geographically-distributed setting.
What is l frame used for in motion capture systems?
4 answers
In motion capture systems, an L frame is utilized for key frame extraction and analysis. This process involves feature representation, critical point screening, key degree curve structure, weight learning, and key frame extraction based on fitting curves. The L frame plays a crucial role in identifying key critical points in motion component curves, dividing motion component curves, and ultimately extracting key frames without the need for manually setting thresholds. By using the L frame method, motion capture data can be processed in real-time, allowing for efficient extraction of key frames from human motion postures and sequences. This approach enhances the accuracy and efficiency of motion capture data analysis, benefiting various fields such as manufacturing, robotics, and human-computer interaction.