scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Science in 2006"


Journal ArticleDOI
TL;DR: This study emphasized on different types of normalization, each of which was tested against the ID3 methodology using the HSV data set, and recommendations were concluded on the best normalization method based on the factors and their priorities.
Abstract: This study is emphasized on different types of normalization. Each of which was tested against the ID3 methodology using the HSV data set. Number of leaf nodes, accuracy and tree growing time are three factors that were taken into account. Comparisons between different learning methods were accomplished as they were applied to each normalization method. A new matrix was designed to check for the best normalization method based on the factors and their priorities. Recommendations were concluded.

391 citations


Journal ArticleDOI
TL;DR: The potential use of classification based data mining techniques such as Rule based, decision tree and Artificial Neural Network to massive volume of healthcare data is examined.
Abstract: The healthcare environment is generally perceived as being ‘information rich’ yet ‘knowledge poor’. There is a wealth of data available within the healthcare systems. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. Knowledge discovery and data mining have found numerous applications in business and scientific domain. Valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, we briefly examine the potential use of classification based data mining techniques such as Rule based, decision tree and Artificial Neural Network to massive volume of healthcare data. In particular we consider a case study using classification techniques on a medical data set of diabetic patients.

230 citations


Journal ArticleDOI
TL;DR: A modified version of the famous COCOMO model provided to explore the effect of the software development adopted methodology in effort computation and two new model structures to estimate the effort required for the development of software projects using Genetic Algorithms.
Abstract: Defining the project estimated cost, duration and maintenance effort early in the development life cycle is a valuable goal to be achieved for software projects. Many model structures evolved in the literature. These model structures consider modeling software effort as a function of the developed line of code (DLOC). Building such a function helps project managers to accurately allocate the available resources for the project. In this study, we present two new model structures to estimate the effort required for the development of software projects using Genetic Algorithms (GAs). A modified version of the famous COCOMO model provided to explore the effect of the software development adopted methodology in effort computation. The performance of the developed models were tested on NASA software project dataset (1) .The developed models were able to provide a good estimation capabilities.

145 citations


Journal ArticleDOI
TL;DR: This study presents a general framework model for E-Government Readiness Assessment, which represents the basic components to be assessed before launching the "e-initiative" to guarantee the right implementation in the right direction.
Abstract: This study presents a general framework model for E-Government Readiness Assessment. There are six necessary key factors to implement any E-government initiative worldwide. These factors represent the basic components to be assessed before launching the "e-initiative" to guarantee the right implementation in the right direction. The organization building blocks need to be assessed are: Organizational Readiness, Governance and leadership Readiness, Customer Readiness, Competency Readiness, Technology Readiness and Legal Readiness[1]. In the Organizational readiness, bureaucratic nature of E-Governments, business process, long process delay and need for re-engineering will be discussed. In the Governance and Leadership Readiness, the importance of leadership and governance for the e-initiative, the importance of procedures, service level agreement, the way public officials perform, commitment and accountability for public jobs, all will be shown. In the Customer readiness, the main public concerns regarding accessibility, trust and security will be highlighted. In the Competency readiness, the lack of qualified personnel in the public sector and the different alternatives to overcome this issue will be discussed. In the Technology readiness, too many issues worth to be considered, such as hardware, software, communication, current technology, legacy systems, sharing applications and data and setting secure infrastructure to exchange services. The last factor is the Legal readiness where the adoption of the Jordanian Temporary law No 85 in the year 2001 "Electronic Transaction Law" ETL paved the road towards the big shift for e-initiative and privacy. Some of these will be discussed in detail. The need for this detail arises from the fact that all government activities are governed by law. For this reason, it is important to start from this key factor

103 citations


Journal ArticleDOI
TL;DR: A new model has been built that supports a standard set of quality characteristics suitable for evaluating COTS components, along with newly defined sets of sub-characteristics associated with them.
Abstract: Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based systems. In this contribution, an analysis study has been carried out on several existing software quality models, namely: McCall's, Boehm, ISO 9126, FURPS, Dromey, ISO/IEC TR 15504-2 1998(E), Triangle and Quality Cube, for the purpose of evaluating them and defining a ground to build a new model specializing in evaluating and selecting COTS components. The study also outlines limitations found in the existing models, such as the tendency to ignore a certain quality feature like Functionality or the failure to describe how the quality measurement in these models has been carried out. As a result of this analysis, a new model has been built that supports a standard set of quality characteristics suitable for evaluating COTS components, along with newly defined sets of sub-characteristics associated with them. The new model avoids some of the limitations found in the existing models. The new model ignores quality characteristics that are not applicable to COTS components and is empowered with new ones that are. In addition, it matches the appropriate type of stakeholders with corresponding quality characteristics; such a feature is missing in all existing models. The objective of the new model is to guide organizations that are in the process of building COTS-based systems to evaluate and choose the appropriate products, and that is essential to the success of the entire system.

89 citations


Journal ArticleDOI
TL;DR: This paper along with test results show that the possibility of guessing keys is extremely weaker than using the Data Encryption Standard method (DES), which is a widely-used method of data encryption.
Abstract: In this paper, an efficient and scalable technique for computer network security is presented. On one hand, the decryption scheme and the public key creation used in this work are based on a multi-layer neural network that is trained by backpropagation learning algorithm. On the other hand, the encryption scheme and the private key creation process are based on Boolean algebra. This is a new potential source for public key cryptographic schemes which are not based on number theoretic functions and have small time and memory complexities. This paper along with test results show that the possibility of guessing keys is extremely weaker than using the Data Encryption Standard method (DES), which is a widely-used method of data encryption. The presented results are obtained through the use of MATLAB 6.5.1 software.

70 citations


Journal ArticleDOI
TL;DR: The study focuses on launching the E-Government project in Jordan, the main planning and implementation features noticed there and the main obstacles, and proposes a simplified model for some of the Jordan E- government Portal Online services.
Abstract: This study explains architecture for the E-Government system; its main concepts, objectives, most common applications, famous worldwide experiences and the E-Government in Jordan. It introduces the E-Government model as a modern evolution of Information and Communication Technology (ICT) and how to convert the life of societies to the communication and networked age. It presents the experiences of countries like USA, UK, Singapore, UAE and Egypt. The study focuses on launching the E-Government project in Jordan, the main planning and implementation features noticed there and the main obstacles. It proposes a simplified model for some of the Jordan E-Government Portal Online services.

52 citations


Journal ArticleDOI
TL;DR: This paper proposes a distinct measure, which combines the qualitative, quantitative factors referred in the literature with rarely mentioned factors such as trust and feature state, which determines the usability of a website from the user view point.
Abstract: The presence of multiple websites offering similar services has changed the user outlook. The user now prefers to visit those sites, which are easy to use. Many different methods have been proposed to measure usability of a website. The quantitative methods focus on the performance measurement of the website whereas the qualitative methods estimate the user's opinion of a website. However none of these specify measurement of the human emotions; the emotional experiences of the user during the website visit. This emotional aspect plays a strong guiding force in the way a user uses the website. In this paper we propose a distinct measure, which combines the qualitative, quantitative factors referred in the literature with rarely mentioned factors such as trust and feature state. The measure thus obtained determines the usability of a website from the user view point .It can be employed to compare usability of different websites

48 citations


Journal ArticleDOI
TL;DR: This study proposes a string searching algorithm as an improvement of the brute-force searching algorithm, named as, Occurrences algorithm, based on performing preprocessing for the pattern and for the text before beginning to search for the patterns in the text.
Abstract: This study proposes a string searching algorithm as an improvement of the brute-force searching algorithm. The algorithm is named as, Occurrences algorithm. It is based on performing preprocessing for the pattern and for the text before beginning to search for the pattern in the text.

43 citations


Journal ArticleDOI
TL;DR: The implementation of WiFi with respect to future market opportunities in the Kingdom of Bahrain and an analysis of various demographics will be outlined with particular concentration on the acceptance of WiFi by society in the kingdom of Bahrain.
Abstract: Computer networks have played a major role in expanding the operational boundaries in organizations today. Until now traditional methods of networking, which involves computers, wired directly to a hub or switch are the norm. Recent advances in networking technology have made it possible for devices to communicate using various light and wave emitting technologies. WiFi is a perfect example of one of these emerging technologies, which has enabled computers to communicate with each other without the use of traditional cables. The implementation of WiFi with respect to future market opportunities in the Kingdom of Bahrain will be discussed in this study. Finally, an analysis of various demographics will be outlined with particular concentration on the acceptance of WiFi by society in the Kingdom of Bahrain. Some concerns along with recommendations, which need to be taken into account when using WiFi are also outlined.

42 citations


Journal ArticleDOI
TL;DR: The objective of this work was to implement a classifier based on neural networks MLP (Multi-layer Perception) for face detection, and targeted Xilinx FPGA as the implementation support.
Abstract: Face detection and recognition has many applications in a variety of fields such as security system, videoconferencing and identification. Face classification is currently implemented in software. A hardware implementation allows real-time processing, but has higher cost and time to-market. The objective of this work was to implement a classifier based on neural networks MLP (Multi-layer Perception) for face detection. The MLP was used to classify face and non-face patterns. The system described using C language on a P4 (2.4 Ghz) to extract weight values. Then a Hardware implementation achieved using VHDL based Methodology. We targeted Xilinx FPGA as the implementation support.

Journal ArticleDOI
TL;DR: This study proposes a hybrid method of based- rules and a machine learning method for tagging Arabic words based firstly on rules and then the anomaly is corrected by adopting a memory-based learning method (MBL).
Abstract: Many natural language expressions are ambiguous and need to draw on other sources of information to be interpreted. Interpretation of the word ????? to be considered as a noun or a verb depends on the presence of contextual cues. This study proposes a hybrid method of based- rules and a machine learning method for tagging Arabic words. So this method is based firstly on rules (that considered the post-position, ending of a word and patterns) and then the anomaly is corrected by adopting a memory-based learning method (MBL). The memory based learning is an efficient method to integrate various sources of information and handling exceptional data in natural language processing tasks. Secondly checking the exceptional cases of rules and more information is made available to the learner for treating those exceptional cases. To evaluate the proposed method a number of experiments has been run and in order, to improve the importance of the various information in learning.

Journal ArticleDOI
TL;DR: A top-down progressive deepening method is developed for mining level-crossing association rules in large transaction databases by extension of some existing multiple-level association rule mining techniques.
Abstract: Existing algorithms for mining association rule at multiple concept level, restricted mining strong association among the concept at same level of a hierarchy. However mining level-crossing association rule at multiple concept level may lead to the discovery of mining strong association among at different level of hierarchy. In this study, a top-down progressive deepening method is developed for mining level-crossing association rules in large transaction databases by extension of some existing multiple-level association rule mining techniques. This method is using concept of reduced support and refine the transaction table at each level.

Journal ArticleDOI
TL;DR: This paper deals with sensitivity analysis of combina- torial optimization problems and its fundamental term, the tolerance, and for three classes of objective functions (, MAX, MAX) it is shown that the upper tolerances of an element is well defined, how to compute the upper tolerance of an elements, and equivalent formulations when the upperolerance is +1 or > 0.
Abstract: In this paper we deal with sensitivity analysis of combinatorial optimization problems and its fundamental term, the tolerance. For three classes of objective functions (Σ,II,MAX) we give some basi ...

Journal ArticleDOI
TL;DR: An advanced software tool assists in analyzing and classifying the human chromosomes by providing a new filter to remove the noise depending on the objects that exists in the image.
Abstract: This study aimed to represent an advanced software tool assists in analyzing and classifying the human chromosomes. Our new software handles both classifying and analyzing the medical image of human chromosomes according to the following steps: 1- we use the image processing utilities in Matlab to analyze the image; where we provide a new filter to remove the noise depending on the objects that exists in the image. 2- The filtered image enters to a segmentation algorithm to segment this image to the most acceptable one. 3-The segments will enter two tracks for classifying the chromosomes that are contained in the segment: the first one depends on using the image processing for measuring the length of each chromosome; while the second one deals with initiating the neural network to classify the chromosomes according to the chromosomes characteristics. There is a noticeable enhancement in the image processing approach as a result of using our filter which improve the image quality by a factor equal to 91.7%.

Journal ArticleDOI
TL;DR: The shell footing is found to have a better load carrying capacity compared with the conventional slab/flat footing of similar cross sectional area and the FE analysis showed a reasonably good agreement with the laboratory experimental results.
Abstract: This study describes a study on the geotechnical behavior of shell footing using a non-linear finite element analysis with a finite element code, PLAXIS. The shell footing is found to have a better load carrying capacity compared with the conventional slab/flat footing of similar cross sectional area. The FE analysis also showed a reasonably good agreement with the laboratory experimental results. The effect of adding edge beams at the bottom of the shell footings has been studied numerically and found to be beneficial in increasing the load carrying capacity of the footing. The effect of increasing the embedment ratio is found to increase the load carrying capacity of the shell footings

Journal ArticleDOI
TL;DR: The integer factoring attacks, attacks on the underlying mathematical function, as well as attacks that exploit details in implementations of the algorithm are described.
Abstract: In this paper some of the most common attacks against Rivest, Shamir, and Adleman (RSA) cryptosystem are presented. We describe the integer factoring attacks, attacks on the underlying mathematical function, as well as attacks that exploit details in implementations of the algorithm. Algorithms for each type of attacks are developed and analyzed by their complexity, memory requirements and area of usage.

Journal ArticleDOI
TL;DR: A comparative study of both ANN and the conventional Auto-Regression (AR) model networks indicates that the artificial neural networks performed better than the AR model, and is recommended as a useful tool for river flow forecasting.
Abstract: Forecasting a time series became one of the most challenging tasks to variety of data sets. The existence of large number of parameters to be estimated and the effect of uncertainty and outliers in the measurements makes the time series modeling too complicated. Recently, Artificial Neural Network (ANN) became quite successful tool to handle time series modeling problem. This paper provides a solution to the forecasting problem of the river flow for two well known Rivers in the USA. They are the Black Water River and the Gila River. The selected ANN models were used to train and forecast the daily flows of the first station no: 02047500, for the Black Water River near Dendron in Virginia and the second station no: 0944200 for the Gila River near Clifton in Arizona. The feed forward network is trained using the conventional back propagation learning algorithm with many variations in the NN inputs. We explored models built using various historical data. The selection process of various architectures and training data sets for the proposed NN models are presented. A comparative study of both ANN and the conventional Auto-Regression (AR) model networks indicates that the artificial neural networks performed better than the AR model. Hence, we recommend ANN as a useful tool for river flow forecasting.

Journal ArticleDOI
TL;DR: The results showed that the He-Ne laser emits a pure Gaussian beam, whereas the diode laser emits an elliptical beam shape.
Abstract: The laser beam profiles captured by CCD camera processed and analyzed by using 2D- graphic and displayed on the monitor. The analysis of the He-Ne profiles and laser diode profiles are presented. The results showed that the He-Ne laser emits a pure Gaussian beam, whereas the diode laser emits an elliptical beam shape. The laser beam profile analysis developed to study the intensity distribution, the laser power, number of modes and the laser beam, these parameters are quite important in many applications such as the medical, industrial and military. The analysis of the image profile can be used in determination the distance of the objects depending on the distribution of the spot contours of the laser beam with its width. As well as it can be used to capturing images to the objects and determining their distance and the shape of the objects.

Journal ArticleDOI
TL;DR: This study proposes an improved Case-Based Reasoning (CBR) approach integrated with multi-agent technology to retrieve similar projects from multi-organizational distributed datasets to build a software cost estimation model by collecting software cost data from distributed predefined project cost databases.
Abstract: Accurate software cost estimation is a vital task that affects the firm's software investment decisions before committing required resources to that project or bidding for a contract. This study proposes an improved Case-Based Reasoning (CBR) approach integrated with multi-agent technology to retrieve similar projects from multi-organizational distributed datasets. The study explores the possibility of building a software cost estimation model by collecting software cost data from distributed predefined project cost databases. The model applying CBR method to find similar projects in historical data derived from measured software projects developed by different organizations.

Journal ArticleDOI
TL;DR: The roles and functions of various autonomic components are described, and past and current approaches that are being developed to address specific areas of the autonomic computing environment that focus on improving system dependability including both reliability and security concerns are reviewed.
Abstract: The rapidly increasing complexity of computing systems is driving the movement towards autonomic systems that are capable of managing themselves without the need for human intervention. Without autonomic technologies, many conventional systems suffer reliability degradation and compromised security. Autonomic management techniques reverse this trend. This study describes the roles and functions of various autonomic components, and systematically reviews past and current approaches that have been/are being developed to address specific areas of the autonomic computing environment that focus on improving system dependability including both reliability and security concerns. Analyzing past research can lead to the design of a more advanced, dependable autonomic computing system. A novel and promising prototypical system that is a work-in-progress will be presented finally.

Journal ArticleDOI
TL;DR: This study introduces a method for generating a particular permutation P of a given size N out of N! permutations from a given key that computes a unique permutation for a specific size since it takes the same key; therefore, the same permutation can be computed each time the sameKey and size are applied.
Abstract: This study introduces a method for generating a particular permutation P of a given size N out of N! permutations from a given key. This method computes a unique permutation for a specific size since it takes the same key; therefore, the same permutation can be computed each time the same key and size are applied. The name of random permutation comes from the fact that the probability of getting this permutation is 1 out of N! possible permutations. Beside that, the permutation can not be guessed because of its generating method that is depending completely on a given key and size.

Journal ArticleDOI
TL;DR: The purpose of this article is to present a survey of the current techniques of aspect mining, and to understand both the strengths and limitations of this new area.
Abstract: In object oriented paradigm, the implementation of a concern is typically scattered over many locations and tangled with the implementation of other concerns, resulting in a system that is hard to explore and understand. Identifying such code automatically greatly improves both the maintainability and the evolveability of the application. Aspect mining aims to identify crosscutting concerns in existing systems, thereby improving the system's comprehensibility and enabling migration of existing (object-oriented) programs to aspect-oriented ones. Aspect are mined either by use of static information or dynamic information of the code. The purpose of this article is to present a survey of the current techniques of aspect mining. We seek to understand both the strengths and limitations of this new area.

Journal ArticleDOI
TL;DR: This study presents a framework for a web-based e-learning system using the Semantic Web technology, which focus on the RDF data model, OWL ontology language and RAP for parsing RDF documents.
Abstract: E-learning is being increasing viewed as an important activity in the field of distance and continuing education. Web-based courses offer obvious advantages for learners by making access to educational resource very fast, just-in-time and relevant, at any time or place. In this study, based on our previous work, we present a framework for our web-based e-learning system using the Semantic Web technology. In addition we present an approach for implementing a Semantic Web-based e-learning system, which focus on the RDF data model, OWL ontology language and RAP for parsing RDF documents. Also the use of RAP – a Semantic Web toolkit for developing our application is discussed in more details.

Journal ArticleDOI
TL;DR: This study considers an operational software system with multiple degradations and derives the optimal software rejuvenation policy, minimizing the expected operation cost per unit time in the steady state, via the dynamic programming approach.
Abstract: Software rejuvenation is a preventive and proactive maintenance policy that is particularly useful for counteracting the phenomenon of software aging. In this study we consider an operational software system with multiple degradations and derive the optimal software rejuvenation policy minimizing the expected operation cost per unit time in the steady state, via the dynamic programming approach. Especially, we show analytically that the control-limit type of software rejuvenation policy is optimal. A numerical example is presented to make a decision table and to perform the sensitivity analysis of cost parameters.

Journal ArticleDOI
TL;DR: The final results indicate that the proposed AHOCR technique achieves an excellent test accuracy of recognition rated up to 97% for isolated Arabic characters and 96% for Arabic text.
Abstract: The objective of this study was to present a new technique assists in developing a recognition system for handling the Arabic Hand Written text. The proposed system is called Arabic Hand Written Optical Character Recognition (AHOCR). AHOCR was concerned with recognition of hand written Alphanumeric Arabic characters. In the present work, extracted characters are represented by using geometric moment invariant of order three. The advantage of using moment invariant for pattern classification as compared to the other methods is its invariant with respect to its: position , size and rotation .The proposed technique was divided into three major steps : the first step was concerned with digitization and preprocessing documents to create connect components, detect the skew of characters and correct it .The second step deals with how to use geometric moment invariant features of the input Arabic characters to extract features . The third step focused on description of an advanced system of classification using Probabilistic Neural networks structure which yields significant speed improvement. Our final results indicate and clarify that the proposed AHOCR technique achieves an excellent test accuracy of recognition rated up to 97% for isolated Arabic characters and 96% for Arabic text.

Journal ArticleDOI
TL;DR: Four different models for dealing with missing data were studied and a framework is established that remove inconsistencies before and after filling the attributes of missing values with the new expected value as generated by one of the four models.
Abstract: Most information systems usually have some missing values due to unavailable data Missing values minimizing the quality of classification rules generated by a data mining system Missing vales also affecting the quantity of classification rules achieved by the data mining system Missing values could influence the coverage percentage and number of reducts generated Missing values lead to the difficulty of extracting useful information from that data set Solving the problem of missing data is of a high priority in the field of data mining and knowledge discovery Replacing missing values by a specific value should not affect the quality of the data Four different models for dealing with missing data were studied A framework is established that remove inconsistencies before and after filling the attributes of missing values with the new expected value as generated by one of the four models Comparative results were discussed and recommendations were concluded

Journal ArticleDOI
TL;DR: Comparison of different websites with respect to their trust levels can therefore, provide the designer an insight into the weak features of that website and help in retaining the users and hence increase the usability of that site.
Abstract: The technological advancement has significantly influenced the style of human-computer interaction, especially for World Wide Web. The users can now afford to choose amongst multiple websites offering similar services. The website which provides a usable interface along with requisite services, scores over its competitors. User interaction with a website is driven by two major factors- the performance factors associated with the site and the emotional factors associated with human being. Amongst the emotional factors, trust is a dominant driving factor of web usage. The more a user trusts a website, more shall be the usage and vice-versa. Trust has almost always been specified qualitatively. This study presents a distinct method of measuring user trust on a website, by considering the features of that website. Four distinct states of a feature are considered in the study. Each of these states has a different impact on user trust formation. The method proposed considers the features, their states and their contribution towards trust formation to compute the user trust on a website. Such a measure can be effectively employed to determine the trust level of a website as perceived by its users. Comparison of different websites with respect to their trust levels can therefore, provide the designer an insight into the weak features of that site. Identifying and correcting these, can help in retaining the users and hence increase the usability of that site.

Journal ArticleDOI
TL;DR: This study collects some information about the features and related issues regarding the SmartCards which are to be issued to the public and evaluates the efforts taken by the government to create awareness among the public about the usage and features of the SmartCard.
Abstract: SmartCards are one of the latest additions to the continuing list of advancements and innovations in the world of information and communication technology. A SmartCard resembles in size and shape to a normal credit card or bank ATM card, with a microprocessor chip implanted into the plastic card. These cards are used not just as identity cards, like the earliest versions of such cards, but hold a relatively huge amount of editable information including the card holder's bank data, e-purse, finger print, health record, blood type, traffic and license details and other such vital information. The purpose of this study was to present a general overview of SmartCards, a brief history, features and applications of SmartCards and the introduction of SmartCards in the Kingdom of Bahrain. This study collects some information about the features and related issues regarding the SmartCards which are to be issued to the public. A total of 513 questionnaires were distributed to students of the University of Bahrain. The questions asked included questions to check the acceptance of the people to replace their current cards with a SmartCards and their awareness of the new National SmartCard in Bahrain. It also evaluates the efforts taken by the government to create awareness among the public about the usage and features of the SmartCards.

Journal ArticleDOI
TL;DR: The experimental results show that the average message length and the efficiency of compression on Arabic text is better than the compression on English text, and the main factor which significantly affects compression ratio andaverage message length is the frequency of the symbols on the text.
Abstract: The development of an efficient compression scheme to process the Arabic language represents a difficult task. This paper employs the dynamic Huffman coding on data compression with variable length bit coding, on the Arabic language. Experimental tests have been performed on both Arabic and English text. A comparison is made to measure the efficiency of compressing data results on both Arabic and English text. Also a comparison is made between the compression rate and the size of the file to be compressed. It has been found that as the file size increases, the compression ratio decreases for both Arabic and English text. The experimental results show that the average message length and the efficiency of compression on Arabic text is better than the compression on English text. Also, results show that the main factor which significantly affects compression ratio and average message length is the frequency of the symbols on the text.