scispace - formally typeset
Search or ask a question
Book ChapterDOI

Planned Random Algorithm

01 Jan 2018-pp 119-127
TL;DR: An algorithm to minimize repetition of values in the returned data set so as to make it appear more random in “appearance” is proposed.
Abstract: Computers are very systemized and none of the procedures conducted by them are random. But computers are seldom required to generate a random number for many practical applications like gaming, accounting, encryption/decryption and many more. The number generated by the computer relies on the time or the CPU clock. A given computer can be programmed to return random number (or character) arrays from a number (or character) data set. The returned dataset can have repeated values. Even though the repeated values are not related with its degree of randomness (in fact, it may be a sign of higher randomization), but still to humans, it appears as biased or not random. We propose an algorithm to minimize repetition of values in the returned data set so as to make it appear more random. The concept proposes returning a data set by using biased or nonrandom procedure in order to make it more random in “appearance”.
Citations
More filters
Journal ArticleDOI
TL;DR: The Planned Random algorithm (PR algorithm) as discussed by the authors was created as a solution to this problem, as it introduces a non-random process to lessen the repetitions of data in the returned dataset and making the result less random to appear more random.
Abstract: Like many other innovations, randomization has been a key factor in various fields such as games, science, art, statistics, and other areas. But although treated as common knowledge, randomization is often misunderstood by users when they start to perceive relationships or connections between randomly distributed data, this tendency is known as apophenia. It is observed mostly when a randomized dataset shows the same elements in a row or in multiple repetitions, an occurrence that is plausible given the nature of randomization. Nevertheless, this may appear as biased or non-random to humans, thinking that since an element had appeared once or a couple of times then it should not appear next to or within a set amount of time, another user misunderstanding known as gambler’s fallacy.The Planned Random algorithm (PR algorithm) was created as a solution to this problem, as it introduces a non- random process to lessen the repetitions of data in the returned dataset and making the result less random to appear more random.This study aims to enhance the Planned Random algorithm by increasing its apparent randomness by further minimizing the repetition of data in a resulting dataset. 5 final test simulations were done for both the existing PR algorithm and enhanced PR algorithm wherein the enhanced algorithm showed a promising 22.22% decrease in both total and average repetitions compared to the existing algorithm.In conclusion, the enhanced algorithm had seen a significant increase in apparent randomness when compared to the existing PR algorithm.
References
More filters
Journal ArticleDOI
TL;DR: The neurobiological study of coincidence rests upon the brain’s need for order and predictability and the brain is predisposed to use coincidences to create or discover patterns, which suggests the possibility that the authors can look where they cannot.
Abstract: The neurobiological study of coincidence rests upon the brain’s need for order and predictability. Coincidences alert the brain to possible causal relationships between events. Through the apprehension of such relationships, the world appears as more orderly and more predictable. Even though the scientifi c method has created a systematic way of determining the validity of possible causal connections between events, the human brain persists in its often non-scientifi c interpretations of coincidences. The same brain processes that manage coincidence interpretation can yield the strangest superstitions as well as new ideas about the nature of reality. This article addresses the following concepts: 1. The brain seeks patterns; 2. The brain is predisposed to use coincidences to create or discover patterns; 3. The philosophical basis for interpreting coincidences is provided by fundamental association cortex schemas; 4. Personally relevant coincidence interpretation is infl uenced by a person’s biases; 5. Hemispheric lateralization infl uences coincidence detection and interpretation — the right brain associates while the left brain inhibits; and 6. Coincidences suggest the possibility that we can look where we cannot

23 citations