scispace - formally typeset
Search or ask a question

How do I turn off sound detection in Webex? 

Answers from top 7 papers

More filters
Papers (7)Insight
Therefore, sound detection is probably based on spike rate and not synchronization criteria.
The macula neglecta is a sensitive vibrational detector and may possibly be involved in underwater sound detection.
I observed engineers conducting two concurrent but contrasting experiments; results indicate how settings both enable and constrain the interpretation of sound.
Proceedings ArticleDOI
Jiangtao Zhai, Guangjie Liu, Yuewei Dai 
04 Nov 2010
22 Citations
Experiments show that the proposed algorithm achieves sound detection performance.
The obtained results show an evident influence of the stand-off distance on sound emission.
Open accessProceedings ArticleDOI
Keisuke Imoto, Seisuke Kyochi 
12 May 2019
16 Citations
In this paper, we propose a technique of sound event detection utilizing graph Laplacian regularization taking the sound event co-occurrence into account.
The equivalence of behavioral and neural thresholds indicates that the filters used in behavioral sound detection are simply the bandwidths of saccular fibers.

See what other people are reading

What is the typical pipeline for a star centroiding algorithm?
5 answers
How does noise affect people in the office?
5 answers
How does noise affect people in the office?
5 answers
How to do an electrocardiogram in a shaking patient?
5 answers
Do b vitamins influence listening effort or listening fatigue?
5 answers
Do identifying bird post as a challenge for bird watcher?
5 answers
Identifying bird calls poses a significant challenge for bird watchers and researchers. Manual or semi-automatic detection methods often require tuning and post-processing, limiting efficiency. To address this issue, there is a need for tuning-free and species-agnostic approaches in automatic bird sound detection. Additionally, the VAST Challenge 2018 Mini Challenge 1 highlights the difficulty in uncovering patterns in spatio-temporal data, emphasizing the need for interactive visual analytic systems to extract valuable information more efficiently. Furthermore, the "drone-vs-bird detection challenge" under the SafeShore project aims to tackle the technical challenges posed by small drones, indicating the ongoing efforts to enhance detection capabilities in various contexts.
How often does faults due to vibrations occur in ships?
5 answers
Faults due to vibrations in ships are a significant concern, with various studies shedding light on their occurrence. Research indicates that wave-induced vibrations can contribute up to 50% of fatigue damage in large ocean-going ships, emphasizing the impact of vibrations on ship structures. Additionally, equipment fault detection in engines for boats focuses on shock and vibration information, with a high probability (98%) of engine fault determination and 72% probability of fault detection. Furthermore, metal fatigue in vessels, caused by dynamic movement, has been diagnosed using vibration analysis, highlighting the importance of reliability engineering in detecting faults related to vibrations. Overall, these studies underscore the frequency and significance of faults attributed to vibrations in ships, necessitating effective monitoring and diagnostic strategies for ensuring maritime safety.
How does music affect short term memory?
5 answers
Music has a significant impact on short-term memory. Studies have shown that background music can influence both short-term and long-term memory recall. Additionally, music expertise, particularly in musicians, has been linked to better auditory memory performance, especially for nonverbal stimuli with contour information. Furthermore, listening to music, such as the Quran or classical music, has been found to enhance short-term memory abilities. Musicians, in particular, exhibit advantages in memory tasks involving auditory and visual stimuli, potentially due to the encoding strategies they employ. Overall, music can enhance the speed of memory processing, especially in auditory and visual memory tasks, although the effect on accuracy may vary.
Is noisy environment a challenge in teaching reading?
5 answers
Yes, a noisy environment poses a significant challenge in teaching reading. Research indicates that teachers in noisy environments tend to exhibit vocally demanding behaviors, potentially leading to vocal fatigue and dysphonia. Additionally, studies have shown that classroom noise, particularly generated by chatter, can have a detrimental impact on students' comprehension performance during reading tasks. Furthermore, for individuals with blindness or low vision using screen readers, noisy environments can impair speech intelligibility, potentially affecting their ability to access information effectively. Therefore, it is evident that noise in educational settings can hinder both teachers' vocal health and students' reading comprehension, highlighting the importance of creating quieter environments to support effective teaching and learning.
How to optimize p1 delay in nmr?
5 answers
To optimize the P1 delay in NMR, various approaches can be considered based on the research findings. One method involves combining Carr–Purcell–Meiboom–Gill (CPMG) during acquisition with selective or nonselective excitation, leading to intensity enhancement but a loss in chemical shift information. Additionally, for quantitative NMR analysis, using an angle of approximately 83° with a pulse-repetition period of 4.5 times the longest T1 is suggested as an optimal procedure, although results are only marginally better than the standard method. Furthermore, a swarming approach can be employed to optimize the one-hop delay for interplatoon communications by adjusting the minimum contention window size of backbone vehicles iteratively, resulting in improved performance metrics including end-to-end delay and throughput.
How context awareness help in speech intelligibility in hearing aids?
5 answers
Contextual information plays a crucial role in enhancing speech intelligibility for individuals using hearing aids. Studies show that individuals with hearing loss, including cochlear implant users, benefit from contextual cues to improve speech recognition. The ability to utilize context is linked to cognitive factors like verbal working memory capacity. Additionally, the presence of semantic context in sentences has been found to increase speech intelligibility and reduce listening effort, especially in challenging listening environments like reverberant spaces. Furthermore, some individuals with hearing loss rely on a brief moment of silence after a sentence to mentally repair misperceptions, highlighting the importance of context in continuous communication for better speech recognition outcomes.