scispace - formally typeset
Search or ask a question

K-space vs gaussian functions studies with molecules? 


Best insight from top research papers

The conformational profile of small organic molecules is important in molecular recognition and computer-assisted drug design. A parametric fitting procedure based on cluster analysis and Gaussian distributions can estimate the probability density function (PDF) in the conformational space of drug-like molecules . Gaussian functions on the space of type G/K have foundational properties and can be used as test functions, leading to explicit formulas and the heat kernel . Gaussian data obtained from GAUSSIAN can provide qualitative results about a molecule, although the frequencies of spectral lines may not be as accurate as required by astronomers . Gaussian molecules have prolate shapes on average, except for regular stars with densely radiated long arms which exhibit perfect symmetry . An approximate variational method using distributed Gaussian functions has been applied to study the rotation-vibration spectra of molecules .

Answers from top 5 papers

More filters
Papers (5)Insight
The provided paper does not discuss k-space studies with molecules. It focuses on the use of distributed Gaussian functions (DGF) for studying rotation-vibration spectra of triatomic molecules.
Journal ArticleDOI
Gaoyuan Wei, Bruce E. Eichinger 
37 Citations
The provided paper does not discuss k-space or Gaussian functions studies with molecules.
The provided paper does not discuss k-space or Gaussian functions studies with molecules.
Journal ArticleDOI
Jay Jorgenson, Serge Lang 
1 Citations
The provided paper does not discuss k-space or Gaussian functions studies with molecules.
Open accessJournal ArticleDOI
03 Apr 2019
1 Citations
The provided paper does not discuss the comparison between k-space and Gaussian functions studies with molecules.

Related Questions

What is the Gaussian distribution with identity link?5 answersThe Gaussian distribution with an identity link refers to a distribution where the derivative of the log-density of the normal distribution is the negative identity function, allowing for a deeper understanding of existing characterizations and extensions to general densities. This concept is also relevant in the context of random matrices, where products of random $2\times 2$ matrices exhibit Gaussian fluctuations around Lyapunov exponents, with the distribution supported by a small neighborhood of the identity matrix. Moreover, in the field of biometric identification, a Gaussian confidence scoring approach has been proposed for converting match scores to probabilistic confidence scores, resulting in increased discrimination accuracy and higher recognition rates, especially in face recognition systems.
Dot product of complex gaussian4 answersThe dot product of complex Gaussian random variables has been studied in several papers. The joint probability density function (pdf) of the amplitude and phase of the product of two correlated non-zero mean complex Gaussian random variables with arbitrary variances has been derived in. This distribution is useful in radar and communication systems. The distribution of the product of a complex Gaussian matrix and a complex Gaussian vector has also been derived in. The distribution of the sum of this product and a complex Gaussian vector has been calculated, which generalizes previous results. These results have been applied to analyze the performance of the energy detector for a multiple-input-multiple-output (MIMO) communication system. Additionally, the joint probability density function (pdf) of a complex vector, which is the product of a complex Gaussian scalar and a complex Gaussian vector, added to another complex Gaussian vector, has been derived in. This result has been used to characterize the performance of the energy detector for a radar system.
What's new in gaussian math?4 answersThe new developments in Gaussian math include the extension of classical second-order differential subordination knowledge to include third-order differential subordinations. Another advancement is the investigation of the three-dimensional Gaussian product inequality, which is equivalent to an improved Cauchy-Schwarz inequality and has led to novel moment inequalities for bivariate Gaussian random variables. Additionally, a new Gaussian mixture model (GMM) algorithm has been developed for better representations of atomic models and electron microscopy 3D density maps. This algorithm considers the input atoms or voxels as a set of Gaussian functions and solves the problems of size ignorance, singularity, and computation time in the standard GMM algorithm. Furthermore, a new method for solving the Gaussian integral has been proposed, which introduces a parameter representing the Taylor series coefficients and has practical applications in non-extensive particle number density, gamma and factorial functions, and fractional calculus.
What are the connections between Gaussian processes and Stochastic differential equations?5 answersGaussian processes and stochastic differential equations are connected in several ways. Gaussian processes have been used to model spatial and spatio-temporal dependence structures, such as covariance functions and reproducing kernel Hilbert spaces. Stochastic integration with respect to Gaussian processes has also been studied, with applications in various fields. Additionally, recent advances in probabilistic machine learning have leveraged Gaussian process priors to discover governing equations expressed by linear operators, including differential and fractional order operators. Gaussian process regression has been used to reconstruct fields and physical parameters based on data that satisfy a given linear differential equation, leading to more reliable regression and physically interpretable solutions. Furthermore, Gaussian processes have been used as priors in nonparametric methods for estimating the drift and diffusion terms of stochastic differential equations, allowing for the analysis of complex dynamics.
How to to qsar analysis with gaussian and jmp?4 answersQSAR analysis can be performed using Gaussian and JMP software. Gaussian is a widely used method that uses multivariate statistics to relate chemical structures to responses or properties based on experimental observations. It has advantages such as speed, simplicity, and the ability to account for transport and metabolic processes that occur after compound administration. On the other hand, JMP is a software that provides tools for statistical modeling and analysis, including QSAR analysis. It allows for the evaluation and validation of statistical models in QSAR, as well as the setup of QSAR datasets. By standardizing QSAR datasets using an open XML format (QSAR-ML) and an extensible descriptor ontology, the setup of the dataset becomes completely reproducible, facilitating data exchange and collaboration. Therefore, to perform QSAR analysis with Gaussian and JMP, one can utilize the capabilities of Gaussian for statistical modeling and the tools provided by JMP for dataset setup and analysis.
Whats the difference between a mean function and the kernel in gaussian proccesses?3 answersA mean function and a kernel are both important components of Gaussian processes. The mean function represents the expected value of the process at each point in the input space. It provides a prior assumption about the behavior of the process and can be used to capture trends or biases in the data. On the other hand, the kernel describes the covariance between neighboring points in the data set. It determines the smoothness and shape of the functions that can be generated by the Gaussian process. The choice of kernel affects the flexibility and generalization ability of the model. The mean function and the kernel are both crucial in determining the behavior and performance of Gaussian processes.

See what other people are reading

What are the challenges of Gaussian process regression?
5 answers
What is the most cited paper on machine learning in space weather?
5 answers
What is the most cited paper on machine learning in space weather?
5 answers
Have a lot of things to handle at the same time for learning?
5 answers
Multitasking in learning involves handling multiple tasks simultaneously to enhance learning outcomes by sharing information between related tasks. It can improve generalization performance and save computation time during inference by training a single neural network to solve various tasks. However, the effectiveness of multitasking depends on task cooperation and competition, as some tasks may compete with each other, leading to inferior performance. Understanding which tasks should be learned together in one network is crucial to optimize learning outcomes. Attention plays a vital role in the learning process, acting as the gatekeeper to memory and learning. Therefore, while multitasking can be beneficial, it is essential to manage tasks strategically to leverage the advantages of shared information without compromising learning efficiency.
How Bayesian Principle used in Data Pre-processing?
5 answers
Bayesian principles are utilized in data pre-processing to enhance efficiency and scalability in various fields. For instance, in hydrological modeling, techniques like Principle Component Analysis (PCA) and Bayesian Information Criterion (BIC) aid in reducing input dimensions and optimizing training data length. Additionally, Bayesian inference plays a crucial role in approximate Bayesian computation (ABC) by enabling the use of synthetic likelihoods and precomputed binding functions, significantly improving scalability and reducing computational time for model fitting from hours to minutes. Moreover, in fusion research, Bayesian probability theory is applied to tomography problems, such as Gaussian process tomography (GPT), allowing for fast and robust reconstruction of emissivity fields with uncertainty analysis capabilities. These applications showcase the versatility and effectiveness of Bayesian principles in streamlining data pre-processing tasks.
Does research exist that uses generative AI to lower the dimension of motion capture data?
4 answers
Research in the field of generative AI has indeed explored methods to reduce the dimensionality of motion capture data. One approach involves disentangling video frames into foreground and background to focus on generating the foreground, effectively reducing the network output dimension. Another study introduces a probabilistic motion model encoded in a low-dimensional space, termed the motion matrix, enabling tasks like motion analysis, simulation, and interpolation of realistic motion patterns, ultimately lowering the dimension of the data for faster acquisition and augmentation. These innovative techniques showcase how generative AI can be leveraged to streamline the representation of complex motion data, opening up possibilities for enhanced analysis and synthesis in various applications.
Are ANN more interpretable then gaussian proccess regression?
4 answers
Neural networks are generally less interpretable than Gaussian Processes (GPs) due to their complexity. While neural networks excel in solving tabular data modeling challenges, their interpretability is limited compared to GPs, which are increasingly seen as an alternative to neural and deep learning methods. Recent research has focused on enhancing the interpretability of models like GPs, aiming to make the decision process transparent to humans. In fact, a novel Gaussian process kernel has been proposed that leverages a deep neural network structure while maintaining good interpretability, addressing issues of optimality, explainability, model complexity, and sample efficiency. Therefore, GPs, with their explainable approaches and focus on interpretability, can be considered more interpretable than artificial neural networks.
How to find vortex rings from a variational formulation?
5 answers
To find vortex rings using a variational formulation, one approach involves constructing a special family of "trial functions" known as α-rings, circular-sector vortex rings positioned on the border of a circular sector with aperture angle α. These rings can be described by an explicit formula for the velocity potential, derived in terms of trigonometric and Hypergeometric functions, allowing for comparisons with known solutions like circular vortex rings. Additionally, the Lagrangian pseudovorticity field computed from particles suspended in a fluid can be used to estimate the dynamics of isolated vortex rings accurately, especially in the context of superfluid experiments. This method provides valuable insights into the behavior and properties of vortex rings in various fluid flow scenarios.
Does high data dimensionality impact active learning?
5 answers
High data dimensionality significantly impacts active learning strategies. While uncertainty sampling is often favored in low-dimensional settings, its effectiveness in high-dimensional data is not guaranteed. Novel methods like adaptive weighted uncertainty sampling (AWUS) have been developed to address this challenge, showing superior performance in high-dimensional datasets. Additionally, hierarchical ensemble clustering algorithms have been proposed to handle high-dimensional unsupervised learning tasks effectively. Furthermore, active learning methods based on intrinsic data geometries learned through diffusion processes have been shown to provide accurate labelings with minimal training labels in high-dimensional scenarios. These findings collectively highlight the complex interplay between data dimensionality and active learning performance, emphasizing the need for tailored approaches in high-dimensional settings.
What is shared latent space of image features and class embeddings?
5 answers
The shared latent space of image features and class embeddings refers to a common low-dimensional space where both visual features and class-related semantic information are represented. This shared space aims to capture essential cross-modal correlations associated with unseen classes. By aligning the distributions learned from multimodal input features and class-embeddings, this latent space preserves class-discriminative information. Additionally, the latent space is structured by visual concepts, where the two modalities should be indistinguishable, enabling the translation of image features into semantically structured embeddings for generating language descriptions of scenes. This approach allows for efficient retrieval, knowledge transfer to unseen classes, and outperforms previous methods in various tasks.
How HPLC analyzes data beneficial to plant extraction to be subjected in LC-MS before molecular docking?
5 answers
HPLC-MS plays a crucial role in analyzing plant extracts before subjecting them to LC-MS for molecular docking. Data preprocessing in LC-MS experiments is essential for extracting information effectively. Normalization of data generated by LC-MS is vital to reduce bias and correct for instrument drift, enhancing the accuracy of the analysis. Reliable data on phytoestrogen content in food is necessary for investigating biological effects, and a generic method using LC-MS has been developed for this purpose, showing good reproducibility. Solvents used in natural product chemistry can impact the extraction process, leading to artefacts that may influence biological activity assessments; thus, careful analysis and correlation with biological effects are crucial. Overall, utilizing HPLC-MS and LC-MS techniques with proper data handling methods are essential for accurate plant extract analysis before molecular docking.