Example of Multisensory Research format
Recent searches

Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format
Sample paper formatted on SciSpace - SciSpace
This content is only for preview purposes. The original open access content can be found here.
Look Inside
Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format Example of Multisensory Research format
Sample paper formatted on SciSpace - SciSpace
This content is only for preview purposes. The original open access content can be found here.
open access Open Access

Multisensory Research — Template for authors

Publisher: Brill
Categories Rank Trend in last 3 yrs
Ophthalmology #37 of 116 down down by 4 ranks
Experimental and Cognitive Psychology #68 of 148 down down by 7 ranks
Computer Vision and Pattern Recognition #44 of 85 down down by 17 ranks
Cognitive Neuroscience #58 of 96 down down by 5 ranks
Sensory Systems #27 of 40 -
journal-quality-icon Journal quality:
Good
calendar-icon Last 4 years overview: 161 Published Papers | 500 Citations
indexed-in-icon Indexed in: Scopus
last-updated-icon Last updated: 19/07/2020
Related journals
Insights
General info
Top papers
Popular templates
Get started guide
Why choose from SciSpace
FAQ

Related Journals

open access Open Access

Frontiers Media

Quality:  
High
CiteRatio: 5.7
SJR: 2.036
SNIP: 1.066
open access Open Access
recommended Recommended

Springer

Quality:  
High
CiteRatio: 8.6
SJR: 0.86
SNIP: 1.676
open access Open Access

Springer

Quality:  
High
CiteRatio: 4.4
SJR: 1.196
SNIP: 1.258
open access Open Access
recommended Recommended

BMJ Publishing Group

Quality:  
High
CiteRatio: 7.3
SJR: 2.016
SNIP: 2.055

Journal Performance & Insights

Impact Factor

CiteRatio

Determines the importance of a journal by taking a measure of frequency with which the average article in a journal has been cited in a particular year.

A measure of average citations received per peer-reviewed paper published in the journal.

1.553

15% from 2018

Impact factor for Multisensory Research from 2016 - 2019
Year Value
2019 1.553
2018 1.829
2017 2.339
2016 1.962
graph view Graph view
table view Table view

3.1

18% from 2019

CiteRatio for Multisensory Research from 2016 - 2020
Year Value
2020 3.1
2019 3.8
2018 3.5
2017 3.1
2016 2.6
graph view Graph view
table view Table view

insights Insights

  • Impact factor of this journal has decreased by 15% in last year.
  • This journal’s impact factor is in the top 10 percentile category.

insights Insights

  • CiteRatio of this journal has decreased by 18% in last years.
  • This journal’s CiteRatio is in the top 10 percentile category.

SCImago Journal Rank (SJR)

Source Normalized Impact per Paper (SNIP)

Measures weighted citations received by the journal. Citation weighting depends on the categories and prestige of the citing journal.

Measures actual citations received relative to citations expected for the journal's category.

0.521

33% from 2019

SJR for Multisensory Research from 2016 - 2020
Year Value
2020 0.521
2019 0.783
2018 0.983
2017 0.834
2016 0.62
graph view Graph view
table view Table view

0.661

31% from 2019

SNIP for Multisensory Research from 2016 - 2020
Year Value
2020 0.661
2019 0.957
2018 0.728
2017 0.912
2016 0.566
graph view Graph view
table view Table view

insights Insights

  • SJR of this journal has decreased by 33% in last years.
  • This journal’s SJR is in the top 10 percentile category.

insights Insights

  • SNIP of this journal has decreased by 31% in last years.
  • This journal’s SNIP is in the top 10 percentile category.
Multisensory Research

Guideline source: View

All company, product and service names used in this website are for identification purposes only. All product names, trademarks and registered trademarks are property of their respective owners.

Use of these names, trademarks and brands does not imply endorsement or affiliation. Disclaimer Notice

Brill

Multisensory Research

Approved by publishing and review experts on SciSpace, this template is built as per for Multisensory Research formatting guidelines as mentioned in Brill author instructions. The current version was created on 19 Jul 2020 and has been used by 896 authors to write and format their manuscripts to this journal.

Ophthalmology

Computer Vision and Pattern Recognition

Experimental and Cognitive Psychology

Cognitive Neuroscience

Sensory Systems

Medicine

i
Last updated on
19 Jul 2020
i
ISSN
2213-4794
i
Impact Factor
Medium - 0.656
i
Open Access
No
i
Sherpa RoMEO Archiving Policy
Yellow faq
i
Plagiarism Check
Available via Turnitin
i
Endnote Style
Download Available
i
Bibliography Name
plainnat
i
Citation Type
Author Year
(Blonder et al., 1982)
i
Bibliography Example
G. E. Blonder, M. Tinkham, and T. M. Klapwijk. Transition from metallic to tunneling regimes in superconducting microconstrictions: Excess current, charge imbalance, and supercurrent conversion. Phys. Rev. B, 25(7):4515– 4532, 1982. URL 10.1103/PhysRevB.25.4515.

Top papers written in this journal

Journal Article DOI: 10.1163/22134808-00002478
Multisensory Integration and Calibration in Children and Adults with and without Sensory and Motor Disabilities.
Monica Gori1
01 Jan 2015 - Multisensory Research

Abstract:

During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibr... During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibration theory. This theory emerged from the observation that children start to integrate multisensory information (such as vision and touch) only after 8–10 years of age. Before this age the more accurate sense teaches (calibrates) the others; when one calibrating modality is missing, the other modalities result impaired. Children with visual disability have problems in understanding the haptic or auditory perception of space and children with motor disabilities have problems in understanding the visual dimension of objects. This review presents our recent studies on multisensory integration and cross-sensory calibration in children and adults with and without sensory and motor disabilities. The goal of this review is to show the importance of interaction between sensory systems during the early period of life in order to correct perceptual development to occur. read more read less

Topics:

Multisensory integration (63%)63% related to the paper, Perception (55%)55% related to the paper, Modality (human–computer interaction) (55%)55% related to the paper, Stimulus modality (55%)55% related to the paper, Modalities (51%)51% related to the paper
97 Citations
open accessOpen access Journal Article DOI: 10.1163/22134808-20191403
Extrinsic Auditory Contributions to Food Perception & Consumer Behaviour: an Interdisciplinary Review.
Charles Spence1, Felipe Reinoso-Carvalho, Carlos Velasco2, Qian Janice Wang3
01 Jan 2019 - Multisensory Research

Abstract:

Food product-extrinsic sounds (i.e., those auditory stimuli that are not linked directly to a food or beverage product, or its packaging) have been shown to exert a significant influence over various aspects of food perception and consumer behaviour, often operating outside of conscious awareness. In this review, we summarise... Food product-extrinsic sounds (i.e., those auditory stimuli that are not linked directly to a food or beverage product, or its packaging) have been shown to exert a significant influence over various aspects of food perception and consumer behaviour, often operating outside of conscious awareness. In this review, we summarise the latest evidence concerning the various ways in which what we hear can influence what we taste. According to one line of empirical research, background noise interferes with tasting, due to attentional distraction. A separate body of marketing-relevant research demonstrates that music can be used to bias consumers' food perception, judgments, and purchasing/consumption behaviour in various ways. Some of these effects appear to be driven by the arousal elicited by loud music as well as the entrainment of people's behaviour to the musical beat. However, semantic priming effects linked to the type and style of music are also relevant. Another route by which music influences food perception comes from the observation that our liking/preference for the music that we happen to be listening to carries over to influence our hedonic judgments of what we are tasting. A final route by which hearing influences tasting relates to the emerging field of 'sonic seasoning'. A developing body of research now demonstrates that people often rate tasting experiences differently when listening to soundtracks that have been designed to be (or are chosen because they are) congruent with specific flavour experiences (e.g., when compared to when listening to other soundtracks, or else when tasting in silence). Taken together, such results lead to the growing realization that the crossmodal influences of music and noise on food perception and consumer behaviour may have some important if, as yet, unrecognized implications for public health. read more read less

Topics:

Perception (53%)53% related to the paper, Loud music (53%)53% related to the paper, Crossmodal (52%)52% related to the paper, Consumer behaviour (52%)52% related to the paper, Active listening (50%)50% related to the paper
View PDF
91 Citations
Journal Article DOI: 10.1163/22134808-00002429
Visual and haptic representations of material properties.
01 Jan 2013 - Multisensory Research

Abstract:

Research on material perception has received an increasing amount of attention recently. Clearly, both the visual and the haptic sense play important roles in the perception of materials, yet it is still unclear how both senses compare in material perception tasks. Here, we set out to investigate the degree of correspondence ... Research on material perception has received an increasing amount of attention recently. Clearly, both the visual and the haptic sense play important roles in the perception of materials, yet it is still unclear how both senses compare in material perception tasks. Here, we set out to investigate the degree of correspondence between the visual and the haptic representations of different materials. We asked participants to both categorize and rate 84 different materials for several material properties. In the haptic case, participants were blindfolded and asked to assess the materials based on haptic exploration. In the visual condition, participants assessed the stimuli based on their visual impressions only. While categorization performance was less consistent in the haptic condition than in the visual one, ratings correlated highly between the visual and the haptic modality. PCA revealed that all material samples were similarly organized within the perceptual space in both modalities. Moreover, in both senses the first two principal components were dominated by hardness and roughness. These are two material features that are fundamental for the haptic sense. We conclude that although the haptic sense seems to be crucial for material perception, the information it can gather alone might not be quite fine-grained and rich enough for perfect material recognition. read more read less

Topics:

Haptic memory (63%)63% related to the paper, Haptic technology (56%)56% related to the paper, Stereotaxy (54%)54% related to the paper, Perception (50%)50% related to the paper
View PDF
90 Citations
Journal Article DOI: 10.1163/22134808-00002502
Crossmodal Correspondences: Standing Issues and Experimental Guidelines
Cesare Parise1
01 Jan 2016 - Multisensory Research

Abstract:

Crossmodal correspondences refer to the systematic associations often found across seemingly unrelated sensory features from different sensory modalities. Such phenomena constitute a universal trait of multisensory perception even in non-human species, and seem to result, at least in part, from the adaptation of sensory syste... Crossmodal correspondences refer to the systematic associations often found across seemingly unrelated sensory features from different sensory modalities. Such phenomena constitute a universal trait of multisensory perception even in non-human species, and seem to result, at least in part, from the adaptation of sensory systems to natural scene statistics. Despite recent developments in the study of crossmodal correspondences, there are still a number of standing questions about their definition, their origins, their plasticity, and their underlying computational mechanisms. In this paper, I will review such questions in the light of current research on sensory cue integration, where crossmodal correspondences can be conceptualized in terms of natural mappings across different sensory cues that are present in the environment and learnt by the sensory systems. Finally, I will provide some practical guidelines for the design of experiments that might shed new light on crossmodal correspondences. read more read less

Topics:

Crossmodal (69%)69% related to the paper, Sensory cue (53%)53% related to the paper, Perception (53%)53% related to the paper, Perceptual learning (50%)50% related to the paper, Stimulus modality (50%)50% related to the paper
85 Citations
Journal Article DOI: 10.1163/22134808-00002510
Statistically optimal multisensory cue integration: A practical tutorial
Marieke Rohde1, Loes van Dam1, Marc O. Ernst1
01 Jan 2016 - Multisensory Research

Abstract:

Humans combine redundant multisensory estimates into a coherent multimodal percept. Experiments in cue integration have shown for many modality pairs and perceptual tasks that multisensory information is fused in a statistically optimal manner: observers take the unimodal sensory reliability into consideration when performing... Humans combine redundant multisensory estimates into a coherent multimodal percept. Experiments in cue integration have shown for many modality pairs and perceptual tasks that multisensory information is fused in a statistically optimal manner: observers take the unimodal sensory reliability into consideration when performing perceptual judgments. They combine the senses according to the rules of Maximum Likelihood Estimation to maximize overall perceptual precision. This tutorial explains in an accessible manner how to design optimal cue integration experiments and how to analyse the results from these experiments to test whether humans follow the predictions of the optimal cue integration model. The tutorial is meant for novices in multisensory integration and requires very little training in formal models and psychophysical methods. For each step in the experimental design and analysis, rules of thumb and practical examples are provided. We also publish Matlab code for an example experiment on cue integration and a Matlab toolbox for data analysis that accompanies the tutorial online. This way, readers can learn about the techniques by trying them out themselves. We hope to provide readers with the tools necessary to design their own experiments on optimal cue integration and enable them to take part in explaining when, why and how humans combine multisensory information optimally. read more read less

Topics:

Multisensory integration (56%)56% related to the paper
78 Citations
Author Pic

SciSpace is a very innovative solution to the formatting problem and existing providers, such as Mendeley or Word did not really evolve in recent years.

- Andreas Frutiger, Researcher, ETH Zurich, Institute for Biomedical Engineering

Get MS-Word and LaTeX output to any Journal within seconds
1
Choose a template
Select a template from a library of 40,000+ templates
2
Import a MS-Word file or start fresh
It takes only few seconds to import
3
View and edit your final output
SciSpace will automatically format your output to meet journal guidelines
4
Submit directly or Download
Submit to journal directly or Download in PDF, MS Word or LaTeX

(Before submission check for plagiarism via Turnitin)

clock Less than 3 minutes

What to expect from SciSpace?

Speed and accuracy over MS Word

''

With SciSpace, you do not need a word template for Multisensory Research.

It automatically formats your research paper to Brill formatting guidelines and citation style.

You can download a submission ready research paper in pdf, LaTeX and docx formats.

Time comparison

Time taken to format a paper and Compliance with guidelines

Plagiarism Reports via Turnitin

SciSpace has partnered with Turnitin, the leading provider of Plagiarism Check software.

Using this service, researchers can compare submissions against more than 170 million scholarly articles, a database of 70+ billion current and archived web pages. How Turnitin Integration works?

Turnitin Stats
Publisher Logos

Freedom from formatting guidelines

One editor, 100K journal formats – world's largest collection of journal templates

With such a huge verified library, what you need is already there.

publisher-logos

Easy support from all your favorite tools

Multisensory Research format uses plainnat citation style.

Automatically format and order your citations and bibliography in a click.

SciSpace allows imports from all reference managers like Mendeley, Zotero, Endnote, Google Scholar etc.

Frequently asked questions

1. Can I write Multisensory Research in LaTeX?

Absolutely not! Our tool has been designed to help you focus on writing. You can write your entire paper as per the Multisensory Research guidelines and auto format it.

2. Do you follow the Multisensory Research guidelines?

Yes, the template is compliant with the Multisensory Research guidelines. Our experts at SciSpace ensure that. If there are any changes to the journal's guidelines, we'll change our algorithm accordingly.

3. Can I cite my article in multiple styles in Multisensory Research?

Of course! We support all the top citation styles, such as APA style, MLA style, Vancouver style, Harvard style, and Chicago style. For example, when you write your paper and hit autoformat, our system will automatically update your article as per the Multisensory Research citation style.

4. Can I use the Multisensory Research templates for free?

Sign up for our free trial, and you'll be able to use all our features for seven days. You'll see how helpful they are and how inexpensive they are compared to other options, Especially for Multisensory Research.

5. Can I use a manuscript in Multisensory Research that I have written in MS Word?

Yes. You can choose the right template, copy-paste the contents from the word document, and click on auto-format. Once you're done, you'll have a publish-ready paper Multisensory Research that you can download at the end.

6. How long does it usually take you to format my papers in Multisensory Research?

It only takes a matter of seconds to edit your manuscript. Besides that, our intuitive editor saves you from writing and formatting it in Multisensory Research.

7. Where can I find the template for the Multisensory Research?

It is possible to find the Word template for any journal on Google. However, why use a template when you can write your entire manuscript on SciSpace , auto format it as per Multisensory Research's guidelines and download the same in Word, PDF and LaTeX formats? Give us a try!.

8. Can I reformat my paper to fit the Multisensory Research's guidelines?

Of course! You can do this using our intuitive editor. It's very easy. If you need help, our support team is always ready to assist you.

9. Multisensory Research an online tool or is there a desktop version?

SciSpace's Multisensory Research is currently available as an online tool. We're developing a desktop version, too. You can request (or upvote) any features that you think would be helpful for you and other researchers in the "feature request" section of your account once you've signed up with us.

10. I cannot find my template in your gallery. Can you create it for me like Multisensory Research?

Sure. You can request any template and we'll have it setup within a few days. You can find the request box in Journal Gallery on the right side bar under the heading, "Couldn't find the format you were looking for like Multisensory Research?”

11. What is the output that I would get after using Multisensory Research?

After writing your paper autoformatting in Multisensory Research, you can download it in multiple formats, viz., PDF, Docx, and LaTeX.

12. Is Multisensory Research's impact factor high enough that I should try publishing my article there?

To be honest, the answer is no. The impact factor is one of the many elements that determine the quality of a journal. Few of these factors include review board, rejection rates, frequency of inclusion in indexes, and Eigenfactor. You need to assess all these factors before you make your final call.

13. What is Sherpa RoMEO Archiving Policy for Multisensory Research?

SHERPA/RoMEO Database

We extracted this data from Sherpa Romeo to help researchers understand the access level of this journal in accordance with the Sherpa Romeo Archiving Policy for Multisensory Research. The table below indicates the level of access a journal has as per Sherpa Romeo's archiving policy.

RoMEO Colour Archiving policy
Green Can archive pre-print and post-print or publisher's version/PDF
Blue Can archive post-print (ie final draft post-refereeing) or publisher's version/PDF
Yellow Can archive pre-print (ie pre-refereeing)
White Archiving not formally supported
FYI:
  1. Pre-prints as being the version of the paper before peer review and
  2. Post-prints as being the version of the paper after peer-review, with revisions having been made.

14. What are the most common citation types In Multisensory Research?

The 5 most common citation types in order of usage for Multisensory Research are:.

S. No. Citation Style Type
1. Author Year
2. Numbered
3. Numbered (Superscripted)
4. Author Year (Cited Pages)
5. Footnote

15. How do I submit my article to the Multisensory Research?

It is possible to find the Word template for any journal on Google. However, why use a template when you can write your entire manuscript on SciSpace , auto format it as per Multisensory Research's guidelines and download the same in Word, PDF and LaTeX formats? Give us a try!.

16. Can I download Multisensory Research in Endnote format?

Yes, SciSpace provides this functionality. After signing up, you would need to import your existing references from Word or Bib file to SciSpace. Then SciSpace would allow you to download your references in Multisensory Research Endnote style according to Elsevier guidelines.

Fast and reliable,
built for complaince.

Instant formatting to 100% publisher guidelines on - SciSpace.

Available only on desktops 🖥

No word template required

Typset automatically formats your research paper to Multisensory Research formatting guidelines and citation style.

Verifed journal formats

One editor, 100K journal formats.
With the largest collection of verified journal formats, what you need is already there.

Trusted by academicians

I spent hours with MS word for reformatting. It was frustrating - plain and simple. With SciSpace, I can draft my manuscripts and once it is finished I can just submit. In case, I have to submit to another journal it is really just a button click instead of an afternoon of reformatting.

Andreas Frutiger
Researcher & Ex MS Word user
Use this template