scispace - formally typeset
Search or ask a question
Posted ContentDOI

Micro-Meta App: an interactive software tool to facilitate the collection of microscopy metadata based on community-driven specifications [preprint]

TL;DR: Micro-Meta App as mentioned in this paper is an open-source software designed to facilitate the extraction and collection of relevant microscopy metadata, which can be used for documenting imaging experiments and shared with the community.
Abstract: For the information content of microscopy images to be appropriately interpreted, reproduced, and meet FAIR (Findable Accessible Interoperable and Reusable) principles, they should be accompanied by detailed descriptions of microscope hardware, image acquisition settings, image pixel and dimensional structure, and instrument performance. Nonetheless, the thorough documentation of imaging experiments is significantly impaired by the lack of community-sanctioned easy-to-use software tools to facilitate the extraction and collection of relevant microscopy metadata. Here we present Micro-Meta App, an intuitive open-source software designed to tackle these issues that was developed in the context of nascent global bioimaging community organizations, including BioImaging North America (BINA) and QUAlity Assessment and REProducibility in Light Microscopy (QUAREP-LiMi), whose goal is to improve reproducibility, data quality and sharing value for imaging experiments. The App provides a user-friendly interface for building comprehensive descriptions of the conditions utilized to produce individual microscopy datasets as specified by the recently proposed 4DN-BINA-OME tiered-system of Microscopy Metadata model. To achieve this goal the App provides a visual guide for a microscope-user to: 1) interactively build diagrammatic representations of hardware configurations of given microscopes that can be easily reused and shared with colleagues needing to document similar instruments. 2) Automatically extracts relevant metadata from image files and facilitates the collection of missing image acquisition settings and calibration metrics associated with a given experiment. 3) Output all collected Microscopy Metadata to interoperable files that can be used for documenting imaging experiments and shared with the community. In addition to significantly lowering the burden of quality assurance, the visual nature of Micro-Meta App makes it particularly suited for training users that have limited knowledge of the intricacies of light microscopy experiments. To ensure wide-adoption by microscope-users with different needs Micro-Meta App closely interoperates with MethodsJ2 and OMERO.mde, two complementary tools described in parallel manuscripts.

Summary (3 min read)

1. Introduction

  • Nowadays, it is hard to imagine a life without the benefits and achievements the fossil age has brought us.
  • Specific measures are manifold and it is clear that this can only be achieved by shifting global energy generation as well as chemical production away from fossil towards renewable and alternative resources [2,3].
  • Depending on the electrode material, different products can be produced, mainly formic acid/ formate, CO, methane but also other small hydrocarbons [6,7].
  • For technical realization, high yields - characterized by high current density (CD, linked to reaction rate by Faraday’s law) and Faradaic efficiency (FE, percentage of current going into desired product) - are mandatory and CDs on the order of 100-500 mA∙cm-2 or even 1000 mA∙cm-2 as recently suggested have to be aimed at [27].

2.1 Electrode preparation

  • GDEs have been prepared via a newly developed and simple dry deposition method described in [33].
  • In short, the components are thoroughly mixed in a knife mill (IKA, M20 Universal mil), put in a cylindrical mask of 12.56 cm² and pressed with a pressure of 11 kN∙cm-2.
  • The GDE is sintered in an oven at 340 °C, slightly above the melting point of PTFE for 10 min in N2 atmosphere.
  • The active metal tin is either mechanically mixed using commercial Sn-nanopowder (Sigma-Aldrich, <150 nm, >99% trace metal basis) with a loading of 1 mg∙cm-2 or supported on the carbon black beforehand.
  • On the anode side a commercial catalyst-coated membrane (Ion-Power, Nafion NR-212, 50 µm thickness, coated with Pt/C, 0.5 mg Pt∙cm-2) was employed.

2.2 Catalyst synthesis

  • The carbon-supported SnO2 electrocatalyst is synthesized by a homogenous precipitation method, introduced by [37].
  • As precursor solution SnCl2 is mixed with urea in aqueous solution, carbon black added and sonicated for one hour and mixed for 4 hours.
  • The slurry was then refluxed at 90 °C for 4 h through which urea slowly decomposed, the pH-value is homogenously increased resulting in precipitation of tin oxide (SnOx) on the carbon surface.
  • Actual tin mass has been quantified by burning off the carbon at 1000 °C and weighing the residue.

2.3 Electrochemical reactor and experimental procedure

  • Semi-batch experiments were conducted in a custom-made cell made from PMMA consisting of cathode and anode chamber separated by a proton-conducting membrane (Nafion® 117, DuPont).
  • A scheme and more details on the experimental procedure are given in [33].
  • The experiments in continuous mode of operation were performed using a self-developed microstructured electrochemical flowcell made from PMMA (PEEK for KOH-experiments), a scheme of the setup is shown in Fig. 1. CO2 (Westfalen, 5.9) was continuously fed on the cathode side at a flow-rate of 8 mL/min, hydrogen was fed on the anode side.
  • When noted, iR-compensation was performed using Current-Interrupt technique automatically done by the Gamry software between each data point.

2.4 Analytics and characterization

  • The amount of formate produced was quantified using high performance liquid chromatography with an Agilent Technology type 1200 equipped with a Nucleogel Sugar 810H column (Macherey–Nagel) and an RI detector.
  • To evaluate gas phase composition, gas chromatography was used on an Agilent Technologies 7890.
  • Samples were injected through a Poropak Q and Molsieve 5A connected to the TCD.
  • A quantitative assessment of formate production is not possible anymore.
  • As this is not an intrinsic problem of the process (concentrations in continuous mode much smaller due to low residence time) but rather an issue of analysis, reaction was stopped after 0.5 h to quantify formate concentration.

2.5 Characterization

  • In order to evaluate dispersion of the catalyst, scanning electron microscopy (SEM) images have been taken at the DLR with a Zeiss ULTRA plus microscope equipped with an AsB (angle selective backscattered electron) detector to obtain material contrast images.
  • X-ray Photoelectron Spectroscopy (XPS) has been performed by Pawel Gazdzicki from the DLR using a Thermo ESCALAB 250 (Thermo Electron Corporation) ultra-high vacuum facility with a base pressure of 1×10-9 mbar.
  • Analysis was conducted using a nonmonochromated AlKa X-ray source (Thermo XR4) operated at 300W in combination with a hemispherical six channeltron electron energy analyzer operated in small area mode (0.8mm2 analyzed surface area).
  • The concentrations of elements were quantified using sensitivity factors provided by Thermo Scientific using Shirley algorithm for peak background correction.

3. Results

  • Results from semi-batch experiments and use of carbon-supported SnO2 nanoparticles.
  • The development of a dry deposition technique for their preparation has been reported, which produces GDEs that allow CO2 reduction towards formate up to a current density (CD) of 200 mA∙cm-2.
  • The latter occurs due to the higher activity with increasing loading what makes less negative electrode potentials necessary for a given CD.
  • Besides the very good agreement, in both figures further observations known from semi-batch experiments can be made.
  • As comparison of Fig. 7 a) and b) shows, ohmic contribution makes up a huge fraction of the reactor voltage, whereas actual electrode potential is much lower, e.g. 1.5 V compared to 5 V reactor voltage at 100 mA∙cm-2.

4. Conclusion

  • The aim of this study was to demonstrate the transfer of electrochemical CO2 reduction into continuous mode of operation at industrially relevant current density (CD) and to show how the reaction parameters, namely the electrolyte, are important tools to substantially improve performance.
  • For that purpose, gas diffusion electrodes (GDE) have been prepared via a dry deposition technique and loaded with carbonsupported SnO2 nanoparticles.
  • As for technical realization the process needs to be conducted in a continuous manner, the next step was to examine this mode of operation in a microstructured flow-cell.
  • The transfer was shown to be successful in terms of product distribution and the CD range accessible.
  • Necessary improvements would therefore comprise the further decrease of ohmic losses, also those not derived from electrolyte conductivity and to allow for even higher CD at low reactor voltage by improving the catalyst and by sophisticated GDE optimization which is one of the keys to come closer to technical application.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Micro-Meta App: an interactive software tool to
facilitate the collection of microscopy metadata
based on community-driven specifications
Alex Rigano
1
, Shannon Ehmsen
2
, (https://orcid.org/0000-0001-8835-787X) Serkan Utku Ozturk
2
, Joel Ryan
3
, Alexander Balashov
2
,
Mathias Hammer
4
, (https://orcid.org/0000-0002-2289-0652) Koray Kirli
2
, (https://orcid.org/0000-0003-3889-468X) Karl Bellve
1
,
(https://orcid.org/0000-0001-7471-2244) Ulrike Boehm
#
5
, (https://orcid.org/0000-0003-1622-663X) Claire M. Brown
#3
,
(https://orcid.org/0000-0003-3883-8215) James J. Chambers
#
6
, (https://orcid.org/0000-0002-7367-9603) Robert A. Coleman
7
,
(https://orcid.org/0000-0003-3417-0713) Andrea Cosolo
2
, (https://orcid.org/0000-0001-5965-5405) Orestis Faklaris
#
8
,
(https://orcid.org/0000-0002-4651-3480) Kevin Fogarty
1
, (https://orcid.org/0000-0001-5069-0730) Thomas Guilbert
9
,
(https://orcid.org/0000-0001-7374-8245) Anna B. Hamacher
10
, (https://orcid.org/0000-0001-6853-1228) Michelle S. Itano
11
, Daniel
P. Keeley
11
, (https://orcid.org/0000-0001-6523-7496) Susanne Kunis
12
, (https://orcid.org/0000-0002-8783-8599) Judith Lacoste
#
13
,
(https://orcid.org/0000-0002-3853-1187) Alex Laude
#
14
, Willa Ma
11
, (https://orcid.org/0000-0002-2392-8640) Marco Marcello
15
,
(https://orcid.org/0000-0002-5983-2296) Paula Montero-Llopis
16
, (http://orcid.org/0000-0002-1895-4772) Glyn Nelson
#14
,
(https://orcid.org/0000-0002-9397-8475) Roland Nitschke
#
17
, (https://orcid.org/0000-0001-8569-0466) Jaime A. Pimentel
#
18
,
(https://orcid.org/0000-0001-7734-3771) Stefanie Weidtkamp-Peters
10
, Peter J. Park
2
, (https://orcid.org/0000-0002-5019-7652) Burak
Alver
2
, David Grunwald
3
, and (https://orcid.org/0000-0002-1069-1816) Caterina Strambio-De-Castillia
#1
# Members of the Bioimaging North America Quality Control and Data Management Working Group
Keywords: bioimage informatics, calibration, data formats, data provenance, data standards, image quality, imaging, metadata, microscopy, open
microscopy quality control, reproducibility
Abbreviation list: BINA, BioImaging North America; 4DN, 4D Nucleome; FAIR, Findable Accessible Interoperable and Reproducible; OME, Open
Microscopy Environment; QUAREP-LiMi, QUAlity Assessment and REProducibility for Instrument and Images in Light Microscopy
1
Program in Molecular Medicine, UMass Medical School, Worcester MA 01605, USA
2
Department of Biomedical Informatics, Harvard Medical School, Boston, MA 02115, USA
3
Advanced BioImaging Facility (ABIF), McGill University, Montreal, Quebec, H3G 0B1, Canada
4
RNA Therapeutics Institute, UMass Medical School, Worcester MA 01605, USA
5
Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA
6
Institute for Applied Life Sciences, University of Massachusetts, Amherst, MA 01003, USA
7
Gruss-Lipper Biophotonics Center, Department of Anatomy and Structural Biology, Albert Einstein College of Medicine, Bronx, NY
10461
8
BCM, Univ. Montpellier, CNRS, INSERM, Montpellier, France
9
Institut Cochin, Inserm U1016-CNRS UMR8104-Université de Paris, 75014 Paris, France
10
Center for Advanced Imaging, Heinrich-Heine University Duesseldorf, 40225 Düsseldorf, Germany
11
UNC Neuroscience Microscopy Core Facility, University of North Carolina, Chapel Hill, NC 27599-7250
12
Department of Biology/Chemistry and Centre for Cellular Nanoanalytics, University Osnabrueck, 49076 Osnabrück, Germany
13
MIA Cellavie Inc., Montreal, Quebec, H1K 4G6, Canada
14
Bioimaging Unit, Newcastle University, Newcastle upon Tyne, NE2 4HH, UK
15
Center for Cell Imaging, University of Liverpool, Liverpool, L69 3BE, UK
16
Microscopy Resources of the North Quad, University of Harvard Medical School, Boston, MA 02115
17
Life Imaging Center and BIOSS Centre for Biological Signaling Studies, Albert-Ludwigs-University Freiburg, Freiburg, 79104,
Germany
18
Instituto de Biotecnologıa, Universidad Nacional Autonoma de Mexico, Cuernavaca, Morelos, 62210, México
.CC-BY-NC-ND 4.0 International licenseavailable under a
was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made
The copyright holder for this preprint (whichthis version posted June 1, 2021. ; https://doi.org/10.1101/2021.05.31.446382doi: bioRxiv preprint

2
Abstract
For the information content of microscopy images to be appropriately interpreted, reproduced, and meet FAIR
(Findable Accessible Interoperable and Reusable) principles, they should be accompanied by detailed descriptions
of microscope hardware, image acquisition settings, image pixel and dimensional structure, and instrument
performance. Nonetheless, the thorough documentation of imaging experiments is significantly impaired by the
lack of community-sanctioned easy-to-use software tools to facilitate the extraction and collection of relevant
microscopy metadata. Here we present Micro-Meta App, an intuitive open-source software designed to tackle
these issues that was developed in the context of nascent global bioimaging community organizations, including
BioImaging North America (BINA) and QUAlity Assessment and REProducibility in Light Microscopy
(QUAREP-LiMi), whose goal is to improve reproducibility, data quality and sharing value for imaging
experiments. The App provides a user-friendly interface for building comprehensive descriptions of the
conditions utilized to produce individual microscopy datasets as specified by the recently proposed 4DN-BINA-
OME tiered-system of Microscopy Metadata model. To achieve this goal the App provides a visual guide for a
microscope-user to: 1) interactively build diagrammatic representations of hardware configurations of given
microscopes that can be easily reused and shared with colleagues needing to document similar instruments. 2)
Automatically extracts relevant metadata from image files and facilitates the collection of missing image
acquisition settings and calibration metrics associated with a given experiment. 3) Output all collected
Microscopy Metadata to interoperable files that can be used for documenting imaging experiments and shared
with the community. In addition to significantly lowering the burden of quality assurance, the visual nature of
Micro-Meta App makes it particularly suited for training users that have limited knowledge of the intricacies of
light microscopy experiments. To ensure wide-adoption by microscope-users with different needs Micro-Meta
App closely interoperates with MethodsJ2 and OMERO.mde, two complementary tools described in parallel
manuscripts.
.CC-BY-NC-ND 4.0 International licenseavailable under a
was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made
The copyright holder for this preprint (whichthis version posted June 1, 2021. ; https://doi.org/10.1101/2021.05.31.446382doi: bioRxiv preprint

3
Background
The establishment of community-driven, shared documentation and quality control specifications for light
microscopy would allow to appropriately document imaging experiments, minimize errors and quantify any
residual uncertainty associated with each step of the procedure (16). In addition to providing essential
information about the provenance (i.e., origin, lineage) (7, 8) of microscopy results, this would make it possible
to faithfully interpret scientific claims, facilitate comparisons within and between experiments, foster
reproducibility, and maximize the likelihood that data can be re-used by other scientists for further insight (5, 6,
9, 10). First and foremost, such information would serve to facilitate the compilation of accurate Methods sections
for publications that utilize the quantitative power of microscopy experiments to answer scientific questions (11
13). Furthermore, it would provide clear guidance to the manufacturers of microscopy instruments, hardware
components, and processing software about what information the scientific community requires to ensure
scientific rigor so that they can be automatically provided during acquisition and written in the headers of image
files. Last but not least, machine-actionable versions of the same information (14) could be provided alongside
image datasets on the growing number of public image data resources (3) that allow the deposition of raw image
data associated with scientific manuscripts, a promise to emulate for light microscopy the successful path that has
led to community standards in the field of genomics (1519) (e.g., the IDR (20), EMPIAR (21), and Bioimage
Archive (22) hosted at the EMBL - EBI; the European Movincell (23); the Japanese SSBD hosted by RIKEN
(24); and, in the USA, the NIH-funded Cell Image Library (25, 26), BRAIN initiative’s imaging resources (27),
the Allen Cell Explorer (28), and the Human Cell Atlas (2932)).
In order to promote the development of shared community-driven Microscopy Metadata standards, the NIH
funded 4D Nucleome (4DN) (33, 34) and the Chan Zuckerberg Initiative (CZI) funded BioImaging North
America (BINA) Quality Control and Data Management Working Group (QC-DM-WG) (35) have recently
proposed the 4DN-BINA-OME (NBO) a tiered-system for Microscopy Metadata specifications (3639). The
4DN-BINA-OME specifications lay the foundations for upcoming community-sanctioned standards being
developed in the context of the Metadata Working Group (WG7) of the QUAlity Assessment and REProducibility
for Instrument and Images in Light Microscopy (QUAREP-LiMi) initiative (quarep.org) (4, 40). Their purpose is
to provide a scalable, interoperable and Open Microscopy Environment (OME) (4143) Next-Generation File
Format (NGFF) (44) compatible framework for light microscopy metadata guiding scientists as to what
provenance metadata and calibration metrics should be provided to ensure the reproducibility of different
categories of imaging experiments.
Despite their value in indicating a path forward, guidelines, specifications, and standards on their own lack
the one essential feature that would make them actionable by experimental scientists faced with the challenge of
producing well-documented, high-quality, reproducible and re-usable datasets: namely easy-to-use software tools
or even better-automated pipelines to extract all available metadata from microscope configuration and image
data files.
While some advances have been proposed, such as OMERO.forms (45), PyOmeroUpload (46) and MethodsJ
(5), these tools only offer limited functionalities, are not integrated with community standards and are not per se
future proof. To provide a way forward, in this and in two related manuscripts, we present a suite of three
interoperable software tools (Supplemental Figure 1) that were developed to provide highly complementary,
intuitive approaches for the bench-side collection of Image Metadata, with particular reference to Experimental
Metadata and Microscopy Metadata (37, 38). In two related manuscripts, we describe: 1) OMERO.mde, which
is highly integrated with the widely used OMERO image data management repository and emphasizes the
.CC-BY-NC-ND 4.0 International licenseavailable under a
was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made
The copyright holder for this preprint (whichthis version posted June 1, 2021. ; https://doi.org/10.1101/2021.05.31.446382doi: bioRxiv preprint

4
development of flexible, nascent specifications for experimental metadata (4749); and 2) MethodsJ2 (50),
which is designed as an ImageJ plugin and emphasizes the consolidation of metadata from multiple sources and
automatic generation of Methods sections of scientific publications.
In this manuscript, we present Micro-Meta App (Figure 1), which works both as a stand-alone app on the
user’s desktop and as an integrated resource in third party web data portals. It offers a visual guide to navigate
through the different steps required for the rigorous documentation of imaging experiments (Figures 2-4) as
sanctioned by community specifications such as the 4DN-BINA-OME (NBO) Microscopy Metadata
specifications that were recently put forth to extend the OME Data Model (3638, 51).
Methods: Implementation and Availability
Micro-Meta App is available in two JavaScript (JS) implementations. The first was designed to facilitate the
incorporation of the software in existing third party web portals (i.e., the 4DN Data Portal) (34, 52) and was
developed using the JavaScript React library, which is widely used to build web-based user interfaces. Starting
from this version, a stand-alone version of the App was developed by wrapping the React implementation using
the JavaScript Electron library, with the specific purpose of lowering the barrier of adoption of the tool by labs
that do not have access or prefer not to use imaging databases. More details about the implementation of Micro-
Meta App are available in Supplemental Material.
In order to promote the adoption of Micro-Meta App, incorporation in third party data portals and re-use of the
source code by other developers, the executables and source code for both Javascript React and Electron
implementations of Micro-Meta App are available on GitHub (53, 54). In addition, a website describing Micro-
Meta App (55) was developed alongside complete documentation and tutorials (56).
Results + Discussion
Micro-Meta App: an intuitive, highly visual interface to facilitate microscopy metadata collection
While the establishment of data formats, metadata standards and QC procedures is important, it is not per se
sufficient to make sure reporting and data quality guidelines are adopted by the community. To ensure their
routine utilization, it is, therefore, necessary to produce software tools that expedite QC procedures and image
data documentation and make it straightforward for investigators to reproduce results and make decisions
regarding the utility of specific datasets for addressing their specific questions. However, despite the availability
of the OME Data Model and Bioformats (41, 43), the lack of standards has resulted in a scarce adoption of
minimal information criteria and as a result the metadata provided by instrument and software manufacturers is
scarce (Supplemental Table I and II).
Micro-Meta App was developed to address this unmet need. Micro-Meta App consists of a Graphical User
Interface (GUI)-based open-source and interoperable software tool to facilitate and (when possible) automate the
annotation of fluorescence microscopy datasets. The App provides an interactive approach to navigate through
the different steps required for documenting light microscopy experiments based on available OME-compatible
community-sanctioned tiered systems of specifications. Thus, Micro-Meta App is not only capable of adapting to
varying levels of imaging complexity and user experience but also to evolving data models that might emerge
from the community. At the time of writing, the App implements the Core of the OME Data Model and the tiered
4DN-BINA-OME Basic extension (3638, 51). Efforts to implement the current Confocal and Advanced as well
.CC-BY-NC-ND 4.0 International licenseavailable under a
was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made
The copyright holder for this preprint (whichthis version posted June 1, 2021. ; https://doi.org/10.1101/2021.05.31.446382doi: bioRxiv preprint

5
as the Calibration and Performance 4DN-BINA-OME extensions are underway (see Future Directions). To
achieve this goal, Micro-Meta App is organized around two highly related data processing flows (Figure 1):
1) In the Manage Instrument modality, the App guides the users through the steps required to interactively build
a diagrammatic representation of a given Microscope (Figures 2A and 3) by dragging-and-dropping individual
components onto the workspace and entering the relevant attribute values based on the tier-level that best
suites the microscope modality, experimental design, instrument complexity, and image analysis needs (38).
2) From this, Micro-Meta App automatically generates structured descriptions of the microscope Hardware
Specifications and outputs them as interoperable Microscope.JSON files (example available on Zenodo as
illustrated in Supplemental Material) (57) that can be saved locally, used by existing third-party web-portals
(52), integrated with other software tools (MethodsJ2) and shared with other scientists, thus significantly
lowering the manual work required for rigorous record-keeping and enabling rapid uptake and widespread
implementation.
3) When user is ready to collect metadata to document the acquisition of specific image data sets, the Manage
Settings section of the App automatically imports Hardware Specifications metadata from previously-
prepared Microscope.JSON files and uses the BioFormats library (43) to extract available, OME-compatible
metadata from an image data file of interest. From this basis, the App interactively guides the user to enter
missing metadata specifying the tier-appropriate Settings used for a specific Image Acquisition session
(Figures 2B and 4).
4) As a final step, the App generates interoperable paired Microscope- and Settings- JSON files (example
available on Zenodo as illustrated in Supplemental Material) (57) that together contain comprehensive
documentation of the conditions utilized to produce individual microscopy datasets and can be stored locally
or integrated by third-party data portals (i.e., the 4D Nucleome Data Portal) (58).
Depending on the specific implementation of the Micro-Meta App being used (see Implementation section), the
workflow varies slightly. The discussion below refers specifically to the stand-alone version of Micro-Meta App
implemented in JavaScript Electron.
Manage Instrument Hardware
The purpose of this section of Micro-Meta App is to guide microscope users and custodians in the creation of
accurate but at the same time intuitive and easy-to-generate visual depictions of a given microscope. This is done
while collecting relevant information for each hardware component that scales with experimental intent,
instrument complexity and analytical needs of individual imaging experiments depending on tier-levels
sanctioned by the 4DN-BINA-OME Microscopy Metadata specifications (3638, 51). Specifically, the workflow
(Figure 3) is composed of the following steps:
1) After launching the application, the user selects an appropriate Tier to be used (Figure 3A) to document a
given imaging experiment as determined by following the 4DN-BINA-OME tiered specifications (3638,
51) and launches the Manage Instrument modality of Micro-Meta App by clicking the appropriate button
(Figure 3B). Because Micro-Meta App was specifically designed to be tier-aware, Micro-Meta App
automatically displays only metadata fields that are specified by 4DN-BINA-OME to belong to the tier that
was selected upon launching the App (Figure 3A), thus massively reducing the documentation burden. In
addition, to increase flexibility, the tier-level utilized for validation can be modified dynamically after
opening the main Manage Instrument workspace. This way, the user can, for example, be presented with all
Tier 2 appropriate fields while being required to only fill in Tier 1 fields for validation (see also point 3 ii).
.CC-BY-NC-ND 4.0 International licenseavailable under a
was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made
The copyright holder for this preprint (whichthis version posted June 1, 2021. ; https://doi.org/10.1101/2021.05.31.446382doi: bioRxiv preprint

Citations
More filters
Posted ContentDOI
26 Apr 2021-bioRxiv
TL;DR: The 4D Nucleome Initiative (4DN) and the BioImaging North America (BINA)-OME (NBO namespace) as mentioned in this paper have proposed a set of metadata specifications for light microscopy data that scale with experimental intent and with the complexity of the instrumentation and analytical requirements.
Abstract: 1 - ABSTRACT Digital light microscopy provides powerful tools for quantitatively probing the real-time dynamics of subcellular structures. While the power of modern microscopy techniques is undeniable, rigorous record-keeping and quality control are required to ensure that imaging data may be properly interpreted (quality), reproduced (reproducibility), and used to extract reliable information and scientific knowledge which can be shared for further analysis (value). Keeping notes on microscopy experiments and quality control procedures ought to be straightforward, as the microscope is a machine whose components are defined and the performance measurable. Nevertheless, to this date, no universally adopted community-driven specifications exist that delineate the required information about the microscope hardware and acquisition settings (i.e., microscopy “data provenance” metadata) and the minimally accepted calibration metrics (i.e., microscopy quality control metadata) that should be automatically recorded by both commercial microscope manufacturers and customized microscope developers. In the absence of agreed guidelines, it is inherently difficult for scientists to create comprehensive records of imaging experiments and ensure the quality of resulting image data or for manufacturers to incorporate standardized reporting and performance metrics. To add to the confusion, microscopy experiments vary greatly in aim and complexity, ranging from purely descriptive work to complex, quantitative and even sub-resolution studies that require more detailed reporting and quality control measures. To solve this problem, the 4D Nucleome Initiative (4DN) (1, 2) Imaging Standards Working Group (IWG), working in conjunction with the BioImaging North America (BINA) Quality Control and Data Management Working Group (QC-DM-WG) (3), here propose light Microscopy Metadata specifications that scale with experimental intent and with the complexity of the instrumentation and analytical requirements. They consist of a revision of the Core of the Open Microscopy Environment (OME) Data Model, which forms the basis for the widely adopted Bio-Formats library (4–6), accompanied by a suite of three extensions, each with three tiers, allowing the classification of imaging experiments into levels of increasing imaging and analytical complexity (7, 8). Hence these specifications not only provide an OME-based comprehensive set of metadata elements that should be recorded, but they also specify which subset of the full list should be recorded for a given experimental tier. In order to evaluate the extent of community interest, an extensive outreach effort was conducted to present the proposed metadata specifications to members of several core-facilities and international bioimaging initiatives including the European Light Microscopy Initiative (ELMI), Global BioImaging (GBI), and European Molecular Biology Laboratory (EMBL) - European Bioinformatics Institute (EBI). Consequently, close ties were established between our endeavour and the undertakings of the recently established QUAlity Assessment and REProducibility for Instruments and Images in Light Microscopy global community initiative (9). As a result this flexible 4DN-BINA-OME (NBO namespace) framework (7, 8) represents a turning point towards achieving community-driven Microscopy Metadata standards that will increase data fidelity, improve repeatability and reproducibility, ease future analysis and facilitate the verifiable comparison of different datasets, experimental setups, and assays, and it demonstrates the method for future extensions. Such universally accepted microscopy standards would serve a similar purpose as the Encode guidelines successfully adopted by the genomic community (10, 11). The intention of this proposal is therefore to encourage participation, critiques and contributions from the entire imaging community and all stakeholders, including research and imaging scientists, facility personnel, instrument manufacturers, software developers, standards organizations, scientific publishers, and funders.

12 citations

Journal ArticleDOI
TL;DR: This review aims to summarize the key points that need to be considered when setting up and analyzing a live-cell imaging experiment and puts a particular focus on yeast, but many of the concepts discussed are applicable also to other organisms.
Abstract: Live-cell microscopy is a powerful tool that can reveal cellular behavior as well as the underlying molecular processes. A key advantage of microscopy is that by visualizing biological processes, it can provide direct insights. Nevertheless, live-cell imaging can be technically challenging and prone to artifacts. For a successful experiment, many careful decisions are required at all steps from hardware selection to downstream image analysis. Facing these questions can be particularly intimidating due to the requirement for expertise in multiple disciplines, ranging from optics, biophysics, and programming to cell biology. In this review, we aim to summarize the key points that need to be considered when setting up and analyzing a live-cell imaging experiment. While we put a particular focus on yeast, many of the concepts discussed are applicable also to other organisms. In addition, we discuss reporting and data sharing strategies that we think are critical to improve reproducibility in the field.

10 citations

Posted ContentDOI
24 Jun 2021-bioRxiv
TL;DR: MethodsJ2 as discussed by the authors is an ImageJ/Fiji based software tool that gathers metadata and automatically generates text for the methods section of publications to reproduce microscopy experiments, interpret results and share images.
Abstract: Proper reporting of metadata is essential to reproduce microscopy experiments, interpret results and share images. Experimental scientists can report details about sample preparation and imaging conditions while imaging scientists have the expertise required to collect and report the image acquisition, hardware and software metadata information. MethodsJ2 is an ImageJ/Fiji based software tool that gathers metadata and automatically generates text for the methods section of publications.

1 citations

References
More filters
Journal ArticleDOI
TL;DR: The ultimate goal of this work is to establish a standard for recording and reporting microarray-based gene expression data, which will in turn facilitate the establishment of databases and public repositories and enable the development of data analysis tools.
Abstract: Microarray analysis has become a widely used tool for the generation of gene expression data on a genomic scale. Although many significant results have been derived from microarray studies, one limitation has been the lack of standards for presenting and exchanging such data. Here we present a proposal, the Minimum Information About a Microarray Experiment (MIAME), that describes the minimum information required to ensure that microarray data can be easily interpreted and that results derived from its analysis can be independently verified. The ultimate goal of this work is to establish a standard for recording and reporting microarray-based gene expression data, which will in turn facilitate the establishment of databases and public repositories and enable the development of data analysis tools. With respect to MIAME, we concentrate on defining the content and structure of the necessary information rather than the technical format for capturing it.

4,030 citations

Journal ArticleDOI
Aviv Regev1, Aviv Regev2, Aviv Regev3, Sarah A. Teichmann4, Sarah A. Teichmann5, Sarah A. Teichmann6, Eric S. Lander7, Eric S. Lander2, Eric S. Lander1, Ido Amit8, Christophe Benoist7, Ewan Birney5, Bernd Bodenmiller9, Bernd Bodenmiller5, Peter J. Campbell6, Peter J. Campbell4, Piero Carninci6, Menna R. Clatworthy10, Hans Clevers11, Bart Deplancke12, Ian Dunham5, James Eberwine13, Roland Eils14, Roland Eils15, Wolfgang Enard16, Andrew Farmer, Lars Fugger17, Berthold Göttgens6, Nir Hacohen2, Nir Hacohen7, Muzlifah Haniffa18, Martin Hemberg4, Seung K. Kim19, Paul Klenerman17, Paul Klenerman20, Arnold R. Kriegstein21, Ed S. Lein22, Sten Linnarsson23, Emma Lundberg24, Emma Lundberg19, Joakim Lundeberg24, Partha P. Majumder, John C. Marioni6, John C. Marioni5, John C. Marioni4, Miriam Merad25, Musa M. Mhlanga26, Martijn C. Nawijn27, Mihai G. Netea28, Garry P. Nolan19, Dana Pe'er29, Anthony Phillipakis2, Chris P. Ponting30, Stephen R. Quake19, Wolf Reik4, Wolf Reik31, Wolf Reik6, Orit Rozenblatt-Rosen2, Joshua R. Sanes7, Rahul Satija32, Ton N. Schumacher33, Alex K. Shalek1, Alex K. Shalek2, Alex K. Shalek34, Ehud Shapiro8, Padmanee Sharma35, Jay W. Shin, Oliver Stegle5, Michael R. Stratton4, Michael J. T. Stubbington4, Fabian J. Theis36, Matthias Uhlen24, Matthias Uhlen37, Alexander van Oudenaarden11, Allon Wagner38, Fiona M. Watt39, Jonathan S. Weissman, Barbara J. Wold40, Ramnik J. Xavier, Nir Yosef38, Nir Yosef34, Human Cell Atlas Meeting Participants 
05 Dec 2017-eLife
TL;DR: An open comprehensive reference map of the molecular state of cells in healthy human tissues would propel the systematic study of physiological states, developmental trajectories, regulatory circuitry and interactions of cells, and also provide a framework for understanding cellular dysregulation in human disease.
Abstract: The recent advent of methods for high-throughput single-cell molecular profiling has catalyzed a growing sense in the scientific community that the time is ripe to complete the 150-year-old effort to identify all cell types in the human body. The Human Cell Atlas Project is an international collaborative effort that aims to define all human cell types in terms of distinctive molecular profiles (such as gene expression profiles) and to connect this information with classical cellular descriptions (such as location and morphology). An open comprehensive reference map of the molecular state of cells in healthy human tissues would propel the systematic study of physiological states, developmental trajectories, regulatory circuitry and interactions of cells, and also provide a framework for understanding cellular dysregulation in human disease. Here we describe the idea, its potential utility, early proofs-of-concept, and some design considerations for the Human Cell Atlas, including a commitment to open data, code, and community.

1,391 citations

Journal ArticleDOI
TL;DR: An open standard format for multidimensional microscopy image data is described and it is called on the community to use open image data standards and to insist that all imaging platforms support these file formats.
Abstract: Data sharing is important in the biological sciences to prevent duplication of effort, to promote scientific integrity, and to facilitate and disseminate scientific discovery. Sharing requires centralized repositories, and submission to and utility of these resources require common data formats. This is particularly challenging for multidimensional microscopy image data, which are acquired from a variety of platforms with a myriad of proprietary file formats (PFFs). In this paper, we describe an open standard format that we have developed for microscopy image data. We call on the community to use open image data standards and to insist that all imaging platforms support these file formats. This will build the foundation for an open image data repository.

818 citations

Journal ArticleDOI
TL;DR: These evaluations show that, unlike previous bio-inspired models, the latest DNNs rival the representational performance of IT cortex on this visual object recognition task and propose an extension of “kernel analysis” that measures the generalization accuracy as a function of representational complexity.
Abstract: The primate visual system achieves remarkable visual object recognition performance even in brief presentations, and under changes to object exemplar, geometric transformations, and background variation (a.k.a. core visual object recognition). This remarkable performance is mediated by the representation formed in inferior temporal (IT) cortex. In parallel, recent advances in machine learning have led to ever higher performing models of object recognition using artificial deep neural networks (DNNs). It remains unclear, however, whether the representational performance of DNNs rivals that of the brain. To accurately produce such a comparison, a major difficulty has been a unifying metric that accounts for experimental limitations, such as the amount of noise, the number of neural recording sites, and the number of trials, and computational limitations, such as the complexity of the decoding classifier and the number of classifier training examples. In this work, we perform a direct comparison that corrects for these experimental limitations and computational considerations. As part of our methodology, we propose an extension of ‘‘kernel analysis’’ that measures the generalization accuracy as a function of representational complexity. Our evaluations show that, unlike previous bio-inspired models, the latest DNNs rival the representational performance of IT cortex on this visual object recognition task. Furthermore, we show that models that perform well on measures of representational performance also perform well on measures of representational similarity to IT, and on measures of predicting individual IT multi-unit responses. Whether these DNNs rely on computational mechanisms similar to the primate visual system is yet to be determined, but, unlike all previous bioinspired models, that possibility cannot be ruled out merely on representational performance grounds.

773 citations

Journal ArticleDOI
04 Jun 2020-Nature
TL;DR: The results obtained by seventy different teams analysing the same functional magnetic resonance imaging dataset show substantial variation, highlighting the influence of analytical choices and the importance of sharing workflows publicly and performing multiple analyses.
Abstract: Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses1. The flexibility of analytical approaches is exemplified by the fact that no two teams chose identical workflows to analyse the data. This flexibility resulted in sizeable variation in the results of hypothesis tests, even for teams whose statistical maps were highly correlated at intermediate stages of the analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Notably, a meta-analytical approach that aggregated information across teams yielded a significant consensus in activated regions. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2-5. Our findings show that analytical flexibility can have substantial effects on scientific conclusions, and identify factors that may be related to variability in the analysis of functional magnetic resonance imaging. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for performing and reporting multiple analyses of the same data. Potential approaches that could be used to mitigate issues related to analytical variability are discussed.

551 citations

Related Papers (5)