- Source: Neuroinformatics
Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics has to be applied:
the development of computational models of the nervous system and neural processes;
the development of tools for analyzing and modeling neuroscience data; and
the development of tools and databases for management and sharing of neuroscience data at all levels of analysis.
Neuroinformatics encompasses philosophy (computational theory of mind), psychology (information processing theory), computer science (natural computing, bio-inspired computing), among others disciplines. Neuroinformatics doesn't deal with matter or energy, so it can be seen as a branch of neurobiology that studies various aspects of nervous systems. The term neuroinformatics seems to be used synonymously with cognitive informatics, described by Journal of Biomedical Informatics as interdisciplinary domain that focuses on human information processing, mechanisms and processes within the context of computing and computing applications. According to German National Library, neuroinformatics is synonymous with neurocomputing. At Proceedings of the 10th IEEE International Conference on Cognitive Informatics and Cognitive Computing was introduced the following description: Cognitive Informatics (CI) as a transdisciplinary enquiry of computer science, information sciences, cognitive science, and intelligence science. CI investigates into the internal information processing mechanisms and processes of the brain and natural intelligence, as well as their engineering applications in cognitive computing. According to INCF, neuroinformatics is a research field devoted to the development of neuroscience data and knowledge bases together with computational models.
Neuroinformatics in neuropsychology and neurobiology
= Models of neural computation
=Models of neural computation are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. Due to the complexity of nervous system behavior, the associated experimental error bounds are ill-defined, but the relative merit of the different models of a particular subsystem can be compared according to how closely they reproduce real-world behaviors or respond to specific input signals. In the closely related field of computational neuroethology, the practice is to include the environment in the model in such a way that the loop is closed. In the cases where competing models are unavailable, or where only gross responses have been measured or quantified, a clearly formulated model can guide the scientist in designing experiments to probe biochemical mechanisms or network connectivity.
Neurocomputing technologies
= Artificial neural networks
=Artificial neural networks (ANNs), usually simply called neural networks (NNs), are computing systems vaguely inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. An artificial neuron that receives a signal then processes it and can signal neurons connected to it. The "signal" at a connection is a real number, and the output of each neuron is computed by some non-linear function of the sum of its inputs. The connections are called edges. Neurons and edges typically have a weight that adjusts as learning proceeds. The weight increases or decreases the strength of the signal at a connection. Neurons may have a threshold such that a signal is sent only if the aggregate signal crosses that threshold. Typically, neurons are aggregated into layers. Different layers may perform different transformations on their inputs. Signals travel from the first layer (the input layer), to the last layer (the output layer), possibly after traversing the layers multiple times.
= Brain emulation and mind uploading
=Brain emulation is the concept of creating a functioning computational model and emulation of a brain or part of a brain. In December 2006, the Blue Brain project completed a simulation of a rat's neocortical column. The neocortical column is considered the smallest functional unit of the neocortex. The neocortex is the part of the brain thought to be responsible for higher-order functions like conscious thought, and contains 10,000 neurons in the rat brain (and 108 synapses). In November 2007, the project reported the end of its first phase, delivering a data-driven process for creating, validating, and researching the neocortical column. An artificial neural network described as being "as big and as complex as half of a mouse brain" was run on an IBM Blue Gene supercomputer by the University of Nevada's research team in 2007. Each second of simulated time took ten seconds of computer time. The researchers claimed to observe "biologically consistent" nerve impulses that flowed through the virtual cortex. However, the simulation lacked the structures seen in real mice brains, and they intend to improve the accuracy of the neuron and synapse models.
Mind uploading is the process of scanning a physical structure of the brain accurately enough to create an emulation of the mental state (including long-term memory and "self") and copying it to a computer in a digital form. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind. Substantial mainstream research in related areas is being conducted in animal brain mapping and simulation, development of faster supercomputers, virtual reality, brain–computer interfaces, connectomics, and information extraction from dynamically functioning brains. According to supporters, many of the tools and ideas needed to achieve mind uploading already exist or are currently under active development; however, they will admit that others are, as yet, very speculative, but say they are still in the realm of engineering possibility.
= Brain–computer interface
=Research on brain–computer interface began in the 1970s at the University of California, Los Angeles under a grant from the National Science Foundation, followed by a contract from DARPA. The papers published after this research also mark the first appearance of the expression brain–computer interface in scientific literature. Recently, studies in Human-computer interaction through the application of machine learning with statistical temporal features extracted from the frontal lobe, EEG brainwave data has shown high levels of success in classifying mental states (Relaxed, Neutral, Concentrating) mental emotional states (Negative, Neutral, Positive) and thalamocortical dysrhythmia.
= Neuroengineering & Neuroinformatics
=Neuroinformatics is the scientific study of information flow and processing in the nervous system. Institute scientists utilize brain imaging techniques, such as magnetic resonance imaging, to reveal the organization of brain networks involved in human thought. Brain simulation is the concept of creating a functioning computer model of a brain or part of a brain. There are three main directions where neuroinformatics has to be applied:
the development of computational models of the nervous system and neural processes,
the development of tools for analyzing data from devices for neurological diagnostic devices,
the development of tools and databases for management and sharing of patients brain data in healthcare institutions.
= Brain mapping and simulation
=Brain simulation is the concept of creating a functioning computational model of a brain or part of a brain. In December 2006, the Blue Brain project completed a simulation of a rat's neocortical column. The neocortical column is considered the smallest functional unit of the neocortex. The neocortex is the part of the brain thought to be responsible for higher-order functions like conscious thought, and contains 10,000 neurons in the rat brain (and 108 synapses). In November 2007, the project reported the end of its first phase, delivering a data-driven process for creating, validating, and researching the neocortical column. An artificial neural network described as being "as big and as complex as half of a mouse brain" was run on an IBM Blue Gene supercomputer by the University of Nevada's research team in 2007. Each second of simulated time took ten seconds of computer time. The researchers claimed to observe "biologically consistent" nerve impulses that flowed through the virtual cortex. However, the simulation lacked the structures seen in real mice brains, and they intend to improve the accuracy of the neuron and synapse models.
= Mind uploading
=Mind uploading is the process of scanning a physical structure of the brain accurately enough to create an emulation of the mental state (including long-term memory and "self") and copying it to a computer in a digital form. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind. Substantial mainstream research in related areas is being conducted in animal brain mapping and simulation, development of faster supercomputers, virtual reality, brain–computer interfaces, connectomics, and information extraction from dynamically functioning brains. According to supporters, many of the tools and ideas needed to achieve mind uploading already exist or are currently under active development; however, they will admit that others are, as yet, very speculative, but say they are still in the realm of engineering possibility.
Auxiliary sciences of neuroinformatics
= Data analysis and knowledge organisation
=Neuroinformatics (in context of library science) is also devoted to the development of neurobiology knowledge with computational models and analytical tools for sharing, integration, and analysis of experimental data and advancement of theories about the nervous system function. In the INCF context, this field refers to scientific information about primary experimental data, ontology, metadata, analytical tools, and computational models of the nervous system. The primary data includes experiments and experimental conditions concerning the genomic, molecular, structural, cellular, networks, systems and behavioural level, in all species and preparations in both the normal and disordered states. In the recent decade, as vast amounts of diverse data about the brain were gathered by many research groups, the problem was raised of how to integrate the data from thousands of publications in order to enable efficient tools for further research. The biological and neuroscience data are highly interconnected and complex, and by itself, integration represents a great challenge for scientists.
History
The United States National Institute of Mental Health (NIMH), the National Institute of Drug Abuse (NIDA) and the National Science Foundation (NSF) provided the National Academy of Sciences Institute of Medicine with funds to undertake a careful analysis and study of the need to introduce computational techniques to brain research. The positive recommendations were reported in 1991. This positive report enabled NIMH, now directed by Allan Leshner, to create the "Human Brain Project" (HBP), with the first grants awarded in 1993. Next, Koslow pursued the globalization of the HPG and neuroinformatics through the European Union and the Office for Economic Co-operation and Development (OECD), Paris, France. Two particular opportunities occurred in 1996.
The first was the existence of the US/European Commission Biotechnology Task force co-chaired by Mary Clutter from NSF. Within the mandate of this committee, of which Koslow was a member the United States European Commission Committee on Neuroinformatics was established and co-chaired by Koslow from the United States. This committee resulted in the European Commission initiating support for neuroinformatics in Framework 5 and it has continued to support activities in neuroinformatics research and training.
A second opportunity for globalization of neuroinformatics occurred when the participating governments of the Mega Science Forum (MSF) of the OECD were asked if they had any new scientific initiatives to bring forward for scientific cooperation around the globe. The White House Office of Science and Technology Policy requested that agencies in the federal government meet at NIH to decide if cooperation were needed that would be of global benefit. The NIH held a series of meetings in which proposals from different agencies were discussed. The proposal recommendation from the U.S. for the MSF was a combination of the NSF and NIH proposals. Jim Edwards of NSF supported databases and data-sharing in the area of biodiversity.
The two related initiatives were combined to form the United States proposal on "Biological Informatics". This initiative was supported by the White House Office of Science and Technology Policy and presented at the OECD MSF by Edwards and Koslow. An MSF committee was established on Biological Informatics with two subcommittees: 1. Biodiversity (Chair, James Edwards, NSF), and 2. Neuroinformatics (Chair, Stephen Koslow, NIH). At the end of two years the Neuroinformatics subcommittee of the Biological Working Group issued a report supporting a global neuroinformatics effort. Koslow, working with the NIH and the White House Office of Science and Technology Policy to establishing a new Neuroinformatics working group to develop specific recommendation to support the more general recommendations of the first report. The Global Science Forum (GSF; renamed from MSF) of the OECD supported this recommendation.
Community
Institute of Neuroinformatics, University of Zurich
The Institute of Neuroinformatics was established at the University of Zurich and ETH Zurich at the end of 1995. The mission of the Institute is to discover the key principles by which brains work and to implement these in artificial systems that interact intelligently with the real world.
Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh
Computational Neuroscience and Neuroinformatics Group in Institute for Adaptive and Neural Computation of University of Edinburgh's School of Informatics study how the brain processes information.
The International Neuroinformatics Coordinating Facility
An international organization with the mission to develop, evaluate, and endorse standards and best practices that embrace the principles of open, fair, and citable neuroscience. As of October 2019, the INCF has active nodes in 18 countries. This committee presented 3 recommendations to the member governments of GSF. These recommendations were:
National neuroinformatics programs should be continued or initiated in each country should have a national node to both provide research resources nationally and to serve as the contact for national and international coordination.
An International Neuroinformatics Coordinating Facility should be established. The INCF will coordinate the implementation of a global neuroinformatics network through integration of national neuroinformatics nodes.
A new international funding scheme should be established.
This scheme should eliminate national and disciplinary barriers and provide a most efficient approach to global collaborative research and data sharing. In this new scheme, each country will be expected to fund the participating researchers from their country. The GSF neuroinformatics committee then developed a business plan for the operation, support and establishment of the INCF which was supported and approved by the GSF Science Ministers at its 2004 meeting. In 2006 the INCF was created and its central office established and set into operation at the Karolinska Institute, Stockholm, Sweden under the leadership of Sten Grillner. Sixteen countries (Australia, Canada, China, the Czech Republic, Denmark, Finland, France, Germany, India, Italy, Japan, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom and the United States), and the EU Commission established the legal basis for the INCF and Programme in International Neuroinformatics (PIN). To date, eighteen countries (Australia, Belgium, Czech Republic, Finland, France, Germany, India, Italy, Japan, Malaysia, Netherlands, Norway, Poland, Republic of Korea, Sweden, Switzerland, the United Kingdom and the United States) are members of the INCF. Membership is pending for several other countries. The goal of the INCF is to coordinate and promote international activities in neuroinformatics. The INCF contributes to the development and maintenance of database and computational infrastructure and support mechanisms for neuroscience applications. The system is expected to provide access to all freely accessible human brain data and resources to the international research community. The more general task of INCF is to provide conditions for developing convenient and flexible applications for neuroscience laboratories in order to improve our knowledge about the human brain and its disorders.
Laboratory of Neuroinformatics, Nencki Institute of Experimental Biology
The main activity of the group is development of computational tools and models, and using them to understand brain structure and function.
Neuroimaging & Neuroinformatics, Howard Florey Institute, University of Melbourne
Institute scientists utilize brain imaging techniques, such as magnetic resonance imaging, to reveal the organization of brain networks involved in human thought. Led by Gary Egan.
Montreal Neurological Institute, McGill University
Led by Alan Evans, MCIN conducts computationally-intensive brain research using innovative mathematical and statistical approaches to integrate clinical, psychological and brain imaging data with genetics. MCIN researchers and staff also develop infrastructure and software tools in the areas of image processing, databasing, and high performance computing. The MCIN community, together with the Ludmer Centre for Neuroinformatics and Mental Health, collaborates with a broad range of researchers and increasingly focuses on open data sharing and open science, including for the Montreal Neurological Institute.
The THOR Center for Neuroinformatics
Established April 1998 at the Department of Mathematical Modelling, Technical University of Denmark. Besides pursuing independent research goals, the THOR Center hosts a number of related projects concerning neural networks, functional neuroimaging, multimedia signal processing, and biomedical signal processing.
The Neuroinformatics Portal Pilot
The project is part of a larger effort to enhance the exchange of neuroscience data, data-analysis tools, and modeling software. The portal is supported from many members of the OECD Working Group on Neuroinformatics. The Portal Pilot is promoted by the German Ministry for Science and Education.
Computational Neuroscience, ITB, Humboldt-University Berlin
This group focuses on computational neurobiology, in particular on the dynamics and signal processing capabilities of systems with spiking neurons. Led by Andreas VM Herz.
The Neuroinformatics Group in Bielefeld
Active in the field of Artificial Neural Networks since 1989. Current research programmes within the group are focused on the improvement of man-machine-interfaces, robot-force-control, eye-tracking experiments, machine vision, virtual reality and distributed systems.
Laboratory of Computational Embodied Neuroscience (LOCEN)
This group, part of the Institute of Cognitive Sciences and Technologies, Italian National Research Council (ISTC-CNR) in Rome and founded in 2006 is currently led by Gianluca Baldassarre. It has two objectives: (a) understanding the brain mechanisms underlying learning and expression of sensorimotor behaviour, and related motivations and higher-level cognition grounded on it, on the basis of embodied computational models; (b) transferring the acquired knowledge to building innovative controllers for autonomous humanoid robots capable of learning in an open-ended fashion on the basis of intrinsic and extrinsic motivations.
Japan national neuroinformatics resource
The Visiome Platform is the Neuroinformatics Search Service that provides access to mathematical models, experimental data, analysis libraries and related resources. An online portal for neurophysiological data sharing is also available at BrainLiner.jp as part of the MEXT Strategic Research Program for Brain Sciences (SRPBS).
Laboratory for Mathematical Neuroscience, RIKEN Brain Science Institute (Wako, Saitama)
The target of Laboratory for Mathematical Neuroscience is to establish mathematical foundations of brain-style computations toward construction of a new type of information science. Led by Shun-ichi Amari.
Netherlands state program in neuroinformatics
Started in the light of the international OECD Global Science Forum which aim is to create a worldwide program in Neuroinformatics.
NUST-SEECS Neuroinformatics Research Lab
Establishment of the Neuro-Informatics Lab at SEECS-NUST has enabled Pakistani researchers and members of the faculty to actively participate in such efforts, thereby becoming an active part of the above-mentioned experimentation, simulation, and visualization processes. The lab collaborates with the leading international institutions to develop highly skilled human resource in the related field. This lab facilitates neuroscientists and computer scientists in Pakistan to conduct their experiments and analysis on the data collected using state of the art research methodologies without investing in establishing the experimental neuroscience facilities. The key goal of this lab is to provide state of the art experimental and simulation facilities, to all beneficiaries including higher education institutes, medical researchers/practitioners, and technology industry.
The Blue Brain Project
The Blue Brain Project was founded in May 2005, and uses an 8000 processor Blue Gene/L supercomputer developed by IBM. At the time, this was one of the fastest supercomputers in the world.
The project involves:
Databases: 3D reconstructed model neurons, synapses, synaptic pathways, microcircuit statistics, computer model neurons, virtual neurons.
Visualization: microcircuit builder and simulation results visualizator, 2D, 3D and immersive visualization systems are being developed.
Simulation: a simulation environment for large-scale simulations of morphologically complex neurons on 8000 processors of IBM's Blue Gene supercomputer.
Simulations and experiments: iterations between large-scale simulations of neocortical microcircuits and experiments in order to verify the computational model and explore predictions.
The mission of the Blue Brain Project is to understand mammalian brain function and dysfunction through detailed simulations. The Blue Brain Project will invite researchers to build their own models of different brain regions in different species and at different levels of detail using Blue Brain Software for simulation on Blue Gene. These models will be deposited in an internet database from which Blue Brain software can extract and connect models together to build brain regions and begin the first whole brain simulations.
Genes to Cognition Project
Genes to Cognition Project, a neuroscience research programme that studies genes, the brain and behaviour in an integrated manner. It is engaged in a large-scale investigation of the function of molecules found at the synapse. This is mainly focused on proteins that interact with the NMDA receptor, a receptor for the neurotransmitter, glutamate, which is required for processes of synaptic plasticity such as long-term potentiation (LTP). Many of the techniques used are high-throughout in nature, and integrating the various data sources, along with guiding the experiments has raised numerous informatics questions. The program is primarily run by Professor Seth Grant at the Wellcome Trust Sanger Institute, but there are many other teams of collaborators across the world.
The CARMEN project
The CARMEN project is a multi-site (11 universities in the United Kingdom) research project aimed at using GRID computing to enable experimental neuroscientists to archive their datasets in a structured database, making them widely accessible for further research, and for modellers and algorithm developers to exploit.
EBI Computational Neurobiology, EMBL-EBI (Hinxton)
The main goal of the group is to build realistic models of neuronal function at various levels, from the synapse to the micro-circuit, based on the precise knowledge of molecule functions and interactions (Systems Biology). Led by Nicolas Le Novère.
Neurogenetics GeneNetwork
Genenetwork started as component of the NIH Human Brain Project in 1999 with a focus on the genetic analysis of brain structure and function. This international program consists of tightly integrated genome and phenome data sets for human, mouse, and rat that are designed specifically for large-scale systems and network studies relating gene variants to differences in mRNA and protein expression and to differences in CNS structure and behavior. The great majority of data are open access. GeneNetwork has a companion neuroimaging web site—the Mouse Brain Library—that contains high resolution images for thousands of genetically defined strains of mice.
The Neuronal Time Series Analysis (NTSA)
NTSA Workbench is a set of tools, techniques and standards designed to meet the needs of neuroscientists who work with neuronal time series data. The goal of this project is to develop information system that will make the storage, organization, retrieval, analysis and sharing of experimental and simulated neuronal data easier. The ultimate aim is to develop a set of tools, techniques and standards in order to satisfy the needs of neuroscientists who work with neuronal data.
The Cognitive Atlas
The Cognitive Atlas is a project developing a shared knowledge base in cognitive science and neuroscience. This comprises two basic kinds of knowledge: tasks and concepts, providing definitions and properties thereof, and also relationships between them. An important feature of the site is ability to cite literature for assertions (e.g. "The Stroop task measures executive control") and to discuss their validity. It contributes to NeuroLex and the Neuroscience Information Framework, allows programmatic access to the database, and is built around semantic web technologies.
Brain Big Data research group at the Allen Institute for Brain Science (Seattle, WA)
Led by Hanchuan Peng, this group has focused on using large-scale imaging computing and data analysis techniques to reconstruct single neuron models and mapping them in brains of different animals.
See also
References
= Citations
== Sources
=Further reading
= Books
== Journals and conferences
=Kata Kunci Pencarian:
- Otak manusia
- Pemetaan otak
- Simulasi otak
- Konektom
- Neuroinformatics
- Informatics
- Computer science
- Maxima (software)
- Functional neurologic disorder
- NeuroLex
- Taxonomy (biology)
- International Neuroinformatics Coordinating Facility
- Viktor K. Jirsa
- Neuroinformatics (journal)