第一部分 考试说明
一、考试性质
本《神经计算科学》考试大纲适用于北京师范大学脑与认知科学学院各专业的硕士研究生入学考试。要求考生全面系统地掌握计算神经科学的基本概念和研究方法,能熟练运用计算神经科学知识分析神经生物学基本问题。 考试对象为报考我校硕士研究生入学考试的准考考生。
二、考试形式与试卷
(一)答卷方式:闭卷、笔试
(二)答题时间:180分钟
(三)各题型分值(总分300)
1. 名词解释30
2.简答题 90
3.论述题 180
三、考试要求
1.掌握计算神经科学的基本概念和基础理论
2.了解计算神经科学的一些经典进展
3.具有运用基本概念和基础理论分析问题与解决问题的能力
第二部分 考察要点
PART I - ANALYZINGAND MODELING NEURAL RESPONSES
Chapter 1 - NeuralEncoding I: Firing Rates and Spike Statistics
1. Properties ofNeurons;Recording NeuronalResponses;From Stimulus to Response
2. Spike Trains andFiring Rates; Tuning Curves;Spike-Count Variability
3. Describing theStimulus;The Spike-Triggered Average;White-Noise Stimuli;Multiple-Spike-Triggered Averages and Spike-TriggeredCorrelations
4. Spike TrainStatistics; The Homogeneous Poisson Process;The Spike-Train Autocorrelation Function;The Inhomogeneous Poisson Process;The Poisson Spike Generator;Comparison with Data
5. The Neural Code;Independent-Spike, Independent Neuron and Correlation Codes;Temporal Codes
Chapter 2 - NeuralEncoding II: Reverse Correlation and Receptive Fields
1. EstimatingFiring Rates; The Most Effective Stimulus; Static Nonlinearities
2. Early VisualSystem; The Retinotopic Map; Visual Stimuli; The Nyquist Frequency
3. ReverseCorrelation Methods - Simple Cells;
Spatial Receptive Fields; Temporal Receptive Fields; Responseof a Simple Cell to a Counterphase Grating; Space-Time Receptive Fields; Nonseparable Receptive Fields; StaticNonlinearities - Simple Cells
4. StaticNonlinearities - Complex Cells
5. Receptive Fieldsin the Retina and LGN
6. Constructing V1Receptive Fields
Chapter 3 - NeuralDecoding
1. Encoding andDecoding
2. Discrimination; ROCCurves; ROC Analysis of Motion Discrimination; The Likelihood Ratio Test
3. PopulationDecoding; Encoding and Decoding Direction; Optimal Decoding Methods; FisherInformation; Optimal Discrimination
4. Spike TrainDecoding
PART II - MODELINGNEURONS AND NETWORKS
Chapter 4 - ModelNeurons I: Neuroelectronics
1. Levels of NeuronModeling
2. ElectricalProperties of Neurons
Intracellular Resistance; Membrane Capacitance andResistance; Equilibrium and Reversal Potentials; The Membrane Current
3. Single-CompartmentModels
Integrate-and-Fire Models; Spike-Rate Adaptation andRefractoriness
4. Voltage-DependentConductances
Persistent Conductances; Transient Conductances; Hyperpolarization-ActivatedConductances
5. TheHodgkin-Huxley Model
6. ModelingChannels
7. SynapticConductances
The Postsynaptic Conductance; Release Probability andShort-Term Plasticity
8. Synapses onIntegrate-and-Fire Neurons
Regular and Irregular Firing Modes
Chapter 5 - ModelNeurons II: Conductances and Morphology
1. Levels of NeuronModeling
2. Conductance-BasedModels
The Connor-Stevens Model; Postinhibitory Rebound andBursting
3. The CableEquation
Linear Cable Theory (An Infinite Cable; An IsolatedBranching Node); The Rall Model; The Morphoelectrotonic Transform
4. Multi-CompartmentModels
Action Potential Propagation Along an Unmyelinated Axon;Propagation Along a Myelinated Axon
Chapter 6 - NetworkModels
1. Firing-RateModels
Feedforward and Recurrent Networks; Continuously LabelledNetworks
2. FeedforwardNetworks
Neural Coordinate Transformations
3. RecurrentNetworks
Linear Recurrent Networks (Selective Amplification; InputIntegration; Continuous Linear Recurrent Networks);
Nonlinear Recurrent Networks (Nonlinear Amplification; ARecurrent Model of Simple Cells in Primary Visual Cortex; A Recurrent Model ofComplex Cells in Primary Visual Cortex; Winner-Take-All Input Selection; GainModulation; Sustained Activity; Maximum Likelihood and Network Recoding )
4. NetworkStability
Associative Memory
5. Excitatory-InhibitoryNetworks
Homogeneous Excitatory and Inhibitory Populations(Phase-Plane Methods and Stability Analysis); The Olfactory Bulb; OscillatoryAmplification
6. Stochastic Networks
PART III -PLASTICITY AND LEARNING
Chapter 7 -Plasticity and Learning
1. Stability andCompetition
2. SynapticPlasticity Rules
The Basic Hebb Rule; The Covariance Rule; The BCM Rule;Synaptic Normalization (Subtractive Normalization; Multiplicative Normalizationand the Oja Rule); Timing-Based Rules
3. UnsupervisedLearning
Single Postsynaptic Neuron (Principal ComponentProjection; Hebbian Development and Ocular Dominance; Hebbian Development ofOrientation Selectivity; Temproal Hebbian Rules and Trace Learning)
Multiple Postsynaptic Neurons (Fixed Linear RecurrentConnections; Competitive Hebbian Learning; Feature-Based Models; Anti-HebbianModification; Timing-Based Plasticity and Prediction)
4. SupervisedLearning
Supervised Hebbian Learning (Classification and thePerceptron; Function Approximation)
5. SupervisedError-Correcting Rules
The Perceptron Learning Rule
6. The Delta Rule
Contrastive Hebbian Learning