Interacting with the NEST simulator#

This section details how to create detailed neuronal networks, then run simulations on them using the NEST simulator.

Readers are supposed to have a good grasp of the way NEST handles neurons and models, and how to create and setup NEST nodes. If this is not the case, please see the NEST user doc and the PyNEST tutorials first.

NNGT tools should work for NEST version 2 or 3; they can be separated into

  • the structural tools (Network, NeuralPop …) that are used to prepare the neuronal network and setup its properties and connectivity; these tools should be used before

  • the make_nest_network() and the associated, to_nest() functions that are used to send the previously prepared network to NEST;

  • then, after using one of the previous functions, all the other functions contained in the nngt.simulation module can be used to add stimulations to the neurons or monitor them.

Note

Calls to nest.ResetKernel will also reset all networks and populations, which means that after such a call, populations, parameters, etc, can again be changed until the next invocation of make_nest_network() or to_nest().

Example files associated to the interactions between NEST and NNGT can be found here: docs/examples/nest_network.py / docs/examples/nest_receptor_ports.py.

Creating detailed neuronal networks#

NeuralPop and NeuralGroup#

These two classes are the basic blocks to design neuronal networks: a NeuralGroup is a set of neurons sharing common properties while the NeuralPop is the main container that represents the whole network as an ensemble of groups.

Depending on your perspective, you can either create the groups first, then build the population from them, or create the population first, then split it into various groups.

For more details on groups and populations, see Groups, structures, and neuronal populations.

Neuronal groups before the population

Neural groups can be created as follow:

# 100 inhibitory neurons
basic_group = nngt.NeuralGroup(100, neuron_type=-1)
# 10 excitatory (default) aeif neurons
aeif_group  = nngt.NeuralGroup(10, neuron_model="aeif_psc_alpha")
# an unspecified number of aeif neurons with specific parameters
p = {"E_L": -58., "V_th": -54.}
aeif_g2 = nngt.NeuralGroup(neuron_model="aeif_psc_alpha", neuron_param=p)

In the case where the number of neurons is specified upon creation, NNGT can check that the number of neurons matches in the network and the associated population and raise a warning if they don’t. However, it is just a security check and it does not prevent the network for being created if the numbers don’t match.

Once the groups are created, you can simply generate the population using

pop = nngt.NeuralPop.from_groups([basic_group, aeif_group], ["b", "a"])

This created a population separated into “a” and “b” from the previously created groups.

Population before the groups

A population with excitatory and inhibitory neurons

pop = nngt.NeuralPop(1000)
pop.create_group(800, "first")
pop.create_group(200, "second", neuron_type=-1)

or, more compact

pop = nngt.NeuralPop.exc_and_inhib(1000, iratio=0.2)

The Network class#

Besides connectivity, the main interest of the NeuralGroup is that you can pass it the biological properties that the neurons belonging to this group will share.

Since we are using NEST, these properties are:

  • the model’s name

  • its non-default properties

  • the synapses that the neurons have and their properties

  • the type of the neurons (1 for excitatory or -1 for inhibitory)

    neuron_param=params1)

burst = nngt.NeuralGroup(
    nodes=200, neuron_model='aeif_psc_alpha', neuron_type=1,
    neuron_param=params2)

adapt = nngt.NeuralGroup(
    nodes=200, neuron_model='aeif_psc_alpha', neuron_type=1,
    neuron_param=base_params)

model = 'model'

try:
    import nest
    nest.NodeCollection()
    model = 'synapse_model'
except:
    pass

synapses = {
    'default': {model: 'tsodyks2_synapse'},
    ('oscillators', 'bursters'): {model: 'tsodyks2_synapse', 'U': 0.6},
    ('oscillators', 'oscillators'): {model: 'tsodyks2_synapse', 'U': 0.7},
    ('oscillators', 'adaptive'): {model: 'tsodyks2_synapse', 'U': 0.5}
}
'''
Create the network from this population,
using a Gaussian in-degree
'''
net = ng.gaussian_degree(
    100., 15., population=pop, weights=155., delays=5.)


'''
Send the network to NEST, monitor and simulate
'''
if nngt.get_config('with_nest'):
    import nngt.simulation as ns
    import nest

    nest.ResetKernel()

    nest.SetKernelStatus({'local_num_threads': 4})

    gids = net.to_nest()

Once this network is created, it can simply be sent to nest through the command: gids = net.to_nest(), and the NEST gids are returned.

In order to access the gids from each group, you can do:

oscill_gids = net.nest_gids[oscill.ids]

or directly:

oscill_gids = oscill.nest_gids

As shown in “Use with NEST”, synaptic strength from inhibitory neurons in NNGT are positive (for compatibility with graph analysis tools) but they are automatically converted to negative values when the network is created in NEST.

Changing the parameters of neurons#

Before sending the network to NEST#

Once the NeuralPop has been created, you can change the parameters of the neuron groups before you send the network to NEST.

To do this, you can use the set_param() function, to which you pass the parameter dict and the name of the NeuralGroup you want to modify.

If you are dealing directly with NeuralGroup objects, you can access and modify their neuron_param attribute as long as the network has not been sent to nest. Once sent, these parameters become unsettable and any wourkaround to circumvent this will not change the values inside NEST anyway.

After sending the network to NEST, randomizing#

Once the network has been sent to NEST, neuronal parameters can still be changed, but only for randomization purposes. It is possible to randomize the neuronal parameters through the randomize_neural_states() function. This sets the parameters using a specified distribution and stores their values inside the network nodes’ attributes.


Go to other tutorials: