Working in Multipoint Mode
To build behavioral models, the Nemesida AI MLC module requires a significant amount of free RAM. When using more than one server with the Nemesida WAF module, you can save hardware resources by using the point-to-multipoint operation scheme (one server with the Nemesida AI MLC module installed interacts with many servers with Nemesida WAF modules installed).
On a server with the Nemesida WAF module installed
– Create a user of the RabbitMQ service:
# rabbitmqctl add_user USER PASSWORD
# rabbitmqctl set_permissions -p / USER ".*" ".*" ".*"
PASSWORD are the username and password for connecting the Nemesida AI MLC module.
– Make changes to the configuration file
– Allow access from the server on which the Nemesida AI MLC module is installed to the RabbitMQ port (by default 5672 TCP).
– Complete the RabbitMQ setup:
# service rabbitmq-server restart
On a server with the Nemesida AI MLC module installed
Create additional configuration files in the
/opt/mlc/conf/ directory by copying the
/opt/mlc/mlc.conf file. Make changes to the new configuration files to work with the remote RabbitMQ server. After making the changes, restart the service:
# service mlc_main restart
# service mlc_main status
In additional configuration files
nwaf_license_key is a required parameter. The license key used in the Nemesida AI MLC settings and the remote Nemesida WAFs must have the same
WAF ID. When using additional configuration files, it is recommended to delete the
Using remote RabbitMQ services, the Nemesida AI MLC module will collect queries and then train models in the same way as in normal operation.
Working with the Nemesida AI MLS cloud server
The Nemesida AI cloud server is designed to generate behavioral models based on a copy of traffic coming from remote servers. The cloud server is used in cases when the Nemesida WAF software user does not have enough RAM for the Nemesida AI MLC module to work. To use the capabilities of the Nemesida AI cloud server, contact the service technical support.