Merge pull request #453 from LibreQoE/main

Main > Develop
This commit is contained in:
Robert Chacón 2024-01-12 07:55:19 -07:00 committed by GitHub
commit c1ed0c2b13
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
4 changed files with 73 additions and 7 deletions

View File

@ -6,6 +6,8 @@ Servers running LibreQoS can shape traffic for many thousands of customers.
Learn more at [LibreQoS.io](https://libreqos.io/)!
<img alt="LibreQoS" src="https://user-images.githubusercontent.com/22501920/223866474-603e1112-e2e6-4c67-93e4-44c17b1b7c43.png"></a>
## Sponsors
LibreQoS' development is made possible by our sponsors, the NLnet Foundation and Equinix.
@ -27,4 +29,14 @@ Please support the continued development of LibreQoS by sponsoring us via [GitHu
Our Matrix chat channel is available at [https://matrix.to/#/#libreqos:matrix.org](https://matrix.to/#/#libreqos:matrix.org).
<img alt="LibreQoS" src="https://user-images.githubusercontent.com/22501920/223866474-603e1112-e2e6-4c67-93e4-44c17b1b7c43.png"></a>
## Long-Term Stats (LTS)
Long-Term Stats (LTS) is an analytics service built for LibreQoS that revolutionizes the way you track and analyze your network.
With flexible time window views ranging from 5 minutes to 1 month, LTS gives you comprehensive insights into your network's performance.
Built from the ground-up for performance and efficiency, LTS greatly outperforms our original InfluxDB plugin, and gives you rapidly rendered data to help you maximize your network performance.
We provide a free 30-day trial of LTS, after which the rate is $0.30 USD per shaped subscriber.
You can enroll in the 30-day free trial by [upgrading to the latest version of LibreQoS v1.4](https://libreqos.readthedocs.io/en/latest/docs/Updates/update.html) and selecting "Start Stats Free Trial" in the top-right corner of the local LibreQoS WebUI.
<img alt="LibreQoS Long Term Stats" src="https://i0.wp.com/libreqos.io/wp-content/uploads/2023/11/01-Dashboard.png"></a>

View File

@ -14,7 +14,38 @@ On the first successful run, it will create a network.json and ShapedDevices.csv
If a network.json file exists, it will not be overwritten.
You can modify the network.json file to more accurately reflect bandwidth limits.
ShapedDevices.csv will be overwritten every time the UISP integration is run.
You have the option to run integrationUISP.py automatically on boot and every 30 minutes, which is recommended. This can be enabled by setting ```automaticImportUISP = True``` in ispConfig.py
You have the option to run integrationUISP.py automatically on boot and every 10 minutes, which is recommended. This can be enabled by setting ```automaticImportUISP = True``` in ispConfig.py
## Powercode Integration
First, set the relevant parameters for Sonar (powercode_api_key, powercode_api_url, etc.) in ispConfig.py.
To test the Powercode Integration, use
```shell
python3 integrationPowercode.py
```
On the first successful run, it will create a ShapedDevices.csv file.
You can modify the network.json file manually to reflect Site/AP bandwidth limits.
ShapedDevices.csv will be overwritten every time the Powercode integration is run.
You have the option to run integrationPowercode.py automatically on boot and every 10 minutes, which is recommended. This can be enabled by setting ```automaticImportPowercode = True``` in ispConfig.py
## Sonar Integration
First, set the relevant parameters for Sonar (sonar_api_key, sonar_api_url, etc.) in ispConfig.py.
To test the Sonar Integration, use
```shell
python3 integrationSonar.py
```
On the first successful run, it will create a ShapedDevices.csv file.
If a network.json file exists, it will not be overwritten.
You can modify the network.json file to more accurately reflect bandwidth limits.
ShapedDevices.csv will be overwritten every time the Sonar integration is run.
You have the option to run integrationSonar.py automatically on boot and every 10 minutes, which is recommended. This can be enabled by setting ```automaticImportSonar = True``` in ispConfig.py
## Splynx Integration
@ -31,4 +62,4 @@ python3 integrationSplynx.py
On the first successful run, it will create a ShapedDevices.csv file.
You can manually create your network.json file to more accurately reflect bandwidth limits.
ShapedDevices.csv will be overwritten every time the Splynx integration is run.
You have the option to run integrationSplynx.py automatically on boot and every 30 minutes, which is recommended. This can be enabled by setting ```automaticImportSplynx = True``` in ispConfig.py
You have the option to run integrationSplynx.py automatically on boot and every 10 minutes, which is recommended. This can be enabled by setting ```automaticImportSplynx = True``` in ispConfig.py

View File

@ -83,6 +83,17 @@ powercode_api_key = ''
# Everything before :444/api/ in your Powercode instance URL
powercode_api_url = ''
# Sonar Integration
automaticImportSonar = False
sonar_api_key = ''
sonar_api_url = '' # ex 'https://company.sonar.software/api/graphql'
# If there are radios in these lists, we will try to get the clients using snmp. This requires snmpwalk to be install on the server. You can use "sudo apt-get install snmp" for that. You will also need to fill in the snmp_community.
sonar_airmax_ap_model_ids = [] # ex ['29','43']
sonar_ltu_ap_model_ids = [] # ex ['4']
snmp_community = ''
# This is for all account statuses where we should be applying QoS. If you leave it blank, we'll use any status in account marked with "Activates Account" in Sonar.
sonar_active_status_ids = []
# Splynx Integration
automaticImportSplynx = False
splynx_api_key = ''

View File

@ -27,6 +27,13 @@ def pullMikrotikIPv6():
# pass
list_arp4 = api.get_resource('/ip/arp')
entries = list_arp4.get()
for entry in entries:
try:
macToIPv4[entry['mac-address']] = entry['address']
except:
pass
list_dhcp4 = api.get_resource('/ip/dhcp-server/lease')
entries = list_dhcp4.get()
for entry in entries:
try:
macToIPv4[entry['mac-address']] = entry['address']
@ -35,10 +42,15 @@ def pullMikrotikIPv6():
list_binding6 = api.get_resource('/ipv6/dhcp-server/binding')
entries = list_binding6.get()
for entry in entries:
try:
clientAddressToIPv6[entry['client-address']] = entry['address']
except:
pass
if len(entry['duid']) == 14:
mac = entry['duid'][2:14].upper()
macNew = mac[0:2] + ':' + mac[2:4] + ':' + mac[4:6] + ':' + mac[6:8] + ':' + mac[8:10] + ':' + mac[10:12]
macToIPv6[macNew] = entry['address']
else:
try:
clientAddressToIPv6[entry['client-address']] = entry['address']
except:
pass
list_neighbor6 = api.get_resource('/ipv6/neighbor')
entries = list_neighbor6.get()
for entry in entries: