1
0
mirror of https://github.com/craigerl/aprsd.git synced 2024-11-24 00:48:54 -05:00

Compare commits

...

45 Commits

Author SHA1 Message Date
Adam Fourney
24714923be Addressing comments in PR. 2024-11-11 20:49:23 -08:00
afourney
c460cefc1a
Merge branch 'master' into option-to-disable-help 2024-11-11 20:40:33 -08:00
03ce5a3d50 Change healthcheck email thread check timeout
Email thread by default runs every 5 minutes to check for email.
The healthcheck max timeout for a loop run is also 5 minutes, leaving
little room for a race condition for the healthcheck.
This patch updates the healthcheck timout to 5 minutes 30 seconds.
2024-11-11 12:44:01 -05:00
98a62102b7 Don't break logging aprslib failures
this patch removes the newline when logging failures to parse
aprs packets in aprslib
2024-11-08 13:47:02 -05:00
7d1e739502 Added new features to listen command.
Changed the default to not log incoming packets.  If you want to see
the packets logged, then pass in --log-packets.

Added the ability to specify a list of plugins to load by passing in
--enable-plugin <fully qualified python path to class>
You can specify --enable-plugin multiple times to enable multiple
plugins.

Added new switch to enable the packet stats thread logging of stats
of all the packets seen.  --enable-packet-stats.  This is off by
default.
2024-11-08 13:28:46 -05:00
bd0bcc1924 Fixed the protocol for Stats Collector
The stats() method had an inconsistent name for serializable.
2024-11-08 13:22:53 -05:00
adcf94d8c7 Catch and log exceptions in consumer
This patch adds a try except block around the APRSIS
consumer.  This gives us a chance to log the specific
exception, so we can see why the consumer failed.
2024-11-08 13:21:38 -05:00
9f3c8f889f Allow loading a specific list of plugins
Updated the PluginManager to allow only activating a
specific list of plugins passed in, instead of what is
in the config file.
2024-11-08 13:20:42 -05:00
6e62ac14b8 Allow disabling sending all AckPackets
This patch adds a new config option 'enable_sending_ack_packets', which
is by default set to True.  This allows the admin to disable sending Ack
Packets for MessagePackets entirely.
2024-11-06 18:21:46 -05:00
d0018a8cd3 Added rich output for dump-stats
this patch adds table formatted output for the stats in the
aprsd dump-stats command.  You can also show the stats in raw json/dict
format by passing --raw.  You can also limit the sections of the
stats by passing --show-section aprsdstats
2024-11-06 11:39:50 -05:00
2fdc7b111d Only load EmailStats if email is enabled
This patch updates the stats collector to only register the EmailStats
when the email plugin is enabled.
2024-11-06 08:43:25 -05:00
229155d0ee updated README.rst
this patch includes information on building your own
plugins for APRSD
2024-11-05 20:49:11 -05:00
7d22148b0f
Merge pull request #181 from craigerl/unit-tests
Added unit test for client base
2024-11-05 20:48:27 -05:00
563b06876c fixed name for dump-stats output
Also added a console.stats during loading of the stats
2024-11-05 20:15:52 -05:00
579d0c95a0 optimized Packet.get() 2024-11-05 15:04:48 -05:00
224686cac5 Added unit test for APRSISClient 2024-11-05 13:39:44 -05:00
ab2de86726 Added unit test for ClientFactory 2024-11-05 12:32:16 -05:00
f1d066b8a9 Added unit test for client base
This patch adds a unit test for the APRSClient base class.
2024-11-05 12:15:59 -05:00
0be87d8b4f Calculate delta once and reuse it 2024-11-05 11:54:07 -05:00
d808e217a2 Updated APRSClient
Added some doc strings and some types for returns as well
as an exception catching around create_client
2024-11-05 11:46:50 -05:00
7e8d7cdf86 Update PacketList
This patch updates some of the code in PacketList to be
a bit more efficient.  Thanks to the Cursor IDE :P
2024-11-05 11:34:12 -05:00
add18f1a6f Added new dump-stats command
This new command will dump the existing packetstats from the
last time it was written to disk.
2024-11-05 11:33:19 -05:00
c4bf89071a
Merge pull request #180 from craigerl/walt-listen-test
Walt listen test
2024-11-05 11:32:38 -05:00
df0ca04483 Added some changes to listen
to collect stats and only show those stats during listen
2024-11-05 11:29:44 -05:00
3fd606946d Fix a small issue with packet sending failures
When a packet _send_direct() failed to send due to a network
timeout or client issue, we don't want to count that as a send
attempt for the packet.  This patch catches that and allows for
another retry.
2024-10-31 18:10:46 -04:00
dependabot[bot]
fbfac97140 Bump werkzeug from 3.0.4 to 3.0.6
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 3.0.4 to 3.0.6.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/3.0.4...3.0.6)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-31 18:10:45 -04:00
f265e8f354 Fix a small issue with packet sending failures
When a packet _send_direct() failed to send due to a network
timeout or client issue, we don't want to count that as a send
attempt for the packet.  This patch catches that and allows for
another retry.
2024-10-31 17:42:43 -04:00
d863474c13 Added some changes to listen
to collect stats and only show those stats during listen
2024-10-31 09:17:36 -04:00
993b40d936
Merge pull request #178 from craigerl/dependabot/pip/werkzeug-3.0.6
Bump werkzeug from 3.0.4 to 3.0.6
2024-10-29 12:35:17 -04:00
0271ccd145 Added new aprsd admin command
This patch adds the aprsd admin command back.
If you don't have about lots of web traffic, then use
aprsd admin to start the admin interface.
2024-10-29 12:30:19 -04:00
578062648b Update Changelog for v3.4.3 2024-10-29 11:08:27 -04:00
ecf30d3397 Fixed issue in send_message command
Send Message was using an old mechanism for logging ack packets.
This patch fixes that problem.
2024-10-29 09:52:39 -04:00
882e90767d Change virtual env name to .venv 2024-10-29 09:52:18 -04:00
dependabot[bot]
0ca62e727e
Bump werkzeug from 3.0.4 to 3.0.6
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 3.0.4 to 3.0.6.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/3.0.4...3.0.6)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-26 00:26:57 +00:00
14274c93b5 3.4.2 2024-10-18 16:08:09 -04:00
14c0a699cb Cleanup test failures 2024-10-18 12:25:16 -04:00
c12c42b876 cleaned up some requirements
we don't really need gevent, eventlet.
those are only needed for the web admin interface
2024-10-18 12:25:06 -04:00
765e02f5b3 Collector cleanup 2024-10-18 12:07:02 -04:00
8cdbf18bef Add final stages in Dockerfile
This patch adds another final stage in the Dockerfile
2024-10-17 17:10:59 -04:00
a65262d2ff Sort changelog commits by date 2024-10-17 17:10:03 -04:00
9951b12e2d Log closing client connection.
This patch updates the aprsis connection client to add logging
when the close() happens
2024-10-17 17:09:11 -04:00
3e9bf2422a Added packet log distance and new arrows
this patch adds unicode arrows during logging of packet arrows
(tx/rx) and adds distance for GPSPackets
2024-10-17 17:06:28 -04:00
5e9f92dfa6 Added color logging of thread names at keepalive
This patch adds logging of the thread name in color
during keepalive loop output.
2024-10-17 17:04:33 -04:00
5314856101 Removed dumping of the stats on exit
This patch removes the logging of the raw stats dict when the commands
exit.
2024-10-17 17:01:36 -04:00
758007ea3f Removed remnants of QueryPlugin
QueryPlugin was removed a while back after the stats rework.
This patch removes the config options for the Query plugin
2024-10-03 10:34:35 -07:00
45 changed files with 1860 additions and 988 deletions

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,5 @@
WORKDIR?=. WORKDIR?=.
VENVDIR ?= $(WORKDIR)/.aprsd-venv VENVDIR ?= $(WORKDIR)/.venv
.DEFAULT_GOAL := help .DEFAULT_GOAL := help
@ -24,7 +24,7 @@ run: venv ## Create a virtual environment for running aprsd commands
changelog: dev changelog: dev
npm i -g auto-changelog npm i -g auto-changelog
auto-changelog -l false -o ChangeLog.md auto-changelog -l false --sort-commits date -o ChangeLog.md
docs: changelog docs: changelog
m2r --overwrite ChangeLog.md m2r --overwrite ChangeLog.md

View File

@ -11,6 +11,37 @@ ____________________
`APRSD <http://github.com/craigerl/aprsd>`_ is a Ham radio `APRS <http://aprs.org>`_ message command gateway built on python. `APRSD <http://github.com/craigerl/aprsd>`_ is a Ham radio `APRS <http://aprs.org>`_ message command gateway built on python.
Table of Contents
=================
1. `What is APRSD <#what-is-aprsd>`_
2. `APRSD Overview Diagram <#aprsd-overview-diagram>`_
3. `Typical Use Case <#typical-use-case>`_
4. `Installation <#installation>`_
5. `Example Usage <#example-usage>`_
6. `Help <#help>`_
7. `Commands <#commands>`_
- `Configuration <#configuration>`_
- `Server <#server>`_
- `Current List of Built-in Plugins <#current-list-of-built-in-plugins>`_
- `Pypi.org APRSD Installable Plugin Packages <#pypiorg-aprsd-installable-plugin-packages>`_
- `🐍 APRSD Installed 3rd Party Plugins <#aprsd-installed-3rd-party-plugins>`_
- `Send Message <#send-message>`_
- `Send Email (Radio to SMTP Server) <#send-email-radio-to-smtp-server>`_
- `Receive Email (IMAP Server to Radio) <#receive-email-imap-server-to-radio>`_
- `Location <#location>`_
- `Web Admin Interface <#web-admin-interface>`_
8. `Development <#development>`_
- `Building Your Own APRSD Plugins <#building-your-own-aprsd-plugins>`_
9. `Workflow <#workflow>`_
10. `Release <#release>`_
11. `Docker Container <#docker-container>`_
- `Building <#building-1>`_
- `Official Build <#official-build>`_
- `Development Build <#development-build>`_
- `Running the Container <#running-the-container>`_
What is APRSD What is APRSD
============= =============
APRSD is a python application for interacting with the APRS network and providing APRSD is a python application for interacting with the APRS network and providing
@ -147,8 +178,7 @@ look for incomming commands to the callsign configured in the config file
Current list of built-in plugins Current list of built-in plugins
====================================== --------------------------------
:: ::
└─> aprsd list-plugins └─> aprsd list-plugins
@ -300,18 +330,21 @@ AND... ping, fortune, time.....
Web Admin Interface Web Admin Interface
=================== ===================
APRSD has a web admin interface that allows you to view the status of the running APRSD server instance.
The web admin interface shows graphs of packet counts, packet types, number of threads running, the latest
packets sent and received, and the status of each of the plugins that are loaded. You can also view the logfile
and view the raw APRSD configuration file.
To start the web admin interface, You have to install gunicorn in your virtualenv that already has aprsd installed. To start the web admin interface, You have to install gunicorn in your virtualenv that already has aprsd installed.
:: ::
source <path to APRSD's virtualenv>/bin/activate source <path to APRSD's virtualenv>/bin/activate
pip install gunicorn aprsd admin --loglevel INFO
gunicorn --bind 0.0.0.0:8080 "aprsd.wsgi:app"
The web admin interface will be running on port 8080 on the local machine. http://localhost:8080 The web admin interface will be running on port 8080 on the local machine. http://localhost:8080
Development Development
=========== ===========
@ -320,7 +353,7 @@ Development
* ``make`` * ``make``
Workflow Workflow
======== --------
While working aprsd, The workflow is as follows: While working aprsd, The workflow is as follows:
@ -349,7 +382,7 @@ While working aprsd, The workflow is as follows:
Release Release
======= -------
To do release to pypi: To do release to pypi:
@ -370,6 +403,29 @@ To do release to pypi:
``make upload`` ``make upload``
Building your own APRSD plugins
-------------------------------
APRSD plugins are the mechanism by which APRSD can respond to APRS Messages. The plugins are loaded at server startup
and can also be loaded at listen startup. When a packet is received by APRSD, it is passed to each of the plugins
in the order they were registered in the config file. The plugins can then decide what to do with the packet.
When a plugin is called, it is passed a APRSD Packet object. The plugin can then do something with the packet and
return a reply message if desired. If a plugin does not want to reply to the packet, it can just return None.
When a plugin does return a reply message, APRSD will send the reply message to the appropriate destination.
For example, when a 'ping' message is received, the PingPlugin will return a reply message of 'pong'. When APRSD
receives the 'pong' message, it will be sent back to the original caller of the ping message.
APRSD plugins are simply python packages that can be installed from pypi.org. They are installed into the
aprsd virtualenv and can be imported by APRSD at runtime. The plugins are registered in the config file and loaded
at startup of the aprsd server command or the aprsd listen command.
Overview
--------
You can build your own plugins by following the instructions in the `Building your own APRSD plugins`_ section.
Plugins are called by APRSD when packe
Docker Container Docker Container
================ ================

View File

@ -126,7 +126,10 @@ class APRSISClient(base.APRSClient):
return aprs_client return aprs_client
def consumer(self, callback, blocking=False, immortal=False, raw=False): def consumer(self, callback, blocking=False, immortal=False, raw=False):
self._client.consumer( try:
callback, blocking=blocking, self._client.consumer(
immortal=immortal, raw=raw, callback, blocking=blocking,
) immortal=immortal, raw=raw,
)
except Exception as e:
LOG.error(f"Exception in consumer: {e}")

View File

@ -32,7 +32,11 @@ class APRSClient:
@abc.abstractmethod @abc.abstractmethod
def stats(self) -> dict: def stats(self) -> dict:
pass """Return statistics about the client connection.
Returns:
dict: Statistics about the connection and packet handling
"""
def set_filter(self, filter): def set_filter(self, filter):
self.filter = filter self.filter = filter
@ -46,22 +50,31 @@ class APRSClient:
return self._client return self._client
def _create_client(self): def _create_client(self):
self._client = self.setup_connection() try:
if self.filter: self._client = self.setup_connection()
LOG.info("Creating APRS client filter") if self.filter:
self._client.set_filter(self.filter) LOG.info("Creating APRS client filter")
self._client.set_filter(self.filter)
except Exception as e:
LOG.error(f"Failed to create APRS client: {e}")
self._client = None
raise
def stop(self): def stop(self):
if self._client: if self._client:
LOG.info("Stopping client connection.") LOG.info("Stopping client connection.")
self._client.stop() self._client.stop()
def send(self, packet: core.Packet): def send(self, packet: core.Packet) -> None:
"""Send a packet to the network.""" """Send a packet to the network.
Args:
packet: The APRS packet to send
"""
self.client.send(packet) self.client.send(packet)
@wrapt.synchronized(lock) @wrapt.synchronized(lock)
def reset(self): def reset(self) -> None:
"""Call this to force a rebuild/reconnect.""" """Call this to force a rebuild/reconnect."""
LOG.info("Resetting client connection.") LOG.info("Resetting client connection.")
if self._client: if self._client:
@ -76,7 +89,11 @@ class APRSClient:
@abc.abstractmethod @abc.abstractmethod
def setup_connection(self): def setup_connection(self):
pass """Initialize and return the underlying APRS connection.
Returns:
object: The initialized connection object
"""
@staticmethod @staticmethod
@abc.abstractmethod @abc.abstractmethod
@ -90,7 +107,11 @@ class APRSClient:
@abc.abstractmethod @abc.abstractmethod
def decode_packet(self, *args, **kwargs): def decode_packet(self, *args, **kwargs):
pass """Decode raw APRS packet data into a Packet object.
Returns:
Packet: Decoded APRS packet
"""
@abc.abstractmethod @abc.abstractmethod
def consumer(self, callback, blocking=False, immortal=False, raw=False): def consumer(self, callback, blocking=False, immortal=False, raw=False):

View File

@ -33,7 +33,11 @@ class Aprsdis(aprslib.IS):
def stop(self): def stop(self):
self.thread_stop = True self.thread_stop = True
LOG.info("Shutdown Aprsdis client.") LOG.warning("Shutdown Aprsdis client.")
def close(self):
LOG.warning("Closing Aprsdis client.")
super().close()
@wrapt.synchronized(lock) @wrapt.synchronized(lock)
def send(self, packet: core.Packet): def send(self, packet: core.Packet):
@ -189,14 +193,14 @@ class Aprsdis(aprslib.IS):
except ParseError as exp: except ParseError as exp:
self.logger.log( self.logger.log(
11, 11,
"%s\n Packet: %s", "%s Packet: '%s'",
exp, exp,
exp.packet, exp.packet,
) )
except UnknownFormat as exp: except UnknownFormat as exp:
self.logger.log( self.logger.log(
9, 9,
"%s\n Packet: %s", "%s Packet: '%s'",
exp, exp,
exp.packet, exp.packet,
) )

57
aprsd/cmds/admin.py Normal file
View File

@ -0,0 +1,57 @@
import logging
import os
import signal
import click
from oslo_config import cfg
import socketio
import aprsd
from aprsd import cli_helper
from aprsd import main as aprsd_main
from aprsd import utils
from aprsd.main import cli
os.environ["APRSD_ADMIN_COMMAND"] = "1"
# this import has to happen AFTER we set the
# above environment variable, so that the code
# inside the wsgi.py has the value
from aprsd import wsgi as aprsd_wsgi # noqa
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
# main() ###
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.pass_context
@cli_helper.process_standard_options
def admin(ctx):
"""Start the aprsd admin interface."""
signal.signal(signal.SIGINT, aprsd_main.signal_handler)
signal.signal(signal.SIGTERM, aprsd_main.signal_handler)
level, msg = utils._check_version()
if level:
LOG.warning(msg)
else:
LOG.info(msg)
LOG.info(f"APRSD Started version: {aprsd.__version__}")
# Dump all the config options now.
CONF.log_opt_values(LOG, logging.DEBUG)
async_mode = "threading"
sio = socketio.Server(logger=True, async_mode=async_mode)
aprsd_wsgi.app.wsgi_app = socketio.WSGIApp(sio, aprsd_wsgi.app.wsgi_app)
aprsd_wsgi.init_app()
sio.register_namespace(aprsd_wsgi.LoggingNamespace("/logs"))
CONF.log_opt_values(LOG, logging.DEBUG)
aprsd_wsgi.app.run(
threaded=True,
debug=False,
port=CONF.admin.web_port,
host=CONF.admin.web_ip,
)

View File

@ -101,7 +101,7 @@ def test_plugin(
pm = plugin.PluginManager() pm = plugin.PluginManager()
if load_all: if load_all:
pm.setup_plugins() pm.setup_plugins(load_help_plugin=CONF.load_help_plugin)
obj = pm._create_class(plugin_path, plugin.APRSDPluginBase) obj = pm._create_class(plugin_path, plugin.APRSDPluginBase)
if not obj: if not obj:
click.echo(ctx.get_help()) click.echo(ctx.get_help())

View File

@ -11,6 +11,7 @@ from rich.table import Table
import aprsd import aprsd
from aprsd import cli_helper from aprsd import cli_helper
from aprsd.main import cli from aprsd.main import cli
from aprsd.threads.stats import StatsStore
# setup the global logger # setup the global logger
@ -154,3 +155,157 @@ def fetch_stats(ctx, host, port):
watch_table.add_row(key, value["last"]) watch_table.add_row(key, value["last"])
console.print(watch_table) console.print(watch_table)
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--raw",
is_flag=True,
default=False,
help="Dump raw stats instead of formatted output.",
)
@click.option(
"--show-section",
default=["All"],
help="Show specific sections of the stats. "
" Choices: All, APRSDStats, APRSDThreadList, APRSClientStats,"
" PacketList, SeenList, WatchList",
multiple=True,
type=click.Choice(
[
"All",
"APRSDStats",
"APRSDThreadList",
"APRSClientStats",
"PacketList",
"SeenList",
"WatchList",
],
case_sensitive=False,
),
)
@click.pass_context
@cli_helper.process_standard_options
def dump_stats(ctx, raw, show_section):
"""Dump the current stats from the running APRSD instance."""
console = Console()
console.print(f"APRSD Dump-Stats started version: {aprsd.__version__}")
with console.status("Dumping stats"):
ss = StatsStore()
ss.load()
stats = ss.data
if raw:
if "All" in show_section:
console.print(stats)
return
else:
for section in show_section:
console.print(f"Dumping {section} section:")
console.print(stats[section])
return
t = Table(title="APRSD Stats")
t.add_column("Key")
t.add_column("Value")
for key, value in stats["APRSDStats"].items():
t.add_row(key, str(value))
if "All" in show_section or "APRSDStats" in show_section:
console.print(t)
# Show the thread list
t = Table(title="Thread List")
t.add_column("Name")
t.add_column("Class")
t.add_column("Alive?")
t.add_column("Loop Count")
t.add_column("Age")
for name, value in stats["APRSDThreadList"].items():
t.add_row(
name,
value["class"],
str(value["alive"]),
str(value["loop_count"]),
str(value["age"]),
)
if "All" in show_section or "APRSDThreadList" in show_section:
console.print(t)
# Show the plugins
t = Table(title="Plugin List")
t.add_column("Name")
t.add_column("Enabled")
t.add_column("Version")
t.add_column("TX")
t.add_column("RX")
for name, value in stats["PluginManager"].items():
t.add_row(
name,
str(value["enabled"]),
value["version"],
str(value["tx"]),
str(value["rx"]),
)
if "All" in show_section or "PluginManager" in show_section:
console.print(t)
# Now show the client stats
t = Table(title="Client Stats")
t.add_column("Key")
t.add_column("Value")
for key, value in stats["APRSClientStats"].items():
t.add_row(key, str(value))
if "All" in show_section or "APRSClientStats" in show_section:
console.print(t)
# now show the packet list
packet_list = stats.get("PacketList")
t = Table(title="Packet List")
t.add_column("Key")
t.add_column("Value")
t.add_row("Total Received", str(packet_list["rx"]))
t.add_row("Total Sent", str(packet_list["tx"]))
if "All" in show_section or "PacketList" in show_section:
console.print(t)
# now show the seen list
seen_list = stats.get("SeenList")
sorted_seen_list = sorted(
seen_list.items(),
)
t = Table(title="Seen List")
t.add_column("Callsign")
t.add_column("Message Count")
t.add_column("Last Heard")
for key, value in sorted_seen_list:
t.add_row(
key,
str(value["count"]),
str(value["last"]),
)
if "All" in show_section or "SeenList" in show_section:
console.print(t)
# now show the watch list
watch_list = stats.get("WatchList")
sorted_watch_list = sorted(
watch_list.items(),
)
t = Table(title="Watch List")
t.add_column("Callsign")
t.add_column("Last Heard")
for key, value in sorted_watch_list:
t.add_row(
key,
str(value["last"]),
)
if "All" in show_section or "WatchList" in show_section:
console.print(t)

View File

@ -63,7 +63,7 @@ def healthcheck(ctx, timeout):
if email_thread_last_update != "never": if email_thread_last_update != "never":
d = now - email_thread_last_update d = now - email_thread_last_update
max_timeout = {"hours": 0.0, "minutes": 5, "seconds": 0} max_timeout = {"hours": 0.0, "minutes": 5, "seconds": 30}
max_delta = datetime.timedelta(**max_timeout) max_delta = datetime.timedelta(**max_timeout)
if d > max_delta: if d > max_delta:
console.log(f"Email thread is very old! {d}") console.log(f"Email thread is very old! {d}")

View File

@ -10,12 +10,13 @@ import sys
import time import time
import click import click
from loguru import logger
from oslo_config import cfg from oslo_config import cfg
from rich.console import Console from rich.console import Console
# local imports here # local imports here
import aprsd import aprsd
from aprsd import cli_helper, packets, plugin, threads from aprsd import cli_helper, packets, plugin, threads, utils
from aprsd.client import client_factory from aprsd.client import client_factory
from aprsd.main import cli from aprsd.main import cli
from aprsd.packets import collector as packet_collector from aprsd.packets import collector as packet_collector
@ -24,12 +25,14 @@ from aprsd.packets import seen_list
from aprsd.stats import collector from aprsd.stats import collector
from aprsd.threads import keep_alive, rx from aprsd.threads import keep_alive, rx
from aprsd.threads import stats as stats_thread from aprsd.threads import stats as stats_thread
from aprsd.threads.aprsd import APRSDThread
# setup the global logger # setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10 # log.basicConfig(level=log.DEBUG) # level=10
LOG = logging.getLogger("APRSD") LOG = logging.getLogger("APRSD")
CONF = cfg.CONF CONF = cfg.CONF
LOGU = logger
console = Console() console = Console()
@ -42,16 +45,21 @@ def signal_handler(sig, frame):
), ),
) )
time.sleep(5) time.sleep(5)
LOG.info(collector.Collector().collect()) # Last save to disk
collector.Collector().collect()
class APRSDListenThread(rx.APRSDRXThread): class APRSDListenThread(rx.APRSDRXThread):
def __init__(self, packet_queue, packet_filter=None, plugin_manager=None): def __init__(
self, packet_queue, packet_filter=None, plugin_manager=None,
enabled_plugins=[], log_packets=False,
):
super().__init__(packet_queue) super().__init__(packet_queue)
self.packet_filter = packet_filter self.packet_filter = packet_filter
self.plugin_manager = plugin_manager self.plugin_manager = plugin_manager
if self.plugin_manager: if self.plugin_manager:
LOG.info(f"Plugins {self.plugin_manager.get_message_plugins()}") LOG.info(f"Plugins {self.plugin_manager.get_message_plugins()}")
self.log_packets = log_packets
def process_packet(self, *args, **kwargs): def process_packet(self, *args, **kwargs):
packet = self._client.decode_packet(*args, **kwargs) packet = self._client.decode_packet(*args, **kwargs)
@ -72,13 +80,16 @@ class APRSDListenThread(rx.APRSDRXThread):
if self.packet_filter: if self.packet_filter:
filter_class = filters[self.packet_filter] filter_class = filters[self.packet_filter]
if isinstance(packet, filter_class): if isinstance(packet, filter_class):
packet_log.log(packet) if self.log_packets:
packet_log.log(packet)
if self.plugin_manager: if self.plugin_manager:
# Don't do anything with the reply # Don't do anything with the reply
# This is the listen only command. # This is the listen only command.
self.plugin_manager.run(packet) self.plugin_manager.run(packet)
else: else:
packet_log.log(packet) if self.log_packets:
LOG.error("PISS")
packet_log.log(packet)
if self.plugin_manager: if self.plugin_manager:
# Don't do anything with the reply. # Don't do anything with the reply.
# This is the listen only command. # This is the listen only command.
@ -87,6 +98,42 @@ class APRSDListenThread(rx.APRSDRXThread):
packet_collector.PacketCollector().rx(packet) packet_collector.PacketCollector().rx(packet)
class ListenStatsThread(APRSDThread):
"""Log the stats from the PacketList."""
def __init__(self):
super().__init__("PacketStatsLog")
self._last_total_rx = 0
def loop(self):
if self.loop_count % 10 == 0:
# log the stats every 10 seconds
stats_json = collector.Collector().collect()
stats = stats_json["PacketList"]
total_rx = stats["rx"]
rx_delta = total_rx - self._last_total_rx
rate = rx_delta / 10
# Log summary stats
LOGU.opt(colors=True).info(
f"<green>RX Rate: {rate} pps</green> "
f"<yellow>Total RX: {total_rx}</yellow> "
f"<red>RX Last 10 secs: {rx_delta}</red>",
)
self._last_total_rx = total_rx
# Log individual type stats
for k, v in stats["types"].items():
thread_hex = f"fg {utils.hex_from_name(k)}"
LOGU.opt(colors=True).info(
f"<{thread_hex}>{k:<15}</{thread_hex}> "
f"<blue>RX: {v['rx']}</blue> <red>TX: {v['tx']}</red>",
)
time.sleep(1)
return True
@cli.command() @cli.command()
@cli_helper.add_options(cli_helper.common_options) @cli_helper.add_options(cli_helper.common_options)
@click.option( @click.option(
@ -121,6 +168,11 @@ class APRSDListenThread(rx.APRSDRXThread):
), ),
help="Filter by packet type", help="Filter by packet type",
) )
@click.option(
"--enable-plugin",
multiple=True,
help="Enable a plugin. This is the name of the file in the plugins directory.",
)
@click.option( @click.option(
"--load-plugins", "--load-plugins",
default=False, default=False,
@ -132,6 +184,18 @@ class APRSDListenThread(rx.APRSDRXThread):
nargs=-1, nargs=-1,
required=True, required=True,
) )
@click.option(
"--log-packets",
default=False,
is_flag=True,
help="Log incoming packets.",
)
@click.option(
"--enable-packet-stats",
default=False,
is_flag=True,
help="Enable packet stats periodic logging.",
)
@click.pass_context @click.pass_context
@cli_helper.process_standard_options @cli_helper.process_standard_options
def listen( def listen(
@ -139,8 +203,11 @@ def listen(
aprs_login, aprs_login,
aprs_password, aprs_password,
packet_filter, packet_filter,
enable_plugin,
load_plugins, load_plugins,
filter, filter,
log_packets,
enable_packet_stats,
): ):
"""Listen to packets on the APRS-IS Network based on FILTER. """Listen to packets on the APRS-IS Network based on FILTER.
@ -194,22 +261,32 @@ def listen(
aprs_client.set_filter(filter) aprs_client.set_filter(filter)
keepalive = keep_alive.KeepAliveThread() keepalive = keep_alive.KeepAliveThread()
# keepalive.start()
if not CONF.enable_seen_list: if not CONF.enable_seen_list:
# just deregister the class from the packet collector # just deregister the class from the packet collector
packet_collector.PacketCollector().unregister(seen_list.SeenList) packet_collector.PacketCollector().unregister(seen_list.SeenList)
pm = None pm = None
pm = plugin.PluginManager()
if load_plugins: if load_plugins:
pm = plugin.PluginManager()
LOG.info("Loading plugins") LOG.info("Loading plugins")
pm.setup_plugins(load_help_plugin=False) pm.setup_plugins(load_help_plugin=False)
elif enable_plugin:
pm = plugin.PluginManager()
pm.setup_plugins(
load_help_plugin=False,
plugin_list=enable_plugin,
)
else: else:
LOG.warning( LOG.warning(
"Not Loading any plugins use --load-plugins to load what's " "Not Loading any plugins use --load-plugins to load what's "
"defined in the config file.", "defined in the config file.",
) )
if pm:
for p in pm.get_plugins():
LOG.info("Loaded plugin %s", p.__class__.__name__)
stats = stats_thread.APRSDStatsStoreThread() stats = stats_thread.APRSDStatsStoreThread()
stats.start() stats.start()
@ -218,9 +295,14 @@ def listen(
packet_queue=threads.packet_queue, packet_queue=threads.packet_queue,
packet_filter=packet_filter, packet_filter=packet_filter,
plugin_manager=pm, plugin_manager=pm,
enabled_plugins=enable_plugin,
log_packets=log_packets,
) )
LOG.debug("Start APRSDListenThread") LOG.debug("Start APRSDListenThread")
listen_thread.start() listen_thread.start()
if enable_packet_stats:
listen_stats = ListenStatsThread()
listen_stats.start()
keepalive.start() keepalive.start()
LOG.debug("keepalive Join") LOG.debug("keepalive Join")

View File

@ -12,7 +12,9 @@ from aprsd import cli_helper, packets
from aprsd import conf # noqa : F401 from aprsd import conf # noqa : F401
from aprsd.client import client_factory from aprsd.client import client_factory
from aprsd.main import cli from aprsd.main import cli
import aprsd.packets # noqa : F401
from aprsd.packets import collector from aprsd.packets import collector
from aprsd.packets import log as packet_log
from aprsd.threads import tx from aprsd.threads import tx
@ -94,10 +96,6 @@ def send_message(
else: else:
LOG.info(f"L'{aprs_login}' To'{tocallsign}' C'{command}'") LOG.info(f"L'{aprs_login}' To'{tocallsign}' C'{command}'")
packets.PacketList()
packets.WatchList()
packets.SeenList()
got_ack = False got_ack = False
got_response = False got_response = False
@ -106,7 +104,7 @@ def send_message(
cl = client_factory.create() cl = client_factory.create()
packet = cl.decode_packet(packet) packet = cl.decode_packet(packet)
collector.PacketCollector().rx(packet) collector.PacketCollector().rx(packet)
packet.log("RX") packet_log.log(packet, tx=False)
# LOG.debug("Got packet back {}".format(packet)) # LOG.debug("Got packet back {}".format(packet))
if isinstance(packet, packets.AckPacket): if isinstance(packet, packets.AckPacket):
got_ack = True got_ack = True

View File

@ -8,7 +8,7 @@ from oslo_config import cfg
import aprsd import aprsd
from aprsd import cli_helper from aprsd import cli_helper
from aprsd import main as aprsd_main from aprsd import main as aprsd_main
from aprsd import packets, plugin, threads, utils from aprsd import plugin, threads, utils
from aprsd.client import client_factory from aprsd.client import client_factory
from aprsd.main import cli from aprsd.main import cli
from aprsd.packets import collector as packet_collector from aprsd.packets import collector as packet_collector
@ -65,7 +65,7 @@ def server(ctx, flush):
# log file output. # log file output.
LOG.info("Loading Plugin Manager and registering plugins") LOG.info("Loading Plugin Manager and registering plugins")
plugin_manager = plugin.PluginManager() plugin_manager = plugin.PluginManager()
plugin_manager.setup_plugins() plugin_manager.setup_plugins(load_help_plugin=CONF.load_help_plugin)
# Dump all the config options now. # Dump all the config options now.
CONF.log_opt_values(LOG, logging.DEBUG) CONF.log_opt_values(LOG, logging.DEBUG)
@ -87,29 +87,24 @@ def server(ctx, flush):
LOG.error("APRS client is not properly configured in config file.") LOG.error("APRS client is not properly configured in config file.")
sys.exit(-1) sys.exit(-1)
# Now load the msgTrack from disk if any
packets.PacketList()
if flush:
LOG.debug("Deleting saved MsgTrack.")
packets.PacketTrack().flush()
packets.WatchList().flush()
packets.SeenList().flush()
packets.PacketList().flush()
else:
# Try and load saved MsgTrack list
LOG.debug("Loading saved MsgTrack object.")
packets.PacketTrack().load()
packets.WatchList().load()
packets.SeenList().load()
packets.PacketList().load()
keepalive = keep_alive.KeepAliveThread()
keepalive.start()
if not CONF.enable_seen_list: if not CONF.enable_seen_list:
# just deregister the class from the packet collector # just deregister the class from the packet collector
packet_collector.PacketCollector().unregister(seen_list.SeenList) packet_collector.PacketCollector().unregister(seen_list.SeenList)
# Now load the msgTrack from disk if any
if flush:
LOG.debug("Flushing All packet tracking objects.")
packet_collector.PacketCollector().flush()
else:
# Try and load saved MsgTrack list
LOG.debug("Loading saved packet tracking data.")
packet_collector.PacketCollector().load()
# Now start all the main processing threads.
keepalive = keep_alive.KeepAliveThread()
keepalive.start()
stats_store_thread = stats_thread.APRSDStatsStoreThread() stats_store_thread = stats_thread.APRSDStatsStoreThread()
stats_store_thread.start() stats_store_thread.start()

View File

@ -62,9 +62,7 @@ def signal_handler(sig, frame):
threads.APRSDThreadList().stop_all() threads.APRSDThreadList().stop_all()
if "subprocess" not in str(frame): if "subprocess" not in str(frame):
time.sleep(1.5) time.sleep(1.5)
# packets.WatchList().save() stats.stats_collector.collect()
# packets.SeenList().save()
LOG.info(stats.stats_collector.collect())
LOG.info("Telling flask to bail.") LOG.info("Telling flask to bail.")
signal.signal(signal.SIGTERM, sys.exit(0)) signal.signal(signal.SIGTERM, sys.exit(0))
@ -647,11 +645,6 @@ def webchat(ctx, flush, port):
LOG.error("APRS client is not properly configured in config file.") LOG.error("APRS client is not properly configured in config file.")
sys.exit(-1) sys.exit(-1)
packets.PacketList()
packets.PacketTrack()
packets.WatchList()
packets.SeenList()
keepalive = keep_alive.KeepAliveThread() keepalive = keep_alive.KeepAliveThread()
LOG.info("Start KeepAliveThread") LOG.info("Start KeepAliveThread")
keepalive.start() keepalive.start()

View File

@ -141,6 +141,12 @@ aprsd_opts = [
default=True, default=True,
help="Set this to False to disable the help plugin.", help="Set this to False to disable the help plugin.",
), ),
cfg.BoolOpt(
"enable_sending_ack_packets",
default=True,
help="Set this to False, to disable sending of ack packets. This will entirely stop"
"APRSD from sending ack packets.",
),
] ]
watch_list_opts = [ watch_list_opts = [
@ -210,7 +216,6 @@ enabled_plugins_opts = [
"aprsd.plugins.fortune.FortunePlugin", "aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin", "aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin", "aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.query.QueryPlugin",
"aprsd.plugins.time.TimePlugin", "aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.OWMWeatherPlugin", "aprsd.plugins.weather.OWMWeatherPlugin",
"aprsd.plugins.version.VersionPlugin", "aprsd.plugins.version.VersionPlugin",

View File

@ -31,13 +31,6 @@ aprsfi_opts = [
), ),
] ]
query_plugin_opts = [
cfg.StrOpt(
"callsign",
help="The Ham callsign to allow access to the query plugin from RF.",
),
]
owm_wx_opts = [ owm_wx_opts = [
cfg.StrOpt( cfg.StrOpt(
"apiKey", "apiKey",
@ -172,7 +165,6 @@ def register_opts(config):
config.register_group(aprsfi_group) config.register_group(aprsfi_group)
config.register_opts(aprsfi_opts, group=aprsfi_group) config.register_opts(aprsfi_opts, group=aprsfi_group)
config.register_group(query_group) config.register_group(query_group)
config.register_opts(query_plugin_opts, group=query_group)
config.register_group(owm_wx_group) config.register_group(owm_wx_group)
config.register_opts(owm_wx_opts, group=owm_wx_group) config.register_opts(owm_wx_opts, group=owm_wx_group)
config.register_group(avwx_group) config.register_group(avwx_group)
@ -184,7 +176,6 @@ def register_opts(config):
def list_opts(): def list_opts():
return { return {
aprsfi_group.name: aprsfi_opts, aprsfi_group.name: aprsfi_opts,
query_group.name: query_plugin_opts,
owm_wx_group.name: owm_wx_opts, owm_wx_group.name: owm_wx_opts,
avwx_group.name: avwx_opts, avwx_group.name: avwx_opts,
location_group.name: location_opts, location_group.name: location_opts,

View File

@ -54,7 +54,7 @@ def cli(ctx):
def load_commands(): def load_commands():
from .cmds import ( # noqa from .cmds import ( # noqa
completion, dev, fetch_stats, healthcheck, list_plugins, listen, admin, completion, dev, fetch_stats, healthcheck, list_plugins, listen,
send_message, server, webchat, send_message, server, webchat,
) )
@ -79,11 +79,15 @@ def signal_handler(sig, frame):
), ),
) )
time.sleep(1.5) time.sleep(1.5)
packets.PacketTrack().save() try:
packets.WatchList().save() packets.PacketTrack().save()
packets.SeenList().save() packets.WatchList().save()
packets.PacketList().save() packets.SeenList().save()
LOG.info(collector.Collector().collect()) packets.PacketList().save()
collector.Collector().collect()
except Exception as e:
LOG.error(f"Failed to save data: {e}")
sys.exit(0)
# signal.signal(signal.SIGTERM, sys.exit(0)) # signal.signal(signal.SIGTERM, sys.exit(0))
# sys.exit(0) # sys.exit(0)

View File

@ -1,4 +0,0 @@
# What to return from a plugin if we have processed the message
# and it's ok, but don't send a usage string back
# REMOVE THIS FILE

View File

@ -1,3 +1,4 @@
from aprsd.packets import collector
from aprsd.packets.core import ( # noqa: F401 from aprsd.packets.core import ( # noqa: F401
AckPacket, BeaconPacket, BulletinPacket, GPSPacket, MessagePacket, AckPacket, BeaconPacket, BulletinPacket, GPSPacket, MessagePacket,
MicEPacket, ObjectPacket, Packet, RejectPacket, StatusPacket, MicEPacket, ObjectPacket, Packet, RejectPacket, StatusPacket,
@ -9,4 +10,11 @@ from aprsd.packets.tracker import PacketTrack # noqa: F401
from aprsd.packets.watch_list import WatchList # noqa: F401 from aprsd.packets.watch_list import WatchList # noqa: F401
# Register all the packet tracking objects.
collector.PacketCollector().register(PacketList)
collector.PacketCollector().register(SeenList)
collector.PacketCollector().register(PacketTrack)
collector.PacketCollector().register(WatchList)
NULL_MESSAGE = -1 NULL_MESSAGE = -1

View File

@ -20,6 +20,14 @@ class PacketMonitor(Protocol):
"""When we send a packet out the network.""" """When we send a packet out the network."""
... ...
def flush(self) -> None:
"""Flush out any data."""
...
def load(self) -> None:
"""Load any data."""
...
@singleton @singleton
class PacketCollector: class PacketCollector:
@ -27,30 +35,45 @@ class PacketCollector:
self.monitors: list[Callable] = [] self.monitors: list[Callable] = []
def register(self, monitor: Callable) -> None: def register(self, monitor: Callable) -> None:
if not isinstance(monitor, PacketMonitor):
raise TypeError(f"Monitor {monitor} is not a PacketMonitor")
self.monitors.append(monitor) self.monitors.append(monitor)
def unregister(self, monitor: Callable) -> None: def unregister(self, monitor: Callable) -> None:
if not isinstance(monitor, PacketMonitor):
raise TypeError(f"Monitor {monitor} is not a PacketMonitor")
self.monitors.remove(monitor) self.monitors.remove(monitor)
def rx(self, packet: type[core.Packet]) -> None: def rx(self, packet: type[core.Packet]) -> None:
for name in self.monitors: for name in self.monitors:
cls = name() cls = name()
if isinstance(cls, PacketMonitor): try:
try: cls.rx(packet)
cls.rx(packet) except Exception as e:
except Exception as e: LOG.error(f"Error in monitor {name} (rx): {e}")
LOG.error(f"Error in monitor {name} (rx): {e}")
else:
raise TypeError(f"Monitor {name} is not a PacketMonitor")
def tx(self, packet: type[core.Packet]) -> None: def tx(self, packet: type[core.Packet]) -> None:
for name in self.monitors: for name in self.monitors:
cls = name() cls = name()
if isinstance(cls, PacketMonitor): try:
try: cls.tx(packet)
cls.tx(packet) except Exception as e:
except Exception as e: LOG.error(f"Error in monitor {name} (tx): {e}")
LOG.error(f"Error in monitor {name} (tx): {e}")
else: def flush(self):
raise TypeError(f"Monitor {name} is not a PacketMonitor") """Call flush on the objects. This is used to flush out any data."""
for name in self.monitors:
cls = name()
try:
cls.flush()
except Exception as e:
LOG.error(f"Error in monitor {name} (flush): {e}")
def load(self):
"""Call load on the objects. This is used to load any data."""
for name in self.monitors:
cls = name()
try:
cls.load()
except Exception as e:
LOG.error(f"Error in monitor {name} (load): {e}")

View File

@ -63,15 +63,11 @@ def _init_msgNo(): # noqa: N802
def _translate_fields(raw: dict) -> dict: def _translate_fields(raw: dict) -> dict:
translate_fields = { # Direct key checks instead of iteration
"from": "from_call", if "from" in raw:
"to": "to_call", raw["from_call"] = raw.pop("from")
} if "to" in raw:
# First translate some fields raw["to_call"] = raw.pop("to")
for key in translate_fields:
if key in raw:
raw[translate_fields[key]] = raw[key]
del raw[key]
# addresse overrides to_call # addresse overrides to_call
if "addresse" in raw: if "addresse" in raw:
@ -110,11 +106,7 @@ class Packet:
via: Optional[str] = field(default=None, compare=False, hash=False) via: Optional[str] = field(default=None, compare=False, hash=False)
def get(self, key: str, default: Optional[str] = None): def get(self, key: str, default: Optional[str] = None):
"""Emulate a getter on a dict.""" return getattr(self, key, default)
if hasattr(self, key):
return getattr(self, key)
else:
return default
@property @property
def key(self) -> str: def key(self) -> str:

View File

@ -1,10 +1,12 @@
import logging import logging
from typing import Optional from typing import Optional
from geopy.distance import geodesic
from loguru import logger from loguru import logger
from oslo_config import cfg from oslo_config import cfg
from aprsd.packets.core import AckPacket, RejectPacket from aprsd import utils
from aprsd.packets.core import AckPacket, GPSPacket, RejectPacket
LOG = logging.getLogger() LOG = logging.getLogger()
@ -16,6 +18,8 @@ TO_COLOR = "fg #D033FF"
TX_COLOR = "red" TX_COLOR = "red"
RX_COLOR = "green" RX_COLOR = "green"
PACKET_COLOR = "cyan" PACKET_COLOR = "cyan"
DISTANCE_COLOR = "fg #FF5733"
DEGREES_COLOR = "fg #FFA900"
def log_multiline(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None: def log_multiline(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None:
@ -97,19 +101,19 @@ def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> No
if header: if header:
if tx: if tx:
via_color = "red" via_color = "red"
arrow = f"<{via_color}>-></{via_color}>" arrow = f"<{via_color}>\u2192</{via_color}>"
logit.append( logit.append(
f"<red>TX {arrow}</red> " f"<red>TX\u2191</red> "
f"<cyan>{name}</cyan>" f"<cyan>{name}</cyan>"
f":{packet.msgNo}" f":{packet.msgNo}"
f" ({packet.send_count + 1} of {pkt_max_send_count})", f" ({packet.send_count + 1} of {pkt_max_send_count})",
) )
else: else:
via_color = "fg #828282" via_color = "fg #1AA730"
arrow = f"<{via_color}>-></{via_color}>" arrow = f"<{via_color}>\u2192</{via_color}>"
left_arrow = f"<{via_color}><-</{via_color}>" f"<{via_color}><-</{via_color}>"
logit.append( logit.append(
f"<fg #1AA730>RX</fg #1AA730> {left_arrow} " f"<fg #1AA730>RX\u2193</fg #1AA730> "
f"<cyan>{name}</cyan>" f"<cyan>{name}</cyan>"
f":{packet.msgNo}", f":{packet.msgNo}",
) )
@ -139,5 +143,19 @@ def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> No
msg = msg.replace("<", "\\<") msg = msg.replace("<", "\\<")
logit.append(f"<light-yellow><b>{msg}</b></light-yellow>") logit.append(f"<light-yellow><b>{msg}</b></light-yellow>")
# is there distance information?
if isinstance(packet, GPSPacket) and CONF.latitude and CONF.longitude:
my_coords = (CONF.latitude, CONF.longitude)
packet_coords = (packet.latitude, packet.longitude)
try:
bearing = utils.calculate_initial_compass_bearing(my_coords, packet_coords)
except Exception as e:
LOG.error(f"Failed to calculate bearing: {e}")
bearing = 0
logit.append(
f" : <{DEGREES_COLOR}>{utils.degrees_to_cardinal(bearing, full_string=True)}</{DEGREES_COLOR}>"
f"<{DISTANCE_COLOR}>@{geodesic(my_coords, packet_coords).miles:.2f}miles</{DISTANCE_COLOR}>",
)
LOGU.opt(colors=True).info(" ".join(logit)) LOGU.opt(colors=True).info(" ".join(logit))
log_multiline(packet, tx, header) log_multiline(packet, tx, header)

View File

@ -3,7 +3,7 @@ import logging
from oslo_config import cfg from oslo_config import cfg
from aprsd.packets import collector, core from aprsd.packets import core
from aprsd.utils import objectstore from aprsd.utils import objectstore
@ -37,9 +37,10 @@ class PacketList(objectstore.ObjectStoreMixin):
self._total_rx += 1 self._total_rx += 1
self._add(packet) self._add(packet)
ptype = packet.__class__.__name__ ptype = packet.__class__.__name__
if ptype not in self.data["types"]: type_stats = self.data["types"].setdefault(
self.data["types"][ptype] = {"tx": 0, "rx": 0} ptype, {"tx": 0, "rx": 0},
self.data["types"][ptype]["rx"] += 1 )
type_stats["rx"] += 1
def tx(self, packet: type[core.Packet]): def tx(self, packet: type[core.Packet]):
"""Add a packet that was received.""" """Add a packet that was received."""
@ -47,9 +48,10 @@ class PacketList(objectstore.ObjectStoreMixin):
self._total_tx += 1 self._total_tx += 1
self._add(packet) self._add(packet)
ptype = packet.__class__.__name__ ptype = packet.__class__.__name__
if ptype not in self.data["types"]: type_stats = self.data["types"].setdefault(
self.data["types"][ptype] = {"tx": 0, "rx": 0} ptype, {"tx": 0, "rx": 0},
self.data["types"][ptype]["tx"] += 1 )
type_stats["tx"] += 1
def add(self, packet): def add(self, packet):
with self.lock: with self.lock:
@ -81,36 +83,18 @@ class PacketList(objectstore.ObjectStoreMixin):
return self._total_tx return self._total_tx
def stats(self, serializable=False) -> dict: def stats(self, serializable=False) -> dict:
# limit the number of packets to return to 50
with self.lock: with self.lock:
tmp = OrderedDict( # Get last N packets directly using list slicing
reversed( packets_list = list(self.data.get("packets", {}).values())
list( pkts = packets_list[-CONF.packet_list_stats_maxlen:][::-1]
self.data.get("packets", OrderedDict()).items(),
),
),
)
pkts = []
count = 1
for packet in tmp:
pkts.append(tmp[packet])
count += 1
if count > CONF.packet_list_stats_maxlen:
break
stats = { stats = {
"total_tracked": self._total_rx + self._total_rx, "total_tracked": self._total_rx + self._total_tx, # Fixed typo: was rx + rx
"rx": self._total_rx, "rx": self._total_rx,
"tx": self._total_tx, "tx": self._total_tx,
"types": self.data.get("types", []), "types": self.data.get("types", {}), # Changed default from [] to {}
"packet_count": len(self.data.get("packets", [])), "packet_count": len(self.data.get("packets", [])),
"maxlen": self.maxlen, "maxlen": self.maxlen,
"packets": pkts, "packets": pkts,
} }
return stats return stats
# Now register the PacketList with the collector
# every packet we RX and TX goes through the collector
# for processing for whatever reason is needed.
collector.PacketCollector().register(PacketList)

View File

@ -3,7 +3,7 @@ import logging
from oslo_config import cfg from oslo_config import cfg
from aprsd.packets import collector, core from aprsd.packets import core
from aprsd.utils import objectstore from aprsd.utils import objectstore
@ -47,8 +47,3 @@ class SeenList(objectstore.ObjectStoreMixin):
def tx(self, packet: type[core.Packet]): def tx(self, packet: type[core.Packet]):
"""We don't care about TX packets.""" """We don't care about TX packets."""
# Register with the packet collector so we can process the packet
# when we get it off the client (network)
collector.PacketCollector().register(SeenList)

View File

@ -3,7 +3,7 @@ import logging
from oslo_config import cfg from oslo_config import cfg
from aprsd.packets import collector, core from aprsd.packets import core
from aprsd.utils import objectstore from aprsd.utils import objectstore
@ -101,9 +101,3 @@ class PacketTrack(objectstore.ObjectStoreMixin):
del self.data[key] del self.data[key]
except KeyError: except KeyError:
pass pass
# Now register the PacketList with the collector
# every packet we RX and TX goes through the collector
# for processing for whatever reason is needed.
collector.PacketCollector().register(PacketTrack)

View File

@ -4,7 +4,7 @@ import logging
from oslo_config import cfg from oslo_config import cfg
from aprsd import utils from aprsd import utils
from aprsd.packets import collector, core from aprsd.packets import core
from aprsd.utils import objectstore from aprsd.utils import objectstore
@ -117,6 +117,3 @@ class WatchList(objectstore.ObjectStoreMixin):
return False return False
else: else:
return False return False
collector.PacketCollector().register(WatchList)

View File

@ -25,7 +25,6 @@ CORE_MESSAGE_PLUGINS = [
"aprsd.plugins.fortune.FortunePlugin", "aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin", "aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin", "aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.query.QueryPlugin",
"aprsd.plugins.time.TimePlugin", "aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.USWeatherPlugin", "aprsd.plugins.weather.USWeatherPlugin",
"aprsd.plugins.version.VersionPlugin", "aprsd.plugins.version.VersionPlugin",
@ -471,24 +470,27 @@ class PluginManager:
def reload_plugins(self): def reload_plugins(self):
with self.lock: with self.lock:
del self._pluggy_pm del self._pluggy_pm
self.setup_plugins() self.setup_plugins(load_help_plugin=CONF.load_help_plugin)
def setup_plugins(self, load_help_plugin=None): def setup_plugins(
self, load_help_plugin=True,
plugin_list=[],
):
"""Create the plugin manager and register plugins.""" """Create the plugin manager and register plugins."""
# If load_help_plugin is not specified, load it from the config
if load_help_plugin is None:
load_help_plugin = CONF.load_help_plugin
LOG.info("Loading APRSD Plugins") LOG.info("Loading APRSD Plugins")
# Help plugin is always enabled. # Help plugin is always enabled.
if load_help_plugin: if load_help_plugin:
_help = HelpPlugin() _help = HelpPlugin()
self._pluggy_pm.register(_help) self._pluggy_pm.register(_help)
enabled_plugins = CONF.enabled_plugins # if plugins_list is passed in, only load
if enabled_plugins: # those plugins.
for p_name in enabled_plugins: if plugin_list:
for plugin_name in plugin_list:
self._load_plugin(plugin_name)
elif CONF.enabled_plugins:
for p_name in CONF.enabled_plugins:
self._load_plugin(p_name) self._load_plugin(p_name)
else: else:
# Enabled plugins isn't set, so we default to loading all of # Enabled plugins isn't set, so we default to loading all of

View File

@ -12,6 +12,7 @@ import imapclient
from oslo_config import cfg from oslo_config import cfg
from aprsd import packets, plugin, threads, utils from aprsd import packets, plugin, threads, utils
from aprsd.stats import collector
from aprsd.threads import tx from aprsd.threads import tx
from aprsd.utils import trace from aprsd.utils import trace
@ -126,6 +127,11 @@ class EmailPlugin(plugin.APRSDRegexCommandPluginBase):
shortcuts = _build_shortcuts_dict() shortcuts = _build_shortcuts_dict()
LOG.info(f"Email shortcuts {shortcuts}") LOG.info(f"Email shortcuts {shortcuts}")
# Register the EmailStats producer with the stats collector
# We do this here to prevent EmailStats from being registered
# when email is not enabled in the config file.
collector.Collector().register_producer(EmailStats)
else: else:
LOG.info("Email services not enabled.") LOG.info("Email services not enabled.")
self.enabled = False self.enabled = False

View File

@ -8,7 +8,7 @@ from aprsd.utils import trace
LOG = logging.getLogger("APRSD") LOG = logging.getLogger("APRSD")
DEFAULT_FORTUNE_PATH = '/usr/games/fortune' DEFAULT_FORTUNE_PATH = "/usr/games/fortune"
class FortunePlugin(plugin.APRSDRegexCommandPluginBase): class FortunePlugin(plugin.APRSDRegexCommandPluginBase):
@ -45,7 +45,7 @@ class FortunePlugin(plugin.APRSDRegexCommandPluginBase):
command, command,
shell=True, shell=True,
timeout=3, timeout=3,
universal_newlines=True, text=True,
) )
output = ( output = (
output.replace("\r", "") output.replace("\r", "")

View File

@ -2,8 +2,10 @@ import logging
import re import re
import time import time
from geopy.geocoders import ArcGIS, AzureMaps, Baidu, Bing, GoogleV3 from geopy.geocoders import (
from geopy.geocoders import HereV7, Nominatim, OpenCage, TomTom, What3WordsV3, Woosmap ArcGIS, AzureMaps, Baidu, Bing, GoogleV3, HereV7, Nominatim, OpenCage,
TomTom, What3WordsV3, Woosmap,
)
from oslo_config import cfg from oslo_config import cfg
from aprsd import packets, plugin, plugin_utils from aprsd import packets, plugin, plugin_utils
@ -39,8 +41,8 @@ class USGov:
result = plugin_utils.get_weather_gov_for_gps(lat, lon) result = plugin_utils.get_weather_gov_for_gps(lat, lon)
# LOG.info(f"WEATHER: {result}") # LOG.info(f"WEATHER: {result}")
# LOG.info(f"area description {result['location']['areaDescription']}") # LOG.info(f"area description {result['location']['areaDescription']}")
if 'location' in result: if "location" in result:
loc = UsLocation(result['location']['areaDescription']) loc = UsLocation(result["location"]["areaDescription"])
else: else:
loc = UsLocation("Unknown Location") loc = UsLocation("Unknown Location")

View File

@ -1,7 +1,6 @@
from aprsd import plugin from aprsd import plugin
from aprsd.client import stats as client_stats from aprsd.client import stats as client_stats
from aprsd.packets import packet_list, seen_list, tracker, watch_list from aprsd.packets import packet_list, seen_list, tracker, watch_list
from aprsd.plugins import email
from aprsd.stats import app, collector from aprsd.stats import app, collector
from aprsd.threads import aprsd from aprsd.threads import aprsd
@ -15,6 +14,5 @@ stats_collector.register_producer(watch_list.WatchList)
stats_collector.register_producer(tracker.PacketTrack) stats_collector.register_producer(tracker.PacketTrack)
stats_collector.register_producer(plugin.PluginManager) stats_collector.register_producer(plugin.PluginManager)
stats_collector.register_producer(aprsd.APRSDThreadList) stats_collector.register_producer(aprsd.APRSDThreadList)
stats_collector.register_producer(email.EmailStats)
stats_collector.register_producer(client_stats.APRSClientStats) stats_collector.register_producer(client_stats.APRSClientStats)
stats_collector.register_producer(seen_list.SeenList) stats_collector.register_producer(seen_list.SeenList)

View File

@ -10,7 +10,7 @@ LOG = logging.getLogger("APRSD")
@runtime_checkable @runtime_checkable
class StatsProducer(Protocol): class StatsProducer(Protocol):
"""The StatsProducer protocol is used to define the interface for collecting stats.""" """The StatsProducer protocol is used to define the interface for collecting stats."""
def stats(self, serializeable=False) -> dict: def stats(self, serializable=False) -> dict:
"""provide stats in a dictionary format.""" """provide stats in a dictionary format."""
... ...
@ -25,14 +25,18 @@ class Collector:
stats = {} stats = {}
for name in self.producers: for name in self.producers:
cls = name() cls = name()
if isinstance(cls, StatsProducer): try:
try: stats[cls.__class__.__name__] = cls.stats(serializable=serializable).copy()
stats[cls.__class__.__name__] = cls.stats(serializable=serializable).copy() except Exception as e:
except Exception as e: LOG.error(f"Error in producer {name} (stats): {e}")
LOG.error(f"Error in producer {name} (stats): {e}")
else:
raise TypeError(f"{cls} is not an instance of StatsProducer")
return stats return stats
def register_producer(self, producer_name: Callable): def register_producer(self, producer_name: Callable):
if not isinstance(producer_name, StatsProducer):
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
self.producers.append(producer_name) self.producers.append(producer_name)
def unregister_producer(self, producer_name: Callable):
if not isinstance(producer_name, StatsProducer):
raise TypeError(f"Producer {producer_name} is not a StatsProducer")
self.producers.remove(producer_name)

View File

@ -3,6 +3,7 @@ import logging
import time import time
import tracemalloc import tracemalloc
from loguru import logger
from oslo_config import cfg from oslo_config import cfg
from aprsd import packets, utils from aprsd import packets, utils
@ -14,6 +15,7 @@ from aprsd.threads import APRSDThread, APRSDThreadList
CONF = cfg.CONF CONF = cfg.CONF
LOG = logging.getLogger("APRSD") LOG = logging.getLogger("APRSD")
LOGU = logger
class KeepAliveThread(APRSDThread): class KeepAliveThread(APRSDThread):
@ -87,7 +89,12 @@ class KeepAliveThread(APRSDThread):
key = thread["name"] key = thread["name"]
if not alive: if not alive:
LOG.error(f"Thread {thread}") LOG.error(f"Thread {thread}")
LOG.info(f"{key: <15} Alive? {str(alive): <5} {str(age): <20}")
thread_hex = f"fg {utils.hex_from_name(key)}"
t_name = f"<{thread_hex}>{key:<15}</{thread_hex}>"
thread_msg = f"{t_name} Alive? {str(alive): <5} {str(age): <20}"
LOGU.opt(colors=True).info(thread_msg)
# LOG.info(f"{key: <15} Alive? {str(alive): <5} {str(age): <20}")
# check the APRS connection # check the APRS connection
cl = client_factory.create() cl = client_factory.create()

View File

@ -151,6 +151,11 @@ class APRSDProcessPacketThread(APRSDThread):
def __init__(self, packet_queue): def __init__(self, packet_queue):
self.packet_queue = packet_queue self.packet_queue = packet_queue
super().__init__("ProcessPKT") super().__init__("ProcessPKT")
if not CONF.enable_sending_ack_packets:
LOG.warning(
"Sending ack packets is disabled, messages "
"will not be acknowledged.",
)
def process_ack_packet(self, packet): def process_ack_packet(self, packet):
"""We got an ack for a message, no need to resend it.""" """We got an ack for a message, no need to resend it."""
@ -329,15 +334,8 @@ class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
# response, then we send a usage statement. # response, then we send a usage statement.
if to_call == CONF.callsign and not replied: if to_call == CONF.callsign and not replied:
# Is the help plugin installed?
help_available = False
for p in pm.get_message_plugins():
if isinstance(p, plugin.HelpPlugin):
help_available = True
break
# Tailor the messages accordingly # Tailor the messages accordingly
if help_available: if CONF.load_help_plugin:
LOG.warning("Sending help!") LOG.warning("Sending help!")
message_text = "Unknown command! Send 'help' message for help" message_text = "Unknown command! Send 'help' message for help"
else: else:

View File

@ -53,7 +53,10 @@ def send(packet: core.Packet, direct=False, aprs_client=None):
# After prepare, as prepare assigns the msgNo # After prepare, as prepare assigns the msgNo
collector.PacketCollector().tx(packet) collector.PacketCollector().tx(packet)
if isinstance(packet, core.AckPacket): if isinstance(packet, core.AckPacket):
_send_ack(packet, direct=direct, aprs_client=aprs_client) if CONF.enable_sending_ack_packets:
_send_ack(packet, direct=direct, aprs_client=aprs_client)
else:
LOG.info("Sending ack packets is disabled. Not sending AckPacket.")
else: else:
_send_packet(packet, direct=direct, aprs_client=aprs_client) _send_packet(packet, direct=direct, aprs_client=aprs_client)
@ -89,6 +92,9 @@ def _send_direct(packet, aprs_client=None):
except Exception as e: except Exception as e:
LOG.error(f"Failed to send packet: {packet}") LOG.error(f"Failed to send packet: {packet}")
LOG.error(e) LOG.error(e)
return False
else:
return True
class SendPacketThread(aprsd_threads.APRSDThread): class SendPacketThread(aprsd_threads.APRSDThread):
@ -150,8 +156,17 @@ class SendPacketThread(aprsd_threads.APRSDThread):
# no attempt time, so lets send it, and start # no attempt time, so lets send it, and start
# tracking the time. # tracking the time.
packet.last_send_time = int(round(time.time())) packet.last_send_time = int(round(time.time()))
_send_direct(packet) sent = False
packet.send_count += 1 try:
sent = _send_direct(packet)
except Exception:
LOG.error(f"Failed to send packet: {packet}")
else:
# If an exception happens while sending
# we don't want this attempt to count
# against the packet
if sent:
packet.send_count += 1
time.sleep(1) time.sleep(1)
# Make sure we get called again. # Make sure we get called again.
@ -199,8 +214,18 @@ class SendAckThread(aprsd_threads.APRSDThread):
send_now = True send_now = True
if send_now: if send_now:
_send_direct(self.packet) sent = False
self.packet.send_count += 1 try:
sent = _send_direct(self.packet)
except Exception:
LOG.error(f"Failed to send packet: {self.packet}")
else:
# If an exception happens while sending
# we don't want this attempt to count
# against the packet
if sent:
self.packet.send_count += 1
self.packet.last_send_time = int(round(time.time())) self.packet.last_send_time = int(round(time.time()))
time.sleep(1) time.sleep(1)

View File

@ -2,6 +2,7 @@
import errno import errno
import functools import functools
import math
import os import os
import re import re
import sys import sys
@ -82,6 +83,16 @@ def rgb_from_name(name):
return red, green, blue return red, green, blue
def hextriplet(colortuple):
"""Convert a color tuple to a hex triplet."""
return "#" + "".join(f"{i:02X}" for i in colortuple)
def hex_from_name(name):
"""Create a hex color from a string."""
return hextriplet(rgb_from_name(name))
def human_size(bytes, units=None): def human_size(bytes, units=None):
"""Returns a human readable string representation of bytes""" """Returns a human readable string representation of bytes"""
if not units: if not units:
@ -161,3 +172,47 @@ def load_entry_points(group):
except Exception as e: except Exception as e:
print(f"Extension {ep.name} of group {group} failed to load with {e}", file=sys.stderr) print(f"Extension {ep.name} of group {group} failed to load with {e}", file=sys.stderr)
print(traceback.format_exc(), file=sys.stderr) print(traceback.format_exc(), file=sys.stderr)
def calculate_initial_compass_bearing(start, end):
if (type(start) != tuple) or (type(end) != tuple): # noqa: E721
raise TypeError("Only tuples are supported as arguments")
lat1 = math.radians(float(start[0]))
lat2 = math.radians(float(end[0]))
diff_long = math.radians(float(end[1]) - float(start[1]))
x = math.sin(diff_long) * math.cos(lat2)
y = math.cos(lat1) * math.sin(lat2) - (
math.sin(lat1)
* math.cos(lat2) * math.cos(diff_long)
)
initial_bearing = math.atan2(x, y)
# Now we have the initial bearing but math.atan2 return values
# from -180° to + 180° which is not what we want for a compass bearing
# The solution is to normalize the initial bearing as shown below
initial_bearing = math.degrees(initial_bearing)
compass_bearing = (initial_bearing + 360) % 360
return compass_bearing
def degrees_to_cardinal(bearing, full_string=False):
if full_string:
directions = [
"North", "North-Northeast", "Northeast", "East-Northeast", "East", "East-Southeast",
"Southeast", "South-Southeast", "South", "South-Southwest", "Southwest", "West-Southwest",
"West", "West-Northwest", "Northwest", "North-Northwest", "North",
]
else:
directions = [
"N", "NNE", "NE", "ENE", "E", "ESE",
"SE", "SSE", "S", "SSW", "SW", "WSW",
"W", "WNW", "NW", "NNW", "N",
]
cardinal = directions[round(bearing / 22.5)]
return cardinal

View File

@ -3,6 +3,7 @@ import importlib.metadata as imp
import io import io
import json import json
import logging import logging
import os
import queue import queue
import flask import flask
@ -23,6 +24,12 @@ CONF = cfg.CONF
LOG = logging.getLogger("gunicorn.access") LOG = logging.getLogger("gunicorn.access")
logging_queue = queue.Queue() logging_queue = queue.Queue()
# ADMIN_COMMAND True means we are running from `aprsd admin`
# the `aprsd admin` command will import this file after setting
# the APRSD_ADMIN_COMMAND environment variable.
ADMIN_COMMAND = os.environ.get("APRSD_ADMIN_COMMAND", False)
auth = HTTPBasicAuth() auth = HTTPBasicAuth()
users: dict[str, str] = {} users: dict[str, str] = {}
app = Flask( app = Flask(
@ -297,7 +304,7 @@ if __name__ == "uwsgi_file_aprsd_wsgi":
CONF.log_opt_values(LOG, logging.DEBUG) CONF.log_opt_values(LOG, logging.DEBUG)
if __name__ == "aprsd.wsgi": if __name__ == "aprsd.wsgi" and not ADMIN_COMMAND:
# set async_mode to 'threading', 'eventlet', 'gevent' or 'gevent_uwsgi' to # set async_mode to 'threading', 'eventlet', 'gevent' or 'gevent_uwsgi' to
# force a mode else, the best mode is selected automatically from what's # force a mode else, the best mode is selected automatically from what's
# installed # installed

View File

@ -1,4 +1,4 @@
FROM python:3.11-slim as build FROM python:3.11-slim AS build
ARG VERSION=3.4.0 ARG VERSION=3.4.0
# pass this in as 'dev' if you want to install from github repo vs pypi # pass this in as 'dev' if you want to install from github repo vs pypi
@ -40,7 +40,7 @@ RUN set -ex \
### Final stage ### Final stage
FROM build as final FROM build AS install
WORKDIR /app WORKDIR /app
RUN pip3 install -U pip RUN pip3 install -U pip
@ -64,6 +64,8 @@ RUN aprsd --version
ADD bin/setup.sh /app ADD bin/setup.sh /app
ADD bin/admin.sh /app ADD bin/admin.sh /app
FROM install AS final
# For the web admin interface # For the web admin interface
EXPOSE 8001 EXPOSE 8001

View File

@ -8,23 +8,23 @@ add-trailing-comma==3.1.0 # via gray
alabaster==1.0.0 # via sphinx alabaster==1.0.0 # via sphinx
autoflake==1.5.3 # via gray autoflake==1.5.3 # via gray
babel==2.16.0 # via sphinx babel==2.16.0 # via sphinx
black==24.8.0 # via gray black==24.10.0 # via gray
build==1.2.2 # via -r requirements-dev.in, check-manifest, pip-tools build==1.2.2.post1 # via -r requirements-dev.in, check-manifest, pip-tools
cachetools==5.5.0 # via tox cachetools==5.5.0 # via tox
certifi==2024.8.30 # via requests certifi==2024.8.30 # via requests
cfgv==3.4.0 # via pre-commit cfgv==3.4.0 # via pre-commit
chardet==5.2.0 # via tox chardet==5.2.0 # via tox
charset-normalizer==3.3.2 # via requests charset-normalizer==3.4.0 # via requests
check-manifest==0.49 # via -r requirements-dev.in check-manifest==0.50 # via -r requirements-dev.in
click==8.1.7 # via black, fixit, moreorless, pip-tools click==8.1.7 # via black, fixit, moreorless, pip-tools
colorama==0.4.6 # via tox colorama==0.4.6 # via tox
commonmark==0.9.1 # via rich commonmark==0.9.1 # via rich
configargparse==1.7 # via gray configargparse==1.7 # via gray
coverage[toml]==7.6.1 # via pytest-cov coverage[toml]==7.6.3 # via pytest-cov
distlib==0.3.8 # via virtualenv distlib==0.3.9 # via virtualenv
docutils==0.21.2 # via m2r, sphinx docutils==0.21.2 # via m2r, sphinx
exceptiongroup==1.2.2 # via pytest exceptiongroup==1.2.2 # via pytest
filelock==3.16.0 # via tox, virtualenv filelock==3.16.1 # via tox, virtualenv
fixit==2.1.0 # via gray fixit==2.1.0 # via gray
flake8==7.1.1 # via -r requirements-dev.in, pep8-naming flake8==7.1.1 # via -r requirements-dev.in, pep8-naming
gray==0.15.0 # via -r requirements-dev.in gray==0.15.0 # via -r requirements-dev.in
@ -34,35 +34,35 @@ imagesize==1.4.1 # via sphinx
iniconfig==2.0.0 # via pytest iniconfig==2.0.0 # via pytest
isort==5.13.2 # via -r requirements-dev.in, gray isort==5.13.2 # via -r requirements-dev.in, gray
jinja2==3.1.4 # via sphinx jinja2==3.1.4 # via sphinx
libcst==1.4.0 # via fixit libcst==1.5.0 # via fixit
m2r==0.3.1 # via -r requirements-dev.in m2r==0.3.1 # via -r requirements-dev.in
markupsafe==2.1.5 # via jinja2 markupsafe==3.0.2 # via jinja2
mccabe==0.7.0 # via flake8 mccabe==0.7.0 # via flake8
mistune==0.8.4 # via m2r mistune==0.8.4 # via m2r
moreorless==0.4.0 # via fixit moreorless==0.4.0 # via fixit
mypy==1.11.2 # via -r requirements-dev.in mypy==1.12.0 # via -r requirements-dev.in
mypy-extensions==1.0.0 # via black, mypy mypy-extensions==1.0.0 # via black, mypy
nodeenv==1.9.1 # via pre-commit nodeenv==1.9.1 # via pre-commit
packaging==24.1 # via black, build, fixit, pyproject-api, pytest, sphinx, tox packaging==24.1 # via black, build, fixit, pyproject-api, pytest, sphinx, tox
pathspec==0.12.1 # via black, trailrunner pathspec==0.12.1 # via black, trailrunner
pep8-naming==0.14.1 # via -r requirements-dev.in pep8-naming==0.14.1 # via -r requirements-dev.in
pip-tools==7.4.1 # via -r requirements-dev.in pip-tools==7.4.1 # via -r requirements-dev.in
platformdirs==4.3.3 # via black, tox, virtualenv platformdirs==4.3.6 # via black, tox, virtualenv
pluggy==1.5.0 # via pytest, tox pluggy==1.5.0 # via pytest, tox
pre-commit==3.8.0 # via -r requirements-dev.in pre-commit==4.0.1 # via -r requirements-dev.in
pycodestyle==2.12.1 # via flake8 pycodestyle==2.12.1 # via flake8
pyflakes==3.2.0 # via autoflake, flake8 pyflakes==3.2.0 # via autoflake, flake8
pygments==2.18.0 # via rich, sphinx pygments==2.18.0 # via rich, sphinx
pyproject-api==1.7.1 # via tox pyproject-api==1.8.0 # via tox
pyproject-hooks==1.1.0 # via build, pip-tools pyproject-hooks==1.2.0 # via build, pip-tools
pytest==8.3.3 # via -r requirements-dev.in, pytest-cov pytest==8.3.3 # via -r requirements-dev.in, pytest-cov
pytest-cov==5.0.0 # via -r requirements-dev.in pytest-cov==5.0.0 # via -r requirements-dev.in
pyupgrade==3.17.0 # via gray pyupgrade==3.18.0 # via gray
pyyaml==6.0.2 # via libcst, pre-commit pyyaml==6.0.2 # via libcst, pre-commit
requests==2.32.3 # via sphinx requests==2.32.3 # via sphinx
rich==12.6.0 # via gray rich==12.6.0 # via gray
snowballstemmer==2.2.0 # via sphinx snowballstemmer==2.2.0 # via sphinx
sphinx==8.0.2 # via -r requirements-dev.in sphinx==8.1.3 # via -r requirements-dev.in
sphinxcontrib-applehelp==2.0.0 # via sphinx sphinxcontrib-applehelp==2.0.0 # via sphinx
sphinxcontrib-devhelp==2.0.0 # via sphinx sphinxcontrib-devhelp==2.0.0 # via sphinx
sphinxcontrib-htmlhelp==2.1.0 # via sphinx sphinxcontrib-htmlhelp==2.1.0 # via sphinx
@ -71,14 +71,14 @@ sphinxcontrib-qthelp==2.0.0 # via sphinx
sphinxcontrib-serializinghtml==2.0.0 # via sphinx sphinxcontrib-serializinghtml==2.0.0 # via sphinx
tokenize-rt==6.0.0 # via add-trailing-comma, pyupgrade tokenize-rt==6.0.0 # via add-trailing-comma, pyupgrade
toml==0.10.2 # via autoflake toml==0.10.2 # via autoflake
tomli==2.0.1 # via black, build, check-manifest, coverage, fixit, mypy, pip-tools, pyproject-api, pytest, sphinx, tox tomli==2.0.2 # via black, build, check-manifest, coverage, fixit, mypy, pip-tools, pyproject-api, pytest, sphinx, tox
tox==4.18.1 # via -r requirements-dev.in tox==4.23.0 # via -r requirements-dev.in
trailrunner==1.4.0 # via fixit trailrunner==1.4.0 # via fixit
typing-extensions==4.12.2 # via black, mypy typing-extensions==4.12.2 # via black, mypy, tox
unify==0.5 # via gray unify==0.5 # via gray
untokenize==0.1.1 # via unify untokenize==0.1.1 # via unify
urllib3==2.2.3 # via requests urllib3==2.2.3 # via requests
virtualenv==20.26.4 # via pre-commit, tox virtualenv==20.27.0 # via pre-commit, tox
wheel==0.44.0 # via -r requirements-dev.in, pip-tools wheel==0.44.0 # via -r requirements-dev.in, pip-tools
# The following packages are considered to be unsafe in a requirements file: # The following packages are considered to be unsafe in a requirements file:

View File

@ -5,12 +5,10 @@ click
click-params click-params
dataclasses dataclasses
dataclasses-json dataclasses-json
eventlet
flask flask
flask-httpauth flask-httpauth
flask-socketio flask-socketio
geopy geopy
gevent
imapclient imapclient
kiss3 kiss3
loguru loguru
@ -18,7 +16,6 @@ oslo.config
pluggy pluggy
python-socketio python-socketio
pyyaml pyyaml
pytz
requests requests
# Pinned due to gray needing 12.6.0 # Pinned due to gray needing 12.6.0
rich~=12.6.0 rich~=12.6.0
@ -30,3 +27,4 @@ thesmuggler
tzlocal tzlocal
update_checker update_checker
wrapt wrapt
pytz

View File

@ -4,78 +4,162 @@
# #
# pip-compile --annotation-style=line requirements.in # pip-compile --annotation-style=line requirements.in
# #
aprslib==0.7.2 # via -r requirements.in aprslib==0.7.2
attrs==24.2.0 # via ax253, kiss3, rush # via -r requirements.in
ax253==0.1.5.post1 # via kiss3 attrs==24.2.0
beautifulsoup4==4.12.3 # via -r requirements.in # via
bidict==0.23.1 # via python-socketio # ax253
bitarray==2.9.2 # via ax253, kiss3 # kiss3
blinker==1.8.2 # via flask # rush
certifi==2024.8.30 # via requests ax253==0.1.5.post1
charset-normalizer==3.3.2 # via requests # via kiss3
click==8.1.7 # via -r requirements.in, click-params, flask beautifulsoup4==4.12.3
click-params==0.5.0 # via -r requirements.in # via -r requirements.in
commonmark==0.9.1 # via rich bidict==0.23.1
dataclasses==0.6 # via -r requirements.in # via python-socketio
dataclasses-json==0.6.7 # via -r requirements.in bitarray==3.0.0
debtcollector==3.0.0 # via oslo-config # via
deprecated==1.2.14 # via click-params # ax253
dnspython==2.6.1 # via eventlet # kiss3
eventlet==0.37.0 # via -r requirements.in blinker==1.8.2
flask==3.0.3 # via -r requirements.in, flask-httpauth, flask-socketio # via flask
flask-httpauth==4.8.0 # via -r requirements.in certifi==2024.8.30
flask-socketio==5.3.7 # via -r requirements.in # via requests
geographiclib==2.0 # via geopy charset-normalizer==3.4.0
geopy==2.4.1 # via -r requirements.in # via requests
gevent==24.2.1 # via -r requirements.in click==8.1.7
greenlet==3.1.0 # via eventlet, gevent # via
h11==0.14.0 # via wsproto # -r requirements.in
idna==3.10 # via requests # click-params
imapclient==3.0.1 # via -r requirements.in # flask
importlib-metadata==8.5.0 # via ax253, kiss3 click-params==0.5.0
itsdangerous==2.2.0 # via flask # via -r requirements.in
jinja2==3.1.4 # via flask commonmark==0.9.1
kiss3==8.0.0 # via -r requirements.in # via rich
loguru==0.7.2 # via -r requirements.in dataclasses==0.6
markupsafe==2.1.5 # via jinja2, werkzeug # via -r requirements.in
marshmallow==3.22.0 # via dataclasses-json dataclasses-json==0.6.7
mypy-extensions==1.0.0 # via typing-inspect # via -r requirements.in
netaddr==1.3.0 # via oslo-config debtcollector==3.0.0
oslo-config==9.6.0 # via -r requirements.in # via oslo-config
oslo-i18n==6.4.0 # via oslo-config deprecated==1.2.14
packaging==24.1 # via marshmallow # via click-params
pbr==6.1.0 # via oslo-i18n, stevedore flask==3.0.3
pluggy==1.5.0 # via -r requirements.in # via
pygments==2.18.0 # via rich # -r requirements.in
pyserial==3.5 # via pyserial-asyncio # flask-httpauth
pyserial-asyncio==0.6 # via kiss3 # flask-socketio
python-engineio==4.9.1 # via python-socketio flask-httpauth==4.8.0
python-socketio==5.11.4 # via -r requirements.in, flask-socketio # via -r requirements.in
pytz==2024.2 # via -r requirements.in flask-socketio==5.4.1
pyyaml==6.0.2 # via -r requirements.in, oslo-config # via -r requirements.in
requests==2.32.3 # via -r requirements.in, oslo-config, update-checker geographiclib==2.0
rfc3986==2.0.0 # via oslo-config # via geopy
rich==12.6.0 # via -r requirements.in geopy==2.4.1
rush==2021.4.0 # via -r requirements.in # via -r requirements.in
shellingham==1.5.4 # via -r requirements.in h11==0.14.0
simple-websocket==1.0.0 # via python-engineio # via wsproto
six==1.16.0 # via -r requirements.in idna==3.10
soupsieve==2.6 # via beautifulsoup4 # via requests
stevedore==5.3.0 # via oslo-config imapclient==3.0.1
tabulate==0.9.0 # via -r requirements.in # via -r requirements.in
thesmuggler==1.0.1 # via -r requirements.in importlib-metadata==8.5.0
typing-extensions==4.12.2 # via typing-inspect # via
typing-inspect==0.9.0 # via dataclasses-json # ax253
tzlocal==5.2 # via -r requirements.in # kiss3
update-checker==0.18.0 # via -r requirements.in itsdangerous==2.2.0
urllib3==2.2.3 # via requests # via flask
validators==0.22.0 # via click-params jinja2==3.1.4
werkzeug==3.0.4 # via flask # via flask
wrapt==1.16.0 # via -r requirements.in, debtcollector, deprecated kiss3==8.0.0
wsproto==1.2.0 # via simple-websocket # via -r requirements.in
zipp==3.20.2 # via importlib-metadata loguru==0.7.2
zope-event==5.0 # via gevent # via -r requirements.in
zope-interface==7.0.3 # via gevent markupsafe==3.0.2
# via
# The following packages are considered to be unsafe in a requirements file: # jinja2
# setuptools # werkzeug
marshmallow==3.23.0
# via dataclasses-json
mypy-extensions==1.0.0
# via typing-inspect
netaddr==1.3.0
# via oslo-config
oslo-config==9.6.0
# via -r requirements.in
oslo-i18n==6.4.0
# via oslo-config
packaging==24.1
# via marshmallow
pbr==6.1.0
# via
# oslo-i18n
# stevedore
pluggy==1.5.0
# via -r requirements.in
pygments==2.18.0
# via rich
pyserial==3.5
# via pyserial-asyncio
pyserial-asyncio==0.6
# via kiss3
python-engineio==4.10.1
# via python-socketio
python-socketio==5.11.4
# via
# -r requirements.in
# flask-socketio
pytz==2024.2
# via -r requirements.in
pyyaml==6.0.2
# via
# -r requirements.in
# oslo-config
requests==2.32.3
# via
# -r requirements.in
# oslo-config
# update-checker
rfc3986==2.0.0
# via oslo-config
rich==12.6.0
# via -r requirements.in
rush==2021.4.0
# via -r requirements.in
shellingham==1.5.4
# via -r requirements.in
simple-websocket==1.1.0
# via python-engineio
six==1.16.0
# via -r requirements.in
soupsieve==2.6
# via beautifulsoup4
stevedore==5.3.0
# via oslo-config
tabulate==0.9.0
# via -r requirements.in
thesmuggler==1.0.1
# via -r requirements.in
typing-extensions==4.12.2
# via typing-inspect
typing-inspect==0.9.0
# via dataclasses-json
tzlocal==5.2
# via -r requirements.in
update-checker==0.18.0
# via -r requirements.in
urllib3==2.2.3
# via requests
validators==0.22.0
# via click-params
werkzeug==3.0.6
# via flask
wrapt==1.16.0
# via
# -r requirements.in
# debtcollector
# deprecated
wsproto==1.2.0
# via simple-websocket
zipp==3.20.2
# via importlib-metadata

View File

@ -0,0 +1,81 @@
import datetime
import unittest
from unittest import mock
from aprsd import exception
from aprsd.client.aprsis import APRSISClient
class TestAPRSISClient(unittest.TestCase):
"""Test cases for APRSISClient."""
def setUp(self):
"""Set up test fixtures."""
super().setUp()
# Mock the config
self.mock_conf = mock.MagicMock()
self.mock_conf.aprs_network.enabled = True
self.mock_conf.aprs_network.login = "TEST"
self.mock_conf.aprs_network.password = "12345"
self.mock_conf.aprs_network.host = "localhost"
self.mock_conf.aprs_network.port = 14580
@mock.patch("aprsd.client.base.APRSClient")
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
def test_stats_not_configured(self, mock_aprsdis, mock_base):
"""Test stats when client is not configured."""
mock_client = mock.MagicMock()
mock_aprsdis.return_value = mock_client
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
self.client = APRSISClient()
with mock.patch.object(APRSISClient, "is_configured", return_value=False):
stats = self.client.stats()
self.assertEqual({}, stats)
@mock.patch("aprsd.client.base.APRSClient")
@mock.patch("aprsd.client.drivers.aprsis.Aprsdis")
def test_stats_configured(self, mock_aprsdis, mock_base):
"""Test stats when client is configured."""
mock_client = mock.MagicMock()
mock_aprsdis.return_value = mock_client
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
self.client = APRSISClient()
mock_client = mock.MagicMock()
mock_client.server_string = "test.server:14580"
mock_client.aprsd_keepalive = datetime.datetime.now()
self.client._client = mock_client
self.client.filter = "m/50"
with mock.patch.object(APRSISClient, "is_configured", return_value=True):
stats = self.client.stats()
self.assertEqual(
{
"server_string": mock_client.server_string,
"sever_keepalive": mock_client.aprsd_keepalive,
"filter": "m/50",
}, stats,
)
def test_is_configured_missing_login(self):
"""Test is_configured with missing login."""
self.mock_conf.aprs_network.login = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()
def test_is_configured_missing_password(self):
"""Test is_configured with missing password."""
self.mock_conf.aprs_network.password = None
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()
def test_is_configured_missing_host(self):
"""Test is_configured with missing host."""
self.mock_conf.aprs_network.host = None
with mock.patch("aprsd.client.aprsis.cfg.CONF", self.mock_conf):
with self.assertRaises(exception.MissingConfigOptionException):
APRSISClient.is_configured()

View File

@ -0,0 +1,140 @@
import unittest
from unittest import mock
from aprsd.client.base import APRSClient
from aprsd.packets import core
class MockAPRSClient(APRSClient):
"""Concrete implementation of APRSClient for testing."""
def stats(self):
return {"packets_received": 0, "packets_sent": 0}
def setup_connection(self):
mock_connection = mock.MagicMock()
# Configure the mock with required methods
mock_connection.close = mock.MagicMock()
mock_connection.stop = mock.MagicMock()
mock_connection.set_filter = mock.MagicMock()
mock_connection.send = mock.MagicMock()
self._client = mock_connection
return mock_connection
def decode_packet(self, *args, **kwargs):
return mock.MagicMock()
def consumer(self, callback, blocking=False, immortal=False, raw=False):
pass
def is_alive(self):
return True
def close(self):
pass
@staticmethod
def is_enabled():
return True
@staticmethod
def transport():
return "mock"
def reset(self):
"""Mock implementation of reset."""
if self._client:
self._client.close()
self._client = self.setup_connection()
if self.filter:
self._client.set_filter(self.filter)
class TestAPRSClient(unittest.TestCase):
def setUp(self):
# Reset the singleton instance before each test
APRSClient._instance = None
APRSClient._client = None
self.client = MockAPRSClient()
def test_singleton_pattern(self):
"""Test that multiple instantiations return the same instance."""
client1 = MockAPRSClient()
client2 = MockAPRSClient()
self.assertIs(client1, client2)
def test_set_filter(self):
"""Test setting APRS filter."""
# Get the existing mock client that was created in __init__
mock_client = self.client._client
test_filter = "m/50"
self.client.set_filter(test_filter)
self.assertEqual(self.client.filter, test_filter)
# The filter is set once during set_filter() and once during reset()
mock_client.set_filter.assert_called_with(test_filter)
@mock.patch("aprsd.client.base.LOG")
def test_reset(self, mock_log):
"""Test client reset functionality."""
# Create a new mock client with the necessary methods
old_client = mock.MagicMock()
self.client._client = old_client
self.client.reset()
# Verify the old client was closed
old_client.close.assert_called_once()
# Verify a new client was created
self.assertIsNotNone(self.client._client)
self.assertNotEqual(old_client, self.client._client)
def test_send_packet(self):
"""Test sending an APRS packet."""
mock_packet = mock.Mock(spec=core.Packet)
self.client.send(mock_packet)
self.client._client.send.assert_called_once_with(mock_packet)
def test_stop(self):
"""Test stopping the client."""
# Ensure client is created first
self.client._create_client()
self.client.stop()
self.client._client.stop.assert_called_once()
@mock.patch("aprsd.client.base.LOG")
def test_create_client_failure(self, mock_log):
"""Test handling of client creation failure."""
# Make setup_connection raise an exception
with mock.patch.object(
self.client, "setup_connection",
side_effect=Exception("Connection failed"),
):
with self.assertRaises(Exception):
self.client._create_client()
self.assertIsNone(self.client._client)
mock_log.error.assert_called_once()
def test_client_property(self):
"""Test the client property creates client if none exists."""
self.client._client = None
client = self.client.client
self.assertIsNotNone(client)
def test_filter_applied_on_creation(self):
"""Test that filter is applied when creating new client."""
test_filter = "m/50"
self.client.set_filter(test_filter)
# Force client recreation
self.client.reset()
# Verify filter was applied to new client
self.client._client.set_filter.assert_called_with(test_filter)
if __name__ == "__main__":
unittest.main()

View File

@ -0,0 +1,75 @@
import unittest
from unittest import mock
from aprsd.client.factory import Client, ClientFactory
class MockClient:
"""Mock client for testing."""
@classmethod
def is_enabled(cls):
return True
@classmethod
def is_configured(cls):
return True
class TestClientFactory(unittest.TestCase):
"""Test cases for ClientFactory."""
def setUp(self):
"""Set up test fixtures."""
self.factory = ClientFactory()
# Clear any registered clients from previous tests
self.factory.clients = []
def test_singleton(self):
"""Test that ClientFactory is a singleton."""
factory2 = ClientFactory()
self.assertEqual(self.factory, factory2)
def test_register_client(self):
"""Test registering a client."""
self.factory.register(MockClient)
self.assertIn(MockClient, self.factory.clients)
def test_register_invalid_client(self):
"""Test registering an invalid client raises error."""
invalid_client = mock.MagicMock(spec=Client)
with self.assertRaises(ValueError):
self.factory.register(invalid_client)
def test_create_client(self):
"""Test creating a client."""
self.factory.register(MockClient)
client = self.factory.create()
self.assertIsInstance(client, MockClient)
def test_create_no_clients(self):
"""Test creating a client with no registered clients."""
with self.assertRaises(Exception):
self.factory.create()
def test_is_client_enabled(self):
"""Test checking if any client is enabled."""
self.factory.register(MockClient)
self.assertTrue(self.factory.is_client_enabled())
def test_is_client_enabled_none(self):
"""Test checking if any client is enabled when none are."""
MockClient.is_enabled = classmethod(lambda cls: False)
self.factory.register(MockClient)
self.assertFalse(self.factory.is_client_enabled())
def test_is_client_configured(self):
"""Test checking if any client is configured."""
self.factory.register(MockClient)
self.assertTrue(self.factory.is_client_configured())
def test_is_client_configured_none(self):
"""Test checking if any client is configured when none are."""
MockClient.is_configured = classmethod(lambda cls: False)
self.factory.register(MockClient)
self.assertFalse(self.factory.is_client_configured())

View File

@ -1,15 +1,9 @@
import sys
import unittest import unittest
from unittest import mock
from aprsd.plugins import email from aprsd.plugins import email
if sys.version_info >= (3, 2):
from unittest import mock
else:
from unittest import mock
class TestMain(unittest.TestCase): class TestMain(unittest.TestCase):
@mock.patch("aprsd.plugins.email._imap_connect") @mock.patch("aprsd.plugins.email._imap_connect")
@mock.patch("aprsd.plugins.email._smtp_connect") @mock.patch("aprsd.plugins.email._smtp_connect")