Compare commits

...

765 Commits

Author SHA1 Message Date
Walter A. Boring IV 1828342ef2
Merge pull request #164 from craigerl/client_rework
Refactor client and drivers
2024-05-23 12:00:04 -04:00
Hemna b317d0eb63 Refactor client and drivers
this patch refactors the client, drivers and client factory
to use the same Protocol mechanism used by the stats collector
to construct the proper client to be used according to
the configuration
2024-05-23 11:38:27 -04:00
Walter A. Boring IV 63962acfe6
Merge pull request #167 from craigerl/docker-rework
Refactor Dockerfile
2024-05-23 11:37:50 -04:00
Walter A. Boring IV 44a72e813e
Merge pull request #166 from craigerl/dependabot/pip/requests-2.32.0
Bump requests from 2.31.0 to 2.32.0
2024-05-23 10:59:46 -04:00
Hemna afeb11a085 Refactor Dockerfile
This patch reworks the main Dockerfile to do builds for
both the pypi upstream release of aprsd as well as the
github repo branch of aprsd for development.  This eliminates
the need for Dockerfile-dev.

This patch also installs aprsd as a user in the container image
instead of as root.
2024-05-23 10:58:46 -04:00
dependabot[bot] 18fb2a9e2b
---
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-05-21 05:54:04 +00:00
Hemna fa2d2d965d updated requirements 2024-05-18 11:20:05 -04:00
Hemna 2abf8bc750 Use newer python -m build to build aprsd wheel
This patch changes the Makefile to make use of the
more modern mechanism in python to build a package
and wheel.
2024-05-18 11:19:10 -04:00
Hemna f15974131c Eliminate need for PBR
This patch also removes the setup.cfg and replaces it with
the pyproject.toml.

This also renames the dev-requirements.txt to requirements-dev.txt

To install dev
pip install -e ".[dev]"
2024-05-18 11:19:07 -04:00
Walter A. Boring IV 4d1dfadbde
Merge pull request #163 from craigerl/dependabot/pip/jinja2-3.1.4
Bump jinja2 from 3.1.3 to 3.1.4
2024-05-07 20:01:37 -04:00
Hemna 93a9cce0c0 Put an upper bound on the QueueHandler queue
This patch overrides the base QueueHandler class
from logging to ensure that the queue doesn't grow
infinitely.  That can be a problem when there is
no consumer pulling items out of the queue.
the queue is now capped at 200 entries max.
2024-05-07 20:00:17 -04:00
dependabot[bot] 321260ff7a
Bump jinja2 from 3.1.3 to 3.1.4
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.3...3.1.4)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-05-06 20:55:03 +00:00
Hemna cb2a3441b4 Updated Changelog for 3.4.0 2024-04-29 09:38:47 -04:00
Hemna fc9ab4aa74 Change setup.h 2024-04-24 19:36:15 -04:00
Hemna a5680a7cbb Fixed docker setup.sh comparison 2024-04-24 19:11:59 -04:00
Hemna c4b17eee9d Fixed unit tests failing with WatchList 2024-04-24 16:27:40 -04:00
Hemna 63f3de47b7 Added config enable_packet_logging
If you want to disable the logging of packets to the log file, set this
new common config option to False
2024-04-24 13:57:24 -04:00
Hemna c206f52a76 Make all the Objectstore children use the same lock
This patch updates the ObjectStore and it's child classes
all use the same lock.
2024-04-24 13:53:23 -04:00
Hemna 2b2bf6c92d Fixed PacketTrack with UnknownPacket
This patch fixes an issue with rx() for an UnknownPacket type
trying to access ackMsgNo (reply ack)
2024-04-24 10:45:47 -04:00
Hemna 992485e9c7 Removed the requirement on click-completion
This was an older way to do command line completion with
click.  Now we use the built in completion with click itself.
click.shell_completion
2024-04-23 16:14:29 -04:00
Hemna f02db20c3e Update Dockerfiles
this patch changes the entrypoint and commands to be in line
with how Docker defines their usage.  this allows the admin using
this container to specify which command to run in the
docker-compose.yml if they want to run something other than the
aprsd server command.

This now allows to easily run webchat as a container :)!
2024-04-23 09:38:37 -04:00
Hemna 09b97086bc Added fox for entry_points with old python 2024-04-21 12:41:19 -04:00
Hemna c43652dbea Added config for enable_seen_list
This patch allows the admin to disable the callsign seen list
packet tracking feature.
2024-04-20 19:54:02 -04:00
Hemna 29d97d9f0c Fix APRSDStats start_time 2024-04-20 17:07:48 -04:00
Hemna 813bc7ea29 Added default_packet_send_count config
This allows you to configure how many times a non ACK packet
will be sent before giving up.
2024-04-19 15:59:55 -04:00
Hemna bef32059f4 Call packet collecter after prepare during tx.
We have to call the packet collector.tx() only after
a packet has been prepared for tx, because that's when the
new msgNo is assigned.
2024-04-19 13:02:58 -04:00
Hemna 717db6083e Added PacketTrack to packet collector
Now the PacketTrack object is a packet collector as well.
2024-04-17 16:54:08 -04:00
Hemna 4c7e27c88b Webchat Send Beacon uses Path selected in UI
This patch changes the Send Beacon button capability in
webchat to use the path selected in the UI for the
actual beacon being sent out.
2024-04-17 12:34:01 -04:00
Hemna 88d26241f5 Added try except blocks in collectors
This patch adds some try except blocks in both the stats collector
and the packets collector calls to registered objects.  This can
prevent the rest of APRSD falling down when the collector objects
have a failure of some sort.
2024-04-17 12:24:56 -04:00
Hemna 27359d61aa Remove error logs from watch list 2024-04-17 09:01:49 -04:00
Hemna 7541f13174 Fixed issue with PacketList being empty 2024-04-16 23:12:58 -04:00
Hemna a656d93263 Added new PacketCollector
this patch adds the new PacketCollector class.
It's a single point for collecting information about
packets sent and recieved from the APRS client.
Basically instead of having the packetlist call the seen list
when we get a packet, we simply call the PacketCollector.rx(),
which in turn calls each registered PacketMonitor class.

This allows us to decouple the packet stats like classses inside
of APRSD.  More importantly, it allows extensions to append their
own PacketMonitor class to the chain without modifying ARPSD.
2024-04-16 14:34:14 -04:00
Hemna cb0cfeea0b Fixed Keepalive access to email stats
this patch fixes a potential issue accessing an email stat
that might not be set yet.
2024-04-16 13:09:33 -04:00
Hemna 8d86764c23 Added support for RX replyacks
This patch adds support for processing incoming packets that have
the 'new' acks embedded in messages called replyacks as described here:

http://www.aprs.org/aprs11/replyacks.txt
2024-04-16 11:39:46 -04:00
Hemna dc4879a367 Changed Stats Collector registration
This patch changes the stats Collector object registration
to take a class name instead of an object.   This allows the
app to start up and fetch the configuration correctly so that
when objects are created the CONF has the proper values.
This is so singleton objects can assign settings values at
creation time.
2024-04-16 11:06:38 -04:00
Hemna 4542c0a643 Added PacketList.set_maxlen()
If you want a constructor time member to have a
value you have to set it after the stats collector
registration is done because it will only be the default
since the CONF isn't setup at that point yet.
2024-04-15 21:43:01 -04:00
Hemna 3e8716365e another fix for tx send 2024-04-15 11:29:26 -04:00
Hemna 758ea432ed removed Packet.last_send_attempt and just use send_count 2024-04-15 10:00:35 -04:00
Hemna 1c9f25a3b3 Fix access to PacketList._maxlen 2024-04-15 09:19:05 -04:00
Hemna 7c935345e5 added packet_count in packet_list stats 2024-04-15 08:34:45 -04:00
Hemna c2f8af06bc force uwsgi to 2.0.24 2024-04-14 20:27:26 -04:00
Hemna 5b2a59fae3 ismall update 2024-04-14 14:08:46 -04:00
Hemna 8392d6b8ef Added new config optons for PacketList
This allows the admin to set the number of packets to store
in the PacketList object for tracking.  For apps like IRC,
we need to store lots more packets to detect dupes.
2024-04-14 12:48:09 -04:00
Hemna 1a7694e7e2 Update requirements 2024-04-13 10:41:49 -04:00
Hemna f2d39e5fd2 Added threads chart to admin ui graphs 2024-04-12 17:43:11 -04:00
Hemna 3bd7adda44 set packetlist max back to 100 2024-04-12 17:17:53 -04:00
Hemna 91ba6d10ce ensure thread count is updated 2024-04-12 17:03:10 -04:00
Hemna c6079f897d Added threads table in the admin web ui 2024-04-12 16:33:52 -04:00
Hemna 66e4850353 Fixed issue with APRSDThreadList stats()
the stats method was setting the key to the classname
and not the thread name, giving an inacurate list
of actual running threads.
2024-04-12 15:08:39 -04:00
Hemna 40c028c844 Added new default_ack_send_count config option
There may be applications where the admin might not want a hard
coded 3 acks sent for every RX'd packet.  This patch adds the
ability to change the number of acks sent per RX'd packet.
The default is still 3.
2024-04-12 14:36:27 -04:00
Hemna 4c2a40b7a7 Remove packet from tracker after max attempts 2024-04-12 11:12:57 -04:00
Hemna f682890ef0 Limit packets to 50 in PacketList 2024-04-12 09:01:57 -04:00
Hemna 026dc6e376 syncronize the add for StatsStore 2024-04-11 22:55:01 -04:00
Hemna f59b65d13c Lock on stats for PacketList 2024-04-11 22:24:02 -04:00
Hemna 5ff62c9bdf Fixed PacketList maxlen
This patch removes the MutableMapping from PacketList
and fixes the code that keeps the max packets in the internal
dict.
2024-04-11 21:40:43 -04:00
Hemna 5fa4eaf909 Fixed a problem with the webchat tab notification
Somehow the hidden div for the webchat interface's
tab notification was removed.  this patch adds it back in
so the user knows that they have message(s) for a tab that
isn't selected
2024-04-11 18:11:05 -04:00
Hemna f34120c2df Another fix for ACK packets 2024-04-11 17:28:47 -04:00
Hemna 3bef1314f8 Fix issue not tracking RX Ack packets for stats
This patch updates the RX tracking for packets.  Every
packet we get into the rx thread, we now will track
every packet we RX so the stats are acurate.
2024-04-11 16:54:46 -04:00
Hemna 94f36e0aad Fix time plugin
This patch adds the tzlocal package to help find the local timezone
correctly such that pytz can correctly built the time needed for
the time plugin.
2024-04-10 22:03:29 -04:00
Craig Lamparter 886ad9be09
add GATE route to webchat along with WIDE1, etc 2024-04-10 13:19:46 -07:00
Craig Lamparter aa6e732935
Update webchat, include GATE route along with WIDE, ARISS, etc 2024-04-10 13:18:24 -07:00
Hemna b3889896b9 Get rid of some useless warning logs 2024-04-10 13:59:32 -04:00
Hemna 8f6f8007f4 Added human_info property to MessagePackets
This patch adds the human_info property to the MessagePacket
object to just return the filtered message_text
2024-04-10 13:58:44 -04:00
Hemna 2e9cf3ce88 Fixed scrolling problem with new webchat sent msg
The Webchat ui was failing to scroll properly upon sending
a new message from a tab that had a lot of messages already.
2024-04-09 10:07:12 -04:00
Hemna 8728926bf4 Fix some issues with listen command
The listen command had some older references to some of the
thread modules.  this patch fixes those.
2024-04-09 09:58:59 -04:00
Hemna 2c5bc6c1f7 Admin interface catch empty stats
This patch adds checks in the admin js to ensure that the
specific stats aren't empty before trying to dereference.
2024-04-09 07:46:06 -04:00
Hemna 80705cb341 Ensure StatsStore has empty data
This patch ensures that the StatsStore object has a default
empty dict for data.
2024-04-09 06:59:22 -04:00
Hemna a839dbd3c5 Ensure latest pip is in docker image
this patch adds a command to update pip in both Dockerfile's
2024-04-08 17:00:42 -04:00
Walter A. Boring IV 1267a53ec8
Merge pull request #159 from craigerl/stats-rework
Reworked the stats making the rpc server obsolete.
2024-04-08 16:12:16 -04:00
Hemna da882b4f9b LOG failed requests post to admin ui 2024-04-08 13:07:15 -04:00
Hemna 6845d266f2 changed admin web_ip to StrOpt
The option was an IPOpt, which prevented the user
from setting the ip to a hostname
2024-04-08 12:47:17 -04:00
Hemna db2fbce079 Updated prism to 1.29 2024-04-08 10:26:54 -04:00
Hemna bc3bdc48d2 Removed json-viewer 2024-04-08 10:16:08 -04:00
Hemna 7114269cee Remove rpyc as a requirement 2024-04-05 16:00:45 -04:00
Hemna fcc02f29af Delete more stats from webchat
This patch removes some more stats that the webchat
ui doesn't need.
2024-04-05 15:24:11 -04:00
Hemna 0ca9072c97 Admin UI working again 2024-04-05 15:03:22 -04:00
Hemna 333feee805 Removed RPC Server and client.
This patch removes the need for the RPC Server from aprsd.

APRSD Now saves it's stats to a pickled file on disk in the
aprsd.conf configured save_location.  The web admin UI
will depickle that file to fetch the stats.  The aprsd server
will periodically pickle and save the stats to disk.

The Logmonitor will not do a url post to the web admin ui
to send it the latest log entries.

Updated the healthcheck app to use the pickled stats file
and the fetch-stats command to make a url request to the running
admin ui to fetch the stats of the remote aprsd server.
2024-04-05 12:50:01 -04:00
Hemna a8d56a9967 Remove the logging of the conf password if not set 2024-04-03 18:01:11 -04:00
Hemna 50e491bab4 Lock around client reset
We now have multiple places where we call reset in case
a network connection fails, so now there is a mutex lock
around the reset method.
2024-04-02 18:23:37 -04:00
Hemna 71d72adf06 Allow stats collector to serialize upon creation
This does some cleanup with the stats collector and
usage of the stats.  The patch adds a new optional
param to the collector's collect() method to tell
the object to provide serializable stats.  This is
used for the webchat app that sends stats to the
browser.
2024-04-02 14:07:37 -04:00
Hemna e2e58530b2 Fixed issues with watch list at startup 2024-04-02 09:30:45 -04:00
Hemna 01cd0a0327 Fixed access to log_monitor 2024-04-02 09:30:45 -04:00
Hemna f92b2ee364 Got unit tests working again 2024-04-02 09:30:45 -04:00
Hemna a270c75263 Fixed pep8 errors and missing files 2024-04-02 09:30:45 -04:00
Hemna bd005f628d Reworked the stats making the rpc server obsolete.
This patch implements a new stats collector paradigm
which uses the typing Protocol.  Any object that wants to
supply stats to the collector has to implement the
aprsd.stats.collector.StatsProducer protocol, which at the
current time is implementing a stats() method on the object.

Then register the stats singleton producer with the collector by
calling collector.Collector().register_producer()

This only works if the stats producer object is a singleton.
2024-04-02 09:30:43 -04:00
Walter A. Boring IV 200944f37a
Merge pull request #158 from craigerl/client-update
Update client.py to add consumer in the API.
2024-04-02 09:26:30 -04:00
Hemna a62e490353 Update client.py to add consumer in the API.
This adds a layer between the client object and the
actual client instance, so we can reset the actual
client object instance upon failure of connection.
2024-03-28 16:51:56 -04:00
Hemna 428edaced9 Fix for sample-config warning
This patch fixes a small issue with the sample-config command
outputting a warning during generation.
2024-03-27 10:29:30 -04:00
Hemna 8f588e653d update requirements 2024-03-25 09:47:16 -04:00
Walter A. Boring IV 144ad34ae5
Merge pull request #154 from craigerl/packet_updates
Packet updates
2024-03-25 09:20:35 -04:00
Hemna 0321cb6cf1 Put packet.json back in 2024-03-23 21:06:20 -04:00
Hemna c0623596cd Change debug log color
this patch changes the debug log color from dark blue to grey
2024-03-23 19:27:23 -04:00
Hemna f400c6004e Fix for filtering curse words
This patch adds a fix for filtering out curse words.
This adds a flag to the regex to ignore case!
2024-03-23 18:02:01 -04:00
Hemna 873fc06608 added packet counter random int
The packet counter now starts at a random number between 1 and 9999
instead of always at 1.
2024-03-23 17:56:49 -04:00
Hemna f53df24988 More packet cleanup and tests 2024-03-23 17:05:41 -04:00
Hemna f4356e4a20 Show comment in multiline packet output
This patch adds the comment for a packet if it exists
in the multiline log output
2024-03-23 13:00:51 -04:00
Hemna c581dc5020 Added new config option log_packet_format
This new DEFAULT group option specifies what format to use
when logging a packet.
2024-03-23 11:50:01 -04:00
Hemna da7b7124d7 Some packet cleanup 2024-03-23 10:54:10 -04:00
Hemna 9e26df26d6 Added new webchat config option for logging
This patch adds a new config option for the webchat command
to disable url request logging.
2024-03-23 10:46:17 -04:00
Hemna b461231c00 Fix some pep8 issues 2024-03-23 10:24:02 -04:00
Hemna 1e6c483002 Completely redo logging of packets!!
refactored all logging of packets.

Packet class now doesn't do logging.
the format of the packet log now lives on a single line with
colors.

Created a new packet property called human_info, which
creates a string for the payload of each packet type
in a human readable format.

TODO: need to create a config option to allow showing the
older style of multiline logs for packets.
2024-03-22 23:20:16 -04:00
Hemna 127d3b3f26 Fixed some logging in webchat 2024-03-22 23:19:54 -04:00
Hemna f450238348 Added missing packet types in listen command
This patch adds some missing packet objects for the
listen command.  Also moves the keepalive startup
a little later
2024-03-22 23:18:47 -04:00
Hemna 9858955d34 Don't call stats so often in webchat 2024-03-22 23:16:00 -04:00
Hemna e386e91f6e Eliminated need for from_aprslib_dict
This patch eliminates the need for a custom
static method on each Packetclass to convert an aprslib
raw decoded dictionary -> correct Packet class.

This now uses the built in dataclasses_json from_dict()
mixin with an override for both the WeatherPacket and
the ThirdPartyPacket.

This patch also adds the TelemetryPacket and adds some
missing members to a few of the classes from test runs
decoding all packets from APRS-IS -> Packet classes.

Also adds some verification for packets in test_packets
2024-03-20 21:46:43 -04:00
Hemna 386d2bea62 Fix for micE packet decoding with mbits 2024-03-20 16:12:18 -04:00
Hemna eada5e9ce2 updated dev-requirements 2024-03-20 15:52:01 -04:00
Hemna 00e185b4e7 Fixed some tox errors related to mypy 2024-03-20 15:41:29 -04:00
Hemna 1477e61b0f Refactored packets
this patch removes the need for dacite2 package for creating
packet objects from the aprslib decoded packet dictionary.

moved the factory method from the base Packet object
to the core module.
2024-03-20 15:41:25 -04:00
Hemna 6f1d6b4122 removed print 2024-03-20 15:39:18 -04:00
Hemna 90f212e6dc small refactor of stats usage in version plugin 2024-03-20 15:39:18 -04:00
Hemna 9c77ca26be Added type setting on pluging.py for mypy 2024-03-20 15:39:18 -04:00
Hemna d80277c9d8 Moved Threads list for mypy
This patch moves the APRSDThreadList to the bottom
of the file so that we can specify the type in the
threads_list member for mypy.
2024-03-20 15:39:18 -04:00
Hemna 29b4b04eee No need to synchronize on stats
this patch updates the stats object to remove the synchronize
on calling stats.  each property on the stats object are already
synchronized.
2024-03-20 15:39:18 -04:00
Hemna 12dab284cb Start to add types 2024-03-20 15:39:18 -04:00
Hemna d0f53c563f Update tox for mypy runs 2024-03-20 15:39:18 -04:00
Walter A. Boring IV 24830ae810
Merge pull request #155 from craigerl/dependabot/pip/black-24.3.0
Bump black from 24.2.0 to 24.3.0
2024-03-20 15:38:59 -04:00
dependabot[bot] 52896a1c6f
Bump black from 24.2.0 to 24.3.0
Bumps [black](https://github.com/psf/black) from 24.2.0 to 24.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.2.0...24.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-20 18:14:03 +00:00
Hemna 82b3761628 replaced access to conf from uwsgi 2024-03-14 12:15:23 -04:00
Hemna 8797dfd072 Fixed call to setup_logging in uwsgi 2024-03-14 12:11:30 -04:00
Hemna c1acdc2510 Fixed access to conf.log in logging_setup 2024-03-14 11:41:34 -04:00
Hemna 71cd7e0ab5 Changelog for 3.3.2 2024-03-13 13:49:11 -04:00
Hemna d485f484ec Remove warning during sample-config
This patch removes a warning log during sample-config
generation
2024-03-13 13:47:01 -04:00
Hemna f810c02d5d Removed print in utils
this patch removes a leftover debug print in utils.load_entry_points
that was causing sample-config output to be bogus.
2024-03-13 13:44:09 -04:00
Hemna 50e24abb81 Updates for 3.3.1 2024-03-12 10:41:16 -04:00
Hemna 10d023dd7b Fixed failure with fetch-stats
This patch fails nicely with the fetch-stats if it can't connect
with the rpc server on the other end.
2024-03-12 10:37:17 -04:00
Hemna cb9456b29d Fixed problem with list-plugins
This patch includes a fix to the list-plugins and
list-extensions commands.
2024-03-12 10:36:26 -04:00
Hemna c37e1d58bb Changelog for 3.3.0 2024-03-12 10:08:38 -04:00
Hemna 0ca5ceee7e sample-config fix
This patch makes a change on how it's calling importlib.entry_points
to only fetch the group we want, which is 'oslo.config.opts'.
This fixes a problem with python 3.12 compatibility.
2024-03-11 11:53:28 -04:00
Hemna 2e9c9d40e1 Fixed registry url post 2024-03-08 11:49:10 -05:00
Hemna 66004f639f Changed processpkt message
this includes the pkt.key in the log entry
2024-03-08 11:25:46 -05:00
Hemna 0b0afd39ed Fixed RegistryThread not sending requests 2024-03-08 09:18:28 -05:00
Hemna aec88d4a7e use log.setup_logging 2024-03-07 12:43:10 -05:00
Hemna 24bbea1d49 Disable debug logs for aprslib
This patch adds a disable of propogating the debug logs
from the aprslib parsing.  We don't really need to see
this in our aprsd services.
2024-03-07 09:46:36 -05:00
Hemna 5d3f42f411 Make registry thread sleep
This patch adds a required sleep of 1 second in each
registry thread loop to prevent runaway cpu usage
2024-03-07 08:37:09 -05:00
Walter A. Boring IV 44a98850c9
Merge pull request #147 from craigerl/loguru
Replace slow rich logging with loguru
2024-03-06 14:18:19 -05:00
Hemna 2cb9c2a31c Put threads first after date/time 2024-03-06 13:39:51 -05:00
Hemna 2fefa9fcd6 Replace slow rich logging with loguru
This patch removes the rich logging with
the modern loguru logging
2024-03-06 13:00:52 -05:00
Hemna d092a43ec9 Updated requirements 2024-03-06 12:59:21 -05:00
Hemna d1a09fc6b5 Fixed pep8 2024-02-28 16:24:01 -05:00
Hemna ff051bc285 Added list-extensions and updated README.rst
This patch adds the list-extensions command to support
showing the available extensions for APRSD that live on
pypi.
2024-02-28 16:10:55 -05:00
Hemna 5fd91a2172 Change defaults for beacon and registry
The beacon frequency is now every 30 minutes by default.
The registry call is now every hour.
2024-02-28 13:23:11 -05:00
Hemna a4630c15be Add log info for Beacon and Registry threads 2024-02-27 16:01:15 -05:00
Hemna 6a7d7ad79b fixed frequency_seconds to IntOpt 2024-02-27 15:53:03 -05:00
Hemna 7a5b55fa77 fixed references to conf 2024-02-27 15:48:58 -05:00
Hemna a1e21e795d changed the default packet timeout to 5 minutes 2024-02-27 15:11:39 -05:00
Hemna cb291de047 Fixed default service registry url 2024-02-27 15:10:21 -05:00
Hemna e9c48c1914 fix pep8 failures 2024-02-27 14:21:04 -05:00
Hemna f0ad6d7577 py311 fails in github 2024-02-27 13:46:28 -05:00
Hemna 38fe408c82 Don't send uptime to registry 2024-02-27 13:40:39 -05:00
Hemna 8264c94bd6 Added sending software string to registry
This patch adds sending the APRSD signature and url
along with the regsitry request.
2024-02-27 11:05:41 -05:00
Hemna 1ad2e135dc add py310 gh actions 2024-02-27 08:14:05 -05:00
Hemna 1e4f0ca65a Added the new APRS Registry thread
This patch adds the new APRSRegistryThread,
which enabled in config, will send a small
packet of information to the as yet deployed
APRS service registry every 900 seconds.

The data that this thread will send is
the service callsign, a description of the service,
a website url for the service.

The idea being that the registry website that this thread
sends information to, will show all the services that are
running on the ARPS network, so Ham operators can discover
them and try them out.
2024-02-26 18:28:52 -05:00
Hemna 41185416cb Added installing extensions to Docker run
This patch adds the installation of APRSD via pip during startup
time for the main server run.sh, admin.sh and listen.sh
2024-02-25 15:05:45 -05:00
Hemna 68f23d8ca7 Cleanup some logs
This patch removes some debug logging from the clients.
2024-02-25 15:04:26 -05:00
Hemna 11f1e9533e Added BeaconPacket
This patch adds the BeaconPacket and BeaconSendThread.
This will enable APRSD server to send a beacon if enabled in
the config.
2024-02-25 14:21:17 -05:00
Hemna 275bf67b9e updated requirements files 2024-02-24 14:30:08 -05:00
Hemna 968345944a removed some unneeded code
removed the callsigns locations iterator
2024-02-24 14:28:37 -05:00
Hemna df2798eafb Added iterator to objectstore
Since the objectstore mixin uses a iterable to store it's data,
it was easy to add an __iter__ to the objectstore class itself.
2024-02-24 14:27:39 -05:00
Hemna e89f8a805b Added some missing classes to threads
Added new APRSDupeThread
2024-02-24 14:26:55 -05:00
Hemna b14307270c Added support for loading extensions
This patch adds support for loading extenions
to APRSD!!

You can create another separate aprsd project, and register
your extension in your setup.cfg as a new entry point for aprsd
like

[entry_points]
aprsd.extension =
    cool = my_project.extension

in your my_project/extension.py file
import your commmands and away you go.
2024-02-23 16:53:42 -05:00
Walter A. Boring IV ebee8e1439
Merge pull request #146 from craigerl/webchat-location
Added location for callsign tabs in webchat
2024-02-20 10:30:21 -05:00
Hemna a7e30b0bed Added location for callsign tabs in webchat
This patch adds the new feature of trying to fetch the location
distance and bearing for each callsign in the webchat tabs.
This is handy when out on the go, you can get a general idea
where the other callsign is when chatting with them.

First aprsd webchat tries to fetch the location with aprs.fi
REST api call.  This assumes internet access.  If this fails,
then webchat will send a special message to REPEAT to ask it for
the location information for the callsign.   This will send over
the air.
2024-02-20 10:18:22 -05:00
Hemna 1a5c5f0dce updated gitignore 2024-02-15 14:42:04 -05:00
Walter A. Boring IV a00c4ea840
Create codeql.yml
Try out code scanning
2024-02-07 09:43:29 -05:00
Hemna a88de2f09c update github action branchs to v8 2024-02-07 08:55:24 -05:00
Hemna d6f0f05315 Added Location info on webchat interface
This patch adds a new popover in the webchat tab to show
the location information for a callsign.

webchat will try to hit aprs.fi to fetch the location from the
callsign's last beacon.  If there is no internet, this will fail
and webchat will send a request to REPEAT callsign for the location
information.
2024-02-06 16:52:56 -05:00
Hemna 03c58f83cd Updated dev test-plugin command
This patch updates the output of the aprsd dev test-plugin command
to show the packets that would actually get sent by the plugin
results.
2024-01-19 11:30:15 -05:00
Hemna a4230d324a Update requirements.txt 2024-01-16 16:45:07 -05:00
Hemna 8bceb827ec Update for v3.2.3 2024-01-09 09:12:01 -05:00
Hemna 12a3113192 Force fortune path during setup test
For whatever reason shutil.which() can't find
fortune in the path, unless you specify the entire path.
2024-01-09 01:30:43 +00:00
Hemna 026a64c003 added /usr/games to path 2024-01-08 22:56:45 +00:00
Hemna 682e138ec2 Added fortune to Dockerfile-dev 2024-01-08 19:40:08 +00:00
Walter A. Boring IV e4e9c6e98b
Merge pull request #144 from v-rzh/vrzh-fix-typo-0
aprsd: main.py: Fix premature return in sample_config
2024-01-08 13:01:34 -05:00
Hemna f02824b796 Added missing fortune app 2024-01-08 18:00:26 +00:00
Martiros Shakhzadyan 530ac30a09 aprsd: main.py: Fix premature return in sample_config
Fix a typo in sample_config that causes the function to return before
config is generated.
2024-01-04 08:41:06 -05:00
Craig Lamparter 9350cf6534
Update weather.py because you can't sort icons by penis 2023-12-21 11:07:43 -08:00
Craig Lamparter 651cf014b7
Update weather.py both weather plugins have new Ww regex 2023-12-21 11:01:23 -08:00
Craig Lamparter b6df9de8aa
Update weather.py
get back the "starts with w" is the weather command regex
2023-12-21 10:54:07 -08:00
Walter A. Boring IV 0fd7daaae0
Merge pull request #140 from craigerl/location_plugin
Rework Location Plugin
2023-11-24 19:48:22 -05:00
Hemna 0433768784 Fixed a bug with OWMWeatherPlugin
The weather plugin wasn't able to find the from callsign,
so all of the weather reports were random and wrong.
2023-11-24 19:15:52 -05:00
Hemna a8f73610fe Rework Location Plugin
This Patch updates the location plugin to allow configuring which
geopy library's supported geocoders.  This patch also adds a fake
geopy geocoder class that uses the us government's API for location.
2023-11-22 20:55:38 -05:00
Hemna c0e2ef1199 Update for v3.2.2 release 2023-11-22 12:35:12 -05:00
Hemna 809a41f123 Fix for types 2023-11-17 14:23:29 -05:00
Hemna b0bfdaa1fb Fix wsgi for prod 2023-11-17 14:02:29 -05:00
Walter A. Boring IV b73373db3f
Merge pull request #139 from craigerl/walt-test
Walt test
2023-11-17 13:47:05 -05:00
Hemna 6b397cbdf1 pep8 fixes 2023-11-17 13:34:10 -05:00
Hemna 638128adf8 remove python 3.12 from github builds 2023-11-17 13:15:44 -05:00
Hemna b9dd21bc14 Fixed datetime access in core.py 2023-11-17 13:01:55 -05:00
Hemna fae7032346 removed invalid reference to config.py 2023-11-17 11:59:50 -05:00
Hemna 4b1214de74 Updated requirements 2023-11-17 11:44:12 -05:00
Hemna 763c9ab897 Reworked the admin graphs
This patch fixes some bugs wth the rpc for packets as well
as reworks the admin graphs to use echarts.
2023-11-17 11:39:42 -05:00
Hemna fe1ebf2ec1 Test new packet serialization 2023-11-17 11:39:42 -05:00
Walter A. Boring IV c01037d398
Merge pull request #138 from craigerl/no-internets
Try to localize js libs and css for no internet
2023-10-31 08:04:42 -04:00
Walter A. Boring IV 072a1f4430
Merge pull request #137 from jhmartin/mismatched-arguments
Normalize listen --aprs-login
2023-10-28 19:39:09 -04:00
Hemna 8b2613ec47 Try to localize js libs and css for no internet
this patch fixes some issues with webchat not loading css and js
when there is no internet.  The index.html was relying on internet
being available to fetch remote css and js.
2023-10-28 19:26:50 -04:00
Jason Martin d39ce76475
Normalize listen --aprs-login
The click block specifies aprs-login but the error indicated aprs_login
2023-10-27 23:39:56 +00:00
Walter A. Boring IV 3e9c3612ba
Merge pull request #136 from craigerl/dependabot/pip/werkzeug-3.0.1
Bump werkzeug from 2.3.7 to 3.0.1
2023-10-26 09:22:03 -04:00
Walter A. Boring IV 8746a9477c
Merge pull request #135 from jhmartin/update-installdoc
Update INSTALL with new conf files
2023-10-26 09:20:20 -04:00
dependabot[bot] 7d0524cee5
Bump werkzeug from 2.3.7 to 3.0.1
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 2.3.7 to 3.0.1.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/2.3.7...3.0.1)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-25 19:55:49 +00:00
Jason Martin 5828643f2e
Update INSTALL with new conf files
The name of the config files has changed, update INSTALL with the new names.
2023-10-23 00:14:59 +00:00
Walter A. Boring IV 313ea5b6a5
Merge pull request #134 from craigerl/dependabot/pip/urllib3-2.0.7
Bump urllib3 from 2.0.6 to 2.0.7
2023-10-17 17:28:04 -04:00
dependabot[bot] 7853e19c79
Bump urllib3 from 2.0.6 to 2.0.7
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.6 to 2.0.7.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.6...2.0.7)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-17 21:09:50 +00:00
Hemna acf2b62bce Changelog for 3.2.1 2023-10-09 11:39:53 -04:00
Craig Lamparter 8e9a0213e9
Update index.html disable form autocomplete 2023-10-07 10:06:42 -07:00
Hemna bf905a0e9f Update the packet_dupe_timeout warning
The warning text was hardcoded at the old 60 second value,
instead of using the config option.
2023-10-06 16:06:41 -04:00
Hemna 5ae45ce42f Update the webchat paths
This reorders the paths available for selection in webchat and
sets the selected to default
2023-10-06 16:02:00 -04:00
Hemna 0155923341 Changed the path option to a ListOpt
Both serial_kiss and tcp_kiss path option is converted to a ListOpt
to help generate a single line during sample-config generation.
2023-10-06 15:44:25 -04:00
Hemna 156d9d9592 Fixed default path for tcp_kiss client.
The tcp_kiss client initialization was using the serial_kiss client's
path setting.
2023-10-06 15:41:12 -04:00
Hemna 81169600bd Set a default password for admin
This patch sets a default password of "password" for the admin webui.
2023-10-06 15:32:31 -04:00
Hemna 746eeb81b0 Fix path for KISS clients
The kiss client send method was always forcing the config
path.  If a packet has a path specified in it, that will
override the config setting for the kiss client setting in the config.
2023-10-05 18:00:45 -04:00
Hemna f41488b48a Added packet_dupe_timeout conf
This patch adds the new packet_dump_timeout config option, defaulting to
60 seconds.   If the same packet matching the from, to, msgNo is RX'd
within that timeout the packet is considered a dupe and will be
dropped.  Ack packets are not subject to dupe checking.
2023-10-05 13:56:02 -04:00
Walter A. Boring IV 116f201394
Merge pull request #133 from craigerl/dependabot/pip/urllib3-2.0.6
Bump urllib3 from 2.0.4 to 2.0.6
2023-10-05 10:42:41 -04:00
Hemna ddd4d25e9d Add ability to change path on every TX packet
This patch adds the ability to webchat to set the path
on every outbound packet for the KISS clients as well as
the fake client.  The path dropdown includes the options for
Default path (which will default to the config setting)
WIDE1-1,WIDE2-1
ARISS
2023-10-05 10:33:07 -04:00
Walter A. Boring IV e2f89a6043
Merge pull request #132 from craigerl/RF_dupe_fix
Fix for dupe packets.
2023-10-03 16:18:34 -04:00
Hemna 544600a96b Make Packet objects hashable
This patch makes the packet key a property of the Packet object and
makes packet objects comparable and hashable.
2023-10-03 16:01:43 -04:00
dependabot[bot] c16f3a0bb2
Bump urllib3 from 2.0.4 to 2.0.6
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.4 to 2.0.6.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.4...2.0.6)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-03 00:04:40 +00:00
Hemna 59cec1317d Don't process AckPackets as dupes
If we RX an AckPacket, then send it on for processing.  There is no need
to check for a dupe.
2023-10-02 08:42:00 -04:00
Hemna 751bbc2514 Fixed another msgNo int issue 2023-09-29 15:40:42 -04:00
Hemna 9bdfd166fd Fixed issue with packet tracker and msgNO Counter
The packet msgNo field is a string, but is typically is an integer
counter to keep track of a specific packet id.  The counter was
returning an int, but the packet.msgNo is a string.  So, when trying to
delete a packet from the packet tracker, the key for accessing the
packet is the msgNo, which has to be a string.  Passing an int, will
cause the packet tracker to not find the packet, and hence silently
fail.

This patch forces the msgNo counter to be a string.
2023-09-29 10:04:15 -04:00
Hemna f79b88ec1b Fixed import of Mutablemapping
python 3.10 moved it to collections.abc
2023-09-28 15:30:54 -04:00
Hemna 99a0f877f4 pep8 fixes 2023-09-28 12:34:01 -04:00
Hemna 4f87d5da12 rewrote packet_list and drop dupe packets
This patch rewrites the packet_list internally to be a dictionary
instead of a list for very fast lookups.  This was needed to test for
duplicate packets already in the list.

This patch drops packets that have the same data and are < 60 seconds
in age from the last time we got the packet.   On RF based clients
we can get dupes!!
2023-09-28 12:19:18 -04:00
Hemna 0d7e50d2ba Log a warning on dupe
This patch logs a warning if we detect a dupe packet inbound.
2023-09-27 15:45:39 -04:00
Hemna 1f6c55d2bf Fix for dupe packets.
Sometimes over KISS clients (RF), we can get duplicate packets
due to having many digipeters in range of the TNC that aprsd is
connected to.   We will now filter out any dupe packets that aprsd
is still in the process of doing it's 3 acks.
2023-09-27 14:55:47 -04:00
Hemna 740889426a Update Changelog for 3.2.0 2023-09-26 16:15:19 -04:00
Hemna c9dc4f67d4 minor cleanup prior to release 2023-09-26 15:27:51 -04:00
Hemna 788a72c643 Webchat: fix input maxlength
This changes the maxlength of the input message box to 67 characters.
Also changes the GPS beacon text.
2023-09-26 12:53:08 -04:00
Walter A. Boring IV 1e3d0d4faf
Merge pull request #131 from craigerl/dependabot/pip/gevent-23.9.1
Bump gevent from 23.9.0.post1 to 23.9.1
2023-09-26 12:08:34 -04:00
Hemna 82d25915fc WebChat: cleanup some console.logs 2023-09-26 12:07:28 -04:00
Hemna 12dfdefb62 WebChat: flash a dupe message 2023-09-26 12:00:02 -04:00
Hemna d63c6854af Webchat: Fix issue accessing msg.id
After the refactor of the messages object in webchat, we are sending
a direct json dict version of the packet now.  This means there is no
msg.id in the dict, but msg.msgNo instead.  This should help fix
the display of dupes.
2023-09-26 11:04:59 -04:00
Hemna 6b083d4c4d Webchat: Fix chat css on older browsers
Some older browswers can't handle the new css syntax
for a subclass in the same css definition.
2023-09-26 10:47:53 -04:00
Hemna ff358987a9 WebChat: new tab should get focus
When a new tab is created it now gets the focus.
2023-09-26 10:31:00 -04:00
dependabot[bot] 412ab54303
Bump gevent from 23.9.0.post1 to 23.9.1
Bumps [gevent](https://github.com/gevent/gevent) from 23.9.0.post1 to 23.9.1.
- [Release notes](https://github.com/gevent/gevent/releases)
- [Changelog](https://github.com/gevent/gevent/blob/master/docs/changelog_pre.rst)
- [Commits](https://github.com/gevent/gevent/compare/23.9.0.post1...23.9.1)

---
updated-dependencies:
- dependency-name: gevent
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-09-25 19:28:21 +00:00
Hemna 3f5dbe0a12 Webchat: Fix pep8 errors 2023-09-21 18:34:03 -04:00
Hemna 9635893934 Webchat: Added tab notifications and raw packet
This patch adds an auto mouseover hover popover for displaying
the raw APRS packet.

This patch also adds the notification counter for an unselected tab.
2023-09-21 16:29:15 -04:00
Hemna f151ae4348 WebChat: Prevent sending message without callsign
This patch adds raising an error if the user doesn't set the
to call callsign when sending a message.
2023-09-15 14:32:22 -04:00
Hemna 7130ca2fd9 WebChat: fixed content area scrolling
This patch fixes some issues when switching between tabs.
2023-09-15 14:12:55 -04:00
Hemna b393060edb Webchat: tweaks to UI for expanding chat
This patch changes the layout containers a bit.  Moved the tabs to the
header section and made the tab contents fill the rest of the height of
the browser and it is the only portion that scrolls.
2023-09-15 11:34:38 -04:00
Hemna f770c5ffd5 Webchat: Fixed bug deleteing first tab
This patch fixes a UI issue when the user delets the first tab and the
remaining tabs aren't refreshed/shown.
2023-09-15 09:13:51 -04:00
Hemna ef206b1283 Ensure Keepalive doesn't reset client at startup
This patch ensures that the keepalive thread doesn't try and
reset/restart the aprs connection at startup.
2023-09-14 16:46:00 -04:00
Hemna 140fa4ace4 Ensure parse_delta_str doesn't puke
This patch fixes an issue where the parse_delta_str regex doesn't
match anything.
2023-09-14 16:23:49 -04:00
Hemna 81a19dd101 WebChat: Send GPS Beacon working
This patch adds back in the jquery toast plugin that used to come as
part of the fomantic ui js code.
2023-09-13 12:05:40 -04:00
Walter A. Boring IV 9985c8bf25
Merge pull request #130 from craigerl/webchat-saved-bootstrapjs
Webchat saved bootstrapjs
2023-09-12 21:17:49 -04:00
Hemna 1400e3e711 webchat: got active tab onclick working
This patch adds the ability to click on the already existing active tab
and have it populate the to_call input box.
2023-09-12 16:37:06 -04:00
Hemna 8a90d5480a webchat: set to_call to value of tab when selected
This patch will set the to_call form field to the callsign of the
tab when the tab is activated in the UI.

NOTE: still need to populate it when clicking on the already active
tab.
2023-09-12 15:44:34 -04:00
Hemna b4e02c760e Center the webchat input form
This patch centers the input form for the webchat page over the
center of the page.
2023-09-10 12:13:05 -04:00
Hemna ba6b410795 Update index.html to use chat.css 2023-09-10 11:12:38 -04:00
Hemna 70ddc44b5c Deleted webchat mobile pages
removed user-agents package dependency
2023-09-08 15:45:32 -04:00
Hemna 852760220f Added close X on webchat tabs
This patch adds an X on each tab as a way to close the conversation
and nuke the local storage for the conversation.
2023-09-08 12:43:33 -04:00
Hemna 14e984c9b4 Reworked webchat with new UI
This patch reworks the webchat UI to work in both desktop
and mobile layouts.  Comprimises were made, but there is 1
codebase now between both desktop and mobile.
This patch also includes the new imessage/sms chat look.
2023-09-08 11:19:24 -04:00
Hemna 29f21a9469 Updated the webchat UI to look like iMessage 2023-09-06 11:20:59 -04:00
Hemna 7292744a78 Restore previous conversations in webchat
This patch saves the webchat conversations messages in the browser's
local storage.  When the user comes back to the page, the
conversations are restored.
2023-09-05 14:14:52 -04:00
Hemna 619b1b708e Remove VIM from Dockerfile
If you need vim, you can just ssh into the container and apt-get
install vim.
2023-09-05 08:01:42 -04:00
Hemna 008b2ab09e recreate client during reset()
This patch re-creates the client object during a client.reset() call.
2023-09-01 16:11:19 -04:00
Hemna 4b56e99689 updated github workflows 2023-09-01 14:51:55 -04:00
Hemna 10bf04929e Updated documentation build 2023-09-01 14:43:29 -04:00
Hemna a9e8050ae6 Removed admin_web.py
This patch removes the old admin_web.py.   Use the aprsd.wsgi
for the admin interface.
2023-09-01 14:38:55 -04:00
Hemna 82f77b7a6a Removed some RPC server log noise 2023-08-28 09:19:54 -04:00
Hemna 570fdb98a7 Fixed admin page packet date
The date timestamp was always showing as 1970.  Had to adjust
the javascript conversion from epoch to Date object
2023-08-28 09:18:50 -04:00
Hemna 9582812041 RPC Server logs the client IP on failed auth
this patch adds an error log for the client IP of who connects
to the rpc server without proper auth key.
2023-08-23 13:48:09 -04:00
Hemna 859f904602 Start keepalive thread first
This patch changes the order of the threads starting.  The Keepalive
thread's job is to test the aprsis/kiss client to see if it's up and
running, and then issue a reset if it's down.   On SIGINT, the keepalive
might issue that reset in the middle of a shutdown, which might cause
things to hang when everything should be shutting down.  Making the
KeepaliveThread first, means it will be the first to be shut down as
well, preventing the next loop from resetting the client.
2023-08-23 13:45:46 -04:00
Hemna 34311f0fbd fixed an issue in the mobile webchat
The global socket var wasn't defined globally in send-message-mobile.js
2023-08-23 13:15:50 -04:00
Hemna 2416f0ea1a Added dupe checkig code to webchat mobile 2023-08-22 16:03:27 -04:00
Hemna 377842c2ec click on the div after added. 2023-08-22 13:46:43 -04:00
Hemna a8dd9ce012 Webchat suppress to display of dupe messages
This patch updates the web ui for webchat to suppress the displaying
of duplicate recieved messages.  Dupes can happen over the KISS
interface due to packets being encapsulated by nearby repeaters into 3rd
party packets.

When a dupe message is recieved, the dupe message is flashed 3 times.
2023-08-22 13:37:43 -04:00
Hemna 1d6a667987 Convert webchat internet urls to local static urls 2023-08-22 12:51:50 -04:00
Hemna 2e9a204c74 Make use of webchat gps config options
This patch makes use of the gps settings in the webchat section.
If the user sets the latitude and longitude in the config file, then
the gps beacon button will be enabled.  The gps button will still be
enabled if the http connection is over SSL.
2023-08-22 12:31:44 -04:00
Hemna f922b3f97b Added new webchat config section
This patch adds a new webchat config section to specify:
web_ip (the ip address to listen on)
web_port
latitude (latitude to use for the GPS beacon button)
longitude (long to use for the GPS beacon button)
2023-08-22 12:01:34 -04:00
Hemna 8dd3b05bb1 fixed webchat logging.logformat typeoh
This fixes a problem with webchat when specifying the logfile
in aprsd config
2023-08-15 21:49:43 -04:00
Hemna e06305fceb prep for 3.1.3 2023-08-15 17:52:36 -04:00
Hemna 33c7871dbe Forcefully allow development webchat flask
This patch adds the force werkzeug to allow development environment
to allow aprsd webchat to work from inside of systemd
2023-08-15 17:42:56 -04:00
Hemna b2f95b0f4e Updated Changelog for 3.1.2 2023-08-15 15:25:01 -04:00
Hemna ae9e4d31ad Added support for ThirdParty packet types
The kiss clients now detect if the incomming packet is a third party
packet and then sends up the subpacket instead of the encapsulated
packet up to the consumer.
2023-08-15 14:24:03 -04:00
Hemna 65a5a90458 Disable the Send GPS Beacon button
This patch disables the 'Send GPS Beacon' button for the webchat
command if the browser isn't connected over https
2023-08-14 18:45:13 -04:00
Hemna 182887c20a Removed adhoc ssl support in webchat
This immediately breaks the beacon button.
This patch removes the dep for pyopenssl and cryptography
so that aprsd install on the rpi.

Unfortunately in order for the web page to get the Lat/Lon, the
browser must be connected over SSL.  Will have to create a workaround
for this later.
2023-08-14 18:34:25 -04:00
Hemna f228144f4b Updated Changelog for v3.1.1 2023-08-07 13:01:52 -04:00
Hemna db9e1d23d1 Fixed pep8 failures 2023-08-07 11:07:01 -04:00
Hemna 986df391b2 re-enable USWeatherPlugin to use mapClick
The old MApClick.php api seems to work...re-enabling
2023-07-31 21:53:02 -04:00
Walter A. Boring IV 3994235380
Merge pull request #128 from craigerl/fix_kiss
Fix sending packets over KISS interface
2023-07-28 18:08:54 -04:00
Hemna 9ebf2f9a30 Fix sending packets over KISS interface
The KISS client sends the path as part of the headers, so we had
to strip out the path from the payload of each message so the path
wouldn't get listed twice.
2023-07-28 17:25:06 -04:00
Hemna 011cfc55e1 Use config web_ip for running admin ui from module
When running the web admin interface with
'python -m aprsd.wsgi' the Flask app global now uses
the web_ip config entry for listening.  Also disabled
debug output.
2023-07-26 08:47:22 -04:00
Hemna e0c3c5cbbf remove loop log 2023-07-25 20:45:55 -04:00
Hemna 26f354b3a9 Max out the client reconnect backoff to 5
This patch adjusts the backoff mechanism for aprs client
reconnect to a max backoff sleep of 5 seconds.   This prevents
an exponential backoff when connection retrying.
2023-07-24 17:03:29 -04:00
Walter A. Boring IV 922a6dbb35
Merge pull request #125 from craigerl/update-Dockerfile
Update the Dockerfile
2023-07-24 14:34:06 -04:00
Hemna d03c4fc096 Update the Dockerfile
This updates the main Dockerfile to be the same as the
Dockerfile-dev other than using the official pypi package for
aprsd.
2023-07-24 11:36:28 -04:00
Hemna dfd3688d8f Changelog updates for v3.1.0
This patch is an update to the Changelog for the
3.1.0 release.
2023-07-24 11:22:53 -04:00
Hemna c7d629f88a Use CONF.admin.web_port for single launch web admin
This patch changes the non uwsgi launch of the admin page
to use the config for the web port
2023-07-24 09:39:16 -04:00
Hemna 099b87e250 Fixed sio namespace registration 2023-07-23 20:22:48 -04:00
Hemna 1ab9c3fee4 Update Dockerfile-dev to include uwsgi 2023-07-24 00:13:28 +00:00
Walter A. Boring IV 8891cd3002
Merge pull request #124 from craigerl/wsgi-rework
replacement of flask-socketio with python-socketio
2023-07-23 19:45:51 -04:00
Hemna 4664ead9e7 Fixed pep8 2023-07-23 19:34:02 -04:00
Hemna e51a501544 change port to 8000 2023-07-23 19:19:55 -04:00
Hemna 89576a3c43 replacement of flask-socketio with python-socketio
This patch starts the work to replace flask-socketio with
python-socketio so that uwsgi can be used instead of gunicorn.
uwsgi can support websockets.

Have to rework webchat command next
2023-07-23 18:54:23 -04:00
Hemna 5383b698ea Change how fetch-stats gets it's defaults
The defaults come from the aprsd.conf CONF attributes now.
2023-07-22 17:05:11 -04:00
Hemna cbef93b327 Ensure fetch-stats ip is a string 2023-07-22 16:41:54 -04:00
Hemna 6ae55fc9a1 Add info logging for rpc server calls 2023-07-20 16:43:31 -04:00
Hemna 588e140a7f updated wsgi config default /config/aprsd.conf
This patch changes wsgi.py to default to /config/aprsd.conf

It's assumed that this will be used as a docker container
2023-07-20 15:59:44 -04:00
Walter A. Boring IV d251a2727a
Merge pull request #123 from craigerl/flask-update
Remove flask pinning
2023-07-20 15:10:07 -04:00
Hemna d3a93b735d Added timing after each thread loop
This is to help keep track of which non-blocking threads are still
alive.

The RPC Server thread blocks, so the time will always increase.
2023-07-20 14:44:46 -04:00
Hemna fa452cc773 Update docker bin/admin.sh
This patch uses the wsgi.py instead of admin_Web.py
2023-07-20 14:34:32 -04:00
Hemna 6a6e854caf Removed flask-classful from webchat
This patch removed the dependency on flask-classful.  This required
making all of the flask web routing non class based.

This patch also changes the aprsis class to allow retries for failed
connections when the aprsis servers are full and not responding to
login requests.
2023-07-20 14:34:31 -04:00
Hemna e1183a7e30 Remove flask pinning
Also removed need for flask-classful. Created new
aprsd/wsgi.py for the web admin interface.
2023-07-20 14:34:31 -04:00
Hemna 5723e3a77b removed linux/arm/v8 2023-07-20 14:33:59 -04:00
Hemna dee73c1060 Update master build to include linux/arm/v8 2023-07-17 22:08:00 +00:00
Hemna d8318f2ae2 Update Dockerfile-dev to fix plugin permissions
This patch changes the user creation to include creating
a home directory so the plugin install installs those plugins
as a --user option.
2023-07-17 21:57:39 +00:00
Walter A. Boring IV 2825cac446
Merge pull request #122 from craigerl/crypto-upgrade
Update requirements for upgraded cryptography
2023-07-17 16:56:30 -04:00
Hemna fa6e738a20 update manual build github 2023-07-17 13:05:31 -04:00
Hemna 0c179005ee Update requirements for upgraded cryptography
This patch updates the requirements.in to remove the
pinning to cryptography 38.0.1.  Lets see if the docker
images build.
2023-07-17 11:15:50 -04:00
Hemna ad004633de Added more libs for Dockerfile-dev 2023-07-17 14:09:37 +00:00
Hemna ccd564a52e Replace Dockerfile-dev with python3 slim 2023-07-17 13:28:34 +00:00
Hemna 35d41582ee Moved logging to log for wsgi.py
Added wsgi.py to be used with gunicorn to start aprsd's web admin
interface.

gunicorn -b :8080 "aprsd.wsgi:app"
2023-07-16 16:32:39 -04:00
Hemna 565ffe3f72 Changed weather plugin regex pattern
The weather plugins used to match on w, but now require wx
2023-07-15 18:22:24 -04:00
Hemna 0bd11d05c6 Limit the float values to 3 decimal places 2023-07-14 11:35:32 -04:00
Walter A. Boring IV 62eff8645d
Merge pull request #119 from craigerl/wx-fixes
Fixed rain numbers from aprslib
2023-07-14 11:18:11 -04:00
Hemna aa547cbef5 Fixed rain numbers from aprslib 2023-07-14 10:42:36 -04:00
Hemna 7f2aba702a Fixed rpc client initialization 2023-07-13 14:58:12 -04:00
Hemna 63bf82aab5 Fix in for aprslib issue #80
aprslib incorrectly decodes weather packets and doesn't provide
wind_speed or wind_direction from the CSE/SPD 7 bytes in the APRS
packet.  This patch puts a temporary fix in place until the
aprslib pull request lands and is released.

https://github.com/rossengeorgiev/aprs-python/issues/80

https://github.com/rossengeorgiev/aprs-python/pull/81
2023-07-13 14:35:55 -04:00
Hemna bba7b68112 Try and fix Dockerfile-dev 2023-07-10 16:34:14 +00:00
Hemna 005675cb46 Fixed pep8 errors 2023-07-10 11:01:41 -04:00
Hemna 191e1ff552 Populate stats object with threads info
This patch adds the thread names and state to the stats object
so the aprsd fetch-stats command can show it.
2023-07-10 10:44:24 -04:00
Hemna 0a14b07fae added counts to the fetch-stats table 2023-07-09 21:29:29 -04:00
Hemna b2e621da4b Added the fetch-stats command
You can now fetch and view the stats of a live running aprsd server
if it has enabled the rpc server in the config file's rpc_settings
block.
You just have to match the magic word as specified in the config file to
authorize against the rpc server.

aprsd fetch-stats --ip-address <ip of aprsd> --port <port> --magic-word
<magic word>
2023-07-09 21:06:57 -04:00
Hemna fe0d71de4d Replace ratelimiter with rush
This patch replaces the ratelimiter library with rush for rate limiting
as the ratelimiter package doesn't work with python 3.11.

This patch also refactors the flask.pu to admin_web.py and
aprsd.py to main.py
2023-07-08 17:30:22 -04:00
Hemna 9b944142bd Added some utilities to Dockerfile-dev
This patch adds telnet, sudo and vim to the development
Dockerfile-dev file for testing aprsd in a container
2023-06-22 15:51:45 -04:00
Hemna b172c692a1 add arm64 for manual github build 2023-06-22 10:44:08 -04:00
Hemna 311cebaf27 Added manual master build 2023-06-22 10:08:28 -04:00
Walter A. Boring IV f4d60357ee
Update master-build.yml
undo
2023-06-22 10:04:59 -04:00
Hemna 09a0c4cb02 Add github manual trigger for master build 2023-06-22 10:03:39 -04:00
Hemna 80b85e648f Fixed unit tests for Location plugin 2023-06-22 09:06:55 -04:00
Hemna 9931c8a6c5 USe new tox and update githubworkflows
This patch updates tox to the latest and updates the github workflows
to use tox-gh, which is claimed to work with github parallel tox runs
2023-06-22 07:58:35 -04:00
Hemna 319969cc08 Updated requirements 2023-06-21 19:12:38 -04:00
Hemna da20ff038b force tox to 4.3.5 2023-06-21 19:09:26 -04:00
Hemna 15bf3710d2 Update github workflows
removed building for arm64
2023-06-21 18:55:32 -04:00
Hemna 5bc589f21f Fixed pep8 violation
This patch fixes a pep8 violation in the location plugin
2023-06-21 18:51:53 -04:00
Hemna 8b73372b6e Added rpc server for listen
Added the ability to start the rpc server for fetching stats from the
listen command.  If the rpc server is enabled in config, the rpc
server will now start.
2023-06-21 18:48:08 -04:00
Hemna 26c1e7afbb Update location plugin and reworked requirements
Added geopy as a dependency for the location plugin.
The us weather service API is now broken upstream.

Reworked the requirements.txt and dev-requirements.txt files
2023-06-15 16:08:28 -04:00
Walter A. Boring IV c99d5b859e
Merge pull request #116 from jhmartin/fix-example-plugin
Example plugin wrong function
2023-06-14 11:26:38 -04:00
Hemna cad22e1744 Fixed .readthedocs.yaml format 2023-06-14 09:31:19 -04:00
Hemna 43d6b62760 Add .readthedocs.yaml
Read the docs service is now requiring the config file
.readthedocs.yaml for it to be able to build the online documentation
for a project.
2023-06-14 09:11:39 -04:00
Jason Martin 96fa4330ba
Example plugin wrong function
The example plugin, used verbatim, complains about an abstract class. The interface requires 'process'  not 'command'.
2023-05-19 22:53:15 +00:00
Hemna 4e99e30f16 Ensure conf is imported for threads/tx
Import the conf for threads/tx.py to ensure that the
msg_rate_limit_period is defined prior to the conf entry
being referenced.
2023-05-05 11:07:23 -04:00
Hemna 00f1c3a2ba Update Dockerfile to help build cryptography 2023-04-26 14:31:50 +00:00
Hemna 0527ddfdba Update Changelog to 3.0.3 2023-04-25 14:44:56 -04:00
Hemna 5694cabd93 cleanup some debug messages 2023-04-25 14:29:26 -04:00
Hemna e21e2a7c50 Fixed loading of plugins for server
Some instances the plugins failed to load
2023-04-20 14:31:50 -04:00
Hemna 17d9c06b07 Don't load help plugin for listen command
This patch disables loading the help plugin for the listen command.
2023-04-17 15:37:48 -04:00
Hemna 66ebb286d8 Added listen args. 2023-04-17 15:31:07 -04:00
Hemna 0ec41f7605 Change listen command plugins
The listen command now adds the --load-plugins, which is false by
default, to load all the plugins as defined in the config file.
2023-04-17 15:01:57 -04:00
Hemna c353877321 Added listen.sh for docker
This patch adds the listen.sh entry point for the docker image.
2023-04-17 15:45:49 +00:00
Hemna 483afce5ad Update Listen command
This patch updates the aprsd listen command to add the packet-plugins
argument which allows enabling a single plugin to work against the
packets recieved from the aprsis network.
2023-04-17 10:51:17 -04:00
Hemna 8a456cac48 Update Dockerfile 2023-01-31 17:36:39 +00:00
Walter A. Boring IV 62e1d69272
Merge pull request #111 from craigerl/ratelimit
Add ratelimiting for acks and other packets
2023-01-18 14:01:41 -05:00
Hemna 840b0aba97 Add ratelimiting for acks and other packets
This patch adds basic ratelimiting to sending out AckPackets
and non AckPackets.  This provides a basic way to prevent
aprsd from sending out packets as fast as possible, which isn't
great for a bandwidth limited network.

This patch also adds some keepalive checks to all threads in the
threadslist as well as the network client objects (apris, kiss)
2023-01-18 13:00:10 -05:00
Hemna 357a193a75 Update Changelog for 3.0.2 2023-01-16 11:41:42 -05:00
Hemna 4aa4a4b5d3 Import RejectPacket 2023-01-16 11:38:48 -05:00
Hemna 062f3caf83 3.0.1 2023-01-14 12:56:03 -05:00
Walter A. Boring IV 9ac9835541
Merge pull request #109 from craigerl/reject_packet
Add support for Reject messages.
2023-01-14 12:53:19 -05:00
Hemna c68b270ee2 Add support to Reject messages.
This patch adds support for receiving reject messages.
2023-01-14 12:41:22 -05:00
Hemna 38725907f3 Update Docker builds for 3.0.0 2023-01-09 11:54:50 -05:00
Hemna 4a10511d8b Update Changelog for 3.0.0 2023-01-09 11:05:14 -05:00
Hemna c5aba17ad1 Ensure server command main thread doesn't exit
This patch adds join calls on the running threads to prevent
the main thread from exiting prematurely.
2023-01-07 14:57:25 -05:00
Hemna 233d49bb4c Fixed save directory default 2023-01-03 15:38:19 -05:00
Hemna 6391c7eed6 Fixed pep8 failure 2023-01-03 09:01:53 -05:00
Hemna 0758a58101 Cleaned up KISS interfaces use of old config 2023-01-02 14:20:13 -05:00
Hemna a5520b2cd3 reworked usage of importlib.metadata
For whatever reason passing in group in python 3.9.x
fails for importlib_metadata.entry_points.  This patch
fetches all and filters through them to get the real
oslo.config.opts entry points now.  This is to find all
of the config options of aprsd and the plugins
2023-01-02 14:13:49 -05:00
Hemna 29b8764124 Added new docs files for 3.0.0 2023-01-02 14:13:49 -05:00
Hemna fe2f7b5b71 Removed url option from healthcheck in dev 2023-01-01 17:37:43 +00:00
Hemna c5acdba6de Updated Healthcheck to use rpc to call aprsd
After adding the rpc service for aprsd server and separating the
admin web REST interface, healthcheck no longer worked.   The stats
are available via rpc now.
2022-12-31 16:52:50 -05:00
Hemna 79e7ed1e91 Updated docker/bin/run.sh to use new conf
This patch updates the docker shell run script to use the
new aprsd.conf file.  The new aprsd config is an aprsd.conf file
now not, aprsd.yml
2022-12-30 10:13:25 -05:00
Hemna ed284a42cc Added ObjectPacket
This patch adds the ObjectPacket.  This is used by the REPEAT plugins
to send out an object in message packet to let radios tune directly
to the station.
2022-12-30 09:44:25 -05:00
Hemna 3d0bb8ae8e Update regex processing and regex for plugins
The regex search is now by default case insensitive.
Also update each core plugin to better match the command.

ping plugin can now match on
p
p foo
ping
pIng

Weather plugins can now match on
w
wx
wX
Wx KM6LYW
weather
WeaTher
2022-12-29 14:34:46 -05:00
Hemna 83d2e708eb Change ordering of starting up of server command
This patch moves the plugin manager to early in the startup
process so that the plugins get loaded, which also means each
plugin's custom config settings will be in the CONF object.
This allows dumping the entire CONF with all the plugin settings.
2022-12-29 14:15:56 -05:00
Walter A. Boring IV 473f00973b
Merge pull request #107 from craigerl/oslo-config
Convert config to oslo_config
2022-12-29 09:17:51 -05:00
Hemna c929689647 Update documentation and README
This updates the documentation in prep for 3.0.0
2022-12-28 16:50:34 -05:00
Hemna ff392395ed Decouple admin web interface from server command
This patch introduces rpyc based RPC client/server for
the flask web interface to call into the running aprsd server
command to fetch stats, logs, etc to send to the browser.

This allows running the web interface via gunicorn command
gunicorn -k gevent --reload --threads 10 -w 1 aprsd.flask:app --log-level DEBUG
2022-12-28 15:55:09 -05:00
Hemna 02e4f78d0e Dockerfile now produces aprsd.conf
This patch updates Dockerfile and Dockerfile-dev
to produce aprsd.conf instead of aprsd.yaml
2022-12-27 15:44:32 -05:00
Hemna e9a954a8fd Fix some unit tests and loading of CONF w/o file 2022-12-27 15:31:49 -05:00
Hemna f4a6dfc8a0 Added missing conf 2022-12-27 14:46:41 -05:00
Hemna 7ccfc253cf Removed references to old custom config
Also updated unittests to pass.
2022-12-27 14:30:03 -05:00
Hemna e13ca0061a Convert config to oslo_config
This patch is the initial conversion of the custom config
and config file yaml format to oslo_config's configuration mechanism.

The resulting config format is now an ini type file.

The default location is ~/.config/aprsd/aprsd.conf

This is a backwards incompatible change.  You will have to rebuild
the config file and edit it.

Also any aprsd plugins can now define config options in code and
add an setup.cfg entry_point definition
oslo_config.opts  =
  foo.conf = foo.conf:list_opts
2022-12-24 16:51:40 -05:00
Hemna ce3b29f990 Added rain formatting unit tests to WeatherPacket 2022-12-22 12:04:17 -05:00
Hemna bbcd7c8a5b Fix Rain reporting in WeatherPacket send.
Made a fix for a rain setting of 1.04 inches, the packet
has to be r104 instead of r001
2022-12-22 09:42:30 -05:00
Hemna 4a65f52939 Removed Packet.send()
This patch decouples sending a message from the internals of
the Packet classes.  This allows the rest of the code to use
Packet objects as type hints in methods to enforce Packets
in the plugins.

The send method was moved to a single place in the threads.tx.send()
2022-12-21 16:26:36 -05:00
Hemna f464ff0785 Removed watchlist plugins
All plugins can be loaded with the enabled_plugins
Also added unit tests for the PluginManager
2022-12-21 11:18:26 -05:00
Hemna 2ca36362ec Fix PluginManager.get_plugins
This patch fixes the result of get_plugins to be a list correctly.
2022-12-21 10:05:27 -05:00
Walter A. Boring IV eca5972ebd
Merge pull request #106 from craigerl/dataclasses
Dataclasses
2022-12-20 17:26:46 -05:00
Hemna 7dfa4e6dbf Cleaned up PluginManager
Added a separate pluggy track for normal plugins
and watch list plugins.
2022-12-20 16:19:05 -05:00
Hemna 220fb58f97 Cleaned up PluginManager
Added a separate pluggy track for normal plugins
and watch list plugins.
2022-12-20 15:13:13 -05:00
Hemna 088cbb81ed Update routing for weatherpacket
update the routing to
WIDE1-1, WIDE2-1
2022-12-20 11:51:47 -05:00
Hemna f19043ecd9 Fix some WeatherPacket formatting 2022-12-19 21:27:05 -05:00
Hemna a1188d29d4 Fix pep8 violation 2022-12-19 20:41:57 -05:00
Hemna d01392f6a5 Add packet filtering for aprsd listen
Now aprsd listen can filter by packet types.
2022-12-19 17:28:18 -05:00
Hemna 899a6e5363 Added WeatherPacket encoding
This patch adds the ability to output a correctly formatted
APRS weather packet for sending.
2022-12-19 14:04:14 -05:00
Hemna ad0d89db40 Updated webchat and listen for queue based RX
This patch updates both the webchat and listen commands
to be able to use the new queue based packet RX processing.

APRSD used to start a thread for every packet received, now
packets are pushed into a queue for processing by other threads
already running.
2022-12-19 10:28:22 -05:00
Hemna e37f99a6dd reworked collecting and reporting stats
This is the start of the cleanup of reporting of
packet stats
2022-12-18 21:54:34 -05:00
Hemna 9fc5356456 Removed unused threading code 2022-12-18 09:14:12 -05:00
Hemna 123b3ffa81 Change RX packet processing to enqueu
This changes the RX thread to send the packet into a queue instead of
starting a new thread for every packet.
2022-12-18 08:52:58 -05:00
Hemna 1187f1ed73 Make tracking objectstores work w/o initializing
This changes the objectstore to test to see if the config has been
set or not.  if not, then it doesn't try to save/load from disk.
2022-12-17 20:06:28 -05:00
Hemna c201c93b5d Cleaned up packet transmit class attributes
This patch cleans up the Packet class attributes used to
keep track of how many times packets have been sent and
the last time they were sent.  This is used by the PacketTracker
and the tx threads for transmitting packets
2022-12-17 18:06:24 -05:00
Hemna f1de7bc681 Fix packets timestamp to int
Python's default timestamp is a float.
APRS packet expect to have an old style unix integer
timestamp.
2022-12-16 15:58:03 -05:00
Hemna 6030cb394b More messaging -> packets cleanup
Fixed the unit tests and the notify plugin
2022-12-16 15:58:03 -05:00
Hemna bfc0a5a1e9 Cleaned out all references to messaging
The messaging.py now is nothing but a shell that
contains a link to packets.NULL_MESSAGE to help maintain
some backwards compatibility with plugins.

Packets dataclass has fully replaced messaging objects.
2022-12-16 15:58:02 -05:00
Hemna 59e5af8ee5 Added contructing a GPSPacket for sending
This patch adds the needed code to construct the raw output
string for sending a GPSPacket.

TODO: Need to incorporate speed, course, rng, position ambiguity ?
TODO: Need to add option to 'compress' the output location data.
2022-12-16 15:58:02 -05:00
Hemna 1b49f128a9 cleanup webchat 2022-12-16 15:58:02 -05:00
Hemna 94fb481014 Reworked all packet processing
This patch reworks all the packet processing to use the new
Packets objects.  Nuked all of the messaging classes.

backwards incompatible changes
all messaging.py classes are now gone and replaced by
packets.py classes
2022-12-16 15:58:02 -05:00
Hemna 67a441d443 Updated plugins and plugin interfaces for Packet
This patch updates unit tests as well as the Plugin filter()
interface to accept a packets.Packet object instead of a
packet dictionary.
2022-12-16 15:58:02 -05:00
Hemna 082db7325d Started using dataclasses to describe packets
This patch adds new Packet classes to describe the
incoming packets parsed out from aprslib.
2022-12-16 15:58:02 -05:00
Hemna 2089b2575e v2.6.1 2022-12-16 15:56:48 -05:00
Hemna 9571b0bb38 Fixed position report for webchat beacon
With more testing of the webchat beaconing, found a problem
with the packet format for the beacon.  This patch fixes the
packet format of the beacon.

Also added a timeout when trying to get the GPS location in the browser,
otherwise it could never come back.
2022-12-16 12:01:35 -05:00
Hemna 87cbcaa47f Try and fix broken 32bit qemu builds on 64bit system
This patch adds a 'fix' for trying to build on armv7 32bit system
from a 64bit system.  qemu seems broken in this case.
2022-12-15 13:05:22 -05:00
Hemna 19e5cfa9cc Add unit tests for webchat 2022-12-14 08:26:12 -05:00
Walter A. Boring IV 24b16a29e8
Merge pull request #105 from craigerl/collections-fix
Collections fix
2022-12-13 09:55:02 -05:00
Hemna 321c5a2c25 remove armv7 build RUST sucks 2022-12-13 08:37:31 -05:00
Hemna 9d19502dd8 Fix for Collections change in 3.10
python 3.10 moved a module in the collections package
breaking backwards compatibility.  this patch puts a fix in
to account for it.
2022-12-12 20:45:19 -05:00
Hemna a6015adecc Update workflow again
to include tag latest
2022-12-12 19:49:51 -05:00
Hemna 4fe99c35b5 Update Dockerfile to 22.04 2022-12-12 19:37:55 -05:00
Hemna c1db238719 Update Dockerfile and build.sh
This fixes a problem with the github workflow
2022-12-12 19:30:30 -05:00
Hemna 40f23dcb48 Update workflow 2022-12-12 14:56:35 -05:00
Hemna 5891c71483 Prep for 2.6.0 release 2022-12-12 14:45:35 -05:00
Hemna 68472b0d84 Update requirements 2022-12-12 13:07:45 -05:00
Hemna 935f820271 Removed Makefile comment. 2022-12-07 14:23:35 -05:00
Hemna 576301ca20 Update Makefile for dev vs. run environments
This patch updates the Makefile to allow for creating
development vs runtime python virtual environments.

If you only want to run aprsd commands
make run

If you want to work on aprsd code
make dev
2022-12-07 14:19:42 -05:00
Hemna 6d34d9c514 Added pyopenssl for https for webchat
In order for webchat to support fetching the GPS location in the
browser, the conenction from the browser needs to be https://
2022-12-07 14:03:25 -05:00
Walter A. Boring IV deeee71f8f
Merge pull request #103 from craigerl/user-agents
change from device-detector to user-agents
2022-12-07 13:59:26 -05:00
Hemna f2b1ad35f9 change from device-detector to user-agents
the device detector was taking 1 minute on a raspi to parse out the
user-agent string from the browser.  user-agents takes 2 seconds,
which still isn't great, but 'doable' for the webchat interface.
2022-12-07 13:40:08 -05:00
Walter A. Boring IV 1e65af2dea
Merge pull request #102 from craigerl/remove-twine
Remove twine from dev-requirements
2022-12-05 17:10:27 -05:00
Hemna 83370689b9 Remove twine from dev-requirements
twine is only used for building a distribution and uploading
to pypi.  Unfortunately it has a dependency that pulls in
cryptography which is painful on rpi systems as it requires
the latest version of rustc and cargo.
2022-12-05 16:13:39 -05:00
Hemna e4f93a2ab4 Update to latest Makefile.venv
This patch updates the Makefile.venv to the latest upstream.
2022-12-02 17:08:10 -05:00
Walter A. Boring IV acecba27e8
Merge pull request #101 from craigerl/webchat_mobile
Add support for mobile browsers for webchat
2022-12-02 17:06:17 -05:00
Hemna 51b80cd4ea Refactored threads a bit
This patch refactors the rx threads a bit to reuse some code
responsible for processing acks when packets are received.

This also eliminates a custom thread in the webchat command for
processing received packets now that there is common code in the base
classes.
2022-12-02 16:26:48 -05:00
Hemna 480094b0d4 Mark packets as acked in MsgTracker
This patch updates webchat to track the msgs recieved as tracked
and acked, so the TX thread can stop trying to send.
2022-12-02 14:58:32 -05:00
Hemna 726c8f4f2f remove dev setting for template 2022-12-02 14:20:52 -05:00
Hemna ee96108324 Add GPS beacon to mobile page
This patch adds the GPS beacon button to the mobile layout.
2022-11-30 15:17:28 -05:00
Hemna 5067f745ca Allow werkzeug for admin interface.
This patch enables werkzeug for socketio for the admin interface
2022-11-30 14:31:44 -05:00
Hemna 98fe9daac5 Allow werkzeug for admin interface.
This patch enables werkzeug for socketio for the admin interface
2022-11-30 14:28:31 -05:00
Hemna f9e7195e25 Add support for mobile browsers for webchat
This patch adds initial support for changing the UI for webchat
based if the browser is on a mobile device.
2022-11-30 14:14:51 -05:00
Hemna 44696fbc56 Ignore callsign case while processing packets
This patch fixes an issue where aprsd was deciding if it was
supposed to process a packet destined for itself or not.  It was
making a case sensitive comparison.  This patch makes that comparison
case insensitive for the callsign itself.
2022-11-30 13:57:25 -05:00
Walter A. Boring IV 78329f79f4
Merge pull request #100 from craigerl/webchat_gps
Send GPS Beacon from webchat interface
2022-11-30 11:25:34 -05:00
Hemna 5add0f958d remove linux/arm/v7 for official builds for now 2022-11-26 18:34:32 -05:00
Hemna d40927d1c3 added workflow for building specific version 2022-11-26 18:28:14 -05:00
Hemna d5e56b553e Allow passing in version to the Dockerfile
This patch allows setting the version from pypi.org to use
when building the container.

Currently defaults to 2.5.9.
2022-11-26 18:23:07 -05:00
Hemna 1a1d00242b Send GPS Beacon from webchat interface
This patchset allow getting the GPS coordinates from the browser's
geolocation API (which can be denied by user), then send's the GPS
coordinates to aprsd via socketio and then aprsd sends a beacon.

This allows the APRS network to know the location of the person running
the webchat app via browser so packets can get routed back to it.
2022-11-25 13:25:09 -05:00
Walter A. Boring IV 19f804bf68
Merge pull request #99 from craigerl/remove-email-validation
Remove email validation
2022-11-25 11:38:13 -05:00
Hemna 4111d16aaf specify Dockerfile-dev 2022-11-25 11:21:10 -05:00
Hemna d1a0a988f2 Fixed build.sh
This patch fixes passing the branch to the build script
2022-11-25 10:05:16 -05:00
Hemna d9b39734e6 Build on the source not released aprsd 2022-11-25 10:03:34 -05:00
Hemna d4bf0f1e3c Remove email validation
The package/library being used for email validation is basically
defunct now.
2022-11-25 09:29:41 -05:00
Hemna 117f81f55f Add support for building linux/arm/v7
This patch adds support for the github workflow for building
the raspi architecture
2022-11-24 09:44:32 -05:00
Hemna b41e4a9ef3 Remove python 3.7 from docker build github
This patch removes the testing of python 3.7 during the
github action workflow for building the docker image
2022-11-23 15:41:27 -05:00
Walter A. Boring IV e66dc344b8
Merge pull request #91 from craigerl/small_refactor
Small refactor
2022-11-23 13:33:23 -05:00
Hemna 5acddbd466 Fixed failing unit tests
This patch re-adds in the pytz lib for the generic time plugins.
2022-11-23 13:28:38 -05:00
Hemna 17e784629e change github workflow
remove python 3.7
2022-11-23 13:06:33 -05:00
Hemna 528bdb99e7 Removed TimeOpenCageDataPlugin
This patch removes the TimeOpenCageDataPlugin as it's been superceded
by the aprsd-timeopencage-plugin
2022-11-23 13:02:46 -05:00
Hemna fc1ca52593 Dump config with aprsd dev test-plugin
This patch adds the dumping of the config read for the
aprsd dev test-plugin command
2022-11-23 13:02:46 -05:00
Hemna 075078b520 Updated requirements 2022-11-23 13:02:44 -05:00
Hemna 7d970cbe70 Got webchat working with KISS tcp
This patch reworks the KISS client to get rid of
aioax25 as it was too difficult to work with due to
heavy use of asyncio.

Switched to the kiss3 pypi library.
2022-11-23 13:01:43 -05:00
Hemna d717a22717 Added click auto_envvar_prefix
This allows setting environment variables that are
prefixed with APRSD_
2022-11-23 13:01:06 -05:00
Hemna 9b0c626b59 Update aprsd thread base class to use queue
This patch updates the main aprsd threads class to use
a shared queue to notify all aprsd thread classes they need
to exit.  This ensures any closing down of sockets, etc happens from
inside the context of the thread itself, not the MainThread that
calls stop.
2022-11-23 13:01:06 -05:00
Hemna 967959e7b3 Update packets to use wrapt
This patch updates the aprsd/packets.py to use wrapt for it's method
lock synchornization.
2022-11-23 13:01:06 -05:00
Hemna e5f60b5ce1 Add remving existing requirements
This patch updates the Makefile to do an rm on the requirements.txt
when updating the requirements files.
2022-11-23 13:01:06 -05:00
Hemna 2ce50d8861 Try sending raw APRSFrames to aioax25
This seems to work sending out, but still getting
third-party dropped packets as response from the local repeater.
2022-11-23 13:01:06 -05:00
Hemna ad79ed1261 Use new aprsd.callsign as the main callsign
This patch changes how aprsd identifies itself when connected to
any client, which is not relying on the login for each client.
There are 3 supported clients currently
aprsis,
tcpkiss
serialkiss.

Each client has their own potential login/callsign to connect
to the remote.  This patch tells aprsd to use the new config option
aprsd.callsign as a means to identify itself.  It will accept
packets as <aprsd.callsign> and reply as <aprsd.callsign> regardless
of which client object is being used to connect to the remote.

Note: this breaks backwards compatibility.  This patch now requires
the new config option
aprsd:
  callsign: <callsign>
2022-11-23 13:01:01 -05:00
Hemna 5f28788180 Fixed access to threads refactor 2022-11-23 13:00:37 -05:00
Hemna 585d55f10d Added webchat command
This patch adds the new aprsd webchat command which shows
a new webpage that allows you to aprsd chat with multiple
callsigns
2022-11-23 13:00:36 -05:00
Hemna 1ccb2f7695 Moved log.py to logging
Also renamed logging/logging.py to logging/rich.py
2022-11-23 13:00:36 -05:00
Hemna a62843920a Moved trace.py to utils
This patch moves trace.py to the utils directory
2022-11-23 13:00:36 -05:00
Hemna 29b84b453b Fixed pep8 errors 2022-11-23 13:00:36 -05:00
Hemna 347a6d69f7 Refactored threads.py
This patch creates a threads directory and separates out
the contents of threads.py into separate files in the
threads directory to make it easier to find and maintain.
2022-11-23 13:00:36 -05:00
Hemna bed060f1c5 Refactor utils to directory
This patch moves the utils.py to utils/__init__.py
and fuzzyclock.py to utils
and separates the ring_buffer to it's own file in utils
2022-11-23 13:00:36 -05:00
Hemna ab6583666f remove arm build for now 2022-11-04 14:06:03 -04:00
Hemna 3580425ca3 Added rustc and cargo to Dockerfile
This is an attempt to fix the failing docker image build for
linux/arm/v7
2022-11-04 11:34:27 -04:00
Hemna 358aa59042 remove linux/arm/v6 from docker platform build 2022-11-04 10:41:25 -04:00
Hemna 9671dacb1c Only tag master build as master 2022-11-04 10:32:22 -04:00
Hemna f9d3bc433f Remove docker build from test
This patch removes the container build from the main python.yml
github action that is only supposed to test tox results for commits
2022-11-04 10:30:34 -04:00
Walter A. Boring IV 1383352e75
create master-build.yml
This patch adds the tox and docker image build for the latest container image on every push to master branch
2022-11-04 10:27:43 -04:00
Hemna b50f343440 Added container build action 2022-11-04 09:04:14 -04:00
Walter A. Boring IV 4c7c90b947
Merge pull request #98 from ranguli/ranguli-patch-1
Update docs on using Docker
2022-11-02 10:47:04 -04:00
ranguli bb09296efa
Update docs on using Docker 2022-11-01 21:49:58 -02:30
Hemna 7db2242060 Update dev-requirements pip-tools
This patch updates the pip-tools version to prevent the bug when
trying to run make update-requirements failing.
2022-11-01 14:16:24 -04:00
Walter A. Boring IV 61655a0a85
Merge pull request #89 from wildeyedskies/update-eventlet
Bump dependencies to fix python 3.10
2022-11-01 14:02:13 -04:00
Walter A. Boring IV fdc8bfafc0
Merge pull request #96 from ranguli/fix-pypi-scraping
Fix #92 (PyPI scraping)
2022-11-01 14:01:37 -04:00
Walter A. Boring IV 0e5f7aa211
Merge pull request #93 from ranguli/fix-readme-formatting
README formatting fixes
2022-11-01 13:59:28 -04:00
Walter A. Boring IV c16886263f
Merge pull request #94 from ranguli/fix-exception-typo
Fix typo on exception
2022-11-01 13:51:41 -04:00
Walter A. Boring IV eb4b67d9b8
Merge pull request #97 from ranguli/patch-1
Fix plugins not installing via docker-compose
2022-11-01 13:47:59 -04:00
ranguli 389304c3f2
Fix typo in docker-compose.yml 2022-10-28 10:35:39 -02:30
ranguli 9ffd320353 Fix PyPI scraping 2022-10-27 12:33:31 -02:30
Walter A. Boring IV 74e4e2c4f5
Merge pull request #95 from ranguli/patch-1 2022-10-26 22:19:20 -04:00
ranguli b1db08a08c
Allow web interface when running in Docker 2022-10-26 20:03:42 -02:30
ranguli cc2918377e Fix typo on exception 2022-10-26 16:46:50 -02:30
ranguli f339ee3ebf
README formatting fixes 2022-10-26 16:01:51 -02:30
Zoe Moore 9d39b030fb Bump dependencies to fix python 3.10 2022-05-31 15:53:26 -07:00
Hemna 1c052a63c0 Fixed up config option checking for KISS
This patch updates the config option checking for
required fields in the config yaml file.  Specifically
for the existence of the aprsd: section
and the required fields for the 3 supported client types
apris,
kiss serial,
kiss tcp
2022-02-21 16:04:33 -05:00
Hemna e739441268 Fix logging issue with log messages
This patch changes the base Message class to
ensure that all printing of the message class only
outputs the message in the truncated and bad word filtering
enabled in the log.
2022-02-11 10:03:02 -05:00
Hemna 03a20ebb5c for 2.5.9 2022-01-26 14:59:46 -05:00
Hemna 6257c9ea90 FIX: logging exceptions
This patch fixes the logging of exceptions in the email
plugin.
2022-01-26 14:39:14 -05:00
Hemna b00c8db3d6 Updated build and run for rich lib 2022-01-08 09:41:17 -05:00
Hemna 79270f95be update build for 2.5.8 2022-01-08 09:29:26 -05:00
Hemna 29a60b7ed0 For 2.5.8 2022-01-07 15:19:44 -05:00
Hemna e8100d8777 Removed debug code 2022-01-07 15:17:16 -05:00
Hemna 764730c123 Updated list-plugins
This patch updates the README.rst with the new format for
`aprsd list-plugins`.
2021-12-15 10:48:16 -05:00
Hemna 610e40aecd Renamed virtualenv dir to .aprsd-venv
This helps with shell prompts showing the name of the venv.
When you have multiple venv environments on your system, naming then
helps to identify which one you are actively using.
2021-12-15 10:45:53 -05:00
Hemna 2f6e7e17e8 Added unit tests for dev test-plugin
Also added a check to make sure that the aprs_login
parameter is passed in for use as the fromcallsign.
2021-12-12 16:35:26 -05:00
Hemna a7bbde4a43 Send Message command defaults to config
The APRS_LOGIN and APRS_PASSWORD arguments now fallback
to the config file if it exists.

First it checks the passed in parameters, then checks the
environement vars, then checks the parsed config to find the
login and password.

This patch also adds unit tests for the send-message command to
check the fallback.
2021-12-12 16:13:08 -05:00
Hemna 7530bcf55c Updated Changelog 2021-12-11 07:59:50 -05:00
Walter A. Boring IV ab37a5e7a7
Merge pull request #79 from craigerl/fix_kiss_is_enabled
Fixed an KISS config disabled issue
2021-12-11 07:56:47 -05:00
Hemna 3b9970c0e7 Fixed an KISS config disabled issue
This patch fixes a small bug when both KISS interfaces are disabled.
2021-12-11 07:46:43 -05:00
Hemna e57a2e2ffc Fixed a bug with multiple notify plugins enabled
This patch fixes an issue with the processing of packets
and updateing the watchlist.  Previously after the
notify plugin processed the packet it would update the watchlist.
This doesn't work when there are more than 1 notify plugins
enabled, only the first notify plugin seeing the packet will
recognize that the callsign is old.
2021-12-10 14:20:57 -05:00
Walter A. Boring IV 6a1cea63e4
Merge pull request #77 from craigerl/logs
Unify the logging to file and stdout
2021-12-10 11:12:48 -05:00
Hemna 592b328956 Unify the logging to file and stdout
This patch updates the logging facility to ensure that
logging to a file works even when --quiet mode is selected.
Also update the listen and list-plugins command to show
a console.status line while waiting for results to come in.
2021-12-10 10:49:09 -05:00
Walter A. Boring IV 450bacfe99
Merge pull request #76 from craigerl/list-plugins
Added new feature to list-plugins command
2021-12-09 09:44:49 -05:00
Hemna cd62db95c1 Added new feature to list-plugins command
This patch updates the ouput of the list-plugins command.
This also adds the ability to show the available plugins
to install that are published packages on pypi.org.

This also shows the list of installed packages from pypi.org
2021-12-08 17:16:17 -05:00
Hemna 28b54c330d more README.rst cleanup 2021-12-07 15:22:08 -05:00
Hemna 7c653cc100 Updated README examples
The examples in the README.rst were painfully old.
2021-12-07 15:18:27 -05:00
Hemna b7791eb4fa Changelog 2021-12-07 15:05:34 -05:00
Hemna 440c8d54ad Tightened up the packet logging 2021-12-07 15:00:38 -05:00
Walter A. Boring IV bcc1b4e309
Merge pull request #75 from craigerl/unittests
Unittests
2021-12-07 13:37:02 -05:00
Hemna 8ea00e9888 Added unit tests for USWeatherPlugin, USMetarPlugin 2021-12-07 13:31:58 -05:00
Hemna 5d6ac5cf31 Added test_location to test LocationPlugin 2021-12-07 12:38:12 -05:00
Hemna e0e75149a9 Updated pytest output
This patch changes tox.ini to update the output for the unit test
runs.
2021-12-07 11:57:01 -05:00
Hemna a5184fb98c Added py39 to tox for tests 2021-12-07 11:35:18 -05:00
Hemna 0ad791bdd9 Added NotifyPlugin unit tests and more
This patch restructures the unit tests for plugins.
This also adds unit tests for the NotifyPlugin
2021-12-07 11:25:14 -05:00
Hemna 96cc07d15f Small cleanup on packet logging
This patch reduces some of the leading whitespace
to the message/packet logging to the log file.
2021-12-06 14:35:49 -05:00
Hemna d3dd08714b Reduced the APRSIS connection reset to 2 minutes
The time in which the KeepAlive Thread would reset the APRS-IS
socket connection used to be 5 minutes.   This patch changes
that to 2 minutes.
2021-12-06 14:34:22 -05:00
Hemna 055835cb3c Fixed the NotifyPlugin
The watchlist notify plugin is supposed to send an APRS message
to the configured callsign.  This patch makes sure that the
message is sent to the notify_callsign
2021-12-06 14:11:34 -05:00
Walter A. Boring IV ff8bf02e26
Merge pull request #74 from craigerl/rich_logging
Rich logging
2021-12-03 09:32:25 -05:00
Hemna b5b286e75c Fixed some pep8 errors 2021-12-03 09:10:33 -05:00
Hemna 1233137caf Add tracing for dev command
This patch enables tracing output in the log for the dev
test-plugin command
2021-12-03 08:53:08 -05:00
Hemna 1d5f76defc Added python rich library based logging.
The python rich library is extensive and has a really nice
log format that is easier to read and has built in formatting
and coloring of the log output.

To enable rich logging add rich_logging: True in the config file.
2021-12-03 08:05:03 -05:00
Walter A. Boring IV 950c62f49b
Merge pull request #73 from emresaglam/loglevel
Added LOG_LEVEL env variable for the docker
2021-12-03 08:02:15 -05:00
Emre Saglam 7aaa002a0e Added LOG_LEVEL env variable for the docker 2021-12-02 17:08:41 -08:00
Hemna e27887db1a Update requirements to use aprslib 0.7.0
aprslib 0.7.0 has a few aprs packet parsing fixes.

https://github.com/rossengeorgiev/aprs-python/pull/66

Support for the 'more recent' reply/ack msg format from 1999
2021-11-28 10:47:12 -05:00
Hemna 5e50792e80 fixed the failure during loading for objectstore
This patch fixes a silent failure of loading data from the objectstore
2021-11-13 15:07:28 -05:00
Hemna deec249c45 updated docker build 2021-11-13 10:01:38 -05:00
Hemna ade3c49e93 Updated Changelog 2021-11-13 10:00:40 -05:00
Hemna 6fb610582d Fixed dev command missing initialization
This patch fixes a few issues when running test-plugin command.
It was missing some initialization of the stats and packets classes.
2021-11-13 09:56:19 -05:00
Hemna bda2ef00dd Fix admin logging tab 2021-11-12 12:17:45 -05:00
Hemna 446484e631 Added new list-plugins command
This patch adds the new list-plugins command that shows the
list of built in plugins for APRSD.
2021-11-12 11:36:22 -05:00
Hemna a8a6b1aa07 Don't require check-version command to have a config
This patch removes the need for check-version to have a
config file.
2021-11-12 10:23:27 -05:00
Hemna 8842fb1b44 Healthcheck command doesn't need the aprsd.yml config
This patch updates the healthcheck command to not require
the aprsd.yml config file to exist.   The healthcheck
calls a running aprsd, collects the stats to determine if it's
healthy.
2021-11-10 11:52:51 -05:00
Hemna 152132b0ed Fix test failures 2021-11-10 11:51:21 -05:00
Hemna 7787dc1be4 Removed requirement for aprs.fi key
This removed the requirement of running APRSD for specifying
the aprs.fi key in the config file.  The plugins that need the
key have been updated to set enabled = False when the key is missing.
2021-11-10 11:01:10 -05:00
Hemna 10e34d8634 Updated Changelog 2021-11-09 15:06:40 -05:00
Hemna 9469410929 Removed stock plugin. 2021-11-09 15:02:54 -05:00
Walter A. Boring IV 998bc32c27
Merge pull request #72 from craigerl/remove_stock_plugin
Removed the stock plugin
2021-11-09 14:59:03 -05:00
Hemna 88db485eb4 Removed the stock plugin
This patch removed the built in stock plugin from APRSD.
This helps clean up the requirement tree from the yfinance
python module that pulled in a lot of other requirements.

The stock plugin is it's own separate repo and module now.

https://github.com/hemna/aprsd-stock-plugin

https://pypi.org/project/aprsd-stock-plugin/
2021-11-09 14:53:20 -05:00
Hemna 5d17809895 Updated for v2.5.0 2021-11-09 10:31:29 -05:00
Hemna 059cc86a11 Updated Dockerfile's and build script for docker 2021-11-09 08:15:16 -05:00
Walter A. Boring IV ffdd1e47b2
Merge pull request #71 from craigerl/refactor_cli
Refactor cli
2021-11-09 08:07:19 -05:00
Hemna cdcb98e438 Cleaned up some verbose output & colorized output
Some commands now have some color if the shell detects it supports it.
2021-11-08 12:18:23 -05:00
Hemna 89727e2b8e Reworked all the common arguments
This patch reworks all the common arguments for the commands
and subcommands

--loglevel
--config_file
--quiet

These are all now processed in 1 place.
2021-11-08 11:52:41 -05:00
Hemna 617973f561 Fixed test-plugin 2021-11-05 16:40:07 -04:00
Hemna 9187b9781a Ensure common params are honored 2021-11-05 16:26:24 -04:00
Hemna 8287c09ce5 pep8 2021-11-05 14:38:23 -04:00
Hemna 82def598f0 Added healthcheck to the cmds
this patch moves the healthcheck to it's own command.
aprsd healthcheck
2021-11-05 14:21:36 -04:00
Hemna 3463c6eb96 Removed the need for FROMCALL in dev test-plugin
We already use the env var for APRS_LOGIN, so that is now
used for the test-plugin command.
Also cleaned up some help text
2021-11-05 14:05:24 -04:00
Hemna 2ead6a97da Pep8 failures 2021-11-05 13:42:27 -04:00
Hemna 7d0006b0a6 Refactor the cli
This patch refactors the cli to incorporate
the dev, send-message, listen commands into the main aprsd app.
This also moves the command line completion installer/show into
it's own subgroup.
2021-11-05 13:36:33 -04:00
Hemna 30df452e00 Updated Changelog for 4.2.3 2021-11-05 10:51:22 -04:00
Hemna 49f3ea8339 Fixed a problem with send-message command
This patch fixes a problem with the packets object
not being initialized correctly for the send-message command
from the command line.

Also adds the --wait-response option for send-message, which by
default is now False
2021-11-05 10:39:47 -04:00
Hemna 0d5b7166b3 Updated Changelog 2021-11-02 11:47:40 -04:00
Hemna cefb581bb8 Be more careful picking data to/from disk
This patch ensures that the pickle file is opened and closed correctly
as well as trapping for any exceptions that might occur while loading
a pickle file.
2021-11-02 08:52:59 -04:00
Hemna d2e8fe660f Updated Changelog 2021-10-25 11:33:15 -04:00
Hemna 95fecd2394 Ensure plugins are last to be loaded.
This patch initializes all of the MsgTrack, WatchList and SeenList
prior to the plugins loading.  Some plugins may kick off messages
being sent immediately.  So everything has to be ready to go
prior to the plugins being loaded.
2021-10-25 11:22:46 -04:00
Hemna c8c23e6185 Fixed email connecting to smtp server
Fixed an issue with not passing config in the smtp_connect
2021-10-25 11:12:29 -04:00
Hemna a3a3a5aa23 Updated Changelog for 2.4.0 release 2021-10-22 16:24:26 -04:00
Hemna e009791b75 Converted MsgTrack to ObjectStoreMixin 2021-10-22 16:07:20 -04:00
Hemna b0d25a76f7 Fixed unit tests
Had to initialize the watchlist and seenlist with a config dict.
2021-10-21 09:20:24 -04:00
Hemna 89701c8a70 Make sure SeenList update has a from in packet
This makes sure that the packet being processed by the seenlist
has a from address.
2021-10-21 08:40:40 -04:00
Hemna 66c5d85b89 Ensure PacketList is initialized 2021-10-20 15:48:35 -04:00
Hemna 8ee8b149f1 Added SIGTERM to signal_handler 2021-10-20 15:37:54 -04:00
Hemna 0d51634ec2 Enable configuring where to save the objectstore data
This patch adds a new config entry aprsd.save_location
which is the directory used to store and load the objectstore
data.
2021-10-20 14:39:12 -04:00
Hemna 135e21cd8d PEP8 cleanup 2021-10-20 14:10:54 -04:00
Hemna 4233827dea Added objectstore Mixin
This patch adds the new objectstore Mixin class that enables
classes that store their date in self.data as a serializeable dict,
to be able to be stored to disk at shutdown and loaded at startup.

The SeenList and WatchList are now saved/loaded to/from disk.
2021-10-20 14:07:22 -04:00
Hemna 9b2212245f Added -num option to aprsd-dev test-plugin
This allows the user to specify how many times in a loop
to call the plugin.  The Default is 1.
2021-10-20 11:46:55 -04:00
Hemna 9150f3b6ff Only call stop_threads if it exists 2021-10-10 14:50:04 -04:00
Hemna 278bb6e882 Added new SeenList
This patch adds the seen list feature.  It tracks the callsign of every
packet that aprsd sees.
2021-10-09 14:29:25 -04:00
Hemna 004795dbf1 Added plugin version to stats reporting
This patch adds version to the plugin stats collected.
2021-10-09 09:31:51 -04:00
Hemna 3b7924b13d Added new HelpPlugin
This patch adds the always enabled HelpPlugin.  This plugin
now will respond to the 'help' or 'h' commands that will
automatically build a help string based on the number of
enabled plugins.  It will also respond to
help <plugin> with the plugin specific help
2021-10-08 12:01:04 -04:00
Hemna 2bf85db21b Updated aprsd-dev to use config for logfile format
This patch updates the aprsd-dev command's log file format
to use what's defined as the default and/or use the config file
setting like aprsd server does.
2021-10-08 08:47:24 -04:00
Walter A. Boring IV 14f77876f9
Merge pull request #70 from craigerl/utils_refactor
Refactoring/Cleanup
2021-10-08 08:41:14 -04:00
Hemna db9cbf51df Updated build.sh
This patch forces the rebuild of the docker buildx build container.
Also makes the tag, version available from cmdln
2021-10-07 10:56:09 -04:00
Hemna 5b17228811 removed usage of config.check_config_option
check_config_option has been superceeded by the config UserDict
object's ability to see if a config option exists.
2021-10-07 10:11:48 -04:00
Hemna 725bb2fe35 Fixed send-message after config/client rework
This patch fixes the send-message from the command line
ability after the complete rework of the client classes.
2021-10-07 10:05:19 -04:00
Hemna f8d87d05bb Fixed issue with flask config
Flask was trying to serialize the UserDict object.  Use the
data (dict) inside of it instead.
2021-10-06 15:17:09 -04:00
Hemna 30671cbdbc Added some server startup info logs
This patch adds some general info logs around starting the
client connection as well as loading the plugins.
2021-10-06 12:55:17 -04:00
Hemna fdc8c0cd66 Increase email delay to +10
This patch updates the increasing of the email check delay to += 10
seconds instead of +1.
2021-10-06 12:12:49 -04:00
Hemna c097c31258 Updated dev to use plugin manager
Also ensure that main creates the client prior to starting the
plugins.
2021-10-06 12:09:52 -04:00
Hemna e3c5c7b408 Fixed notify plugins
The notify base filter() was missing the @hookimpl
2021-10-06 12:08:29 -04:00
Hemna 491644ece6 Added new Config object.
The config object now has builtin dot notation getter with default

config.get("some.path.here", default="Not found")
2021-10-04 15:37:14 -04:00
Hemna a6ed7b894b Fixed email plugin's use of globals
The email plugin was still using globals for tracking
the check_email_delay as well as the config.  This
patch creates a new singleton thread safe mechanism for
check_email_delay with the EmailInfo class.
2021-10-04 11:36:13 -04:00
Hemna 270be947b5 Refactored client classes
This patch completely refactors and simplifies how the clients
are created and used.  There is no need now to have a separate
KISSRXThread.  Since all the custom work for the KISS client is
encapsulated in the kiss client itself, the same RX thread and
callback mechanism works for both the APRSIS client and KISS Client
objects.  There is also no need to determine which transport
(aprsis vs kiss) is being used at runtime by any of the messages
objects.  The same API works for both APRSIS and KISS Client objects
2021-09-17 09:32:30 -04:00
Hemna 23e3876e7b Refactor utils usage
This patch separates out the config from the utils.py
utils.py has grown into a catchall for everything and this
patch is the start of that cleanup.
2021-09-16 17:08:30 -04:00
Hemna 65ea33290a 2.3.1 Changelog 2021-09-13 13:30:06 -04:00
Hemna 560e152742 Fixed issue of aprs-is missing keepalive
Started noticing that aprs-is keepalive messages just stop
getting sent.  This causes aprsd to basically disconnect from
the APRS network.  Added a check into the KeepAlive thread to
restart the aprs-is connecter if the last time we got a keepalive
from apris is > 5 minutes.
2021-09-13 13:22:06 -04:00
Hemna 69b215d4d8 Fixed packet processing issue with aprsd send-message
This patch adds the missing PacketList initialization
for the send-message command
2021-09-10 15:39:07 -04:00
Hemna 4164e89016 Prep 2.3.0 2021-09-08 14:54:59 -04:00
Hemna 1b9a9935fc Enable plugins to return message object
This patch enables the ability for plugins to return:
* string
* list of strings
* message object
* list of strings and message ojects

Each string will be encapsulated in a message object prior being sent.
each message object will be sent directly.
Each list will be iterated over and processed according to the above 2
options.
2021-09-08 14:45:15 -04:00
Hemna 3faf41b203 Added enabled flag for every plugin object
This allows the admin interface to see which plugins are registered and
enabled.  Enabled is a flag that is set in the setup() method of the
plugin.  This gives the plugin developer a chance to disable the plugin
if something isn't right at setup time.   This allows aprsd to ignore
plugins that are registered but not emabled.
2021-09-08 14:25:12 -04:00
Hemna 7e6dffb34b Ensure plugin threads are valid
This patch makes sure that the plugin threads returned from
create_threads is somewhat valid
2021-09-08 13:44:20 -04:00
Hemna 605911cb84 Updated Dockerfile to use v2.3.0
Upcoming release of aprsd is v2.3.0
2021-09-07 13:56:06 -04:00
Hemna 9eff99dde7 Removed fixed size on logging queue
If the logging queue gets full, due to a maxsize being set,
then any further logs will result on lots of errors being dumped
to stderr as the queue is full.
2021-09-07 13:43:48 -04:00
Hemna d6b3df93f1 Added Logfile tab in Admin ui
This patch adds a live view of the aprsd logfile in
the admin ui.  This uses a new Log QueueHandler and the
threads.logging_queue to push log entries into a queue.
The flask websockets server will push those log entries up
to a connected client browser.
2021-09-07 13:13:36 -04:00
Hemna 4f088e0a4a Updated Makefile clean target
This expands the clean target to clean up more stuff.
2021-09-05 19:47:26 -04:00
Hemna d643ca3892 Added self creating Makefile help target
This updates the Makefile to include a default target of help
and it automatically generates the help based on the targets
and their descriptions.
2021-09-03 16:49:16 -04:00
Hemna dfaf3aa3d1 Update dev.py
This patch ensures a valid packet is passed into the plugin prior to
testing.
2021-09-03 16:48:00 -04:00
Hemna 62ce84b315 Allow passing in aprsis_client
When the admin user users the web ui to send a message
a new client instance is created with login credentials for
that particular message.  This patch ensures that send_direct
uses that client.
2021-09-02 11:17:15 -04:00
Hemna 8ada789d4d Fixed a problem with the AVWX plugin not working
the regex for the plugin was not matching correctly
2021-09-02 11:06:25 -04:00
Hemna 558710d348 Remove some noisy trace in email plugin
This removes the trace decorators from the email login
functions.  They have been stable for a while now.
2021-09-02 10:03:43 -04:00
Hemna 1ea6c05dec Fixed issue at startup with notify plugin
Ensure that the aprsis client is configured prior to starting
any plugins.
2021-09-02 09:54:13 -04:00
Hemna 0f6df5fc05 Fixed email validation
This patch adjusts the py3-email-validation usage.  Since we
upgraded to 1.0.2, the signature has changed.  This patch adjusts
the signature usage so it works again.
2021-09-02 09:43:33 -04:00
Hemna 1635feb820 Removed values from forms 2021-09-02 09:12:44 -04:00
Hemna c58031d772 Added send-message to the main admin UI 2021-09-02 08:56:25 -04:00
Walter A. Boring IV 266ae7f217
Merge pull request #59 from craigerl/web-send-msg
Send Message via admin Web interface
2021-09-01 17:44:50 -04:00
Hemna c537b54df6 Updated requirements 2021-09-01 17:38:59 -04:00
Hemna 84ce60bc50 Cleaned up some pep8 failures 2021-09-01 17:11:35 -04:00
Hemna c941379a5c Upgraded the send-message POC to use websockets
This patch updates the send message Admi page to use
websockets.  It makes updates to the messages list instant.
2021-09-01 17:10:59 -04:00
Hemna 23cbf32814 New Admin ui send message page working. 2021-09-01 17:10:13 -04:00
Hemna 6d3258e833 Send Message via admin Web interface
This patch adds the ability to send a message from the
admin interface's send-message.html page.
2021-09-01 17:06:56 -04:00
Walter A. Boring IV d243e577f0
Merge pull request #50 from craigerl/tcpkiss
Added the ability to use direwolf KISS socket
2021-09-01 16:58:57 -04:00
Hemna ca438c9c60 Updated Admin UI to show KISS connections
This updates the top area of the Admin UI to reflect the
connection type (aprs-is vs kiss).
2021-09-01 16:39:50 -04:00
Hemna f4dee4b202 Got TX/RX working with aioax25+direwolf over TCP
This patch gets APRSD fully working with the TCPKISS socket
to direwolf.
2021-09-01 14:48:22 -04:00
Hemna 54c9a6b55a Rebased from master 2021-08-30 13:34:25 -04:00
Hemna b53e2ba7fe Added the ability to use direwolf KISS socket
This patch adds APRS KISS connectivity.  I have tested this with
a running Direwolf install via either a serial KISS connection or
the optional new TCPKISS connection, both to Direwolf.

This adds the new required aioax25 python library for the underlying
KISS and AX25 support.

NOTE: For the TCPKISS connection, this patch requires a pull request
patch the aioax25 library to include a TCP Based KISS TNC client to
enable the TCPKISS client  So you will need to pull down this PR
https://github.com/sjlongland/aioax25/pull/7

To enable this,
  Edit your aprsd.yml file and enable one of the 2 KISS connections.
  Only one is supported at a time.

  kiss:
     serial:
         enabled: True
         device: /dev/ttyS1
         baudrate: 9600

  or

  kiss:
      tcp:
          enabled: True
          host: "ip address/hostname of direwolf"
          port: "direwolf configured kiss port"

This patch alters the Message object classes to be able to
send messages out via the aprslib socket connection to the APRS-IS
network on the internet, or via the direwolf KISS TCP socket,
depending on the origination of the initial message coming in.

If an APRS message comes in via APRS-IS, then replies will go out
APRS-IS.  IF an APRS message comes in via direwolf, then replies
will go out via direwolf KISS TCP socket.   Both can work at the same
time.

TODO:  I need some real APRS message packets to verify that
the new thread is processing packets correctly through the plugins
and able to send the resulting messages back out to direwolf.

Have a hard coded callsign for now in the kissclient consumer call,
just so I can see messages coming in from direwolf.  I dont' have an
APRS capable radio at the moment to send messages directly to direwolf.
Might need to write a simple python socket server to send fake APRS
messages to aprsd kiss, just for finishing up development.
2021-08-30 13:28:39 -04:00
Hemna a7d79a6e1b Update Dockerfile to use 2.2.1 2021-08-26 10:50:34 -04:00
Hemna 44c4dd69c6 Update Changelog for 2.2.1 2021-08-25 08:28:21 -04:00
Hemna ec92b07e31 Silence some log noise
Removed an email thread log at the start of the loop.
Also bumped the Keepalivethread time to 60 seconds
2021-08-25 08:25:36 -04:00
Hemna 81903534ed Updated Changelog for v2.2.0 2021-08-25 08:02:09 -04:00
Hemna d5d00643fa Updated overview image 2021-08-24 17:41:26 -04:00
Hemna daf1e21b45 Removed Black code style reference
APRSD No longer follows the black code styling.  Black just sucks
due to it's completely unreadable code for functions with long
parameter lists.  This patch removes the code style badge from
the README.rst
2021-08-24 15:46:16 -04:00
Walter A. Boring IV 3dd48ebb86
Merge pull request #69 from craigerl/refactor_message_processing
Refactor Message processing and MORE
2021-08-24 15:43:19 -04:00
Hemna 61967b5fe8 Removed TXThread
Since all outbound messages have a send() method that starts
a separate there, there really is no reason for the transmit queue
thread at all.  All it did was get a message from the queue and then
call send on it, which would start another thread.  This removes that
intermediate TXThread.   When you want to send a message just call
send() on the message object.
2021-08-24 15:22:50 -04:00
Hemna 2e9b42d7af Added days to uptime string formatting
The uptime string formatter was missing days.
2021-08-24 14:08:24 -04:00
Hemna 0f384b0e85 Updated select timeouts
This patch updates the select timeouts for threads.  This allows
threads to exit quicker when user hits CTRL-C.

Updates the KeepAlive Thread to include total packets.
2021-08-24 13:31:33 -04:00
Hemna 8b5f21eece Rebase from master and run gray
This patch is a rebase of master after the introduction
of switching from black to gray code formatting.
2021-08-23 14:08:14 -04:00
Hemna 8e627c98b3 Added tracking plugin processing
This patch adds plugin rx/tx processing of packets.
This tracks how many messages a plugin processes (recieves) and
how many packets result in a plugin sending a message out.

This patch also adds a new plugins tab on the admin page.
2021-08-23 13:45:01 -04:00
Hemna 86777d838c Added threads functions to APRSDPluginBase
This patch updates the APRSDPluginBase class to include
standard methods for allowing plugins to create, start, stop
threads that the plugin might need/use.  Also update the aprsd-dev
to correctly start the threads and stop them for testing plugin
functionality.
Also added more unit tests and fake objects for unit tests.
2021-08-23 13:44:58 -04:00
Hemna 5f4cf89733 Refactor Message processing and MORE
This patch refactors how the recieved message processing happens.
We now handle all incoming packets the same.  Removed the notification
thread to handle the watchlist packets.  This is now done with a
unified plugins architecture that allows different capabilities
via the new plugin structure.  All packets sent to us will be
sent through all of the plugins.  It's the plugins job to decide what to
do with that packet or ignore it.

Email is no longer a special case for the most part.  All email
functions have been migrated to the EmailPlugin, including starting the
EmailThread, which works in the background to check for new emails and
send those to the registered callsign.   The EmailPlugin now starts the
EmailThread itself.

All plugins are now build on the new APRSDPluginBase which has a common
set of features.  The APRSDPluginBase calls self.setup() upon creation,
which allows all plugins to do whatever they want for initiali startup.
The EmailPlugin uses setup() to start the EmailThread if email is
enabled.
2021-08-23 13:43:53 -04:00
Hemna e175f77347 Use Gray instead of Black for code formatting.
The Black code formatter sucks with respect to function
declarations with a lot of params.  Completely unreadable.
2021-08-23 13:32:09 -04:00
Hemna d6643a8e96 Updated tox.ini
this patch updates tox.ini to support coverage tests.
2021-08-19 19:23:12 -04:00
Hemna f1f8aed8c4 Fixed LOG.debug issue in weather plugin 2021-08-19 19:07:45 -04:00
Hemna 2b694462f0 Updated slack channel link 2021-08-19 16:53:02 -04:00
Hemna e8ffaa92b6 Cleanup of the README.rst
Added badge section.
2021-08-19 16:45:23 -04:00
Hemna d71b0df314 Fixed aprsd-dev
This patch fixes running the aprsd-dev plugin development tool.
It currently only works for message based plugins.
2021-08-18 20:14:03 -04:00
Hemna 691b18fd1c Prep for v2.1.0
Update the Changelog for v2.1.0
2021-08-13 13:11:49 -04:00
Walter A. Boring IV 911730b28a
Merge pull request #68 from craigerl/plugins_multiple_msgs
Enable multiple replies for plugins
2021-08-13 12:45:52 -04:00
Hemna 349250685b Enable multiple replies for plugins
This patch adds the ability for plugins to send multiple messages
back in response to a command/message.  The plugin simple needs
to return a list of messages (Strings).  Each string in that list
will result in a separate message being sent back to the originator
of the message.
2021-08-13 12:36:48 -04:00
Hemna 840c8a990e Put in a fix for aprslib parse exceptions
This patch adds a fix for the aprslib consumer function
to ensure that we don't bail when logging a ParseError
2021-08-13 10:31:45 -04:00
Hemna ed4995b6eb Fixed time plugin 2021-07-29 20:17:58 -04:00
Hemna 6740ff80be Updated the charts Added the packets chart
This patch adds the APRS Packets chart to the charts admin ui.
Also moves the raw json as it's own tab
2021-07-22 20:44:20 -04:00
Hemna be8179415a Added showing symbol images to watch list
This patch updates the Admin UI to display the APRS icon symbol
associated with a mic-e packet on the watch list tab for all
entries in the watch list.
2021-07-21 09:21:04 -04:00
Hemna b4713b2694 Updated docs for 2.0.0 2021-07-17 15:15:52 -04:00
Walter A. Boring IV b606495fbf
Merge pull request #66 from craigerl/notify_rework
Reworked the notification threads and admin ui.
2021-07-17 14:45:18 -04:00
Hemna 2fceba10e1 Reworked the notification threads and admin ui.
This patch updates the notification thread to send all packets
through the notification plugins.   The plugins themselves need to
do smart filter to not reply to every packet.  This allows for
more interesting plugins.

Also fixed an issue with the messages tab in the admin ui, not
showing all of the recieved packets.   The messages tab now also
sees all the packets that aprsd recieves.
2021-07-17 14:30:29 -04:00
Hemna 3d38402be2 Fixed small bug with packets get_packet_type
This fixes an issue with trying to decode the packet type.
Also updated some of the log entries.
2021-07-16 12:15:04 -04:00
Hemna 90a44bb5ed Updated overview images
This patch includes updated aprsd_overview diagram images
which now includes the new watch list feature.
2021-07-16 12:12:34 -04:00
Hemna 7dc4fb3e77 Move version string output to top of log 2021-07-16 12:11:51 -04:00
Walter A. Boring IV f31a4c07b4
Merge pull request #65 from craigerl/plugin_interface_change
Refactor the plugin interface and manager
2021-07-16 08:43:24 -04:00
Hemna 1a1fcba1c4 Add new watchlist feature
This patch adds a new optional feature called Watch list.
Aprsd will filter IN all aprs packets from a list of callsigns.
APRSD will keep track of the last time a callsign has been seen.
When the configured timeout value has been reached, the next time
a callsign is seen, APRSD will send the next packet from that callsign
through the new notification plugins list.

The new BaseNotifyPlugin is the default core APRSD notify based plugin.
When it gets a packet it will construct a reply message to be sent
to the configured alert callsign to alert them that the seen callsign
is now on the APRS network.

This basically acts as a notification that your watched callsign list is
available on APRS.

The new configuration options:
aprsd:
    watch_list:
        # The callsign to send a message to once a watch list callsign
        # is now seen on APRS-IS
        alert_callsign: NOCALL
        # The time in seconds to wait for notification.
        # The default is 12 hours.
        alert_time_seconds: 43200
        # The list of callsigns to watch for
        callsigns:
          - WB4BOR
          - KFART
        # Enable/disable this feature
        enabled: false
        # The list of notify based plugins to load for
        # processing a new seen packet from a callsign.
        enabled_plugins:
        - aprsd.plugins.notify.BaseNotifyPlugin

This patch also adds a new section in the Admin UI for showing the
watch list and the age of the last seen packet for each callsing since
APRSD startup.
2021-07-16 08:31:38 -04:00
Hemna 562ae52c1e Fixed the Ack thread not resending acks
This patch fixes a bug in the AckThread.  The thread loop
was exiting after the first attempt to send the ack.
Thread loops have to return True, in order to be called again
as this is the mechanism in which aprsd gracefully shuts down all
threads.
2021-07-15 14:11:30 -04:00
Hemna 3c45d8bd0f reworked the admin ui to use semenatic ui more 2021-07-14 15:00:23 -04:00
Hemna 5afc7fb664 Added messages count to admin messages list.
This patch adds a simple count of packets shown in the
messages list on the admin ui.
2021-07-14 10:29:12 -04:00
Walter A. Boring IV ccaab72124
Merge pull request #64 from craigerl/web_tabs
Add admin UI tabs for charts, messages, config
2021-07-12 12:17:51 -04:00
Hemna de62579852 Add admin UI tabs for charts, messages, config
This patch updates the admin UI to include 3 tabs
of content.
Charts
messages
config

The charts tab is the existing line charts.
The messages tab shows a list of RX (green) and TX (red) messages
from/to aprsd.
The config tab shows the config loaded at startup time.
2021-07-12 12:12:14 -04:00
Hemna 1c66555450 Removed a noisy debug log 2021-07-09 15:22:53 -04:00
Walter A. Boring IV 13be30f772
Merge pull request #63 from craigerl/dump_config
Dump out the config during startup
2021-07-05 11:02:17 -04:00
Hemna 9a1ab1c0d6 Dump out the config during startup
This patch adds the dumping out of a flattened config to the log
at startup.  This is helpful for seeing what aprsd server is actually
using for config entries at startup and since it's in the log, you can
reference it.
2021-07-05 10:57:22 -04:00
Hemna 3ae5717452 Added message counts for each plugin.
This patch adds a message counter for each plugin.  When the regex for
a plugin passes and the message is pass into the plugin for processing,
that message is tracked.  This message count is reported by the stats
tracking object now for the web admin ui.
2021-06-17 16:37:47 -04:00
Walter A. Boring IV d8950f0995
Merge pull request #61 from craigerl/dependabot/pip/urllib3-1.26.5
Bump urllib3 from 1.26.4 to 1.26.5
2021-06-03 17:20:54 -04:00
dependabot[bot] 6fb16421e8
Bump urllib3 from 1.26.4 to 1.26.5
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.4 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.4...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-06-02 03:40:23 +00:00
Walter A. Boring IV c14b1bc985
Merge pull request #60 from craigerl/update_checker
Added aprsd version checking
2021-05-04 10:10:09 -04:00
Hemna 17302aa76d Added aprsd version checking
This patch adds usage of update_checker to check to make sure the
version of APRSD being launched is the latest version.  Also added a
call to upate_checker as part of the KeepAlive thread.  It will
call update_check every hour.  If there is no aprsd connectivitity,
the update check will silently fail.
2021-05-04 10:06:43 -04:00
Walter A. Boring IV 2f7fa0c3d5
Merge pull request #56 from craigerl/dependabot/pip/urllib3-1.26.4
Bump urllib3 from 1.26.3 to 1.26.4
2021-05-03 09:54:54 -04:00
Hemna 9de0df31eb Updated INSTALL.txt
This patch added some changes to the INSTALL.txt file to fix a user
issue: https://github.com/craigerl/aprsd/issues/58

Added documentation to the INSTALL.txt to use the makefile.  It's
far easier and superior in every way to setup and install than the
manual instructions.   Use pypi for the official package.
2021-04-30 21:28:22 -04:00
Walter A. Boring IV b8dc6a329b
Update my callsign
This patch updates my callsign for the README.rst
2021-04-21 21:56:32 -04:00
Craig Lamparter 970b32f238
Update README.rst 2021-04-12 16:30:20 -07:00
Craig Lamparter 2a5ef58295
Update README.rst 2021-04-12 16:29:31 -07:00
dependabot[bot] 2696a399cb
Bump urllib3 from 1.26.3 to 1.26.4
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.3 to 1.26.4.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.3...1.26.4)

Signed-off-by: dependabot[bot] <support@github.com>
2021-04-06 18:24:15 +00:00
Hemna 55862a2790 Prep for v1.6.1 release 2021-04-05 14:32:36 -04:00
Hemna fc1ee19516 Removed debug log for KeepAlive thread
No need to dump out the length of the keepalive string for now.
2021-04-05 14:14:33 -04:00
Hemna 4aac17dc98 ignore Makefile.venv 2021-04-05 12:47:13 -04:00
Hemna a4a06c9763 Reworked Makefile to use Makefile.venv
Completely reworked the Makefile to make use
of an existing 'library' to manage python
virtual environments.

https://github.com/sio/Makefile.venv
2021-04-05 12:38:38 -04:00
Hemna 23c219f0d2 Fixed version unit tests 2021-04-05 08:55:46 -04:00
Hemna 7b019d24f0 Updated stats output for KeepAlive thread
Also added the aprsd uptime to the VersionPlugin
2021-04-02 18:54:00 -04:00
Hemna 3f21934c0f Update Dockerfile-dev to work with startup 2021-04-02 12:51:50 -04:00
Walter A. Boring IV cabe374909
Merge pull request #54 from craigerl/stats-web-ui
Added aprsd web index page
2021-04-02 12:00:55 -04:00
Hemna 3ac42edd82 Force all the graphs to 0 minimum
This patch updates all the graphs to have a minimum
Y value of 0.  Doesn't make sense to have negative messages.
2021-04-02 11:57:06 -04:00
Hemna d6806c429c Added email messages graphs
This patch cleans up the layout of the admin web page stats graphs
as well as adds in the email stats.  Added the titles to each
graph, so you know what you are looking at.
2021-04-02 11:47:52 -04:00
Hemna bf8d2c6088 Reworked the stats dict output and healthcheck
This patch reworks the stats object dict and includes more data.
Also includes aprsis last update timestamp (from last recieved message).
This is used to help determine if the aprsis server connection is still
alive and well.
2021-04-01 23:12:25 -04:00
Hemna 123266c9ad Added callsign to the web index page
This patch adds the aprs-is server callsign that aprsd is listening
on for messages.
2021-03-31 11:32:09 -04:00
Hemna 34d2c31d90 Added log config for flask and lnav config file
This patch adds the aprsd-lnav.json formatting file.
This is useful when you want to tail the logfile with the lnav
log tailing app.

http://lnav.org/

To install the aprsd-lnav.json formatter
1) install lnav
2) lnav -i aprsd-lnav.json
3) lnav -C  -- just to test it out

The next time you launch aprsd do it with this
aprsd server --loglevel DEBUG | lnav

This patch also updates the logging output from the flask
web service to 1) disable flask web url logging and 2)
use the same output format as the rest of the app.
2021-03-31 11:07:39 -04:00
Hemna d1a2a14370 Added showing APRS-IS server to stats
This patch updates the client.py to collect which APRS-IS server
that aprsd is connected to and displays that info on the stats web page.
2021-03-30 10:43:31 -04:00
Hemna fb979eda94 Provide an initial datapoint on rendering index
This patch adds a single data point when rendering the
initial stats for the index page.
2021-03-30 10:18:56 -04:00
Hemna 6297ebeb67 Make the index page behind auth
This patch makes the index page ask for login/password in order
to see the stats.
2021-03-30 09:55:14 -04:00
Walter A. Boring IV c8484eb195
Merge pull request #53 from craigerl/dependabot/pip/lxml-4.6.3
Bump lxml from 4.6.2 to 4.6.3
2021-03-30 09:52:27 -04:00
Walter A. Boring IV 5f8db9df37
Merge pull request #55 from craigerl/dependabot/pip/pygments-2.7.4
Bump pygments from 2.7.3 to 2.7.4
2021-03-30 09:52:13 -04:00
dependabot[bot] 0e6b46555a
Bump pygments from 2.7.3 to 2.7.4
Bumps [pygments](https://github.com/pygments/pygments) from 2.7.3 to 2.7.4.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.7.3...2.7.4)

Signed-off-by: dependabot[bot] <support@github.com>
2021-03-30 02:29:11 +00:00
Hemna f10372b320 Added acks with messages graphs 2021-03-26 11:13:32 -04:00
Hemna c7d10f53a3 Updated web stats index to show messages and ram usage
This patch updates the main index page to show both the
graph of tx/rx messages as well as peak/current ram usage.
2021-03-24 16:07:09 -04:00
Hemna f211e5cabb Added aprsd web index page
This patch adds an index page for the flask web server that users
can hit at /
2021-03-24 10:45:03 -04:00
dependabot[bot] 6f3486bc72
Bump lxml from 4.6.2 to 4.6.3
Bumps [lxml](https://github.com/lxml/lxml) from 4.6.2 to 4.6.3.
- [Release notes](https://github.com/lxml/lxml/releases)
- [Changelog](https://github.com/lxml/lxml/blob/master/CHANGES.txt)
- [Commits](https://github.com/lxml/lxml/compare/lxml-4.6.2...lxml-4.6.3)

Signed-off-by: dependabot[bot] <support@github.com>
2021-03-23 14:16:10 +00:00
Walter A. Boring IV 53cdccde10
Merge pull request #52 from craigerl/dependabot/pip/jinja2-2.11.3
Bump jinja2 from 2.11.2 to 2.11.3
2021-03-23 10:15:34 -04:00
Walter A. Boring IV a657eeb390
Merge pull request #51 from craigerl/dependabot/pip/urllib3-1.26.3
Bump urllib3 from 1.26.2 to 1.26.3
2021-03-23 10:15:12 -04:00
dependabot[bot] d9536434b0
Bump jinja2 from 2.11.2 to 2.11.3
Bumps [jinja2](https://github.com/pallets/jinja) from 2.11.2 to 2.11.3.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/master/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/2.11.2...2.11.3)

Signed-off-by: dependabot[bot] <support@github.com>
2021-03-20 05:37:19 +00:00
dependabot[bot] 5f3e067c96
Bump urllib3 from 1.26.2 to 1.26.3
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.2 to 1.26.3.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.2...1.26.3)

Signed-off-by: dependabot[bot] <support@github.com>
2021-03-19 20:14:09 +00:00
Hemna 0a038dae44 Added log format and dateformat to config file
This patch moves the default log format string and date format string
to the config file, so users can format the logs as they see fit.
The default log format also includes the file and line number that
posted the log entry.

The new entries in the config are here:
aprsd:
  logformat: "String here"
  dateformat: "string here"
2021-02-25 13:32:50 -05:00
Hemna 239e784d51 Added Dockerfile-dev and updated build.sh
Build.sh is used for multi-architecture building of docker images.
2021-02-18 18:59:08 -05:00
Hemna 933917bf9d Require python 3.7 and > 2021-02-18 16:51:31 -05:00
Hemna e6cafeb3d2 Added plugin live reload and StockPlugin
This patch adds 2 items.  First it adds the new StockPlugin,
which fetches stock quotes from yahoo finance rest API using
the yfinance python module.

2nd, the web interface contains a new url /plugins, which allows
aprsd to reload all of it's plugins from disk.  This is useful for
development where the dev is editing an existing plugin and wants to
run the edited plugin without restarting aprsd itself.  The /plugins
url requires admin login credentials.

TODO: would be nice to live reload the aprsd.yml config file, so plugin
reloading can start new plugins defined in aprsd.yml between /plugins
being reloaded.
2021-02-18 16:31:52 -05:00
Hemna 9f66774541 Updated Dockerfile and build.sh
This patch updates the main Dockerfile to work in multi-architecture
build.  Dockerfile now builds and install aprsd from pypi, not github.
2021-02-18 16:15:49 -05:00
Hemna c177748340 Updated Dockerfile for multiplatform builds
This patch updates the main Dockerfile container build to use
the python:3.8-slim base image and installs aprsd from pypi.
This also results in a much smaller image.

Also added support for multiarchitecture builds so the same Dockerfile
builds for raspberry pi and linux/amd64
2021-02-16 10:20:34 -05:00
Hemna f0034fc517 Updated Dockerfile for multiplatform builds
This patch updates the main Dockerfile container build to use
the python:3.8-slim base image and installs aprsd from pypi.
This also results in a much smaller image.

Also added support for multiarchitecture builds so the same Dockerfile
builds for raspberry pi and linux/amd64
2021-02-16 10:05:11 -05:00
Hemna 2d5bb85071 Dockerfile: Make creation of /config quiet failure
This patch adds -p to mkdir command in the Dockerfile
to quiet the failure if /config already exists.
2021-02-13 10:41:43 -05:00
Hemna b6ba90de53 Updated README docs
This patch updates the 2 readme files with copies for the latest
sample-config output to match the 1.6.0 release's config.

Also fixed a small issue with the Dockerfile.
2021-02-13 09:27:03 -05:00
Hemna a266c987fd 1.6.0 release prep 2021-02-12 19:46:31 -05:00
Walter A. Boring IV e2d8cc60e5
Merge pull request #49 from craigerl/dependabot/pip/cryptography-3.3.2
Bump cryptography from 3.3.1 to 3.3.2
2021-02-12 14:10:04 -05:00
Walter A. Boring IV 3a6316fa8a
Merge pull request #47 from craigerl/stabilize_1_6_0
Branch to stabilize for the 1.6.0 release.
2021-02-12 14:09:48 -05:00
Hemna 7df6462d91 Updated path of run.sh for docker build 2021-02-10 12:04:24 -05:00
Hemna 24edcad60a Moved docker related stuffs to docker dir 2021-02-10 11:58:02 -05:00
Hemna 9ba44a076c Removed some noisy debug log.
Use the tracing instead to enable the debugging of
email calls
2021-02-10 10:37:39 -05:00
dependabot[bot] afc53624e1
Bump cryptography from 3.3.1 to 3.3.2
Bumps [cryptography](https://github.com/pyca/cryptography) from 3.3.1 to 3.3.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/3.3.1...3.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-10 02:38:28 +00:00
Hemna 131919bdfb Wrap another server call with try except
Dreamhost email is total garbage.  Stop using it.
2021-02-05 15:32:36 -05:00
Walter A. Boring IV d0d0aef077
Merge pull request #48 from craigerl/dependabot/pip/bleach-3.3.0
Bump bleach from 3.2.1 to 3.3.0
2021-02-03 11:12:04 -05:00
Hemna a5cc274ff5 Wrap all imap calls with try except blocks
The Email Thread has been unstable due to some IMAP servers
being crap.  This patch wraps more of the imap server calls
in try except blocks to try and trap errors.
2021-02-03 11:00:20 -05:00
dependabot[bot] d71937b176
Bump bleach from 3.2.1 to 3.3.0
Bumps [bleach](https://github.com/mozilla/bleach) from 3.2.1 to 3.3.0.
- [Release notes](https://github.com/mozilla/bleach/releases)
- [Changelog](https://github.com/mozilla/bleach/blob/master/CHANGES)
- [Commits](https://github.com/mozilla/bleach/compare/v3.2.1...v3.3.0)

Signed-off-by: dependabot[bot] <support@github.com>
2021-02-02 23:17:59 +00:00
Craig Lamparter 47135c6086 EmailThread was exiting because of IMAP timeout, added exceptions for this 2021-02-02 11:13:17 -08:00
Hemna db2b537317 Added memory tracing in keeplive 2021-01-29 11:02:21 -05:00
Hemna 0b44fc08eb Fixed tox pep8 failure for trace 2021-01-29 10:15:20 -05:00
Hemna af48c43eb2 Added tracing facility
You can enable debug tracing iff loglevel == DEBUG AND
config file has aprsd:trace:True
2021-01-29 10:07:49 -05:00
Hemna 94bad95e26 Fixed email login issue.
This patch undoes an overzealous reworking of the
config.  the arps login didn't move.
2021-01-26 13:33:39 -05:00
Craig Lamparter 57d768e010 duplicate email messages from RF would generate usage response 2021-01-26 09:18:43 -08:00
Hemna 030b02551f Enable debug logging for smtp and imap
Add the new config options for
aprsd:
  email:
    imap:
      debug: True

    smtp:
      debug: True
2021-01-25 16:16:08 -05:00
Craig Lamparter cfb172481d more debug around email thread 2021-01-25 12:54:24 -08:00
Craig Lamparter 3ca0eeff56 debug around EmailThread hanging or vanishing 2021-01-25 12:24:20 -08:00
Hemna c1e6792721 Fixed resend email after config rework
This patch fixes 1 missed access to the shortcuts after
the restructuring of the config file
2021-01-25 15:15:53 -05:00
Hemna aa290692ab Added flask messages web UI and basic auth
This patch fixes the CTRL-C signal_handler.
This patch also adds the new Messages WEB UI page
as well as the save url, which are both behind an
http basic auth.

The flask web service now has users in the config file
aprsd:
  web:
    users:
      admin: <password>
2021-01-25 11:24:39 -05:00
Hemna 0d18e54969 Fixed an issue with LocationPlugin
When calling LocationPlugin with a callsign outside of the US,
the forecast.weather gov wasn't raising an exception.  A valid json
dict was coming back, but it didn't have location data we were
expecting.
2021-01-22 16:32:49 -05:00
Hemna 51894bbab8 Cleaned up the KeepAlive output
This patch cleans up the KeepAlive output a bit.
2021-01-22 16:05:48 -05:00
Hemna 8bfdefd5ad updated .gitignore
This patch updates .gitignore to ignore the docs/_build output dir
2021-01-22 15:36:22 -05:00
Walter A. Boring IV f5ae161ab8
Merge pull request #46 from craigerl/healthcheck
Added healthcheck app
2021-01-22 14:30:44 -05:00
Hemna c870207a96 Added healthcheck app
This patch adds the healthcheck app that uses the flask stats url
to fetch the internal stats of a running aprsd server.  If the server is
up the stats will return and be checked for 'healthy' status.
IF the url fails to return, healthcheck will exit with -1.  You can use
this script to restart aprsd if healthcheck exits with -1 status.

There is a check against the email thread.  The email thread updates a
deadman's timer every 5 seconds.   If that time gets older than 5
minutes, then healthcheck will say that's a failure and exit with -1.

You can call healthcheck and restart aprsd if it fails (exit -1)
2021-01-22 12:51:11 -05:00
Walter A. Boring IV f932c203d7
Merge pull request #45 from craigerl/flask
Flask
2021-01-22 08:21:20 -05:00
Hemna cae8746690 Add flask and flask_classful reqs 2021-01-21 21:11:41 -05:00
Hemna 5c949343ec Added Flask web thread and stats collection
This patch adds the stats object to collect statistics of
the running server.  This also optionally adds the ability
to run a flask web service on a port to use as a keepalive
healthcheck.
2021-01-21 20:58:47 -05:00
Hemna 9630279d14 First hack at flask 2021-01-21 15:11:44 -05:00
Walter A. Boring IV c686543323
Merge pull request #44 from craigerl/optional_email
Allow email to be disabled.
2021-01-21 13:53:59 -05:00
Hemna 982f24c5f5 Allow email to be disabled.
The config file defaults to email being off now.  This requires
the user to set the email settings anyway, so the default is off.
2021-01-21 13:50:19 -05:00
Walter A. Boring IV a656508ba6
Merge pull request #43 from craigerl/rework_config
Reworked the config file and options
2021-01-21 13:36:44 -05:00
Hemna ce5b09233c Reworked the config file and options
This patch reorganizes the config file layout and options
to make more logical sense as well as make it more readable.

This breaks backwards compatibility.
2021-01-21 13:32:19 -05:00
Walter A. Boring IV 2f7c1bfcc1
Merge pull request #41 from craigerl/openweathermap
Added openweathermap weather plugin
2021-01-21 10:10:26 -05:00
Hemna a35cb04ca7 Updated documentation and config output
This patch reformats the sample-config output for more
informative comments for the 3 external services:
openweathermap
opencagedata
avwx-api
2021-01-21 10:05:49 -05:00
Hemna fefb626c97 Fixed extracting lat/lon
This patch fixes an issue when aprs.fi returns a non error, but
doesn't have any real entries as the response
2021-01-20 19:51:59 -05:00
Hemna 2349024539 Added openweathermap weather plugin
This patch adds the openweathermap weather plugin.
Also adds a new config option to set the overall
units setting from imperial (default) to metric.

to change it add the following to the ~/.config/aprsd/aprsd.yaml

...
aprsd:
  units: metric
2021-01-20 16:12:17 -05:00
Craig Lamparter f8c001dc49
Merge pull request #40 from craigerl/new_plugins
Added new time plugins
2021-01-20 08:26:48 -08:00
Hemna fc3a747aa4 Added new time plugins
This patch adds 2 new time plugins to allow admins to use their
opencagedata APIkey or openweathermap API key to fetch the timezone
from the lat/lon GPS coordinates for the callsign requesting the time.

This will enable fetching the time local to the ham radio's last beacon,
and not time local to the aprsd server instance running.  If the
location is not found, then the timezone will default to UTC.

The 2 new plugins are
- aprsd.plugins.time.TimeOpenCageDataPlugin
   Fetches timezone from lat/lon using the opencagedata api that can be
   found here:  https://opencagedata.com/dashboard#api-keys

   This requires a new ~/.config/aprsd/aprsd.yml entry to specify the
   api key.
   opencagedata:
       apiKey: <the api key hash here>

- aprsd.plugins.time.TimeOWMPlugin
   Fetches the timezone from lat/lon using the openweathermap api
   that can be found here:  https://home.openweathermap.org/api_keys

   This requires a new ~/.config/aprsd/aprsd.yml entry to specify the
   api key.
   openweathermap:
       apiKey: <the api key hash here>
2021-01-20 10:19:49 -05:00
Walter A. Boring IV fdd5a6ba41
Merge pull request #38 from craigerl/timezone-fix
Fixed TimePlugin timezone issue
2021-01-19 11:29:59 -05:00
Hemna 9f38fd179e Fixed TimePlugin timezone issue
The existing time plugin had a hard coded PDT for pacific timezone,
when it wasn't.   This patch adds some real timezone conversion from
utc to the tz of the running aprsd server.   This will eventually allow
us to use either the tz of the running aprsd and/or the tz of the
calling callsign if we can just get the tz string from the location
beacon of the caller's callsign.
2021-01-19 11:26:07 -05:00
Craig Lamparter ca05676c98 remove fortune white space 2021-01-17 08:02:45 -08:00
Craig Lamparter 83f42dd7b7 Merge branch 'master' of https://github.com/craigerl/aprsd 2021-01-17 07:57:10 -08:00
Craig Lamparter 5fb363c9e7 fix git with install.txt 2021-01-17 07:56:59 -08:00
Craig Lamparter 7de2820caa change query char from ? to ! 2021-01-17 07:55:59 -08:00
Walter A. Boring IV 55360ba5d0
Merge pull request #35 from craigerl/aprsd-dev
Added aprsd-dev plugin test cli and WxPlugin
2021-01-17 08:10:06 -05:00
Hemna b9f6fcfa0c Updated readme to include readthedocs link 2021-01-16 10:46:00 -05:00
Hemna cc8fd178ce Added aprsd-dev plugin test cli and WxPlugin
This patch adds a new CLI app called aprsd-dev.  arpsd-dev is
used specifically for developing plugins.  It allows you to run a
plugin directly without the need to run aprsd server.

This patch also adds the Weather Metar plugin called WxPlugin.
You can use it to fetch METAR from the nearest station for a callsign
or from a known METAR station id.  Call WxPlugin with a message of
'wx' for closest metar station or 'wx KAUN' for metar at KAUN wx station
2021-01-15 22:30:34 -05:00
206 changed files with 19564 additions and 3776 deletions

14
.coveragerc Normal file
View File

@ -0,0 +1,14 @@
[run]
include =
aprsd/*
tests/*
*/lib/python*/site-packages/aprsd/*
*/pypy*/site-packages/aprsd/*
*\Lib\site-packages\aprsd\*
branch = 1
[paths]
source = aprsd/
*/lib/python*/site-packages/aprsd/
*/pypy*/site-packages/aprsd/
*\Lib\site-packages\aprsd\

84
.github/workflows/codeql.yml vendored Normal file
View File

@ -0,0 +1,84 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
schedule:
- cron: '36 8 * * 0'
jobs:
analyze:
name: Analyze
# Runner size impacts CodeQL analysis time. To learn more, please see:
# - https://gh.io/recommended-hardware-resources-for-running-codeql
# - https://gh.io/supported-runners-and-hardware-resources
# - https://gh.io/using-larger-runners
# Consider using larger runners for possible analysis time improvements.
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
timeout-minutes: ${{ (matrix.language == 'swift' && 120) || 360 }}
permissions:
# required for all workflows
security-events: write
# only required for workflows in private repositories
actions: read
contents: read
strategy:
fail-fast: false
matrix:
language: [ 'javascript-typescript', 'python' ]
# CodeQL supports [ 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift' ]
# Use only 'java-kotlin' to analyze code written in Java, Kotlin or both
# Use only 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

53
.github/workflows/manual_build.yml vendored Normal file
View File

@ -0,0 +1,53 @@
name: Manual Build docker container
on:
workflow_dispatch:
inputs:
logLevel:
description: 'Log level'
required: true
default: 'warning'
type: choice
options:
- info
- warning
- debug
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Get Branch Name
id: branch-name
uses: tj-actions/branch-names@v8
- name: Extract Branch
id: extract_branch
run: |
echo "branch=${GITHUB_HEAD_REF:-${GITHUB_REF#refs/heads/}}" >> $GITHUB_OUTPUT
- name: What is the selected branch?
run: |
echo "Selected Branch '${{ steps.extract_branch.outputs.branch }}'"
- name: Setup QEMU
uses: docker/setup-qemu-action@v2
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to Docker HUB
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build the Docker image
uses: docker/build-push-action@v3
with:
context: "{{defaultContext}}:docker"
platforms: linux/amd64,linux/arm64
file: ./Dockerfile
build-args: |
INSTALL_TYPE=github
BRANCH=${{ steps.extract_branch.outputs.branch }}
BUILDX_QEMU_ENV=true
push: true
tags: |
hemna6969/aprsd:${{ steps.extract_branch.outputs.branch }}

63
.github/workflows/master-build.yml vendored Normal file
View File

@ -0,0 +1,63 @@
name: Test and Build Latest Container Image
on:
schedule:
- cron: "0 10 * * *"
push:
branches:
- "**"
tags:
- "v*.*.*"
pull_request:
branches:
- "master"
jobs:
tox:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11"]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh>=1.2
- name: Test with tox
run: tox
build:
needs: tox
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Get Branch Name
id: branch-name
uses: tj-actions/branch-names@v8
- name: Setup QEMU
uses: docker/setup-qemu-action@v2
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to Docker HUB
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build the Docker image
uses: docker/build-push-action@v3
with:
context: "{{defaultContext}}:docker"
platforms: linux/amd64,linux/arm64
file: ./Dockerfile
build-args: |
INSTALL_TYPE=github
BRANCH=${{ steps.branch-name.outputs.current_branch }}
BUILDX_QEMU_ENV=true
push: true
tags: |
hemna6969/aprsd:${{ steps.branch-name.outputs.current_branch }}

View File

@ -1,4 +1,4 @@
name: python
name: TOX Test
on: [push, pull_request]
@ -7,7 +7,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.6, 3.7, 3.8, 3.9]
python-version: ["3.9", "3.10", "3.11"]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
@ -17,6 +17,6 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
pip install tox tox-gh>=1.2
- name: Test with tox
run: tox

49
.github/workflows/release_build.yml vendored Normal file
View File

@ -0,0 +1,49 @@
name: Build specific version
on:
workflow_dispatch:
inputs:
aprsd_version:
required: true
options:
- 3.0.0
logLevel:
description: 'Log level'
required: true
default: 'warning'
type: choice
options:
- info
- warning
- debug
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Get Branch Name
id: branch-name
uses: tj-actions/branch-names@v8
- name: Setup QEMU
uses: docker/setup-qemu-action@v2
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to Docker HUB
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build the Docker image
uses: docker/build-push-action@v3
with:
context: "{{defaultContext}}:docker"
platforms: linux/amd64,linux/arm64
file: ./Dockerfile
build-args: |
VERSION=${{ inputs.aprsd_version }}
BUILDX_QEMU_ENV=true
push: true
tags: |
hemna6969/aprsd:v${{ inputs.aprsd_version }}
hemna6969/aprsd:latest

5
.gitignore vendored
View File

@ -44,6 +44,7 @@ output/*/index.html
# Sphinx
doc/build
docs/_build
# pbr generates these
AUTHORS
@ -55,3 +56,7 @@ AUTHORS
.*sw?
.ropeproject
.idea
Makefile.venv
# Copilot
.DS_Store

View File

@ -1,11 +1,10 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0
rev: v4.5.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: detect-private-key
- id: check-merge-conflict
- id: check-case-conflict
@ -13,35 +12,11 @@ repos:
- id: check-builtin-literals
- repo: https://github.com/asottile/setup-cfg-fmt
rev: v1.16.0
rev: v2.5.0
hooks:
- id: setup-cfg-fmt
- repo: https://github.com/asottile/add-trailing-comma
rev: v2.0.2
- repo: https://github.com/dizballanze/gray
rev: v0.14.0
hooks:
- id: add-trailing-comma
args: [--py36-plus]
- repo: https://github.com/asottile/pyupgrade
rev: v2.7.4
hooks:
- id: pyupgrade
args:
- --py3-plus
- repo: https://github.com/pre-commit/mirrors-isort
rev: v5.7.0
hooks:
- id: isort
- repo: https://github.com/psf/black
rev: 20.8b1
hooks:
- id: black
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
- id: flake8
additional_dependencies: [flake8-bugbear]
- id: gray

23
.readthedocs.yaml Normal file
View File

@ -0,0 +1,23 @@
---
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
# We recommend specifying your dependencies to enable reproducible builds:
# https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: dev-requirements.txt

833
ChangeLog
View File

@ -1,10 +1,843 @@
CHANGES
=======
* Put an upper bound on the QueueHandler queue
v3.4.0
------
* Updated Changelog for 3.4.0
* Change setup.h
* Fixed docker setup.sh comparison
* Fixed unit tests failing with WatchList
* Added config enable\_packet\_logging
* Make all the Objectstore children use the same lock
* Fixed PacketTrack with UnknownPacket
* Removed the requirement on click-completion
* Update Dockerfiles
* Added fox for entry\_points with old python
* Added config for enable\_seen\_list
* Fix APRSDStats start\_time
* Added default\_packet\_send\_count config
* Call packet collecter after prepare during tx
* Added PacketTrack to packet collector
* Webchat Send Beacon uses Path selected in UI
* Added try except blocks in collectors
* Remove error logs from watch list
* Fixed issue with PacketList being empty
* Added new PacketCollector
* Fixed Keepalive access to email stats
* Added support for RX replyacks
* Changed Stats Collector registration
* Added PacketList.set\_maxlen()
* another fix for tx send
* removed Packet.last\_send\_attempt and just use send\_count
* Fix access to PacketList.\_maxlen
* added packet\_count in packet\_list stats
* force uwsgi to 2.0.24
* ismall update
* Added new config optons for PacketList
* Update requirements
* Added threads chart to admin ui graphs
* set packetlist max back to 100
* ensure thread count is updated
* Added threads table in the admin web ui
* Fixed issue with APRSDThreadList stats()
* Added new default\_ack\_send\_count config option
* Remove packet from tracker after max attempts
* Limit packets to 50 in PacketList
* syncronize the add for StatsStore
* Lock on stats for PacketList
* Fixed PacketList maxlen
* Fixed a problem with the webchat tab notification
* Another fix for ACK packets
* Fix issue not tracking RX Ack packets for stats
* Fix time plugin
* add GATE route to webchat along with WIDE1, etc
* Update webchat, include GATE route along with WIDE, ARISS, etc
* Get rid of some useless warning logs
* Added human\_info property to MessagePackets
* Fixed scrolling problem with new webchat sent msg
* Fix some issues with listen command
* Admin interface catch empty stats
* Ensure StatsStore has empty data
* Ensure latest pip is in docker image
* LOG failed requests post to admin ui
* changed admin web\_ip to StrOpt
* Updated prism to 1.29
* Removed json-viewer
* Remove rpyc as a requirement
* Delete more stats from webchat
* Admin UI working again
* Removed RPC Server and client
* Remove the logging of the conf password if not set
* Lock around client reset
* Allow stats collector to serialize upon creation
* Fixed issues with watch list at startup
* Fixed access to log\_monitor
* Got unit tests working again
* Fixed pep8 errors and missing files
* Reworked the stats making the rpc server obsolete
* Update client.py to add consumer in the API
* Fix for sample-config warning
* update requirements
* Put packet.json back in
* Change debug log color
* Fix for filtering curse words
* added packet counter random int
* More packet cleanup and tests
* Show comment in multiline packet output
* Added new config option log\_packet\_format
* Some packet cleanup
* Added new webchat config option for logging
* Fix some pep8 issues
* Completely redo logging of packets!!
* Fixed some logging in webchat
* Added missing packet types in listen command
* Don't call stats so often in webchat
* Eliminated need for from\_aprslib\_dict
* Fix for micE packet decoding with mbits
* updated dev-requirements
* Fixed some tox errors related to mypy
* Refactored packets
* removed print
* small refactor of stats usage in version plugin
* Added type setting on pluging.py for mypy
* Moved Threads list for mypy
* No need to synchronize on stats
* Start to add types
* Update tox for mypy runs
* Bump black from 24.2.0 to 24.3.0
* replaced access to conf from uwsgi
* Fixed call to setup\_logging in uwsgi
* Fixed access to conf.log in logging\_setup
v3.3.2
------
* Changelog for 3.3.2
* Remove warning during sample-config
* Removed print in utils
v3.3.1
------
* Updates for 3.3.1
* Fixed failure with fetch-stats
* Fixed problem with list-plugins
v3.3.0
------
* Changelog for 3.3.0
* sample-config fix
* Fixed registry url post
* Changed processpkt message
* Fixed RegistryThread not sending requests
* use log.setup\_logging
* Disable debug logs for aprslib
* Make registry thread sleep
* Put threads first after date/time
* Replace slow rich logging with loguru
* Updated requirements
* Fixed pep8
* Added list-extensions and updated README.rst
* Change defaults for beacon and registry
* Add log info for Beacon and Registry threads
* fixed frequency\_seconds to IntOpt
* fixed references to conf
* changed the default packet timeout to 5 minutes
* Fixed default service registry url
* fix pep8 failures
* py311 fails in github
* Don't send uptime to registry
* Added sending software string to registry
* add py310 gh actions
* Added the new APRS Registry thread
* Added installing extensions to Docker run
* Cleanup some logs
* Added BeaconPacket
* updated requirements files
* removed some unneeded code
* Added iterator to objectstore
* Added some missing classes to threads
* Added support for loading extensions
* Added location for callsign tabs in webchat
* updated gitignore
* Create codeql.yml
* update github action branchs to v8
* Added Location info on webchat interface
* Updated dev test-plugin command
* Update requirements.txt
* Update for v3.2.3
v3.2.3
------
* Force fortune path during setup test
* added /usr/games to path
* Added fortune to Dockerfile-dev
* Added missing fortune app
* aprsd: main.py: Fix premature return in sample\_config
* Update weather.py because you can't sort icons by penis
* Update weather.py both weather plugins have new Ww regex
* Update weather.py
* Fixed a bug with OWMWeatherPlugin
* Rework Location Plugin
v3.2.2
------
* Update for v3.2.2 release
* Fix for types
* Fix wsgi for prod
* pep8 fixes
* remove python 3.12 from github builds
* Fixed datetime access in core.py
* removed invalid reference to config.py
* Updated requirements
* Reworked the admin graphs
* Test new packet serialization
* Try to localize js libs and css for no internet
* Normalize listen --aprs-login
* Bump werkzeug from 2.3.7 to 3.0.1
* Update INSTALL with new conf files
* Bump urllib3 from 2.0.6 to 2.0.7
v3.2.1
------
* Changelog for 3.2.1
* Update index.html disable form autocomplete
* Update the packet\_dupe\_timeout warning
* Update the webchat paths
* Changed the path option to a ListOpt
* Fixed default path for tcp\_kiss client
* Set a default password for admin
* Fix path for KISS clients
* Added packet\_dupe\_timeout conf
* Add ability to change path on every TX packet
* Make Packet objects hashable
* Bump urllib3 from 2.0.4 to 2.0.6
* Don't process AckPackets as dupes
* Fixed another msgNo int issue
* Fixed issue with packet tracker and msgNO Counter
* Fixed import of Mutablemapping
* pep8 fixes
* rewrote packet\_list and drop dupe packets
* Log a warning on dupe
* Fix for dupe packets
v3.2.0
------
* Update Changelog for 3.2.0
* minor cleanup prior to release
* Webchat: fix input maxlength
* WebChat: cleanup some console.logs
* WebChat: flash a dupe message
* Webchat: Fix issue accessing msg.id
* Webchat: Fix chat css on older browsers
* WebChat: new tab should get focus
* Bump gevent from 23.9.0.post1 to 23.9.1
* Webchat: Fix pep8 errors
* Webchat: Added tab notifications and raw packet
* WebChat: Prevent sending message without callsign
* WebChat: fixed content area scrolling
* Webchat: tweaks to UI for expanding chat
* Webchat: Fixed bug deleteing first tab
* Ensure Keepalive doesn't reset client at startup
* Ensure parse\_delta\_str doesn't puke
* WebChat: Send GPS Beacon working
* webchat: got active tab onclick working
* webchat: set to\_call to value of tab when selected
* Center the webchat input form
* Update index.html to use chat.css
* Deleted webchat mobile pages
* Added close X on webchat tabs
* Reworked webchat with new UI
* Updated the webchat UI to look like iMessage
* Restore previous conversations in webchat
* Remove VIM from Dockerfile
* recreate client during reset()
* updated github workflows
* Updated documentation build
* Removed admin\_web.py
* Removed some RPC server log noise
* Fixed admin page packet date
* RPC Server logs the client IP on failed auth
* Start keepalive thread first
* fixed an issue in the mobile webchat
* Added dupe checkig code to webchat mobile
* click on the div after added
* Webchat suppress to display of dupe messages
* Convert webchat internet urls to local static urls
* Make use of webchat gps config options
* Added new webchat config section
* fixed webchat logging.logformat typeoh
v3.1.3
------
* prep for 3.1.3
* Forcefully allow development webchat flask
v3.1.2
------
* Updated Changelog for 3.1.2
* Added support for ThirdParty packet types
* Disable the Send GPS Beacon button
* Removed adhoc ssl support in webchat
v3.1.1
------
* Updated Changelog for v3.1.1
* Fixed pep8 failures
* re-enable USWeatherPlugin to use mapClick
* Fix sending packets over KISS interface
* Use config web\_ip for running admin ui from module
* remove loop log
* Max out the client reconnect backoff to 5
* Update the Dockerfile
v3.1.0
------
* Changelog updates for v3.1.0
* Use CONF.admin.web\_port for single launch web admin
* Fixed sio namespace registration
* Update Dockerfile-dev to include uwsgi
* Fixed pep8
* change port to 8000
* replacement of flask-socketio with python-socketio
* Change how fetch-stats gets it's defaults
* Ensure fetch-stats ip is a string
* Add info logging for rpc server calls
* updated wsgi config default /config/aprsd.conf
* Added timing after each thread loop
* Update docker bin/admin.sh
* Removed flask-classful from webchat
* Remove flask pinning
* removed linux/arm/v8
* Update master build to include linux/arm/v8
* Update Dockerfile-dev to fix plugin permissions
* update manual build github
* Update requirements for upgraded cryptography
* Added more libs for Dockerfile-dev
* Replace Dockerfile-dev with python3 slim
* Moved logging to log for wsgi.py
* Changed weather plugin regex pattern
* Limit the float values to 3 decimal places
* Fixed rain numbers from aprslib
* Fixed rpc client initialization
* Fix in for aprslib issue #80
* Try and fix Dockerfile-dev
* Fixed pep8 errors
* Populate stats object with threads info
* added counts to the fetch-stats table
* Added the fetch-stats command
* Replace ratelimiter with rush
* Added some utilities to Dockerfile-dev
* add arm64 for manual github build
* Added manual master build
* Update master-build.yml
* Add github manual trigger for master build
* Fixed unit tests for Location plugin
* USe new tox and update githubworkflows
* Updated requirements
* force tox to 4.3.5
* Update github workflows
* Fixed pep8 violation
* Added rpc server for listen
* Update location plugin and reworked requirements
* Fixed .readthedocs.yaml format
* Add .readthedocs.yaml
* Example plugin wrong function
* Ensure conf is imported for threads/tx
* Update Dockerfile to help build cryptography
v3.0.3
------
* Update Changelog to 3.0.3
* cleanup some debug messages
* Fixed loading of plugins for server
* Don't load help plugin for listen command
* Added listen args
* Change listen command plugins
* Added listen.sh for docker
* Update Listen command
* Update Dockerfile
* Add ratelimiting for acks and other packets
v3.0.2
------
* Update Changelog for 3.0.2
* Import RejectPacket
v3.0.1
------
* 3.0.1
* Add support to Reject messages
* Update Docker builds for 3.0.0
v3.0.0
------
* Update Changelog for 3.0.0
* Ensure server command main thread doesn't exit
* Fixed save directory default
* Fixed pep8 failure
* Cleaned up KISS interfaces use of old config
* reworked usage of importlib.metadata
* Added new docs files for 3.0.0
* Removed url option from healthcheck in dev
* Updated Healthcheck to use rpc to call aprsd
* Updated docker/bin/run.sh to use new conf
* Added ObjectPacket
* Update regex processing and regex for plugins
* Change ordering of starting up of server command
* Update documentation and README
* Decouple admin web interface from server command
* Dockerfile now produces aprsd.conf
* Fix some unit tests and loading of CONF w/o file
* Added missing conf
* Removed references to old custom config
* Convert config to oslo\_config
* Added rain formatting unit tests to WeatherPacket
* Fix Rain reporting in WeatherPacket send
* Removed Packet.send()
* Removed watchlist plugins
* Fix PluginManager.get\_plugins
* Cleaned up PluginManager
* Cleaned up PluginManager
* Update routing for weatherpacket
* Fix some WeatherPacket formatting
* Fix pep8 violation
* Add packet filtering for aprsd listen
* Added WeatherPacket encoding
* Updated webchat and listen for queue based RX
* reworked collecting and reporting stats
* Removed unused threading code
* Change RX packet processing to enqueu
* Make tracking objectstores work w/o initializing
* Cleaned up packet transmit class attributes
* Fix packets timestamp to int
* More messaging -> packets cleanup
* Cleaned out all references to messaging
* Added contructing a GPSPacket for sending
* cleanup webchat
* Reworked all packet processing
* Updated plugins and plugin interfaces for Packet
* Started using dataclasses to describe packets
v2.6.1
------
* v2.6.1
* Fixed position report for webchat beacon
* Try and fix broken 32bit qemu builds on 64bit system
* Add unit tests for webchat
* remove armv7 build RUST sucks
* Fix for Collections change in 3.10
v2.6.0
------
* Update workflow again
* Update Dockerfile to 22.04
* Update Dockerfile and build.sh
* Update workflow
* Prep for 2.6.0 release
* Update requirements
* Removed Makefile comment
* Update Makefile for dev vs. run environments
* Added pyopenssl for https for webchat
* change from device-detector to user-agents
* Remove twine from dev-requirements
* Update to latest Makefile.venv
* Refactored threads a bit
* Mark packets as acked in MsgTracker
* remove dev setting for template
* Add GPS beacon to mobile page
* Allow werkzeug for admin interface
* Allow werkzeug for admin interface
* Add support for mobile browsers for webchat
* Ignore callsign case while processing packets
* remove linux/arm/v7 for official builds for now
* added workflow for building specific version
* Allow passing in version to the Dockerfile
* Send GPS Beacon from webchat interface
* specify Dockerfile-dev
* Fixed build.sh
* Build on the source not released aprsd
* Remove email validation
* Add support for building linux/arm/v7
* Remove python 3.7 from docker build github
* Fixed failing unit tests
* change github workflow
* Removed TimeOpenCageDataPlugin
* Dump config with aprsd dev test-plugin
* Updated requirements
* Got webchat working with KISS tcp
* Added click auto\_envvar\_prefix
* Update aprsd thread base class to use queue
* Update packets to use wrapt
* Add remving existing requirements
* Try sending raw APRSFrames to aioax25
* Use new aprsd.callsign as the main callsign
* Fixed access to threads refactor
* Added webchat command
* Moved log.py to logging
* Moved trace.py to utils
* Fixed pep8 errors
* Refactored threads.py
* Refactor utils to directory
* remove arm build for now
* Added rustc and cargo to Dockerfile
* remove linux/arm/v6 from docker platform build
* Only tag master build as master
* Remove docker build from test
* create master-build.yml
* Added container build action
* Update docs on using Docker
* Update dev-requirements pip-tools
* Fix typo in docker-compose.yml
* Fix PyPI scraping
* Allow web interface when running in Docker
* Fix typo on exception
* README formatting fixes
* Bump dependencies to fix python 3.10
* Fixed up config option checking for KISS
* Fix logging issue with log messages
* for 2.5.9
v2.5.9
------
* FIX: logging exceptions
* Updated build and run for rich lib
* update build for 2.5.8
v2.5.8
------
* For 2.5.8
* Removed debug code
* Updated list-plugins
* Renamed virtualenv dir to .aprsd-venv
* Added unit tests for dev test-plugin
* Send Message command defaults to config
v2.5.7
------
* Updated Changelog
* Fixed an KISS config disabled issue
* Fixed a bug with multiple notify plugins enabled
* Unify the logging to file and stdout
* Added new feature to list-plugins command
* more README.rst cleanup
* Updated README examples
v2.5.6
------
* Changelog
* Tightened up the packet logging
* Added unit tests for USWeatherPlugin, USMetarPlugin
* Added test\_location to test LocationPlugin
* Updated pytest output
* Added py39 to tox for tests
* Added NotifyPlugin unit tests and more
* Small cleanup on packet logging
* Reduced the APRSIS connection reset to 2 minutes
* Fixed the NotifyPlugin
* Fixed some pep8 errors
* Add tracing for dev command
* Added python rich library based logging
* Added LOG\_LEVEL env variable for the docker
v2.5.5
------
* Update requirements to use aprslib 0.7.0
* fixed the failure during loading for objectstore
* updated docker build
v2.5.4
------
* Updated Changelog
* Fixed dev command missing initialization
v2.5.3
------
* Fix admin logging tab
v2.5.2
------
* Added new list-plugins command
* Don't require check-version command to have a config
* Healthcheck command doesn't need the aprsd.yml config
* Fix test failures
* Removed requirement for aprs.fi key
* Updated Changelog
v2.5.1
------
* Removed stock plugin
* Removed the stock plugin
v2.5.0
------
* Updated for v2.5.0
* Updated Dockerfile's and build script for docker
* Cleaned up some verbose output & colorized output
* Reworked all the common arguments
* Fixed test-plugin
* Ensure common params are honored
* pep8
* Added healthcheck to the cmds
* Removed the need for FROMCALL in dev test-plugin
* Pep8 failures
* Refactor the cli
* Updated Changelog for 4.2.3
* Fixed a problem with send-message command
v2.4.2
------
* Updated Changelog
* Be more careful picking data to/from disk
* Updated Changelog
v2.4.1
------
* Ensure plugins are last to be loaded
* Fixed email connecting to smtp server
v2.4.0
------
* Updated Changelog for 2.4.0 release
* Converted MsgTrack to ObjectStoreMixin
* Fixed unit tests
* Make sure SeenList update has a from in packet
* Ensure PacketList is initialized
* Added SIGTERM to signal\_handler
* Enable configuring where to save the objectstore data
* PEP8 cleanup
* Added objectstore Mixin
* Added -num option to aprsd-dev test-plugin
* Only call stop\_threads if it exists
* Added new SeenList
* Added plugin version to stats reporting
* Added new HelpPlugin
* Updated aprsd-dev to use config for logfile format
* Updated build.sh
* removed usage of config.check\_config\_option
* Fixed send-message after config/client rework
* Fixed issue with flask config
* Added some server startup info logs
* Increase email delay to +10
* Updated dev to use plugin manager
* Fixed notify plugins
* Added new Config object
* Fixed email plugin's use of globals
* Refactored client classes
* Refactor utils usage
* 2.3.1 Changelog
v2.3.1
------
* Fixed issue of aprs-is missing keepalive
* Fixed packet processing issue with aprsd send-message
v2.3.0
------
* Prep 2.3.0
* Enable plugins to return message object
* Added enabled flag for every plugin object
* Ensure plugin threads are valid
* Updated Dockerfile to use v2.3.0
* Removed fixed size on logging queue
* Added Logfile tab in Admin ui
* Updated Makefile clean target
* Added self creating Makefile help target
* Update dev.py
* Allow passing in aprsis\_client
* Fixed a problem with the AVWX plugin not working
* Remove some noisy trace in email plugin
* Fixed issue at startup with notify plugin
* Fixed email validation
* Removed values from forms
* Added send-message to the main admin UI
* Updated requirements
* Cleaned up some pep8 failures
* Upgraded the send-message POC to use websockets
* New Admin ui send message page working
* Send Message via admin Web interface
* Updated Admin UI to show KISS connections
* Got TX/RX working with aioax25+direwolf over TCP
* Rebased from master
* Added the ability to use direwolf KISS socket
* Update Dockerfile to use 2.2.1
v2.2.1
------
* Update Changelog for 2.2.1
* Silence some log noise
v2.2.0
------
* Updated Changelog for v2.2.0
* Updated overview image
* Removed Black code style reference
* Removed TXThread
* Added days to uptime string formatting
* Updated select timeouts
* Rebase from master and run gray
* Added tracking plugin processing
* Added threads functions to APRSDPluginBase
* Refactor Message processing and MORE
* Use Gray instead of Black for code formatting
* Updated tox.ini
* Fixed LOG.debug issue in weather plugin
* Updated slack channel link
* Cleanup of the README.rst
* Fixed aprsd-dev
v2.1.0
------
* Prep for v2.1.0
* Enable multiple replies for plugins
* Put in a fix for aprslib parse exceptions
* Fixed time plugin
* Updated the charts Added the packets chart
* Added showing symbol images to watch list
v2.0.0
------
* Updated docs for 2.0.0
* Reworked the notification threads and admin ui
* Fixed small bug with packets get\_packet\_type
* Updated overview images
* Move version string output to top of log
* Add new watchlist feature
* Fixed the Ack thread not resending acks
* reworked the admin ui to use semenatic ui more
* Added messages count to admin messages list
* Add admin UI tabs for charts, messages, config
* Removed a noisy debug log
* Dump out the config during startup
* Added message counts for each plugin
* Bump urllib3 from 1.26.4 to 1.26.5
* Added aprsd version checking
* Updated INSTALL.txt
* Update my callsign
* Update README.rst
* Update README.rst
* Bump urllib3 from 1.26.3 to 1.26.4
* Prep for v1.6.1 release
v1.6.1
------
* Removed debug log for KeepAlive thread
* ignore Makefile.venv
* Reworked Makefile to use Makefile.venv
* Fixed version unit tests
* Updated stats output for KeepAlive thread
* Update Dockerfile-dev to work with startup
* Force all the graphs to 0 minimum
* Added email messages graphs
* Reworked the stats dict output and healthcheck
* Added callsign to the web index page
* Added log config for flask and lnav config file
* Added showing APRS-IS server to stats
* Provide an initial datapoint on rendering index
* Make the index page behind auth
* Bump pygments from 2.7.3 to 2.7.4
* Added acks with messages graphs
* Updated web stats index to show messages and ram usage
* Added aprsd web index page
* Bump lxml from 4.6.2 to 4.6.3
* Bump jinja2 from 2.11.2 to 2.11.3
* Bump urllib3 from 1.26.2 to 1.26.3
* Added log format and dateformat to config file
* Added Dockerfile-dev and updated build.sh
* Require python 3.7 and >
* Added plugin live reload and StockPlugin
* Updated Dockerfile and build.sh
* Updated Dockerfile for multiplatform builds
* Updated Dockerfile for multiplatform builds
* Dockerfile: Make creation of /config quiet failure
* Updated README docs
v1.6.0
------
* 1.6.0 release prep
* Updated path of run.sh for docker build
* Moved docker related stuffs to docker dir
* Removed some noisy debug log
* Bump cryptography from 3.3.1 to 3.3.2
* Wrap another server call with try except
* Wrap all imap calls with try except blocks
* Bump bleach from 3.2.1 to 3.3.0
* EmailThread was exiting because of IMAP timeout, added exceptions for this
* Added memory tracing in keeplive
* Fixed tox pep8 failure for trace
* Added tracing facility
* Fixed email login issue
* duplicate email messages from RF would generate usage response
* Enable debug logging for smtp and imap
* more debug around email thread
* debug around EmailThread hanging or vanishing
* Fixed resend email after config rework
* Added flask messages web UI and basic auth
* Fixed an issue with LocationPlugin
* Cleaned up the KeepAlive output
* updated .gitignore
* Added healthcheck app
* Add flask and flask\_classful reqs
* Added Flask web thread and stats collection
* First hack at flask
* Allow email to be disabled
* Reworked the config file and options
* Updated documentation and config output
* Fixed extracting lat/lon
* Added openweathermap weather plugin
* Added new time plugins
* Fixed TimePlugin timezone issue
* remove fortune white space
* fix git with install.txt
* change query char from ? to !
* Updated readme to include readthedocs link
* Added aprsd-dev plugin test cli and WxPlugin
v1.5.1
------
* Updated Changelog for v1.5.1
* Updated README to fix pypi page
* Update INSTALL.txt
v1.5.0
------

View File

@ -1,42 +0,0 @@
FROM alpine:latest as aprsd
ENV VERSION=1.0.0
ENV APRS_USER=aprs
ENV HOME=/home/aprs
ENV VIRTUAL_ENV=$HOME/.venv3
ENV INSTALL=$HOME/install
RUN apk add --update git wget py3-pip py3-virtualenv bash fortune
# Setup Timezone
ENV TZ=US/Eastern
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get install -y tzdata
RUN dpkg-reconfigure --frontend noninteractive tzdata
RUN addgroup --gid 1000 $APRS_USER
RUN adduser -h $HOME -D -u 1001 -G $APRS_USER $APRS_USER
ENV LC_ALL=C.UTF-8
ENV LANG=C.UTF-8
USER $APRS_USER
RUN pip3 install wheel
RUN python3 -m venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
RUN echo "export PATH=\$PATH:\$HOME/.local/bin" >> $HOME/.bashrc
VOLUME ["/config", "/plugins"]
WORKDIR $HOME
RUN pip install aprsd
USER root
RUN aprsd sample-config > /config/aprsd.yml
RUN chown -R $APRS_USER:$APRS_USER /config
# override this to run another configuration
ENV CONF default
USER $APRS_USER
ADD build/bin/run.sh $HOME/
ENTRYPOINT ["/home/aprs/run.sh"]

View File

@ -1,48 +0,0 @@
FROM alpine:latest as aprsd
# Dockerfile for building a container during aprsd development.
ENV VERSION=1.0.0
ENV APRS_USER=aprs
ENV HOME=/home/aprs
ENV APRSD=http://github.com/craigerl/aprsd.git
ENV APRSD_BRANCH="v1.1.0"
ENV VIRTUAL_ENV=$HOME/.venv3
ENV INSTALL=$HOME/install
RUN apk add --update git vim wget py3-pip py3-virtualenv bash fortune
# Setup Timezone
ENV TZ=US/Eastern
#RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
#RUN apt-get install -y tzdata
#RUN dpkg-reconfigure --frontend noninteractive tzdata
RUN addgroup --gid 1001 $APRS_USER
RUN adduser -h $HOME -D -u 1001 -G $APRS_USER $APRS_USER
ENV LC_ALL=C.UTF-8
ENV LANG=C.UTF-8
USER $APRS_USER
RUN pip3 install wheel
RUN python3 -m venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
RUN echo "export PATH=\$PATH:\$HOME/.local/bin" >> $HOME/.bashrc
VOLUME ["/config", "/plugins"]
WORKDIR $HOME
RUN mkdir $INSTALL
RUN git clone -b $APRSD_BRANCH $APRSD $INSTALL/aprsd
RUN cd $INSTALL/aprsd && pip3 install .
RUN which aprsd
USER root
RUN aprsd sample-config > /config/aprsd.yml
RUN chown -R $APRS_USER:$APRS_USER /config
# override this to run another configuration
ENV CONF default
USER $APRS_USER
ADD build/bin/run.sh $HOME/
ENTRYPOINT ["/home/aprs/run.sh"]

View File

@ -1,24 +1,37 @@
# installation instructions for the short-attention-span user like me
mkdir -p ~/.aprsd
# First off the easiest way is to use the official pypi release.
# To use the official release:
pip install aprsd
# For developers, there are a few ways:
# The EASY way? Use the makefile:
git clone https://github.com/craigerl/aprsd.git
cd aprsd
make dev
source .venv/bin/activate
# The HARD way?
cd ~
virtualenv .venv_aprsd
cd .venv_aprsd/
source ./bin/activate
sudo apt get install virtualenv
virtualenv ~/.venv_aprsd
source ~/.venv_aprsd/bin/activate
mkdir ~/aprsd2
cd ~/aprsd2
git clone https://github.com/craigerl/aprsd.git
cd aprsd
pip install -e .
cd ~/.venv_aprsd/bin
./aprsd sample-config # generates a config.yml template
vi ~/.config/aprsd/config.yml # copy/edit config here
# CONFIGURE
# Now configure aprsd HERE
mkdir -p ~/.config/aprsd
./aprsd sample-config > ~/.config/aprsd/aprsd.conf # generates a config template
./aprsd server
vi ~/.config/aprsd/aprsd.conf # copy/edit config here
aprsd server
# profit! #

105
Makefile
View File

@ -1,58 +1,87 @@
.PHONY: virtual dev build-requirements black isort flake8
WORKDIR?=.
VENVDIR ?= $(WORKDIR)/.aprsd-venv
all: pip dev
.DEFAULT_GOAL := help
virtual: .venv/bin/pip # Creates an isolated python 3 environment
.PHONY: dev docs server test
.venv/bin/pip:
virtualenv -p /usr/bin/python3 .venv
include Makefile.venv
Makefile.venv:
curl \
-o Makefile.fetched \
-L "https://raw.githubusercontent.com/sio/Makefile.venv/master/Makefile.venv"
echo " fb48375ed1fd19e41e0cdcf51a4a0c6d1010dfe03b672ffc4c26a91878544f82 *Makefile.fetched" \
| sha256sum --check - \
&& mv Makefile.fetched Makefile.venv
.venv/bin/aprsd: virtual
test -s .venv/bin/aprsd || .venv/bin/pip install -q -e .
help: # Help for the Makefile
@egrep -h '\s##\s' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-20s\033[0m %s\n", $$1, $$2}'
install: .venv/bin/aprsd
.venv/bin/pip install -Ur requirements.txt
dev: REQUIREMENTS_TXT = requirements.txt requirements-dev.txt
dev: venv ## Create a python virtual environment for development of aprsd
dev-pre-commit:
test -s .git/hooks/pre-commit || .venv/bin/pre-commit install
run: venv ## Create a virtual environment for running aprsd commands
dev-requirements:
test -s .venv/bin/twine || .venv/bin/pip install -q -r dev-requirements.txt
docs: dev
cp README.rst docs/readme.rst
cp Changelog docs/changelog.rst
tox -edocs
pip: virtual
.venv/bin/pip install -q -U pip
clean: clean-build clean-pyc clean-test clean-dev ## remove all build, test, coverage and Python artifacts
dev: pip .venv/bin/aprsd dev-requirements dev-pre-commit
clean-build: ## remove build artifacts
rm -fr build/
rm -fr dist/
rm -fr .eggs/
find . -name '*.egg-info' -exec rm -fr {} +
find . -name '*.egg' -exec rm -f {} +
pip-tools:
test -s .venv/bin/pip-compile || .venv/bin/pip install pip-tools
clean-pyc: ## remove Python file artifacts
find . -name '*.pyc' -exec rm -f {} +
find . -name '*.pyo' -exec rm -f {} +
find . -name '__pycache__' -exec rm -fr {} +
clean:
rm -rf dist/*
rm -rf .venv
clean-test: ## remove test and coverage artifacts
rm -fr .tox/
rm -f .coverage
rm -fr htmlcov/
rm -fr .pytest_cache
test: dev
.venv/bin/pre-commit run --all-files
clean-dev:
rm -rf $(VENVDIR)
rm Makefile.venv
test: dev ## Run all the tox tests
tox -p all
build: test
rm -rf dist/*
.venv/bin/python3 setup.py sdist bdist_wheel
.venv/bin/twine check dist/*
build: test ## Make the build artifact prior to doing an upload
$(VENV)/pip install twine
$(VENV)/python3 -m build
$(VENV)/twine check dist/*
upload: build
.venv/bin/twine upload dist/*
upload: build ## Upload a new version of the plugin
$(VENV)/twine upload dist/*
update-requirements: dev pip-tools
.venv/bin/pip-compile -q -U requirements.in
.venv/bin/pip-compile -q -U dev-requirements.in
.venv/bin/tox: # install tox
test -s .venv/bin/tox || .venv/bin/pip install -q -U tox
check: .venv/bin/tox # Code format check with isort and black
check: dev ## Code format check with tox and pep8
tox -efmt-check
tox -epep8
fix: .venv/bin/tox # fixes code formatting with isort and black
fix: dev ## fixes code formatting with gray
tox -efmt
server: venv ## Create the virtual environment and run aprsd server --loglevel DEBUG
$(VENV)/aprsd server --loglevel DEBUG
docker: test ## Make a docker container tagged with hemna6969/aprsd:latest
docker build -t hemna6969/aprsd:latest -f docker/Dockerfile docker
docker-dev: test ## Make a development docker container tagged with hemna6969/aprsd:master
docker build -t hemna6969/aprsd:master -f docker/Dockerfile-dev docker
update-requirements: dev ## Update the requirements.txt and dev-requirements.txt files
rm requirements.txt
rm requirements-dev.txt
touch requirements.txt
touch requirements-dev.txt
$(VENV)/pip-compile --resolver backtracking --annotation-style=line requirements.in
$(VENV)/pip-compile --resolver backtracking --annotation-style=line requirements-dev.in

View File

@ -1,144 +1,93 @@
=====
APRSD
=====
===============================================
APRSD - Ham radio APRS-IS Message plugin server
===============================================
.. image:: https://badge.fury.io/py/aprsd.svg
:target: https://badge.fury.io/py/aprsd
KM6LYW and WB4BOR
____________________
.. image:: https://github.com/craigerl/aprsd/workflows/python/badge.svg
:target: https://github.com/craigerl/aprsd/actions
|pypi| |pytest| |versions| |slack| |issues| |commit| |imports| |down|
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://black.readthedocs.io/en/stable/
.. image:: https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat&labelColor=ef8336
:target: https://timothycrosley.github.io/isort/
.. image:: https://img.shields.io/github/issues/craigerl/aprsd
.. image:: https://img.shields.io/github/last-commit/craigerl/aprsd
.. image:: https://static.pepy.tech/personalized-badge/aprsd?period=month&units=international_system&left_color=black&right_color=orange&left_text=Downloads
:target: https://pepy.tech/project/aprsd
.. contents:: :local:
`APRSD <http://github.com/craigerl/aprsd>`_ is a Ham radio `APRS <http://aprs.org>`_ message command gateway built on python.
APRSD listens on amateur radio aprs-is network for messages and respond to them.
It has a plugin architecture for extensibility. Users of APRSD can write their own
plugins that can respond to APRS-IS messages.
You must have an amateur radio callsign to use this software. APRSD gets
messages for the configured HAM callsign, and sends those messages to a
list of plugins for processing. There are a set of core plugins that
provide responding to messages to check email, get location, ping,
time of day, get weather, and fortune telling as well as version information
of aprsd itself.
What is APRSD
=============
APRSD is a python application for interacting with the APRS network and providing
APRS services for HAM radio operators.
APRSD currently has 4 main commands to use.
* server - Connect to APRS and listen/respond to APRS messages
* webchat - web based chat program over APRS
* send-message - Send a message to a callsign via APRS_IS.
* listen - Listen to packets on the APRS-IS Network based on FILTER.
Each of those commands can connect to the APRS-IS network if internet connectivity
is available. If internet is not available, then APRS can be configured to talk
to a TCP KISS TNC for radio connectivity.
Please `read the docs`_ to learn more!
APRSD Overview Diagram
----------------------
======================
.. image:: https://raw.githubusercontent.com/craigerl/aprsd/master/docs/_static/aprsd_overview.svg?sanitize=true
Typical use case
================
APRSD's typical use case is that of providing an APRS wide service to all HAM
radio operators. For example the callsign 'REPEAT' on the APRS network is actually
an instance of APRSD that can provide a list of HAM repeaters in the area of the
callsign that sent the message.
Ham radio operator using an APRS enabled HAM radio sends a message to check
the weather. an APRS message is sent, and then picked up by APRSD. The
the weather. An APRS message is sent, and then picked up by APRSD. The
APRS packet is decoded, and the message is sent through the list of plugins
for processing. For example, the WeatherPlugin picks up the message, fetches the weather
for the area around the user who sent the request, and then responds with
the weather conditions in that area.
the weather conditions in that area. Also includes a watch list of HAM
callsigns to look out for. The watch list can notify you when a HAM callsign
in the list is seen and now available to message on the APRS network.
APRSD Capabilities
==================
* server - The main aprsd server processor. Send/Rx APRS messages to HAM callsign
* send-message - use aprsd to send a command/message to aprsd server. Used for development testing
* sample-config - generate a sample aprsd.yml config file for use/editing
* bash completion generation. Uses python click bash completion to generate completion code for your .bashrc/.zshrc
List of core server plugins
===========================
Plugins function by specifying a regex that is searched for in the APRS message.
If it matches, the plugin runs. IF the regex doesn't match, the plugin is skipped.
* EmailPlugin - Check email and reply with contents. Have to configure IMAP and SMTP settings in aprs.yml
* FortunePlugin - Replies with old unix fortune random fortune!
* LocationPlugin - Checks location of ham operator
* PingPlugin - Sends pong with timestamp
* QueryPlugin - Allows querying the list of delayed messages that were not ACK'd by radio
* TimePlugin - Current time of day
* WeatherPlugin - Get weather conditions for current location of HAM callsign
* VersionPlugin - Reports the version information for aprsd
Current messages this will respond to:
======================================
::
APRS messages:
l(ocation) [callsign] = descriptive current location of your radio
8 Miles E Auburn CA 1673' 39.92150,-120.93950 0.1h ago
w(eather) = weather forecast for your radio's current position
58F(58F/46F) Partly Cloudy. Tonight, Heavy Rain.
t(ime) = respond with the current time
f(ortune) = respond with a short fortune
-email_addr email text = send an email, say "mapme" to send a current position/map
-2 = resend the last 2 emails from your imap inbox to this radio
p(ing) = respond with Pong!/time
v(ersion) = Respond with current APRSD Version string
anything else = respond with usage
Meanwhile this code will monitor a single imap mailbox and forward email
to your BASECALLSIGN over the air. Only radios using the BASECALLSIGN are allowed
to send email, so consider this security risk before using this (or Amatuer radio in
general). Email is single user at this time.
There are additional parameters in the code (sorry), so be sure to set your
email server, and associated logins, passwords. search for "yourdomain",
"password". Search for "shortcuts" to setup email aliases as well.
Installation:
Installation
=============
pip install aprsd
To install ``aprsd``, use Pip:
Example usage:
``pip install aprsd``
Example usage
==============
aprsd -h
``aprsd -h``
Help
====
::
└─[$] > aprsd -h
└─> aprsd -h
Usage: aprsd [OPTIONS] COMMAND [ARGS]...
Shell completion for click-completion-command Available shell types:
bash Bourne again shell fish Friendly interactive shell
powershell Windows PowerShell zsh Z shell Default type: auto
Options:
--version Show the version and exit.
-h, --help Show this message and exit.
Commands:
install Install the click-completion-command completion
sample-config This dumps the config to stdout.
check-version Check this version against the latest in pypi.org.
completion Click Completion subcommands
dev Development type subcommands
healthcheck Check the health of the running aprsd server.
list-plugins List the built in plugins available to APRSD.
listen Listen to packets on the APRS-IS Network based on FILTER.
sample-config Generate a sample Config file from aprsd and all...
send-message Send a message to a callsign via APRS_IS.
server Start the aprsd server process.
show Show the click-completion-command completion code
server Start the aprsd server gateway process.
version Show the APRSD version.
webchat Web based HAM Radio chat program!
@ -148,51 +97,14 @@ Commands
Configuration
=============
This command outputs a sample config yml formatted block that you can edit
and use to pass in to aprsd with -c. By default aprsd looks in ~/.config/aprsd/aprsd.yml
and use to pass in to ``aprsd`` with ``-c``. By default aprsd looks in ``~/.config/aprsd/aprsd.yml``
aprsd sample-config
``aprsd sample-config``
Output
======
::
└─[$] > aprsd sample-config
aprs:
host: rotate.aprs.net
logfile: /tmp/arsd.log
login: someusername
password: somepassword
port: 14580
aprsd:
enabled_plugins:
- aprsd.plugin.EmailPlugin
- aprsd.plugin.FortunePlugin
- aprsd.plugin.LocationPlugin
- aprsd.plugin.PingPlugin
- aprsd.plugin.TimePlugin
- aprsd.plugin.WeatherPlugin
- aprsd.plugin.VersionPlugin
plugin_dir: ~/.config/aprsd/plugins
ham:
callsign: KFART
imap:
host: imap.gmail.com
login: imapuser
password: something here too
port: 993
use_ssl: true
shortcuts:
aa: 5551239999@vtext.com
cl: craiglamparter@somedomain.org
wb: 555309@vtext.com
smtp:
host: imap.gmail.com
login: something
password: some lame password
port: 465
use_ssl: false
└─> aprsd sample-config
...
server
======
@ -203,30 +115,85 @@ look for incomming commands to the callsign configured in the config file
::
└─[$] > aprsd server --help
Usage: aprsd server [OPTIONS]
Usage: aprsd server [OPTIONS]
Start the aprsd server process.
Start the aprsd server gateway process.
Options:
--loglevel [CRITICAL|ERROR|WARNING|INFO|DEBUG]
The log level to use for aprsd.log
[default: DEBUG]
Options:
--loglevel [CRITICAL|ERROR|WARNING|INFO|DEBUG]
The log level to use for aprsd.log
[default: INFO]
-c, --config TEXT The aprsd config file to use for options.
[default:
/Users/i530566/.config/aprsd/aprsd.yml]
--quiet Don't log to stdout
-f, --flush Flush out all old aged messages on disk.
[default: False]
-h, --help Show this message and exit.
--quiet Don't log to stdout
--disable-validation Disable email shortcut validation. Bad
email addresses can result in broken email
responses!!
-c, --config TEXT The aprsd config file to use for options.
[default: ~/.config/aprsd/aprsd.yml]
-h, --help Show this message and exit.
(.venv3) ┌─[waboring@dl360-1] - [~/devel/aprsd] - [Sun Dec 20, 12:32] -
└─[$] <git:(master*)> aprsd server
└─> aprsd server
Load config
[12/20/2020 12:33:03 PM] [MainThread ] [INFO ] APRSD Started version: 1.0.2
[12/20/2020 12:33:03 PM] [MainThread ] [INFO ] Checking IMAP configuration
[12/20/2020 12:33:04 PM] [MainThread ] [INFO ] Checking SMTP configuration
12/07/2021 03:16:17 PM MainThread INFO APRSD is up to date server.py:51
12/07/2021 03:16:17 PM MainThread INFO APRSD Started version: 2.5.6 server.py:52
12/07/2021 03:16:17 PM MainThread INFO Using CONFIG values: server.py:55
12/07/2021 03:16:17 PM MainThread INFO ham.callsign = WB4BOR server.py:60
12/07/2021 03:16:17 PM MainThread INFO aprs.login = WB4BOR-12 server.py:60
12/07/2021 03:16:17 PM MainThread INFO aprs.password = XXXXXXXXXXXXXXXXXXX server.py:58
12/07/2021 03:16:17 PM MainThread INFO aprs.host = noam.aprs2.net server.py:60
12/07/2021 03:16:17 PM MainThread INFO aprs.port = 14580 server.py:60
12/07/2021 03:16:17 PM MainThread INFO aprs.logfile = /tmp/aprsd.log server.py:60
Current list of built-in plugins
======================================
::
└─> aprsd list-plugins
🐍 APRSD Built-in Plugins 🐍
┏━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Plugin Name ┃ Info ┃ Type ┃ Plugin Path ┃
┡━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ AVWXWeatherPlugin │ AVWX weather of GPS Beacon location │ RegexCommand │ aprsd.plugins.weather.AVWXWeatherPlugin │
│ EmailPlugin │ Send and Receive email │ RegexCommand │ aprsd.plugins.email.EmailPlugin │
│ FortunePlugin │ Give me a fortune │ RegexCommand │ aprsd.plugins.fortune.FortunePlugin │
│ LocationPlugin │ Where in the world is a CALLSIGN's last GPS beacon? │ RegexCommand │ aprsd.plugins.location.LocationPlugin │
│ NotifySeenPlugin │ Notify me when a CALLSIGN is recently seen on APRS-IS │ WatchList │ aprsd.plugins.notify.NotifySeenPlugin │
│ OWMWeatherPlugin │ OpenWeatherMap weather of GPS Beacon location │ RegexCommand │ aprsd.plugins.weather.OWMWeatherPlugin │
│ PingPlugin │ reply with a Pong! │ RegexCommand │ aprsd.plugins.ping.PingPlugin │
│ QueryPlugin │ APRSD Owner command to query messages in the MsgTrack │ RegexCommand │ aprsd.plugins.query.QueryPlugin │
│ TimeOWMPlugin │ Current time of GPS beacon's timezone. Uses OpenWeatherMap │ RegexCommand │ aprsd.plugins.time.TimeOWMPlugin │
│ TimePlugin │ What is the current local time. │ RegexCommand │ aprsd.plugins.time.TimePlugin │
│ USMetarPlugin │ USA only METAR of GPS Beacon location │ RegexCommand │ aprsd.plugins.weather.USMetarPlugin │
│ USWeatherPlugin │ Provide USA only weather of GPS Beacon location │ RegexCommand │ aprsd.plugins.weather.USWeatherPlugin │
│ VersionPlugin │ What is the APRSD Version │ RegexCommand │ aprsd.plugins.version.VersionPlugin │
└───────────────────┴────────────────────────────────────────────────────────────┴──────────────┴─────────────────────────────────────────┘
Pypi.org APRSD Installable Plugin Packages
Install any of the following plugins with 'pip install <Plugin Package Name>'
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━━━━━┳━━━━━━━━━━━━┓
┃ Plugin Package Name ┃ Description ┃ Version ┃ Released ┃ Installed? ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━━━━━╇━━━━━━━━━━━━┩
│ 📂 aprsd-stock-plugin │ Ham Radio APRSD Plugin for fetching stock quotes │ 0.1.3 │ Dec 2, 2022 │ No │
│ 📂 aprsd-sentry-plugin │ Ham radio APRSD plugin that does.... │ 0.1.2 │ Dec 2, 2022 │ No │
│ 📂 aprsd-timeopencage-plugin │ APRSD plugin for fetching time based on GPS location │ 0.1.0 │ Dec 2, 2022 │ No │
│ 📂 aprsd-weewx-plugin │ HAM Radio APRSD that reports weather from a weewx weather station. │ 0.1.4 │ Dec 7, 2021 │ Yes │
│ 📂 aprsd-repeat-plugins │ APRSD Plugins for the REPEAT service │ 1.0.12 │ Dec 2, 2022 │ No │
│ 📂 aprsd-telegram-plugin │ Ham Radio APRS APRSD plugin for Telegram IM service │ 0.1.3 │ Dec 2, 2022 │ No │
│ 📂 aprsd-twitter-plugin │ Python APRSD plugin to send tweets │ 0.3.0 │ Dec 7, 2021 │ No │
│ 📂 aprsd-slack-plugin │ Amateur radio APRS daemon which listens for messages and responds │ 1.0.5 │ Dec 18, 2022 │ No │
└──────────────────────────────┴────────────────────────────────────────────────────────────────────┴─────────┴──────────────┴────────────┘
🐍 APRSD Installed 3rd party Plugins 🐍
┏━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Package Name ┃ Plugin Name ┃ Version ┃ Type ┃ Plugin Path ┃
┡━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ aprsd-weewx-plugin │ WeewxMQTTPlugin │ 1.0 │ RegexCommand │ aprsd_weewx_plugin.weewx.WeewxMQTTPlugin │
└────────────────────┴─────────────────┴─────────┴──────────────┴──────────────────────────────────────────┘
send-message
@ -238,32 +205,30 @@ test messages
::
└─[$] > aprsd send-message -h
Usage: aprsd send-message [OPTIONS] TOCALLSIGN [COMMAND]...
Usage: aprsd send-message [OPTIONS] TOCALLSIGN COMMAND...
Send a message to a callsign via APRS_IS.
Options:
--loglevel [CRITICAL|ERROR|WARNING|INFO|DEBUG]
The log level to use for aprsd.log
[default: DEBUG]
--quiet Don't log to stdout
[default: INFO]
-c, --config TEXT The aprsd config file to use for options.
[default: ~/.config/aprsd/aprsd.yml]
[default:
/Users/i530566/.config/aprsd/aprsd.yml]
--quiet Don't log to stdout
--aprs-login TEXT What callsign to send the message from.
[env var: APRS_LOGIN]
--aprs-password TEXT the APRS-IS password for APRS_LOGIN [env
var: APRS_PASSWORD]
-n, --no-ack Don't wait for an ack, just sent it to APRS-
IS and bail. [default: False]
-w, --wait-response Wait for a response to the message?
[default: False]
--raw TEXT Send a raw message. Implies --no-ack
-h, --help Show this message and exit.
Example output:
===============
SEND EMAIL (radio to smtp server)
=================================
@ -331,28 +296,52 @@ LOCATION
AND... ping, fortune, time.....
Web Admin Interface
===================
To start the web admin interface, You have to install gunicorn in your virtualenv that already has aprsd installed.
::
source <path to APRSD's virtualenv>/bin/activate
pip install gunicorn
gunicorn --bind 0.0.0.0:8080 "aprsd.wsgi:app"
The web admin interface will be running on port 8080 on the local machine. http://localhost:8080
Development
===========
* git clone git@github.com:craigerl/aprsd.git
* cd aprsd
* make
* ``git clone git@github.com:craigerl/aprsd.git``
* ``cd aprsd``
* ``make``
Workflow
========
While working aprsd, The workflow is as follows
While working aprsd, The workflow is as follows:
* Checkout a new branch to work on by running
``git checkout -b mybranch``
* Make your changes to the code
* Run Tox with the following options:
- ``tox -epep8``
- ``tox -efmt``
- ``tox -p``
* Commit your changes. This will run the pre-commit hooks which does checks too
``git commit``
* checkout a new branch to work on
* git checkout -b mybranch
* Edit code
* run tox -epep8
* run tox -efmt
* run tox -p
* git commit ( This will run the pre-commit hooks which does checks too )
* Once you are done with all of your commits, then push up the branch to
github
* git push -u origin mybranch
github with:
``git push -u origin mybranch``
* Create a pull request from your branch so github tests can run and we can do
a code review.
@ -362,21 +351,21 @@ Release
To do release to pypi:
* Tag release with
* Tag release with:
git tag -v1.XX -m "New release"
``git tag -v1.XX -m "New release"``
* push release tag up
* Push release tag:
git push origin master --tags
``git push origin master --tags``
* Do a test build and verify build is valid
* Do a test build and verify build is valid by running:
make build
``make build``
* Once twine is happy, upload release to pypi
* Once twine is happy, upload release to pypi:
make upload
``make upload``
Docker Container
@ -394,24 +383,62 @@ the repo.
Official Build
==============
docker build -t hemna6969/aprsd:latest .
``docker build -t hemna6969/aprsd:latest .``
Development Build
=================
docker build -t hemna6969/aprsd:latest -f Dockerfile-dev .
``docker build -t hemna6969/aprsd:latest -f Dockerfile-dev .``
Running the container
=====================
There is a docker-compose.yml file that can be used to run your container.
There are 2 volumes defined that can be used to store your configuration
and the plugins directory: /config and /plugins
There is a ``docker-compose.yml`` file in the ``docker/`` directory
that can be used to run your container. To provide the container
an ``aprsd.conf`` configuration file, change your
``docker-compose.yml`` as shown below:
If you want to install plugins at container start time, then use the
environment var in docker-compose.yml specified as APRS_PLUGINS
Provide a csv list of pypi installable plugins. Then make sure the plugin
python file is in your /plugins volume and the plugin will be installed at
container startup. The plugin may have dependencies that are required.
The plugin file should be copied to /plugins for loading by aprsd
::
volumes:
- $HOME/.config/aprsd:/config
To install plugins at container start time, pass in a list of
comma-separated list of plugins on PyPI using the ``APRSD_PLUGINS``
environment variable in the ``docker-compose.yml`` file. Note that
version constraints may also be provided. For example:
::
environment:
- APRSD_PLUGINS=aprsd-slack-plugin>=1.0.2,aprsd-twitter-plugin
.. badges
.. |pypi| image:: https://badge.fury.io/py/aprsd.svg
:target: https://badge.fury.io/py/aprsd
.. |pytest| image:: https://github.com/craigerl/aprsd/workflows/python/badge.svg
:target: https://github.com/craigerl/aprsd/actions
.. |versions| image:: https://img.shields.io/pypi/pyversions/aprsd.svg
:target: https://pypi.org/pypi/aprsd
.. |slack| image:: https://img.shields.io/badge/slack-@hemna/aprsd-blue.svg?logo=slack
:target: https://hemna.slack.com/app_redirect?channel=C01KQSCP5RP
.. |imports| image:: https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat&labelColor=ef8336
:target: https://timothycrosley.github.io/isort/
.. |issues| image:: https://img.shields.io/github/issues/craigerl/aprsd
.. |commit| image:: https://img.shields.io/github/last-commit/craigerl/aprsd
.. |down| image:: https://static.pepy.tech/personalized-badge/aprsd?period=month&units=international_system&left_color=black&right_color=orange&left_text=Downloads
:target: https://pepy.tech/project/aprsd
.. links
.. _read the docs:
https://aprsd.readthedocs.io

39
aprsd-lnav.json Normal file
View File

@ -0,0 +1,39 @@
{
"aprsd" : {
"title" : "APRSD APRS-IS server log format",
"description" : "Log formats used by ARPRSD server",
"url" : "http://github.com/craigerl/aprsd",
"regex" : {
"std" : {
"pattern" : "^\\[(?<timestamp>\\d{2}\\/\\d{2}\\/\\d{4} \\d{2}:\\d{2}:\\d{2} ([AaPp][Mm]))\\] \\[(?<thread>\\w+\\s*)\\] \\[(?<alert_level>\\w+\\s*)\\] (?<body>([^-]*)-*)\\s\\[(?<file>([^:]*))\\:(?<line>\\d+)\\]"
}
},
"level-field" : "alert_level",
"level" : {
"info" : "INFO",
"error" : "ERROR",
"warning" : "WARN",
"debug" : "DEBUG",
"fatal" : "FATAL",
"info" : "UNKNOWN"
},
"value" : {
"alert_level": { "kind" : "string", "identifier" : true },
"thread": { "kind" : "string", "identifier" : true },
"body" : { "kind" : "string" },
"file" : { "kind" : "string" }
},
"timestamp-field" : "timestamp",
"timestamp-format" : [
"%m/%d/%Y %I:%M:%S %p"
],
"sample" : [
{
"line" : "[03/30/2021 08:57:44 PM] [MainThread ] [INFO ] Skipping Custom Plugins directory. - [/home/waboring/devel/aprsd/aprsd/plugin.py:232]"
},
{
"line" : "[03/30/2021 08:57:44 PM] [KeepAlive ] [DEBUG] Uptime (0:00:00.577754) Tracker(0) Msgs: TX:0 RX:0 EmailThread: N/A RAM: Current:50289 Peak:99697 - [/home/waboring/devel/aprsd/aprsd/threads.py:89]"
}
]
}
}

View File

@ -10,6 +10,10 @@
# License for the specific language governing permissions and limitations
# under the License.
import pbr.version
from importlib.metadata import PackageNotFoundError, version
__version__ = pbr.version.VersionInfo("aprsd").version_string()
try:
__version__ = version("aprsd")
except PackageNotFoundError:
pass

151
aprsd/cli_helper.py Normal file
View File

@ -0,0 +1,151 @@
from functools import update_wrapper
import logging
from pathlib import Path
import typing as t
import click
from oslo_config import cfg
import aprsd
from aprsd import conf # noqa: F401
from aprsd.log import log
from aprsd.utils import trace
CONF = cfg.CONF
home = str(Path.home())
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
DEFAULT_SAVE_FILE = f"{home}/.config/aprsd/aprsd.p"
DEFAULT_CONFIG_FILE = f"{home}/.config/aprsd/aprsd.conf"
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
common_options = [
click.option(
"--loglevel",
default="INFO",
show_default=True,
type=click.Choice(
["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
case_sensitive=False,
),
show_choices=True,
help="The log level to use for aprsd.log",
),
click.option(
"-c",
"--config",
"config_file",
show_default=True,
default=DEFAULT_CONFIG_FILE,
help="The aprsd config file to use for options.",
),
click.option(
"--quiet",
is_flag=True,
default=False,
help="Don't log to stdout",
),
]
class AliasedGroup(click.Group):
def command(self, *args, **kwargs):
"""A shortcut decorator for declaring and attaching a command to
the group. This takes the same arguments as :func:`command` but
immediately registers the created command with this instance by
calling into :meth:`add_command`.
Copied from `click` and extended for `aliases`.
"""
def decorator(f):
aliases = kwargs.pop("aliases", [])
cmd = click.decorators.command(*args, **kwargs)(f)
self.add_command(cmd)
for alias in aliases:
self.add_command(cmd, name=alias)
return cmd
return decorator
def group(self, *args, **kwargs):
"""A shortcut decorator for declaring and attaching a group to
the group. This takes the same arguments as :func:`group` but
immediately registers the created command with this instance by
calling into :meth:`add_command`.
Copied from `click` and extended for `aliases`.
"""
def decorator(f):
aliases = kwargs.pop("aliases", [])
cmd = click.decorators.group(*args, **kwargs)(f)
self.add_command(cmd)
for alias in aliases:
self.add_command(cmd, name=alias)
return cmd
return decorator
def add_options(options):
def _add_options(func):
for option in reversed(options):
func = option(func)
return func
return _add_options
def process_standard_options(f: F) -> F:
def new_func(*args, **kwargs):
ctx = args[0]
ctx.ensure_object(dict)
config_file_found = True
if kwargs["config_file"]:
default_config_files = [kwargs["config_file"]]
else:
default_config_files = None
try:
CONF(
[], project="aprsd", version=aprsd.__version__,
default_config_files=default_config_files,
)
except cfg.ConfigFilesNotFoundError:
config_file_found = False
ctx.obj["loglevel"] = kwargs["loglevel"]
# ctx.obj["config_file"] = kwargs["config_file"]
ctx.obj["quiet"] = kwargs["quiet"]
log.setup_logging(
ctx.obj["loglevel"],
ctx.obj["quiet"],
)
if CONF.trace_enabled:
trace.setup_tracing(["method", "api"])
if not config_file_found:
LOG = logging.getLogger("APRSD") # noqa: N806
LOG.error("No config file found!! run 'aprsd sample-config'")
del kwargs["loglevel"]
del kwargs["config_file"]
del kwargs["quiet"]
return f(*args, **kwargs)
return update_wrapper(t.cast(F, new_func), f)
def process_standard_options_no_config(f: F) -> F:
"""Use this as a decorator when config isn't needed."""
def new_func(*args, **kwargs):
ctx = args[0]
ctx.ensure_object(dict)
ctx.obj["loglevel"] = kwargs["loglevel"]
ctx.obj["config_file"] = kwargs["config_file"]
ctx.obj["quiet"] = kwargs["quiet"]
log.setup_logging(
ctx.obj["loglevel"],
ctx.obj["quiet"],
)
del kwargs["loglevel"]
del kwargs["config_file"]
del kwargs["quiet"]
return f(*args, **kwargs)
return update_wrapper(t.cast(F, new_func), f)

View File

@ -1,182 +0,0 @@
import logging
import select
import time
import aprsd
import aprslib
from aprslib import is_py3
from aprslib.exceptions import LoginError
LOG = logging.getLogger("APRSD")
class Client:
"""Singleton client class that constructs the aprslib connection."""
_instance = None
aprs_client = None
config = None
connected = False
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
# Put any initialization here.
return cls._instance
def __init__(self, config=None):
"""Initialize the object instance."""
if config:
self.config = config
@property
def client(self):
if not self.aprs_client:
self.aprs_client = self.setup_connection()
return self.aprs_client
def reset(self):
"""Call this to force a rebuild/reconnect."""
del self.aprs_client
def setup_connection(self):
user = self.config["aprs"]["login"]
password = self.config["aprs"]["password"]
host = self.config["aprs"].get("host", "rotate.aprs.net")
port = self.config["aprs"].get("port", 14580)
connected = False
backoff = 1
while not connected:
try:
LOG.info("Creating aprslib client")
aprs_client = Aprsdis(user, passwd=password, host=host, port=port)
# Force the logging to be the same
aprs_client.logger = LOG
aprs_client.connect()
connected = True
backoff = 1
except LoginError as e:
LOG.error("Failed to login to APRS-IS Server '{}'".format(e))
connected = False
raise e
except Exception as e:
LOG.error("Unable to connect to APRS-IS server. '{}' ".format(e))
time.sleep(backoff)
backoff = backoff * 2
continue
LOG.debug("Logging in to APRS-IS with user '%s'" % user)
return aprs_client
class Aprsdis(aprslib.IS):
"""Extend the aprslib class so we can exit properly."""
# flag to tell us to stop
thread_stop = False
# timeout in seconds
select_timeout = 10
def stop(self):
self.thread_stop = True
LOG.info("Shutdown Aprsdis client.")
def _socket_readlines(self, blocking=False):
"""
Generator for complete lines, received from the server
"""
try:
self.sock.setblocking(0)
except OSError as e:
self.logger.error("socket error when setblocking(0): %s" % str(e))
raise aprslib.ConnectionDrop("connection dropped")
while not self.thread_stop:
short_buf = b""
newline = b"\r\n"
# set a select timeout, so we get a chance to exit
# when user hits CTRL-C
readable, writable, exceptional = select.select(
[self.sock],
[],
[],
self.select_timeout,
)
if not readable:
continue
try:
short_buf = self.sock.recv(4096)
# sock.recv returns empty if the connection drops
if not short_buf:
self.logger.error("socket.recv(): returned empty")
raise aprslib.ConnectionDrop("connection dropped")
except OSError as e:
# self.logger.error("socket error on recv(): %s" % str(e))
if "Resource temporarily unavailable" in str(e):
if not blocking:
if len(self.buf) == 0:
break
self.buf += short_buf
while newline in self.buf:
line, self.buf = self.buf.split(newline, 1)
yield line
def _send_login(self):
"""
Sends login string to server
"""
login_str = "user {0} pass {1} vers github.com/craigerl/aprsd {3}{2}\r\n"
login_str = login_str.format(
self.callsign,
self.passwd,
(" filter " + self.filter) if self.filter != "" else "",
aprsd.__version__,
)
self.logger.info("Sending login information")
try:
self._sendall(login_str)
self.sock.settimeout(5)
test = self.sock.recv(len(login_str) + 100)
if is_py3:
test = test.decode("latin-1")
test = test.rstrip()
self.logger.debug("Server: %s", test)
_, _, callsign, status, _ = test.split(" ", 4)
if callsign == "":
raise LoginError("Server responded with empty callsign???")
if callsign != self.callsign:
raise LoginError("Server: %s" % test)
if status != "verified," and self.passwd != "-1":
raise LoginError("Password is incorrect")
if self.passwd == "-1":
self.logger.info("Login successful (receive only)")
else:
self.logger.info("Login successful")
except LoginError as e:
self.logger.error(str(e))
self.close()
raise
except Exception:
self.close()
self.logger.error("Failed to login")
raise LoginError("Failed to login")
def get_client():
cl = Client()
return cl.client

13
aprsd/client/__init__.py Normal file
View File

@ -0,0 +1,13 @@
from aprsd.client import aprsis, factory, fake, kiss
TRANSPORT_APRSIS = "aprsis"
TRANSPORT_TCPKISS = "tcpkiss"
TRANSPORT_SERIALKISS = "serialkiss"
TRANSPORT_FAKE = "fake"
client_factory = factory.ClientFactory()
client_factory.register(aprsis.APRSISClient)
client_factory.register(kiss.KISSClient)
client_factory.register(fake.APRSDFakeClient)

132
aprsd/client/aprsis.py Normal file
View File

@ -0,0 +1,132 @@
import datetime
import logging
import time
from aprslib.exceptions import LoginError
from oslo_config import cfg
from aprsd import client, exception
from aprsd.client import base
from aprsd.client.drivers import aprsis
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSISClient(base.APRSClient):
_client = None
def __init__(self):
max_timeout = {"hours": 0.0, "minutes": 2, "seconds": 0}
self.max_delta = datetime.timedelta(**max_timeout)
def stats(self) -> dict:
stats = {}
if self.is_configured():
stats = {
"server_string": self._client.server_string,
"sever_keepalive": self._client.aprsd_keepalive,
"filter": self.filter,
}
return stats
@staticmethod
def is_enabled():
# Defaults to True if the enabled flag is non existent
try:
return CONF.aprs_network.enabled
except KeyError:
return False
@staticmethod
def is_configured():
if APRSISClient.is_enabled():
# Ensure that the config vars are correctly set
if not CONF.aprs_network.login:
LOG.error("Config aprs_network.login not set.")
raise exception.MissingConfigOptionException(
"aprs_network.login is not set.",
)
if not CONF.aprs_network.password:
LOG.error("Config aprs_network.password not set.")
raise exception.MissingConfigOptionException(
"aprs_network.password is not set.",
)
if not CONF.aprs_network.host:
LOG.error("Config aprs_network.host not set.")
raise exception.MissingConfigOptionException(
"aprs_network.host is not set.",
)
return True
return True
def _is_stale_connection(self):
delta = datetime.datetime.now() - self._client.aprsd_keepalive
if delta > self.max_delta:
LOG.error(f"Connection is stale, last heard {delta} ago.")
return True
def is_alive(self):
if self._client:
return self._client.is_alive() and not self._is_stale_connection()
else:
LOG.warning(f"APRS_CLIENT {self._client} alive? NO!!!")
return False
def close(self):
if self._client:
self._client.stop()
self._client.close()
@staticmethod
def transport():
return client.TRANSPORT_APRSIS
def decode_packet(self, *args, **kwargs):
"""APRS lib already decodes this."""
return core.factory(args[0])
def setup_connection(self):
user = CONF.aprs_network.login
password = CONF.aprs_network.password
host = CONF.aprs_network.host
port = CONF.aprs_network.port
self.connected = False
backoff = 1
aprs_client = None
while not self.connected:
try:
LOG.info(f"Creating aprslib client({host}:{port}) and logging in {user}.")
aprs_client = aprsis.Aprsdis(user, passwd=password, host=host, port=port)
# Force the log to be the same
aprs_client.logger = LOG
aprs_client.connect()
self.connected = True
backoff = 1
except LoginError as e:
LOG.error(f"Failed to login to APRS-IS Server '{e}'")
self.connected = False
time.sleep(backoff)
except Exception as e:
LOG.error(f"Unable to connect to APRS-IS server. '{e}' ")
self.connected = False
time.sleep(backoff)
# Don't allow the backoff to go to inifinity.
if backoff > 5:
backoff = 5
else:
backoff += 1
continue
self._client = aprs_client
return aprs_client
def consumer(self, callback, blocking=False, immortal=False, raw=False):
self._client.consumer(
callback, blocking=blocking,
immortal=immortal, raw=raw,
)

105
aprsd/client/base.py Normal file
View File

@ -0,0 +1,105 @@
import abc
import logging
import threading
from oslo_config import cfg
import wrapt
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSClient:
"""Singleton client class that constructs the aprslib connection."""
_instance = None
_client = None
connected = False
filter = None
lock = threading.Lock()
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
# Put any initialization here.
cls._instance._create_client()
return cls._instance
@abc.abstractmethod
def stats(self) -> dict:
pass
def set_filter(self, filter):
self.filter = filter
if self._client:
self._client.set_filter(filter)
@property
def client(self):
if not self._client:
self._create_client()
return self._client
def _create_client(self):
self._client = self.setup_connection()
if self.filter:
LOG.info("Creating APRS client filter")
self._client.set_filter(self.filter)
def stop(self):
if self._client:
LOG.info("Stopping client connection.")
self._client.stop()
def send(self, packet: core.Packet):
"""Send a packet to the network."""
self.client.send(packet)
@wrapt.synchronized(lock)
def reset(self):
"""Call this to force a rebuild/reconnect."""
LOG.info("Resetting client connection.")
if self._client:
self._client.close()
del self._client
self._create_client()
else:
LOG.warning("Client not initialized, nothing to reset.")
# Recreate the client
LOG.info(f"Creating new client {self.client}")
@abc.abstractmethod
def setup_connection(self):
pass
@staticmethod
@abc.abstractmethod
def is_enabled():
pass
@staticmethod
@abc.abstractmethod
def transport():
pass
@abc.abstractmethod
def decode_packet(self, *args, **kwargs):
pass
@abc.abstractmethod
def consumer(self, callback, blocking=False, immortal=False, raw=False):
pass
@abc.abstractmethod
def is_alive(self):
pass
@abc.abstractmethod
def close(self):
pass

View File

View File

@ -0,0 +1,224 @@
import datetime
import logging
import select
import threading
import aprslib
from aprslib import is_py3
from aprslib.exceptions import (
ConnectionDrop, ConnectionError, GenericError, LoginError, ParseError,
UnknownFormat,
)
import wrapt
import aprsd
from aprsd.packets import core
LOG = logging.getLogger("APRSD")
class Aprsdis(aprslib.IS):
"""Extend the aprslib class so we can exit properly."""
# flag to tell us to stop
thread_stop = False
# date for last time we heard from the server
aprsd_keepalive = datetime.datetime.now()
# timeout in seconds
select_timeout = 1
lock = threading.Lock()
def stop(self):
self.thread_stop = True
LOG.info("Shutdown Aprsdis client.")
@wrapt.synchronized(lock)
def send(self, packet: core.Packet):
"""Send an APRS Message object."""
self.sendall(packet.raw)
def is_alive(self):
"""If the connection is alive or not."""
return self._connected
def _socket_readlines(self, blocking=False):
"""
Generator for complete lines, received from the server
"""
try:
self.sock.setblocking(0)
except OSError as e:
self.logger.error(f"socket error when setblocking(0): {str(e)}")
raise aprslib.ConnectionDrop("connection dropped")
while not self.thread_stop:
short_buf = b""
newline = b"\r\n"
# set a select timeout, so we get a chance to exit
# when user hits CTRL-C
readable, writable, exceptional = select.select(
[self.sock],
[],
[],
self.select_timeout,
)
if not readable:
if not blocking:
break
else:
continue
try:
short_buf = self.sock.recv(4096)
# sock.recv returns empty if the connection drops
if not short_buf:
if not blocking:
# We could just not be blocking, so empty is expected
continue
else:
self.logger.error("socket.recv(): returned empty")
raise aprslib.ConnectionDrop("connection dropped")
except OSError as e:
# self.logger.error("socket error on recv(): %s" % str(e))
if "Resource temporarily unavailable" in str(e):
if not blocking:
if len(self.buf) == 0:
break
self.buf += short_buf
while newline in self.buf:
line, self.buf = self.buf.split(newline, 1)
yield line
def _send_login(self):
"""
Sends login string to server
"""
login_str = "user {0} pass {1} vers github.com/craigerl/aprsd {3}{2}\r\n"
login_str = login_str.format(
self.callsign,
self.passwd,
(" filter " + self.filter) if self.filter != "" else "",
aprsd.__version__,
)
self.logger.debug("Sending login information")
try:
self._sendall(login_str)
self.sock.settimeout(5)
test = self.sock.recv(len(login_str) + 100)
if is_py3:
test = test.decode("latin-1")
test = test.rstrip()
self.logger.debug("Server: '%s'", test)
if not test:
raise LoginError(f"Server Response Empty: '{test}'")
_, _, callsign, status, e = test.split(" ", 4)
s = e.split(",")
if len(s):
server_string = s[0].replace("server ", "")
else:
server_string = e.replace("server ", "")
if callsign == "":
raise LoginError("Server responded with empty callsign???")
if callsign != self.callsign:
raise LoginError(f"Server: {test}")
if status != "verified," and self.passwd != "-1":
raise LoginError("Password is incorrect")
if self.passwd == "-1":
self.logger.info("Login successful (receive only)")
else:
self.logger.info("Login successful")
self.logger.info(f"Connected to {server_string}")
self.server_string = server_string
except LoginError as e:
self.logger.error(str(e))
self.close()
raise
except Exception as e:
self.close()
self.logger.error(f"Failed to login '{e}'")
self.logger.exception(e)
raise LoginError("Failed to login")
def consumer(self, callback, blocking=True, immortal=False, raw=False):
"""
When a position sentence is received, it will be passed to the callback function
blocking: if true (default), runs forever, otherwise will return after one sentence
You can still exit the loop, by raising StopIteration in the callback function
immortal: When true, consumer will try to reconnect and stop propagation of Parse exceptions
if false (default), consumer will return
raw: when true, raw packet is passed to callback, otherwise the result from aprs.parse()
"""
if not self._connected:
raise ConnectionError("not connected to a server")
line = b""
while True and not self.thread_stop:
try:
for line in self._socket_readlines(blocking):
if line[0:1] != b"#":
self.aprsd_keepalive = datetime.datetime.now()
if raw:
callback(line)
else:
callback(self._parse(line))
else:
self.logger.debug("Server: %s", line.decode("utf8"))
self.aprsd_keepalive = datetime.datetime.now()
except ParseError as exp:
self.logger.log(
11,
"%s\n Packet: %s",
exp,
exp.packet,
)
except UnknownFormat as exp:
self.logger.log(
9,
"%s\n Packet: %s",
exp,
exp.packet,
)
except LoginError as exp:
self.logger.error("%s: %s", exp.__class__.__name__, exp)
except (KeyboardInterrupt, SystemExit):
raise
except (ConnectionDrop, ConnectionError):
self.close()
if not immortal:
raise
else:
self.connect(blocking=blocking)
continue
except GenericError:
pass
except StopIteration:
break
except Exception:
self.logger.error("APRS Packet: %s", line)
raise
if not blocking:
break

View File

@ -0,0 +1,73 @@
import logging
import threading
import time
import aprslib
from oslo_config import cfg
import wrapt
from aprsd import conf # noqa
from aprsd.packets import core
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSDFakeClient(metaclass=trace.TraceWrapperMetaclass):
'''Fake client for testing.'''
# flag to tell us to stop
thread_stop = False
lock = threading.Lock()
path = []
def __init__(self):
LOG.info("Starting APRSDFakeClient client.")
self.path = ["WIDE1-1", "WIDE2-1"]
def stop(self):
self.thread_stop = True
LOG.info("Shutdown APRSDFakeClient client.")
def is_alive(self):
"""If the connection is alive or not."""
return not self.thread_stop
@wrapt.synchronized(lock)
def send(self, packet: core.Packet):
"""Send an APRS Message object."""
LOG.info(f"Sending packet: {packet}")
payload = None
if isinstance(packet, core.Packet):
packet.prepare()
payload = packet.payload.encode("US-ASCII")
if packet.path:
packet.path
else:
self.path
else:
msg_payload = f"{packet.raw}{{{str(packet.msgNo)}"
payload = (
":{:<9}:{}".format(
packet.to_call,
msg_payload,
)
).encode("US-ASCII")
LOG.debug(
f"FAKE::Send '{payload}' TO '{packet.to_call}' From "
f"'{packet.from_call}' with PATH \"{self.path}\"",
)
def consumer(self, callback, blocking=False, immortal=False, raw=False):
LOG.debug("Start non blocking FAKE consumer")
# Generate packets here?
raw = "GTOWN>APDW16,WIDE1-1,WIDE2-1:}KM6LYW-9>APZ100,TCPIP,GTOWN*::KM6LYW :KM6LYW: 19 Miles SW"
pkt_raw = aprslib.parse(raw)
pkt = core.factory(pkt_raw)
callback(packet=pkt)
LOG.debug(f"END blocking FAKE consumer {self}")
time.sleep(8)

View File

@ -0,0 +1,119 @@
import logging
from ax253 import Frame
import kiss
from oslo_config import cfg
from aprsd import conf # noqa
from aprsd.packets import core
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class KISS3Client:
path = []
def __init__(self):
self.setup()
def is_alive(self):
return True
def setup(self):
# we can be TCP kiss or Serial kiss
if CONF.kiss_serial.enabled:
LOG.debug(
"KISS({}) Serial connection to {}".format(
kiss.__version__,
CONF.kiss_serial.device,
),
)
self.kiss = kiss.SerialKISS(
port=CONF.kiss_serial.device,
speed=CONF.kiss_serial.baudrate,
strip_df_start=True,
)
self.path = CONF.kiss_serial.path
elif CONF.kiss_tcp.enabled:
LOG.debug(
"KISS({}) TCP Connection to {}:{}".format(
kiss.__version__,
CONF.kiss_tcp.host,
CONF.kiss_tcp.port,
),
)
self.kiss = kiss.TCPKISS(
host=CONF.kiss_tcp.host,
port=CONF.kiss_tcp.port,
strip_df_start=True,
)
self.path = CONF.kiss_tcp.path
LOG.debug("Starting KISS interface connection")
self.kiss.start()
@trace.trace
def stop(self):
try:
self.kiss.stop()
self.kiss.loop.call_soon_threadsafe(
self.kiss.protocol.transport.close,
)
except Exception as ex:
LOG.exception(ex)
def set_filter(self, filter):
# This does nothing right now.
pass
def parse_frame(self, frame_bytes):
try:
frame = Frame.from_bytes(frame_bytes)
# Now parse it with aprslib
kwargs = {
"frame": frame,
}
self._parse_callback(**kwargs)
except Exception as ex:
LOG.error("Failed to parse bytes received from KISS interface.")
LOG.exception(ex)
def consumer(self, callback):
LOG.debug("Start blocking KISS consumer")
self._parse_callback = callback
self.kiss.read(callback=self.parse_frame, min_frames=None)
LOG.debug(f"END blocking KISS consumer {self.kiss}")
def send(self, packet):
"""Send an APRS Message object."""
payload = None
path = self.path
if isinstance(packet, core.Packet):
packet.prepare()
payload = packet.payload.encode("US-ASCII")
if packet.path:
path = packet.path
else:
msg_payload = f"{packet.raw}{{{str(packet.msgNo)}"
payload = (
":{:<9}:{}".format(
packet.to_call,
msg_payload,
)
).encode("US-ASCII")
LOG.debug(
f"KISS Send '{payload}' TO '{packet.to_call}' From "
f"'{packet.from_call}' with PATH '{path}'",
)
frame = Frame.ui(
destination="APZ100",
source=packet.from_call,
path=path,
info=payload,
)
self.kiss.write(frame)

88
aprsd/client/factory.py Normal file
View File

@ -0,0 +1,88 @@
import logging
from typing import Callable, Protocol, runtime_checkable
from aprsd import exception
from aprsd.packets import core
LOG = logging.getLogger("APRSD")
@runtime_checkable
class Client(Protocol):
def __init__(self):
pass
def connect(self) -> bool:
pass
def disconnect(self) -> bool:
pass
def decode_packet(self, *args, **kwargs) -> type[core.Packet]:
pass
def is_enabled(self) -> bool:
pass
def is_configured(self) -> bool:
pass
def transport(self) -> str:
pass
def send(self, message: str) -> bool:
pass
def setup_connection(self) -> None:
pass
class ClientFactory:
_instance = None
clients = []
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
# Put any initialization here.
return cls._instance
def __init__(self):
self.clients: list[Callable] = []
def register(self, aprsd_client: Callable):
if isinstance(aprsd_client, Client):
raise ValueError("Client must be a subclass of Client protocol")
self.clients.append(aprsd_client)
def create(self, key=None):
for client in self.clients:
if client.is_enabled():
return client()
raise Exception("No client is configured!!")
def is_client_enabled(self):
"""Make sure at least one client is enabled."""
enabled = False
for client in self.clients:
if client.is_enabled():
enabled = True
return enabled
def is_client_configured(self):
enabled = False
for client in self.clients:
try:
if client.is_configured():
enabled = True
except exception.MissingConfigOptionException as ex:
LOG.error(ex.message)
return False
except exception.ConfigOptionBogusDefaultException as ex:
LOG.error(ex.message)
return False
return enabled

48
aprsd/client/fake.py Normal file
View File

@ -0,0 +1,48 @@
import logging
from oslo_config import cfg
from aprsd import client
from aprsd.client import base
from aprsd.client.drivers import fake as fake_driver
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSDFakeClient(base.APRSClient, metaclass=trace.TraceWrapperMetaclass):
def stats(self) -> dict:
return {}
@staticmethod
def is_enabled():
if CONF.fake_client.enabled:
return True
return False
@staticmethod
def is_configured():
return APRSDFakeClient.is_enabled()
def is_alive(self):
return True
def close(self):
pass
def setup_connection(self):
self.connected = True
return fake_driver.APRSDFakeClient()
@staticmethod
def transport():
return client.TRANSPORT_FAKE
def decode_packet(self, *args, **kwargs):
LOG.debug(f"kwargs {kwargs}")
pkt = kwargs["packet"]
LOG.debug(f"Got an APRS Fake Packet '{pkt}'")
return pkt

103
aprsd/client/kiss.py Normal file
View File

@ -0,0 +1,103 @@
import logging
import aprslib
from oslo_config import cfg
from aprsd import client, exception
from aprsd.client import base
from aprsd.client.drivers import kiss
from aprsd.packets import core
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class KISSClient(base.APRSClient):
_client = None
def stats(self) -> dict:
stats = {}
if self.is_configured():
return {
"transport": self.transport(),
}
return stats
@staticmethod
def is_enabled():
"""Return if tcp or serial KISS is enabled."""
if CONF.kiss_serial.enabled:
return True
if CONF.kiss_tcp.enabled:
return True
return False
@staticmethod
def is_configured():
# Ensure that the config vars are correctly set
if KISSClient.is_enabled():
transport = KISSClient.transport()
if transport == client.TRANSPORT_SERIALKISS:
if not CONF.kiss_serial.device:
LOG.error("KISS serial enabled, but no device is set.")
raise exception.MissingConfigOptionException(
"kiss_serial.device is not set.",
)
elif transport == client.TRANSPORT_TCPKISS:
if not CONF.kiss_tcp.host:
LOG.error("KISS TCP enabled, but no host is set.")
raise exception.MissingConfigOptionException(
"kiss_tcp.host is not set.",
)
return True
return False
def is_alive(self):
if self._client:
return self._client.is_alive()
else:
return False
def close(self):
if self._client:
self._client.stop()
@staticmethod
def transport():
if CONF.kiss_serial.enabled:
return client.TRANSPORT_SERIALKISS
if CONF.kiss_tcp.enabled:
return client.TRANSPORT_TCPKISS
def decode_packet(self, *args, **kwargs):
"""We get a frame, which has to be decoded."""
LOG.debug(f"kwargs {kwargs}")
frame = kwargs["frame"]
LOG.debug(f"Got an APRS Frame '{frame}'")
# try and nuke the * from the fromcall sign.
# frame.header._source._ch = False
# payload = str(frame.payload.decode())
# msg = f"{str(frame.header)}:{payload}"
# msg = frame.tnc2
# LOG.debug(f"Decoding {msg}")
raw = aprslib.parse(str(frame))
packet = core.factory(raw)
if isinstance(packet, core.ThirdParty):
return packet.subpacket
else:
return packet
def setup_connection(self):
self._client = kiss.KISS3Client()
self.connected = True
return self._client
def consumer(self, callback, blocking=False, immortal=False, raw=False):
self._client.consumer(callback)

38
aprsd/client/stats.py Normal file
View File

@ -0,0 +1,38 @@
import threading
from oslo_config import cfg
import wrapt
from aprsd import client
from aprsd.utils import singleton
CONF = cfg.CONF
@singleton
class APRSClientStats:
lock = threading.Lock()
@wrapt.synchronized(lock)
def stats(self, serializable=False):
cl = client.client_factory.create()
stats = {
"transport": cl.transport(),
"filter": cl.filter,
"connected": cl.connected,
}
if cl.transport() == client.TRANSPORT_APRSIS:
stats["server_string"] = cl.client.server_string
keepalive = cl.client.aprsd_keepalive
if serializable:
keepalive = keepalive.isoformat()
stats["server_keepalive"] = keepalive
elif cl.transport() == client.TRANSPORT_TCPKISS:
stats["host"] = CONF.kiss_tcp.host
stats["port"] = CONF.kiss_tcp.port
elif cl.transport() == client.TRANSPORT_SERIALKISS:
stats["device"] = CONF.kiss_serial.device
return stats

0
aprsd/cmds/__init__.py Normal file
View File

22
aprsd/cmds/completion.py Normal file
View File

@ -0,0 +1,22 @@
import click
import click.shell_completion
from aprsd.main import cli
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
@cli.command()
@click.argument("shell", type=click.Choice(list(click.shell_completion._available_shells)))
def completion(shell):
"""Show the shell completion code"""
from click.utils import _detect_program_name
cls = click.shell_completion.get_completion_class(shell)
prog_name = _detect_program_name()
complete_var = f"_{prog_name}_COMPLETE".replace("-", "_").upper()
print(cls(cli, {}, prog_name, complete_var).source())
print("# Add the following line to your shell configuration file to have aprsd command line completion")
print("# but remove the leading '#' character.")
print(f"# eval \"$(aprsd completion {shell})\"")

162
aprsd/cmds/dev.py Normal file
View File

@ -0,0 +1,162 @@
#
# Dev.py is used to help develop plugins
#
#
# python included libs
import logging
import click
from oslo_config import cfg
from aprsd import cli_helper, conf, packets, plugin
# local imports here
from aprsd.client import base
from aprsd.main import cli
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
@cli.group(help="Development type subcommands", context_settings=CONTEXT_SETTINGS)
@click.pass_context
def dev(ctx):
pass
@dev.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
show_envvar=True,
help="What callsign to send the message from.",
)
@click.option(
"-p",
"--plugin",
"plugin_path",
show_default=True,
default=None,
help="The plugin to run. Ex: aprsd.plugins.ping.PingPlugin",
)
@click.option(
"-a",
"--all",
"load_all",
show_default=True,
is_flag=True,
default=False,
help="Load all the plugins in config?",
)
@click.option(
"-n",
"--num",
"number",
show_default=True,
default=1,
help="Number of times to call the plugin",
)
@click.argument("message", nargs=-1, required=True)
@click.pass_context
@cli_helper.process_standard_options
def test_plugin(
ctx,
aprs_login,
plugin_path,
load_all,
number,
message,
):
"""Test an individual APRSD plugin given a python path."""
CONF.log_opt_values(LOG, logging.DEBUG)
if not aprs_login:
if CONF.aprs_network.login == conf.client.DEFAULT_LOGIN:
click.echo("Must set --aprs_login or APRS_LOGIN")
ctx.exit(-1)
return
else:
fromcall = CONF.aprs_network.login
else:
fromcall = aprs_login
if not plugin_path:
click.echo(ctx.get_help())
click.echo("")
click.echo("Failed to provide -p option to test a plugin")
ctx.exit(-1)
return
if type(message) is tuple:
message = " ".join(message)
if CONF.trace_enabled:
trace.setup_tracing(["method", "api"])
base.APRSClient()
pm = plugin.PluginManager()
if load_all:
pm.setup_plugins()
obj = pm._create_class(plugin_path, plugin.APRSDPluginBase)
if not obj:
click.echo(ctx.get_help())
click.echo("")
ctx.fail(f"Failed to create object from plugin path '{plugin_path}'")
ctx.exit()
# Register the plugin they wanted tested.
LOG.info(
"Testing plugin {} Version {}".format(
obj.__class__, obj.version,
),
)
pm.register_msg(obj)
packet = packets.MessagePacket(
from_call=fromcall,
to_call=CONF.callsign,
msgNo=1,
message_text=message,
)
LOG.info(f"P'{plugin_path}' F'{fromcall}' C'{message}'")
for x in range(number):
replies = pm.run(packet)
# Plugin might have threads, so lets stop them so we can exit.
# obj.stop_threads()
for reply in replies:
if isinstance(reply, list):
# one of the plugins wants to send multiple messages
for subreply in reply:
if isinstance(subreply, packets.Packet):
LOG.info(subreply)
else:
LOG.info(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=fromcall,
message_text=subreply,
),
)
elif isinstance(reply, packets.Packet):
# We have a message based object.
LOG.info(reply)
else:
# A plugin can return a null message flag which signals
# us that they processed the message correctly, but have
# nothing to reply with, so we avoid replying with a
# usage string
if reply is not packets.NULL_MESSAGE:
LOG.info(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=fromcall,
message_text=reply,
),
)
pm.stop()

156
aprsd/cmds/fetch_stats.py Normal file
View File

@ -0,0 +1,156 @@
# Fetch active stats from a remote running instance of aprsd admin web interface.
import logging
import click
from oslo_config import cfg
import requests
from rich.console import Console
from rich.table import Table
# local imports here
import aprsd
from aprsd import cli_helper
from aprsd.main import cli
# setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10
LOG = logging.getLogger("APRSD")
CONF = cfg.CONF
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--host", type=str,
default=None,
help="IP address of the remote aprsd admin web ui fetch stats from.",
)
@click.option(
"--port", type=int,
default=None,
help="Port of the remote aprsd web admin interface to fetch stats from.",
)
@click.pass_context
@cli_helper.process_standard_options
def fetch_stats(ctx, host, port):
"""Fetch stats from a APRSD admin web interface."""
console = Console()
console.print(f"APRSD Fetch-Stats started version: {aprsd.__version__}")
CONF.log_opt_values(LOG, logging.DEBUG)
if not host:
host = CONF.admin.web_ip
if not port:
port = CONF.admin.web_port
msg = f"Fetching stats from {host}:{port}"
console.print(msg)
with console.status(msg):
response = requests.get(f"http://{host}:{port}/stats", timeout=120)
if not response:
console.print(
f"Failed to fetch stats from {host}:{port}?",
style="bold red",
)
return
stats = response.json()
if not stats:
console.print(
f"Failed to fetch stats from aprsd admin ui at {host}:{port}",
style="bold red",
)
return
aprsd_title = (
"APRSD "
f"[bold cyan]v{stats['APRSDStats']['version']}[/] "
f"Callsign [bold green]{stats['APRSDStats']['callsign']}[/] "
f"Uptime [bold yellow]{stats['APRSDStats']['uptime']}[/]"
)
console.rule(f"Stats from {host}:{port}")
console.print("\n\n")
console.rule(aprsd_title)
# Show the connection to APRS
# It can be a connection to an APRS-IS server or a local TNC via KISS or KISSTCP
if "aprs-is" in stats:
title = f"APRS-IS Connection {stats['APRSClientStats']['server_string']}"
table = Table(title=title)
table.add_column("Key")
table.add_column("Value")
for key, value in stats["APRSClientStats"].items():
table.add_row(key, value)
console.print(table)
threads_table = Table(title="Threads")
threads_table.add_column("Name")
threads_table.add_column("Alive?")
for name, alive in stats["APRSDThreadList"].items():
threads_table.add_row(name, str(alive))
console.print(threads_table)
packet_totals = Table(title="Packet Totals")
packet_totals.add_column("Key")
packet_totals.add_column("Value")
packet_totals.add_row("Total Received", str(stats["PacketList"]["rx"]))
packet_totals.add_row("Total Sent", str(stats["PacketList"]["tx"]))
console.print(packet_totals)
# Show each of the packet types
packets_table = Table(title="Packets By Type")
packets_table.add_column("Packet Type")
packets_table.add_column("TX")
packets_table.add_column("RX")
for key, value in stats["PacketList"]["packets"].items():
packets_table.add_row(key, str(value["tx"]), str(value["rx"]))
console.print(packets_table)
if "plugins" in stats:
count = len(stats["PluginManager"])
plugins_table = Table(title=f"Plugins ({count})")
plugins_table.add_column("Plugin")
plugins_table.add_column("Enabled")
plugins_table.add_column("Version")
plugins_table.add_column("TX")
plugins_table.add_column("RX")
plugins = stats["PluginManager"]
for key, value in plugins.items():
plugins_table.add_row(
key,
str(plugins[key]["enabled"]),
plugins[key]["version"],
str(plugins[key]["tx"]),
str(plugins[key]["rx"]),
)
console.print(plugins_table)
seen_list = stats.get("SeenList")
if seen_list:
count = len(seen_list)
seen_table = Table(title=f"Seen List ({count})")
seen_table.add_column("Callsign")
seen_table.add_column("Message Count")
seen_table.add_column("Last Heard")
for key, value in seen_list.items():
seen_table.add_row(key, str(value["count"]), value["last"])
console.print(seen_table)
watch_list = stats.get("WatchList")
if watch_list:
count = len(watch_list)
watch_table = Table(title=f"Watch List ({count})")
watch_table.add_column("Callsign")
watch_table.add_column("Last Heard")
for key, value in watch_list.items():
watch_table.add_row(key, value["last"])
console.print(watch_table)

86
aprsd/cmds/healthcheck.py Normal file
View File

@ -0,0 +1,86 @@
#
# Used to fetch the stats url and determine if
# aprsd server is 'healthy'
#
#
# python included libs
import datetime
import logging
import sys
import click
from oslo_config import cfg
from rich.console import Console
import aprsd
from aprsd import cli_helper
from aprsd import conf # noqa
# local imports here
from aprsd.main import cli
from aprsd.threads import stats as stats_threads
# setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
console = Console()
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--timeout",
show_default=True,
default=3,
help="How long to wait for healtcheck url to come back",
)
@click.pass_context
@cli_helper.process_standard_options
def healthcheck(ctx, timeout):
"""Check the health of the running aprsd server."""
ver_str = f"APRSD HealthCheck version: {aprsd.__version__}"
console.log(ver_str)
with console.status(ver_str):
try:
stats_obj = stats_threads.StatsStore()
stats_obj.load()
stats = stats_obj.data
# console.print(stats)
except Exception as ex:
console.log(f"Failed to load stats: '{ex}'")
sys.exit(-1)
else:
now = datetime.datetime.now()
if not stats:
console.log("No stats from aprsd")
sys.exit(-1)
email_stats = stats.get("EmailStats")
if email_stats:
email_thread_last_update = email_stats["last_check_time"]
if email_thread_last_update != "never":
d = now - email_thread_last_update
max_timeout = {"hours": 0.0, "minutes": 5, "seconds": 0}
max_delta = datetime.timedelta(**max_timeout)
if d > max_delta:
console.log(f"Email thread is very old! {d}")
sys.exit(-1)
client_stats = stats.get("APRSClientStats")
if not client_stats:
console.log("No APRSClientStats")
sys.exit(-1)
else:
aprsis_last_update = client_stats["server_keepalive"]
d = now - aprsis_last_update
max_timeout = {"hours": 0.0, "minutes": 5, "seconds": 0}
max_delta = datetime.timedelta(**max_timeout)
if d > max_delta:
LOG.error(f"APRS-IS last update is very old! {d}")
sys.exit(-1)
console.log("OK")
sys.exit(0)

319
aprsd/cmds/list_plugins.py Normal file
View File

@ -0,0 +1,319 @@
import fnmatch
import importlib
import inspect
import logging
import os
import pkgutil
import re
import sys
from traceback import print_tb
from urllib.parse import urljoin
from bs4 import BeautifulSoup
import click
import requests
from rich.console import Console
from rich.table import Table
from rich.text import Text
from thesmuggler import smuggle
from aprsd import cli_helper
from aprsd import plugin as aprsd_plugin
from aprsd.main import cli
from aprsd.plugins import (
email, fortune, location, notify, ping, time, version, weather,
)
LOG = logging.getLogger("APRSD")
PYPI_URL = "https://pypi.org/search/"
def onerror(name):
print(f"Error importing module {name}")
type, value, traceback = sys.exc_info()
print_tb(traceback)
def is_plugin(obj):
for c in inspect.getmro(obj):
if issubclass(c, aprsd_plugin.APRSDPluginBase):
return True
return False
def plugin_type(obj):
for c in inspect.getmro(obj):
if issubclass(c, aprsd_plugin.APRSDRegexCommandPluginBase):
return "RegexCommand"
if issubclass(c, aprsd_plugin.APRSDWatchListPluginBase):
return "WatchList"
if issubclass(c, aprsd_plugin.APRSDPluginBase):
return "APRSDPluginBase"
return "Unknown"
def walk_package(package):
return pkgutil.walk_packages(
package.__path__,
package.__name__ + ".",
onerror=onerror,
)
def get_module_info(package_name, module_name, module_path):
if not os.path.exists(module_path):
return None
dir_path = os.path.realpath(module_path)
pattern = "*.py"
obj_list = []
for path, _subdirs, files in os.walk(dir_path):
for name in files:
if fnmatch.fnmatch(name, pattern):
module = smuggle(f"{path}/{name}")
for mem_name, obj in inspect.getmembers(module):
if inspect.isclass(obj) and is_plugin(obj):
obj_list.append(
{
"package": package_name,
"name": mem_name, "obj": obj,
"version": obj.version,
"path": f"{'.'.join([module_name, obj.__name__])}",
},
)
return obj_list
def _get_installed_aprsd_items():
# installed plugins
plugins = {}
extensions = {}
for finder, name, ispkg in pkgutil.iter_modules():
if name.startswith("aprsd_"):
print(f"Found aprsd_ module: {name}")
if ispkg:
module = importlib.import_module(name)
pkgs = walk_package(module)
for pkg in pkgs:
pkg_info = get_module_info(module.__name__, pkg.name, module.__path__[0])
if "plugin" in name:
plugins[name] = pkg_info
elif "extension" in name:
extensions[name] = pkg_info
return plugins, extensions
def get_installed_plugins():
# installed plugins
plugins, extensions = _get_installed_aprsd_items()
return plugins
def get_installed_extensions():
# installed plugins
plugins, extensions = _get_installed_aprsd_items()
return extensions
def show_built_in_plugins(console):
modules = [email, fortune, location, notify, ping, time, version, weather]
plugins = []
for module in modules:
entries = inspect.getmembers(module, inspect.isclass)
for entry in entries:
cls = entry[1]
if issubclass(cls, aprsd_plugin.APRSDPluginBase):
info = {
"name": cls.__qualname__,
"path": f"{cls.__module__}.{cls.__qualname__}",
"version": cls.version,
"docstring": cls.__doc__,
"short_desc": cls.short_description,
}
if issubclass(cls, aprsd_plugin.APRSDRegexCommandPluginBase):
info["command_regex"] = cls.command_regex
info["type"] = "RegexCommand"
if issubclass(cls, aprsd_plugin.APRSDWatchListPluginBase):
info["type"] = "WatchList"
plugins.append(info)
plugins = sorted(plugins, key=lambda i: i["name"])
table = Table(
title="[not italic]:snake:[/] [bold][magenta]APRSD Built-in Plugins [not italic]:snake:[/]",
)
table.add_column("Plugin Name", style="cyan", no_wrap=True)
table.add_column("Info", style="bold yellow")
table.add_column("Type", style="bold green")
table.add_column("Plugin Path", style="bold blue")
for entry in plugins:
table.add_row(entry["name"], entry["short_desc"], entry["type"], entry["path"])
console.print(table)
def _get_pypi_packages():
query = "aprsd"
snippets = []
s = requests.Session()
for page in range(1, 3):
params = {"q": query, "page": page}
r = s.get(PYPI_URL, params=params)
soup = BeautifulSoup(r.text, "html.parser")
snippets += soup.select('a[class*="snippet"]')
if not hasattr(s, "start_url"):
s.start_url = r.url.rsplit("&page", maxsplit=1).pop(0)
return snippets
def show_pypi_plugins(installed_plugins, console):
snippets = _get_pypi_packages()
title = Text.assemble(
("Pypi.org APRSD Installable Plugin Packages\n\n", "bold magenta"),
("Install any of the following plugins with\n", "bold yellow"),
("'pip install ", "bold white"),
("<Plugin Package Name>'", "cyan"),
)
table = Table(title=title)
table.add_column("Plugin Package Name", style="cyan", no_wrap=True)
table.add_column("Description", style="yellow")
table.add_column("Version", style="yellow", justify="center")
table.add_column("Released", style="bold green", justify="center")
table.add_column("Installed?", style="red", justify="center")
for snippet in snippets:
link = urljoin(PYPI_URL, snippet.get("href"))
package = re.sub(r"\s+", " ", snippet.select_one('span[class*="name"]').text.strip())
version = re.sub(r"\s+", " ", snippet.select_one('span[class*="version"]').text.strip())
created = re.sub(r"\s+", " ", snippet.select_one('span[class*="created"]').text.strip())
description = re.sub(r"\s+", " ", snippet.select_one('p[class*="description"]').text.strip())
emoji = ":open_file_folder:"
if "aprsd-" not in package or "-plugin" not in package:
continue
under = package.replace("-", "_")
if under in installed_plugins:
installed = "Yes"
else:
installed = "No"
table.add_row(
f"[link={link}]{emoji}[/link] {package}",
description, version, created, installed,
)
console.print("\n")
console.print(table)
def show_pypi_extensions(installed_extensions, console):
snippets = _get_pypi_packages()
title = Text.assemble(
("Pypi.org APRSD Installable Extension Packages\n\n", "bold magenta"),
("Install any of the following extensions by running\n", "bold yellow"),
("'pip install ", "bold white"),
("<Plugin Package Name>'", "cyan"),
)
table = Table(title=title)
table.add_column("Extension Package Name", style="cyan", no_wrap=True)
table.add_column("Description", style="yellow")
table.add_column("Version", style="yellow", justify="center")
table.add_column("Released", style="bold green", justify="center")
table.add_column("Installed?", style="red", justify="center")
for snippet in snippets:
link = urljoin(PYPI_URL, snippet.get("href"))
package = re.sub(r"\s+", " ", snippet.select_one('span[class*="name"]').text.strip())
version = re.sub(r"\s+", " ", snippet.select_one('span[class*="version"]').text.strip())
created = re.sub(r"\s+", " ", snippet.select_one('span[class*="created"]').text.strip())
description = re.sub(r"\s+", " ", snippet.select_one('p[class*="description"]').text.strip())
emoji = ":open_file_folder:"
if "aprsd-" not in package or "-extension" not in package:
continue
under = package.replace("-", "_")
if under in installed_extensions:
installed = "Yes"
else:
installed = "No"
table.add_row(
f"[link={link}]{emoji}[/link] {package}",
description, version, created, installed,
)
console.print("\n")
console.print(table)
def show_installed_plugins(installed_plugins, console):
if not installed_plugins:
return
table = Table(
title="[not italic]:snake:[/] [bold][magenta]APRSD Installed 3rd party Plugins [not italic]:snake:[/]",
)
table.add_column("Package Name", style=" bold white", no_wrap=True)
table.add_column("Plugin Name", style="cyan", no_wrap=True)
table.add_column("Version", style="yellow", justify="center")
table.add_column("Type", style="bold green")
table.add_column("Plugin Path", style="bold blue")
for name in installed_plugins:
for plugin in installed_plugins[name]:
table.add_row(
name.replace("_", "-"),
plugin["name"],
plugin["version"],
plugin_type(plugin["obj"]),
plugin["path"],
)
console.print("\n")
console.print(table)
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.pass_context
@cli_helper.process_standard_options_no_config
def list_plugins(ctx):
"""List the built in plugins available to APRSD."""
console = Console()
with console.status("Show Built-in Plugins") as status:
show_built_in_plugins(console)
status.update("Fetching pypi.org plugins")
installed_plugins = get_installed_plugins()
show_pypi_plugins(installed_plugins, console)
status.update("Looking for installed APRSD plugins")
show_installed_plugins(installed_plugins, console)
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.pass_context
@cli_helper.process_standard_options_no_config
def list_extensions(ctx):
"""List the built in plugins available to APRSD."""
console = Console()
with console.status("Show APRSD Extensions") as status:
status.update("Fetching pypi.org APRSD Extensions")
installed_extensions = get_installed_extensions()
show_pypi_extensions(installed_extensions, console)

230
aprsd/cmds/listen.py Normal file
View File

@ -0,0 +1,230 @@
#
# License GPLv2
#
# python included libs
import datetime
import logging
import signal
import sys
import time
import click
from oslo_config import cfg
from rich.console import Console
# local imports here
import aprsd
from aprsd import cli_helper, packets, plugin, threads
from aprsd.client import client_factory
from aprsd.main import cli
from aprsd.packets import collector as packet_collector
from aprsd.packets import log as packet_log
from aprsd.packets import seen_list
from aprsd.stats import collector
from aprsd.threads import keep_alive, rx
from aprsd.threads import stats as stats_thread
# setup the global logger
# log.basicConfig(level=log.DEBUG) # level=10
LOG = logging.getLogger("APRSD")
CONF = cfg.CONF
console = Console()
def signal_handler(sig, frame):
threads.APRSDThreadList().stop_all()
if "subprocess" not in str(frame):
LOG.info(
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
datetime.datetime.now(),
),
)
time.sleep(5)
LOG.info(collector.Collector().collect())
class APRSDListenThread(rx.APRSDRXThread):
def __init__(self, packet_queue, packet_filter=None, plugin_manager=None):
super().__init__(packet_queue)
self.packet_filter = packet_filter
self.plugin_manager = plugin_manager
if self.plugin_manager:
LOG.info(f"Plugins {self.plugin_manager.get_message_plugins()}")
def process_packet(self, *args, **kwargs):
packet = self._client.decode_packet(*args, **kwargs)
filters = {
packets.Packet.__name__: packets.Packet,
packets.AckPacket.__name__: packets.AckPacket,
packets.BeaconPacket.__name__: packets.BeaconPacket,
packets.GPSPacket.__name__: packets.GPSPacket,
packets.MessagePacket.__name__: packets.MessagePacket,
packets.MicEPacket.__name__: packets.MicEPacket,
packets.ObjectPacket.__name__: packets.ObjectPacket,
packets.StatusPacket.__name__: packets.StatusPacket,
packets.ThirdPartyPacket.__name__: packets.ThirdPartyPacket,
packets.WeatherPacket.__name__: packets.WeatherPacket,
packets.UnknownPacket.__name__: packets.UnknownPacket,
}
if self.packet_filter:
filter_class = filters[self.packet_filter]
if isinstance(packet, filter_class):
packet_log.log(packet)
if self.plugin_manager:
# Don't do anything with the reply
# This is the listen only command.
self.plugin_manager.run(packet)
else:
packet_log.log(packet)
if self.plugin_manager:
# Don't do anything with the reply.
# This is the listen only command.
self.plugin_manager.run(packet)
packet_collector.PacketCollector().rx(packet)
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
show_envvar=True,
help="What callsign to send the message from.",
)
@click.option(
"--aprs-password",
envvar="APRS_PASSWORD",
show_envvar=True,
help="the APRS-IS password for APRS_LOGIN",
)
@click.option(
"--packet-filter",
type=click.Choice(
[
packets.AckPacket.__name__,
packets.BeaconPacket.__name__,
packets.GPSPacket.__name__,
packets.MicEPacket.__name__,
packets.MessagePacket.__name__,
packets.ObjectPacket.__name__,
packets.RejectPacket.__name__,
packets.StatusPacket.__name__,
packets.ThirdPartyPacket.__name__,
packets.UnknownPacket.__name__,
packets.WeatherPacket.__name__,
],
case_sensitive=False,
),
help="Filter by packet type",
)
@click.option(
"--load-plugins",
default=False,
is_flag=True,
help="Load plugins as enabled in aprsd.conf ?",
)
@click.argument(
"filter",
nargs=-1,
required=True,
)
@click.pass_context
@cli_helper.process_standard_options
def listen(
ctx,
aprs_login,
aprs_password,
packet_filter,
load_plugins,
filter,
):
"""Listen to packets on the APRS-IS Network based on FILTER.
FILTER is the APRS Filter to use.\n
see http://www.aprs-is.net/javAPRSFilter.aspx\n
r/lat/lon/dist - Range Filter Pass posits and objects within dist km from lat/lon.\n
p/aa/bb/cc... - Prefix Filter Pass traffic with fromCall that start with aa or bb or cc.\n
b/call1/call2... - Budlist Filter Pass all traffic from exact call: call1, call2, ... (* wild card allowed) \n
o/obj1/obj2... - Object Filter Pass all objects with the exact name of obj1, obj2, ... (* wild card allowed)\n
"""
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
if not aprs_login:
click.echo(ctx.get_help())
click.echo("")
ctx.fail("Must set --aprs-login or APRS_LOGIN")
ctx.exit()
if not aprs_password:
click.echo(ctx.get_help())
click.echo("")
ctx.fail("Must set --aprs-password or APRS_PASSWORD")
ctx.exit()
# CONF.aprs_network.login = aprs_login
# config["aprs"]["password"] = aprs_password
LOG.info(f"APRSD Listen Started version: {aprsd.__version__}")
CONF.log_opt_values(LOG, logging.DEBUG)
collector.Collector()
# Try and load saved MsgTrack list
LOG.debug("Loading saved MsgTrack object.")
# Initialize the client factory and create
# The correct client object ready for use
# Make sure we have 1 client transport enabled
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
sys.exit(-1)
# Creates the client object
LOG.info("Creating client connection")
aprs_client = client_factory.create()
LOG.info(aprs_client)
LOG.debug(f"Filter by '{filter}'")
aprs_client.set_filter(filter)
keepalive = keep_alive.KeepAliveThread()
# keepalive.start()
if not CONF.enable_seen_list:
# just deregister the class from the packet collector
packet_collector.PacketCollector().unregister(seen_list.SeenList)
pm = None
pm = plugin.PluginManager()
if load_plugins:
LOG.info("Loading plugins")
pm.setup_plugins(load_help_plugin=False)
else:
LOG.warning(
"Not Loading any plugins use --load-plugins to load what's "
"defined in the config file.",
)
stats = stats_thread.APRSDStatsStoreThread()
stats.start()
LOG.debug("Create APRSDListenThread")
listen_thread = APRSDListenThread(
packet_queue=threads.packet_queue,
packet_filter=packet_filter,
plugin_manager=pm,
)
LOG.debug("Start APRSDListenThread")
listen_thread.start()
keepalive.start()
LOG.debug("keepalive Join")
keepalive.join()
LOG.debug("listen_thread Join")
listen_thread.join()
stats.join()

174
aprsd/cmds/send_message.py Normal file
View File

@ -0,0 +1,174 @@
import logging
import sys
import time
import aprslib
from aprslib.exceptions import LoginError
import click
from oslo_config import cfg
import aprsd
from aprsd import cli_helper, packets
from aprsd import conf # noqa : F401
from aprsd.client import client_factory
from aprsd.main import cli
from aprsd.packets import collector
from aprsd.threads import tx
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
show_envvar=True,
help="What callsign to send the message from. Defaults to config entry.",
)
@click.option(
"--aprs-password",
envvar="APRS_PASSWORD",
show_envvar=True,
help="the APRS-IS password for APRS_LOGIN. Defaults to config entry.",
)
@click.option(
"--no-ack",
"-n",
is_flag=True,
show_default=True,
default=False,
help="Don't wait for an ack, just sent it to APRS-IS and bail.",
)
@click.option(
"--wait-response",
"-w",
is_flag=True,
show_default=True,
default=False,
help="Wait for a response to the message?",
)
@click.option("--raw", default=None, help="Send a raw message. Implies --no-ack")
@click.argument("tocallsign", required=True)
@click.argument("command", nargs=-1, required=True)
@click.pass_context
@cli_helper.process_standard_options
def send_message(
ctx,
aprs_login,
aprs_password,
no_ack,
wait_response,
raw,
tocallsign,
command,
):
"""Send a message to a callsign via APRS_IS."""
global got_ack, got_response
quiet = ctx.obj["quiet"]
if not aprs_login:
if CONF.aprs_network.login == conf.client.DEFAULT_LOGIN:
click.echo("Must set --aprs_login or APRS_LOGIN")
ctx.exit(-1)
return
else:
aprs_login = CONF.aprs_network.login
if not aprs_password:
if not CONF.aprs_network.password:
click.echo("Must set --aprs-password or APRS_PASSWORD")
ctx.exit(-1)
return
else:
aprs_password = CONF.aprs_network.password
LOG.info(f"APRSD LISTEN Started version: {aprsd.__version__}")
if type(command) is tuple:
command = " ".join(command)
if not quiet:
if raw:
LOG.info(f"L'{aprs_login}' R'{raw}'")
else:
LOG.info(f"L'{aprs_login}' To'{tocallsign}' C'{command}'")
packets.PacketList()
packets.WatchList()
packets.SeenList()
got_ack = False
got_response = False
def rx_packet(packet):
global got_ack, got_response
cl = client_factory.create()
packet = cl.decode_packet(packet)
collector.PacketCollector().rx(packet)
packet.log("RX")
# LOG.debug("Got packet back {}".format(packet))
if isinstance(packet, packets.AckPacket):
got_ack = True
else:
got_response = True
from_call = packet.from_call
our_call = CONF.callsign.lower()
tx.send(
packets.AckPacket(
from_call=our_call,
to_call=from_call,
msgNo=packet.msgNo,
),
direct=True,
)
if got_ack:
if wait_response:
if got_response:
sys.exit(0)
else:
sys.exit(0)
try:
client_factory.create().client
except LoginError:
sys.exit(-1)
# Send a message
# then we setup a consumer to rx messages
# We should get an ack back as well as a new message
# we should bail after we get the ack and send an ack back for the
# message
if raw:
tx.send(
packets.Packet(from_call="", to_call="", raw=raw),
direct=True,
)
sys.exit(0)
else:
tx.send(
packets.MessagePacket(
from_call=aprs_login,
to_call=tocallsign,
message_text=command,
),
direct=True,
)
if no_ack:
sys.exit(0)
try:
# This will register a packet consumer with aprslib
# When new packets come in the consumer will process
# the packet
aprs_client = client_factory.create().client
aprs_client.consumer(rx_packet, raw=False)
except aprslib.exceptions.ConnectionDrop:
LOG.error("Connection dropped, reconnecting")
time.sleep(5)
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()
# is called
aprs_client.reset()

142
aprsd/cmds/server.py Normal file
View File

@ -0,0 +1,142 @@
import logging
import signal
import sys
import click
from oslo_config import cfg
import aprsd
from aprsd import cli_helper
from aprsd import main as aprsd_main
from aprsd import packets, plugin, threads, utils
from aprsd.client import client_factory
from aprsd.main import cli
from aprsd.packets import collector as packet_collector
from aprsd.packets import seen_list
from aprsd.threads import keep_alive, log_monitor, registry, rx
from aprsd.threads import stats as stats_thread
from aprsd.threads import tx
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
# main() ###
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"-f",
"--flush",
"flush",
is_flag=True,
show_default=True,
default=False,
help="Flush out all old aged messages on disk.",
)
@click.pass_context
@cli_helper.process_standard_options
def server(ctx, flush):
"""Start the aprsd server gateway process."""
signal.signal(signal.SIGINT, aprsd_main.signal_handler)
signal.signal(signal.SIGTERM, aprsd_main.signal_handler)
level, msg = utils._check_version()
if level:
LOG.warning(msg)
else:
LOG.info(msg)
LOG.info(f"APRSD Started version: {aprsd.__version__}")
# Initialize the client factory and create
# The correct client object ready for use
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
sys.exit(-1)
# Creates the client object
LOG.info("Creating client connection")
aprs_client = client_factory.create()
LOG.info(aprs_client)
# Create the initial PM singleton and Register plugins
# We register plugins first here so we can register each
# plugins config options, so we can dump them all in the
# log file output.
LOG.info("Loading Plugin Manager and registering plugins")
plugin_manager = plugin.PluginManager()
plugin_manager.setup_plugins()
# Dump all the config options now.
CONF.log_opt_values(LOG, logging.DEBUG)
message_plugins = plugin_manager.get_message_plugins()
watchlist_plugins = plugin_manager.get_watchlist_plugins()
LOG.info("Message Plugins enabled and running:")
for p in message_plugins:
LOG.info(p)
LOG.info("Watchlist Plugins enabled and running:")
for p in watchlist_plugins:
LOG.info(p)
# Make sure we have 1 client transport enabled
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
sys.exit(-1)
if not client_factory.is_client_configured():
LOG.error("APRS client is not properly configured in config file.")
sys.exit(-1)
# Now load the msgTrack from disk if any
packets.PacketList()
if flush:
LOG.debug("Deleting saved MsgTrack.")
packets.PacketTrack().flush()
packets.WatchList().flush()
packets.SeenList().flush()
packets.PacketList().flush()
else:
# Try and load saved MsgTrack list
LOG.debug("Loading saved MsgTrack object.")
packets.PacketTrack().load()
packets.WatchList().load()
packets.SeenList().load()
packets.PacketList().load()
keepalive = keep_alive.KeepAliveThread()
keepalive.start()
if not CONF.enable_seen_list:
# just deregister the class from the packet collector
packet_collector.PacketCollector().unregister(seen_list.SeenList)
stats_store_thread = stats_thread.APRSDStatsStoreThread()
stats_store_thread.start()
rx_thread = rx.APRSDPluginRXThread(
packet_queue=threads.packet_queue,
)
process_thread = rx.APRSDPluginProcessPacketThread(
packet_queue=threads.packet_queue,
)
rx_thread.start()
process_thread.start()
if CONF.enable_beacon:
LOG.info("Beacon Enabled. Starting Beacon thread.")
bcn_thread = tx.BeaconSendThread()
bcn_thread.start()
if CONF.aprs_registry.enabled:
LOG.info("Registry Enabled. Starting Registry thread.")
registry_thread = registry.APRSRegistryThread()
registry_thread.start()
if CONF.admin.web_enabled:
log_monitor_thread = log_monitor.LogMonitorThread()
log_monitor_thread.start()
rx_thread.join()
process_thread.join()
return 0

681
aprsd/cmds/webchat.py Normal file
View File

@ -0,0 +1,681 @@
import datetime
import json
import logging
import math
import signal
import sys
import threading
import time
import click
import flask
from flask import request
from flask_httpauth import HTTPBasicAuth
from flask_socketio import Namespace, SocketIO
from geopy.distance import geodesic
from oslo_config import cfg
from werkzeug.security import check_password_hash, generate_password_hash
import wrapt
import aprsd
from aprsd import (
cli_helper, client, packets, plugin_utils, stats, threads, utils,
)
from aprsd.client import client_factory, kiss
from aprsd.main import cli
from aprsd.threads import aprsd as aprsd_threads
from aprsd.threads import keep_alive, rx, tx
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger()
auth = HTTPBasicAuth()
users = {}
socketio = None
# List of callsigns that we don't want to track/fetch their location
callsign_no_track = [
"REPEAT", "WB4BOR-11", "APDW16", "WXNOW", "WXBOT", "BLN0", "BLN1", "BLN2",
"BLN3", "BLN4", "BLN5", "BLN6", "BLN7", "BLN8", "BLN9",
]
# Callsign location information
# callsign: {lat: 0.0, long: 0.0, last_update: datetime}
callsign_locations = {}
flask_app = flask.Flask(
"aprsd",
static_url_path="/static",
static_folder="web/chat/static",
template_folder="web/chat/templates",
)
def signal_handler(sig, frame):
click.echo("signal_handler: called")
LOG.info(
f"Ctrl+C, Sending all threads({len(threads.APRSDThreadList())}) exit! "
f"Can take up to 10 seconds {datetime.datetime.now()}",
)
threads.APRSDThreadList().stop_all()
if "subprocess" not in str(frame):
time.sleep(1.5)
# packets.WatchList().save()
# packets.SeenList().save()
LOG.info(stats.stats_collector.collect())
LOG.info("Telling flask to bail.")
signal.signal(signal.SIGTERM, sys.exit(0))
class SentMessages:
_instance = None
lock = threading.Lock()
data = {}
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
def is_initialized(self):
return True
@wrapt.synchronized(lock)
def add(self, msg):
self.data[msg.msgNo] = msg.__dict__
@wrapt.synchronized(lock)
def __len__(self):
return len(self.data.keys())
@wrapt.synchronized(lock)
def get(self, id):
if id in self.data:
return self.data[id]
@wrapt.synchronized(lock)
def get_all(self):
return self.data
@wrapt.synchronized(lock)
def set_status(self, id, status):
if id in self.data:
self.data[id]["last_update"] = str(datetime.datetime.now())
self.data[id]["status"] = status
@wrapt.synchronized(lock)
def ack(self, id):
"""The message got an ack!"""
if id in self.data:
self.data[id]["last_update"] = str(datetime.datetime.now())
self.data[id]["ack"] = True
@wrapt.synchronized(lock)
def reply(self, id, packet):
"""We got a packet back from the sent message."""
if id in self.data:
self.data[id]["reply"] = packet
# HTTPBasicAuth doesn't work on a class method.
# This has to be out here. Rely on the APRSDFlask
# class to initialize the users from the config
@auth.verify_password
def verify_password(username, password):
global users
if username in users and check_password_hash(users[username], password):
return username
def calculate_initial_compass_bearing(point_a, point_b):
"""
Calculates the bearing between two points.
The formulae used is the following:
θ = atan2(sin(Δlong).cos(lat2),
cos(lat1).sin(lat2) sin(lat1).cos(lat2).cos(Δlong))
:Parameters:
- `pointA: The tuple representing the latitude/longitude for the
first point. Latitude and longitude must be in decimal degrees
- `pointB: The tuple representing the latitude/longitude for the
second point. Latitude and longitude must be in decimal degrees
:Returns:
The bearing in degrees
:Returns Type:
float
"""
if (type(point_a) is not tuple) or (type(point_b) is not tuple):
raise TypeError("Only tuples are supported as arguments")
lat1 = math.radians(point_a[0])
lat2 = math.radians(point_b[0])
diff_long = math.radians(point_b[1] - point_a[1])
x = math.sin(diff_long) * math.cos(lat2)
y = math.cos(lat1) * math.sin(lat2) - (
math.sin(lat1)
* math.cos(lat2) * math.cos(diff_long)
)
initial_bearing = math.atan2(x, y)
# Now we have the initial bearing but math.atan2 return values
# from -180° to + 180° which is not what we want for a compass bearing
# The solution is to normalize the initial bearing as shown below
initial_bearing = math.degrees(initial_bearing)
compass_bearing = (initial_bearing + 360) % 360
return compass_bearing
def _build_location_from_repeat(message):
# This is a location message Format is
# ^ld^callsign:latitude,longitude,altitude,course,speed,timestamp
a = message.split(":")
LOG.warning(a)
if len(a) == 2:
callsign = a[0].replace("^ld^", "")
b = a[1].split(",")
LOG.warning(b)
if len(b) == 6:
lat = float(b[0])
lon = float(b[1])
alt = float(b[2])
course = float(b[3])
speed = float(b[4])
time = int(b[5])
data = {
"callsign": callsign,
"lat": lat,
"lon": lon,
"altitude": alt,
"course": course,
"speed": speed,
"lasttime": time,
}
LOG.warning(f"Location data from REPEAT {data}")
return data
def _calculate_location_data(location_data):
"""Calculate all of the location data from data from aprs.fi or REPEAT."""
lat = location_data["lat"]
lon = location_data["lon"]
alt = location_data["altitude"]
speed = location_data["speed"]
lasttime = location_data["lasttime"]
# now calculate distance from our own location
distance = 0
if CONF.webchat.latitude and CONF.webchat.longitude:
our_lat = float(CONF.webchat.latitude)
our_lon = float(CONF.webchat.longitude)
distance = geodesic((our_lat, our_lon), (lat, lon)).kilometers
bearing = calculate_initial_compass_bearing(
(our_lat, our_lon),
(lat, lon),
)
return {
"callsign": location_data["callsign"],
"lat": lat,
"lon": lon,
"altitude": alt,
"course": f"{bearing:0.1f}",
"speed": speed,
"lasttime": lasttime,
"distance": f"{distance:0.3f}",
}
def send_location_data_to_browser(location_data):
global socketio
callsign = location_data["callsign"]
LOG.info(f"Got location for {callsign} {callsign_locations[callsign]}")
socketio.emit(
"callsign_location", callsign_locations[callsign],
namespace="/sendmsg",
)
def populate_callsign_location(callsign, data=None):
"""Populate the location for the callsign.
if data is passed in, then we have the location already from
an APRS packet. If data is None, then we need to fetch the
location from aprs.fi or REPEAT.
"""
global socketio
"""Fetch the location for the callsign."""
LOG.debug(f"populate_callsign_location {callsign}")
if data:
location_data = _calculate_location_data(data)
callsign_locations[callsign] = location_data
send_location_data_to_browser(location_data)
return
# First we are going to try to get the location from aprs.fi
# if there is no internets, then this will fail and we will
# fallback to calling REPEAT for the location for the callsign.
fallback = False
if not CONF.aprs_fi.apiKey:
LOG.warning(
"Config aprs_fi.apiKey is not set. Can't get location from aprs.fi "
" falling back to sending REPEAT to get location.",
)
fallback = True
else:
try:
aprs_data = plugin_utils.get_aprs_fi(CONF.aprs_fi.apiKey, callsign)
if not len(aprs_data["entries"]):
LOG.error("Didn't get any entries from aprs.fi")
return
lat = float(aprs_data["entries"][0]["lat"])
lon = float(aprs_data["entries"][0]["lng"])
try: # altitude not always provided
alt = float(aprs_data["entries"][0]["altitude"])
except Exception:
alt = 0
location_data = {
"callsign": callsign,
"lat": lat,
"lon": lon,
"altitude": alt,
"lasttime": int(aprs_data["entries"][0]["lasttime"]),
"course": float(aprs_data["entries"][0].get("course", 0)),
"speed": float(aprs_data["entries"][0].get("speed", 0)),
}
location_data = _calculate_location_data(location_data)
callsign_locations[callsign] = location_data
send_location_data_to_browser(location_data)
return
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi '{ex}'")
LOG.error(ex)
fallback = True
if fallback:
# We don't have the location data
# and we can't get it from aprs.fi
# Send a special message to REPEAT to get the location data
LOG.info(f"Sending REPEAT to get location for callsign {callsign}.")
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call="REPEAT",
message_text=f"ld {callsign}",
),
)
class WebChatProcessPacketThread(rx.APRSDProcessPacketThread):
"""Class that handles packets being sent to us."""
def __init__(self, packet_queue, socketio):
self.socketio = socketio
self.connected = False
super().__init__(packet_queue)
def process_ack_packet(self, packet: packets.AckPacket):
super().process_ack_packet(packet)
ack_num = packet.get("msgNo")
SentMessages().ack(ack_num)
msg = SentMessages().get(ack_num)
if msg:
self.socketio.emit(
"ack", msg,
namespace="/sendmsg",
)
self.got_ack = True
def process_our_message_packet(self, packet: packets.MessagePacket):
global callsign_locations
# ok lets see if we have the location for the
# person we just sent a message to.
from_call = packet.get("from_call").upper()
if from_call == "REPEAT":
# We got a message from REPEAT. Is this a location message?
message = packet.get("message_text")
if message.startswith("^ld^"):
location_data = _build_location_from_repeat(message)
callsign = location_data["callsign"]
location_data = _calculate_location_data(location_data)
callsign_locations[callsign] = location_data
send_location_data_to_browser(location_data)
return
elif (
from_call not in callsign_locations
and from_call not in callsign_no_track
):
# We have to ask aprs for the location for the callsign
# We send a message packet to wb4bor-11 asking for location.
populate_callsign_location(from_call)
# Send the packet to the browser.
self.socketio.emit(
"new", packet.__dict__,
namespace="/sendmsg",
)
class LocationProcessingThread(aprsd_threads.APRSDThread):
"""Class to handle the location processing."""
def __init__(self):
super().__init__("LocationProcessingThread")
def loop(self):
pass
def set_config():
global users
def _get_transport(stats):
if CONF.aprs_network.enabled:
transport = "aprs-is"
aprs_connection = (
"APRS-IS Server: <a href='http://status.aprs2.net' >"
"{}</a>".format(stats["APRSClientStats"]["server_string"])
)
elif kiss.KISSClient.is_enabled():
transport = kiss.KISSClient.transport()
if transport == client.TRANSPORT_TCPKISS:
aprs_connection = (
"TCPKISS://{}:{}".format(
CONF.kiss_tcp.host,
CONF.kiss_tcp.port,
)
)
elif transport == client.TRANSPORT_SERIALKISS:
# for pep8 violation
aprs_connection = (
"SerialKISS://{}@{} baud".format(
CONF.kiss_serial.device,
CONF.kiss_serial.baudrate,
),
)
elif CONF.fake_client.enabled:
transport = client.TRANSPORT_FAKE
aprs_connection = "Fake Client"
return transport, aprs_connection
@flask_app.route("/location/<callsign>", methods=["POST"])
def location(callsign):
LOG.debug(f"Fetch location for callsign {callsign}")
populate_callsign_location(callsign)
@auth.login_required
@flask_app.route("/")
def index():
stats = _stats()
# For development
html_template = "index.html"
LOG.debug(f"Template {html_template}")
transport, aprs_connection = _get_transport(stats["stats"])
LOG.debug(f"transport {transport} aprs_connection {aprs_connection}")
stats["transport"] = transport
stats["aprs_connection"] = aprs_connection
LOG.debug(f"initial stats = {stats}")
latitude = CONF.webchat.latitude
if latitude:
latitude = float(CONF.webchat.latitude)
longitude = CONF.webchat.longitude
if longitude:
longitude = float(longitude)
return flask.render_template(
html_template,
initial_stats=stats,
aprs_connection=aprs_connection,
callsign=CONF.callsign,
version=aprsd.__version__,
latitude=latitude,
longitude=longitude,
)
@auth.login_required
@flask_app.route("/send-message-status")
def send_message_status():
LOG.debug(request)
msgs = SentMessages()
info = msgs.get_all()
return json.dumps(info)
def _stats():
now = datetime.datetime.now()
time_format = "%m-%d-%Y %H:%M:%S"
stats_dict = stats.stats_collector.collect(serializable=True)
# Webchat doesnt need these
if "WatchList" in stats_dict:
del stats_dict["WatchList"]
if "SeenList" in stats_dict:
del stats_dict["SeenList"]
if "APRSDThreadList" in stats_dict:
del stats_dict["APRSDThreadList"]
if "PacketList" in stats_dict:
del stats_dict["PacketList"]
if "EmailStats" in stats_dict:
del stats_dict["EmailStats"]
if "PluginManager" in stats_dict:
del stats_dict["PluginManager"]
result = {
"time": now.strftime(time_format),
"stats": stats_dict,
}
return result
@flask_app.route("/stats")
def get_stats():
return json.dumps(_stats())
class SendMessageNamespace(Namespace):
"""Class to handle the socketio interactions."""
got_ack = False
reply_sent = False
msg = None
request = None
def __init__(self, namespace=None, config=None):
super().__init__(namespace)
def on_connect(self):
global socketio
LOG.debug("Web socket connected")
socketio.emit(
"connected", {"data": "/sendmsg Connected"},
namespace="/sendmsg",
)
def on_disconnect(self):
LOG.debug("WS Disconnected")
def on_send(self, data):
global socketio
LOG.debug(f"WS: on_send {data}")
self.request = data
data["from"] = CONF.callsign
path = data.get("path", None)
if not path:
path = []
elif "," in path:
path_opts = path.split(",")
path = [x.strip() for x in path_opts]
else:
path = [path]
pkt = packets.MessagePacket(
from_call=data["from"],
to_call=data["to"].upper(),
message_text=data["message"],
path=path,
)
pkt.prepare()
self.msg = pkt
msgs = SentMessages()
msgs.add(pkt)
tx.send(pkt)
msgs.set_status(pkt.msgNo, "Sending")
obj = msgs.get(pkt.msgNo)
socketio.emit(
"sent", obj,
namespace="/sendmsg",
)
def on_gps(self, data):
LOG.debug(f"WS on_GPS: {data}")
lat = data["latitude"]
long = data["longitude"]
LOG.debug(f"Lat {lat}")
LOG.debug(f"Long {long}")
path = data.get("path", None)
if not path:
path = []
elif "," in path:
path_opts = path.split(",")
path = [x.strip() for x in path_opts]
else:
path = [path]
tx.send(
packets.BeaconPacket(
from_call=CONF.callsign,
to_call="APDW16",
latitude=lat,
longitude=long,
comment="APRSD WebChat Beacon",
path=path,
),
direct=True,
)
def handle_message(self, data):
LOG.debug(f"WS Data {data}")
def handle_json(self, data):
LOG.debug(f"WS json {data}")
def on_get_callsign_location(self, data):
LOG.debug(f"on_callsign_location {data}")
populate_callsign_location(data["callsign"])
@trace.trace
def init_flask(loglevel, quiet):
global socketio, flask_app
socketio = SocketIO(
flask_app, logger=False, engineio_logger=False,
async_mode="threading",
)
socketio.on_namespace(
SendMessageNamespace(
"/sendmsg",
),
)
return socketio
# main() ###
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.option(
"-f",
"--flush",
"flush",
is_flag=True,
show_default=True,
default=False,
help="Flush out all old aged messages on disk.",
)
@click.option(
"-p",
"--port",
"port",
show_default=True,
default=None,
help="Port to listen to web requests. This overrides the config.webchat.web_port setting.",
)
@click.pass_context
@cli_helper.process_standard_options
def webchat(ctx, flush, port):
"""Web based HAM Radio chat program!"""
loglevel = ctx.obj["loglevel"]
quiet = ctx.obj["quiet"]
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
level, msg = utils._check_version()
if level:
LOG.warning(msg)
else:
LOG.info(msg)
LOG.info(f"APRSD Started version: {aprsd.__version__}")
CONF.log_opt_values(logging.getLogger(), logging.DEBUG)
user = CONF.admin.user
users[user] = generate_password_hash(CONF.admin.password)
if not port:
port = CONF.webchat.web_port
# Initialize the client factory and create
# The correct client object ready for use
# Make sure we have 1 client transport enabled
if not client_factory.is_client_enabled():
LOG.error("No Clients are enabled in config.")
sys.exit(-1)
if not client_factory.is_client_configured():
LOG.error("APRS client is not properly configured in config file.")
sys.exit(-1)
packets.PacketList()
packets.PacketTrack()
packets.WatchList()
packets.SeenList()
keepalive = keep_alive.KeepAliveThread()
LOG.info("Start KeepAliveThread")
keepalive.start()
socketio = init_flask(loglevel, quiet)
rx_thread = rx.APRSDPluginRXThread(
packet_queue=threads.packet_queue,
)
rx_thread.start()
process_thread = WebChatProcessPacketThread(
packet_queue=threads.packet_queue,
socketio=socketio,
)
process_thread.start()
LOG.info("Start socketio.run()")
socketio.run(
flask_app,
# This is broken for now after removing cryptography
# and pyopenssl
# ssl_context="adhoc",
host=CONF.webchat.web_ip,
port=port,
allow_unsafe_werkzeug=True,
)
LOG.info("WebChat exiting!!!! Bye.")

56
aprsd/conf/__init__.py Normal file
View File

@ -0,0 +1,56 @@
from oslo_config import cfg
from aprsd.conf import client, common, log, plugin_common, plugin_email
CONF = cfg.CONF
log.register_opts(CONF)
common.register_opts(CONF)
client.register_opts(CONF)
# plugins
plugin_common.register_opts(CONF)
plugin_email.register_opts(CONF)
def set_lib_defaults():
"""Update default value for configuration options from other namespace.
Example, oslo lib config options. This is needed for
config generator tool to pick these default value changes.
https://docs.openstack.org/oslo.config/latest/cli/
generator.html#modifying-defaults-from-other-namespaces
"""
# Update default value of oslo_log default_log_levels and
# logging_context_format_string config option.
set_log_defaults()
def set_log_defaults():
# log.set_defaults(default_log_levels=log.get_default_log_levels())
pass
def conf_to_dict():
"""Convert the CONF options to a single level dictionary."""
entries = {}
def _sanitize(opt, value):
"""Obfuscate values of options declared secret."""
return value if not opt.secret else "*" * 4
for opt_name in sorted(CONF._opts):
opt = CONF._get_opt_info(opt_name)["opt"]
val = str(_sanitize(opt, getattr(CONF, opt_name)))
entries[str(opt)] = val
for group_name in list(CONF._groups):
group_attr = CONF.GroupAttr(CONF, CONF._get_group(group_name))
for opt_name in sorted(CONF._groups[group_name]._opts):
opt = CONF._get_opt_info(opt_name, group_name)["opt"]
val = str(_sanitize(opt, getattr(group_attr, opt_name)))
gname_opt_name = f"{group_name}.{opt_name}"
entries[gname_opt_name] = val
return entries

131
aprsd/conf/client.py Normal file
View File

@ -0,0 +1,131 @@
"""
The options for log setup
"""
from oslo_config import cfg
DEFAULT_LOGIN = "NOCALL"
aprs_group = cfg.OptGroup(
name="aprs_network",
title="APRS-IS Network settings",
)
kiss_serial_group = cfg.OptGroup(
name="kiss_serial",
title="KISS Serial device connection",
)
kiss_tcp_group = cfg.OptGroup(
name="kiss_tcp",
title="KISS TCP/IP Device connection",
)
fake_client_group = cfg.OptGroup(
name="fake_client",
title="Fake Client settings",
)
aprs_opts = [
cfg.BoolOpt(
"enabled",
default=True,
help="Set enabled to False if there is no internet connectivity."
"This is useful for a direwolf KISS aprs connection only.",
),
cfg.StrOpt(
"login",
default=DEFAULT_LOGIN,
help="APRS Username",
),
cfg.StrOpt(
"password",
secret=True,
help="APRS Password "
"Get the passcode for your callsign here: "
"https://apps.magicbug.co.uk/passcode",
),
cfg.HostAddressOpt(
"host",
default="noam.aprs2.net",
help="The APRS-IS hostname",
),
cfg.PortOpt(
"port",
default=14580,
help="APRS-IS port",
),
]
kiss_serial_opts = [
cfg.BoolOpt(
"enabled",
default=False,
help="Enable Serial KISS interface connection.",
),
cfg.StrOpt(
"device",
help="Serial Device file to use. /dev/ttyS0",
),
cfg.IntOpt(
"baudrate",
default=9600,
help="The Serial device baud rate for communication",
),
cfg.ListOpt(
"path",
default=["WIDE1-1", "WIDE2-1"],
help="The APRS path to use for wide area coverage.",
),
]
kiss_tcp_opts = [
cfg.BoolOpt(
"enabled",
default=False,
help="Enable Serial KISS interface connection.",
),
cfg.HostAddressOpt(
"host",
help="The KISS TCP Host to connect to.",
),
cfg.PortOpt(
"port",
default=8001,
help="The KISS TCP/IP network port",
),
cfg.ListOpt(
"path",
default=["WIDE1-1", "WIDE2-1"],
help="The APRS path to use for wide area coverage.",
),
]
fake_client_opts = [
cfg.BoolOpt(
"enabled",
default=False,
help="Enable fake client connection.",
),
]
def register_opts(config):
config.register_group(aprs_group)
config.register_opts(aprs_opts, group=aprs_group)
config.register_group(kiss_serial_group)
config.register_group(kiss_tcp_group)
config.register_opts(kiss_serial_opts, group=kiss_serial_group)
config.register_opts(kiss_tcp_opts, group=kiss_tcp_group)
config.register_group(fake_client_group)
config.register_opts(fake_client_opts, group=fake_client_group)
def list_opts():
return {
aprs_group.name: aprs_opts,
kiss_serial_group.name: kiss_serial_opts,
kiss_tcp_group.name: kiss_tcp_opts,
fake_client_group.name: fake_client_opts,
}

302
aprsd/conf/common.py Normal file
View File

@ -0,0 +1,302 @@
from pathlib import Path
from oslo_config import cfg
home = str(Path.home())
DEFAULT_CONFIG_DIR = f"{home}/.config/aprsd/"
APRSD_DEFAULT_MAGIC_WORD = "CHANGEME!!!"
admin_group = cfg.OptGroup(
name="admin",
title="Admin web interface settings",
)
watch_list_group = cfg.OptGroup(
name="watch_list",
title="Watch List settings",
)
webchat_group = cfg.OptGroup(
name="webchat",
title="Settings specific to the webchat command",
)
registry_group = cfg.OptGroup(
name="aprs_registry",
title="APRS Registry settings",
)
aprsd_opts = [
cfg.StrOpt(
"callsign",
required=True,
help="Callsign to use for messages sent by APRSD",
),
cfg.BoolOpt(
"enable_save",
default=True,
help="Enable saving of watch list, packet tracker between restarts.",
),
cfg.StrOpt(
"save_location",
default=DEFAULT_CONFIG_DIR,
help="Save location for packet tracking files.",
),
cfg.BoolOpt(
"trace_enabled",
default=False,
help="Enable code tracing",
),
cfg.StrOpt(
"units",
default="imperial",
help="Units for display, imperial or metric",
),
cfg.IntOpt(
"ack_rate_limit_period",
default=1,
help="The wait period in seconds per Ack packet being sent."
"1 means 1 ack packet per second allowed."
"2 means 1 pack packet every 2 seconds allowed",
),
cfg.IntOpt(
"msg_rate_limit_period",
default=2,
help="Wait period in seconds per non AckPacket being sent."
"2 means 1 packet every 2 seconds allowed."
"5 means 1 pack packet every 5 seconds allowed",
),
cfg.IntOpt(
"packet_dupe_timeout",
default=300,
help="The number of seconds before a packet is not considered a duplicate.",
),
cfg.BoolOpt(
"enable_beacon",
default=False,
help="Enable sending of a GPS Beacon packet to locate this service. "
"Requires latitude and longitude to be set.",
),
cfg.IntOpt(
"beacon_interval",
default=1800,
help="The number of seconds between beacon packets.",
),
cfg.StrOpt(
"beacon_symbol",
default="/",
help="The symbol to use for the GPS Beacon packet. See: http://www.aprs.net/vm/DOS/SYMBOLS.HTM",
),
cfg.StrOpt(
"latitude",
default=None,
help="Latitude for the GPS Beacon button. If not set, the button will not be enabled.",
),
cfg.StrOpt(
"longitude",
default=None,
help="Longitude for the GPS Beacon button. If not set, the button will not be enabled.",
),
cfg.StrOpt(
"log_packet_format",
choices=["compact", "multiline", "both"],
default="compact",
help="When logging packets 'compact' will use a single line formatted for each packet."
"'multiline' will use multiple lines for each packet and is the traditional format."
"both will log both compact and multiline.",
),
cfg.IntOpt(
"default_packet_send_count",
default=3,
help="The number of times to send a non ack packet before giving up.",
),
cfg.IntOpt(
"default_ack_send_count",
default=3,
help="The number of times to send an ack packet in response to recieving a packet.",
),
cfg.IntOpt(
"packet_list_maxlen",
default=100,
help="The maximum number of packets to store in the packet list.",
),
cfg.IntOpt(
"packet_list_stats_maxlen",
default=20,
help="The maximum number of packets to send in the stats dict for admin ui.",
),
cfg.BoolOpt(
"enable_seen_list",
default=True,
help="Enable the Callsign seen list tracking feature. This allows aprsd to keep track of "
"callsigns that have been seen and when they were last seen.",
),
cfg.BoolOpt(
"enable_packet_logging",
default=True,
help="Set this to False, to disable logging of packets to the log file.",
),
]
watch_list_opts = [
cfg.BoolOpt(
"enabled",
default=False,
help="Enable the watch list feature. Still have to enable "
"the correct plugin. Built-in plugin to use is "
"aprsd.plugins.notify.NotifyPlugin",
),
cfg.ListOpt(
"callsigns",
help="Callsigns to watch for messsages",
),
cfg.StrOpt(
"alert_callsign",
help="The Ham Callsign to send messages to for watch list alerts.",
),
cfg.IntOpt(
"packet_keep_count",
default=10,
help="The number of packets to store.",
),
cfg.IntOpt(
"alert_time_seconds",
default=3600,
help="Time to wait before alert is sent on new message for "
"users in callsigns.",
),
]
admin_opts = [
cfg.BoolOpt(
"web_enabled",
default=False,
help="Enable the Admin Web Interface",
),
cfg.StrOpt(
"web_ip",
default="0.0.0.0",
help="The ip address to listen on",
),
cfg.PortOpt(
"web_port",
default=8001,
help="The port to listen on",
),
cfg.StrOpt(
"user",
default="admin",
help="The admin user for the admin web interface",
),
cfg.StrOpt(
"password",
default="password",
secret=True,
help="Admin interface password",
),
]
enabled_plugins_opts = [
cfg.ListOpt(
"enabled_plugins",
default=[
"aprsd.plugins.email.EmailPlugin",
"aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.query.QueryPlugin",
"aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.OWMWeatherPlugin",
"aprsd.plugins.version.VersionPlugin",
"aprsd.plugins.notify.NotifySeenPlugin",
],
help="Comma separated list of enabled plugins for APRSD."
"To enable installed external plugins add them here."
"The full python path to the class name must be used",
),
]
webchat_opts = [
cfg.StrOpt(
"web_ip",
default="0.0.0.0",
help="The ip address to listen on",
),
cfg.PortOpt(
"web_port",
default=8001,
help="The port to listen on",
),
cfg.StrOpt(
"latitude",
default=None,
help="Latitude for the GPS Beacon button. If not set, the button will not be enabled.",
),
cfg.StrOpt(
"longitude",
default=None,
help="Longitude for the GPS Beacon button. If not set, the button will not be enabled.",
),
cfg.BoolOpt(
"disable_url_request_logging",
default=False,
help="Disable the logging of url requests in the webchat command.",
),
]
registry_opts = [
cfg.BoolOpt(
"enabled",
default=False,
help="Enable sending aprs registry information. This will let the "
"APRS registry know about your service and it's uptime. "
"No personal information is sent, just the callsign, uptime and description. "
"The service callsign is the callsign set in [DEFAULT] section.",
),
cfg.StrOpt(
"description",
default=None,
help="Description of the service to send to the APRS registry. "
"This is what will show up in the APRS registry."
"If not set, the description will be the same as the callsign.",
),
cfg.StrOpt(
"registry_url",
default="https://aprs.hemna.com/api/v1/registry",
help="The APRS registry domain name to send the information to.",
),
cfg.StrOpt(
"service_website",
default=None,
help="The website for your APRS service to send to the APRS registry.",
),
cfg.IntOpt(
"frequency_seconds",
default=3600,
help="The frequency in seconds to send the APRS registry information.",
),
]
def register_opts(config):
config.register_opts(aprsd_opts)
config.register_opts(enabled_plugins_opts)
config.register_group(admin_group)
config.register_opts(admin_opts, group=admin_group)
config.register_group(watch_list_group)
config.register_opts(watch_list_opts, group=watch_list_group)
config.register_group(webchat_group)
config.register_opts(webchat_opts, group=webchat_group)
config.register_group(registry_group)
config.register_opts(registry_opts, group=registry_group)
def list_opts():
return {
"DEFAULT": (aprsd_opts + enabled_plugins_opts),
admin_group.name: admin_opts,
watch_list_group.name: watch_list_opts,
webchat_group.name: webchat_opts,
registry_group.name: registry_opts,
}

65
aprsd/conf/log.py Normal file
View File

@ -0,0 +1,65 @@
"""
The options for log setup
"""
import logging
from oslo_config import cfg
LOG_LEVELS = {
"CRITICAL": logging.CRITICAL,
"ERROR": logging.ERROR,
"WARNING": logging.WARNING,
"INFO": logging.INFO,
"DEBUG": logging.DEBUG,
}
DEFAULT_DATE_FORMAT = "%m/%d/%Y %I:%M:%S %p"
DEFAULT_LOG_FORMAT = (
"[%(asctime)s] [%(threadName)-20.20s] [%(levelname)-5.5s]"
" %(message)s - [%(pathname)s:%(lineno)d]"
)
DEFAULT_LOG_FORMAT = (
"<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
"<yellow>{thread.name: <18}</yellow> | "
"<level>{level: <8}</level> | "
"<level>{message}</level> | "
"<cyan>{name}</cyan>:<cyan>{function:}</cyan>:<magenta>{line:}</magenta>"
)
logging_group = cfg.OptGroup(
name="logging",
title="Logging options",
)
logging_opts = [
cfg.StrOpt(
"logfile",
default=None,
help="File to log to",
),
cfg.StrOpt(
"logformat",
default=DEFAULT_LOG_FORMAT,
help="Log file format, unless rich_logging enabled.",
),
cfg.StrOpt(
"log_level",
default="INFO",
choices=LOG_LEVELS.keys(),
help="Log level for logging of events.",
),
]
def register_opts(config):
config.register_group(logging_group)
config.register_opts(logging_opts, group=logging_group)
def list_opts():
return {
logging_group.name: (
logging_opts
),
}

80
aprsd/conf/opts.py Normal file
View File

@ -0,0 +1,80 @@
# Copyright 2015 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
This is the single point of entry to generate the sample configuration
file for Nova. It collects all the necessary info from the other modules
in this package. It is assumed that:
* every other module in this package has a 'list_opts' function which
return a dict where
* the keys are strings which are the group names
* the value of each key is a list of config options for that group
* the nova.conf package doesn't have further packages with config options
* this module is only used in the context of sample file generation
"""
import collections
import importlib
import os
import pkgutil
LIST_OPTS_FUNC_NAME = "list_opts"
def _tupleize(dct):
"""Take the dict of options and convert to the 2-tuple format."""
return [(key, val) for key, val in dct.items()]
def list_opts():
opts = collections.defaultdict(list)
module_names = _list_module_names()
imported_modules = _import_modules(module_names)
_append_config_options(imported_modules, opts)
return _tupleize(opts)
def _list_module_names():
module_names = []
package_path = os.path.dirname(os.path.abspath(__file__))
for _, modname, ispkg in pkgutil.iter_modules(path=[package_path]):
if modname == "opts" or ispkg:
continue
else:
module_names.append(modname)
return module_names
def _import_modules(module_names):
imported_modules = []
for modname in module_names:
mod = importlib.import_module("aprsd.conf." + modname)
if not hasattr(mod, LIST_OPTS_FUNC_NAME):
msg = "The module 'aprsd.conf.%s' should have a '%s' "\
"function which returns the config options." % \
(modname, LIST_OPTS_FUNC_NAME)
raise Exception(msg)
else:
imported_modules.append(mod)
return imported_modules
def _append_config_options(imported_modules, config_options):
for mod in imported_modules:
configs = mod.list_opts()
for key, val in configs.items():
config_options[key].extend(val)

191
aprsd/conf/plugin_common.py Normal file
View File

@ -0,0 +1,191 @@
from oslo_config import cfg
aprsfi_group = cfg.OptGroup(
name="aprs_fi",
title="APRS.FI website settings",
)
query_group = cfg.OptGroup(
name="query_plugin",
title="Options for the Query Plugin",
)
avwx_group = cfg.OptGroup(
name="avwx_plugin",
title="Options for the AVWXWeatherPlugin",
)
owm_wx_group = cfg.OptGroup(
name="owm_weather_plugin",
title="Options for the OWMWeatherPlugin",
)
location_group = cfg.OptGroup(
name="location_plugin",
title="Options for the LocationPlugin",
)
aprsfi_opts = [
cfg.StrOpt(
"apiKey",
help="Get the apiKey from your aprs.fi account here:"
"http://aprs.fi/account",
),
]
query_plugin_opts = [
cfg.StrOpt(
"callsign",
help="The Ham callsign to allow access to the query plugin from RF.",
),
]
owm_wx_opts = [
cfg.StrOpt(
"apiKey",
help="OWMWeatherPlugin api key to OpenWeatherMap's API."
"This plugin uses the openweathermap API to fetch"
"location and weather information."
"To use this plugin you need to get an openweathermap"
"account and apikey."
"https://home.openweathermap.org/api_keys",
),
]
avwx_opts = [
cfg.StrOpt(
"apiKey",
help="avwx-api is an opensource project that has"
"a hosted service here: https://avwx.rest/"
"You can launch your own avwx-api in a container"
"by cloning the githug repo here:"
"https://github.com/avwx-rest/AVWX-API",
),
cfg.StrOpt(
"base_url",
default="https://avwx.rest",
help="The base url for the avwx API. If you are hosting your own"
"Here is where you change the url to point to yours.",
),
]
location_opts = [
cfg.StrOpt(
"geopy_geocoder",
choices=[
"ArcGIS", "AzureMaps", "Baidu", "Bing", "GoogleV3", "HERE",
"Nominatim", "OpenCage", "TomTom", "USGov", "What3Words", "Woosmap",
],
default="Nominatim",
help="The geopy geocoder to use. Default is Nominatim."
"See https://geopy.readthedocs.io/en/stable/#module-geopy.geocoders"
"for more information.",
),
cfg.StrOpt(
"user_agent",
default="APRSD",
help="The user agent to use for the Nominatim geocoder."
"See https://geopy.readthedocs.io/en/stable/#module-geopy.geocoders"
"for more information.",
),
cfg.StrOpt(
"arcgis_username",
default=None,
help="The username to use for the ArcGIS geocoder."
"See https://geopy.readthedocs.io/en/latest/#arcgis"
"for more information."
"Only used for the ArcGIS geocoder.",
),
cfg.StrOpt(
"arcgis_password",
default=None,
help="The password to use for the ArcGIS geocoder."
"See https://geopy.readthedocs.io/en/latest/#arcgis"
"for more information."
"Only used for the ArcGIS geocoder.",
),
cfg.StrOpt(
"azuremaps_subscription_key",
help="The subscription key to use for the AzureMaps geocoder."
"See https://geopy.readthedocs.io/en/latest/#azuremaps"
"for more information."
"Only used for the AzureMaps geocoder.",
),
cfg.StrOpt(
"baidu_api_key",
help="The API key to use for the Baidu geocoder."
"See https://geopy.readthedocs.io/en/latest/#baidu"
"for more information."
"Only used for the Baidu geocoder.",
),
cfg.StrOpt(
"bing_api_key",
help="The API key to use for the Bing geocoder."
"See https://geopy.readthedocs.io/en/latest/#bing"
"for more information."
"Only used for the Bing geocoder.",
),
cfg.StrOpt(
"google_api_key",
help="The API key to use for the Google geocoder."
"See https://geopy.readthedocs.io/en/latest/#googlev3"
"for more information."
"Only used for the Google geocoder.",
),
cfg.StrOpt(
"here_api_key",
help="The API key to use for the HERE geocoder."
"See https://geopy.readthedocs.io/en/latest/#here"
"for more information."
"Only used for the HERE geocoder.",
),
cfg.StrOpt(
"opencage_api_key",
help="The API key to use for the OpenCage geocoder."
"See https://geopy.readthedocs.io/en/latest/#opencage"
"for more information."
"Only used for the OpenCage geocoder.",
),
cfg.StrOpt(
"tomtom_api_key",
help="The API key to use for the TomTom geocoder."
"See https://geopy.readthedocs.io/en/latest/#tomtom"
"for more information."
"Only used for the TomTom geocoder.",
),
cfg.StrOpt(
"what3words_api_key",
help="The API key to use for the What3Words geocoder."
"See https://geopy.readthedocs.io/en/latest/#what3words"
"for more information."
"Only used for the What3Words geocoder.",
),
cfg.StrOpt(
"woosmap_api_key",
help="The API key to use for the Woosmap geocoder."
"See https://geopy.readthedocs.io/en/latest/#woosmap"
"for more information."
"Only used for the Woosmap geocoder.",
),
]
def register_opts(config):
config.register_group(aprsfi_group)
config.register_opts(aprsfi_opts, group=aprsfi_group)
config.register_group(query_group)
config.register_opts(query_plugin_opts, group=query_group)
config.register_group(owm_wx_group)
config.register_opts(owm_wx_opts, group=owm_wx_group)
config.register_group(avwx_group)
config.register_opts(avwx_opts, group=avwx_group)
config.register_group(location_group)
config.register_opts(location_opts, group=location_group)
def list_opts():
return {
aprsfi_group.name: aprsfi_opts,
query_group.name: query_plugin_opts,
owm_wx_group.name: owm_wx_opts,
avwx_group.name: avwx_opts,
location_group.name: location_opts,
}

105
aprsd/conf/plugin_email.py Normal file
View File

@ -0,0 +1,105 @@
from oslo_config import cfg
email_group = cfg.OptGroup(
name="email_plugin",
title="Options for the APRSD Email plugin",
)
email_opts = [
cfg.StrOpt(
"callsign",
help="(Required) Callsign to validate for doing email commands."
"Only this callsign can check email. This is also where the "
"email notifications for new emails will be sent.",
),
cfg.BoolOpt(
"enabled",
default=False,
help="Enable the Email plugin?",
),
cfg.BoolOpt(
"debug",
default=False,
help="Enable the Email plugin Debugging?",
),
]
email_imap_opts = [
cfg.StrOpt(
"imap_login",
help="Login username/email for IMAP server",
),
cfg.StrOpt(
"imap_password",
secret=True,
help="Login password for IMAP server",
),
cfg.HostnameOpt(
"imap_host",
help="Hostname/IP of the IMAP server",
),
cfg.PortOpt(
"imap_port",
default=993,
help="Port to use for IMAP server",
),
cfg.BoolOpt(
"imap_use_ssl",
default=True,
help="Use SSL for connection to IMAP Server",
),
]
email_smtp_opts = [
cfg.StrOpt(
"smtp_login",
help="Login username/email for SMTP server",
),
cfg.StrOpt(
"smtp_password",
secret=True,
help="Login password for SMTP server",
),
cfg.HostnameOpt(
"smtp_host",
help="Hostname/IP of the SMTP server",
),
cfg.PortOpt(
"smtp_port",
default=465,
help="Port to use for SMTP server",
),
cfg.BoolOpt(
"smtp_use_ssl",
default=True,
help="Use SSL for connection to SMTP Server",
),
]
email_shortcuts_opts = [
cfg.ListOpt(
"email_shortcuts",
help="List of email shortcuts for checking/sending email "
"For Exmaple: wb=walt@walt.com,cl=cl@cl.com\n"
"Means use 'wb' to send an email to walt@walt.com",
),
]
ALL_OPTS = (
email_opts
+ email_imap_opts
+ email_smtp_opts
+ email_shortcuts_opts
)
def register_opts(config):
config.register_group(email_group)
config.register_opts(ALL_OPTS, group=email_group)
def list_opts():
return {
email_group.name: ALL_OPTS,
}

View File

@ -1,443 +0,0 @@
import datetime
import email
from email.mime.text import MIMEText
import imaplib
import logging
import re
import smtplib
import time
from aprsd import messaging, threads
import imapclient
from validate_email import validate_email
LOG = logging.getLogger("APRSD")
# This gets forced set from main.py prior to being used internally
CONFIG = None
def _imap_connect():
imap_port = CONFIG["imap"].get("port", 143)
use_ssl = CONFIG["imap"].get("use_ssl", False)
host = CONFIG["imap"]["host"]
msg = "{}{}:{}".format("TLS " if use_ssl else "", host, imap_port)
# LOG.debug("Connect to IMAP host {} with user '{}'".
# format(msg, CONFIG['imap']['login']))
try:
server = imapclient.IMAPClient(
CONFIG["imap"]["host"],
port=imap_port,
use_uid=True,
ssl=use_ssl,
)
except Exception:
LOG.error("Failed to connect IMAP server")
return
try:
server.login(CONFIG["imap"]["login"], CONFIG["imap"]["password"])
except (imaplib.IMAP4.error, Exception) as e:
msg = getattr(e, "message", repr(e))
LOG.error("Failed to login {}".format(msg))
return
server.select_folder("INBOX")
return server
def _smtp_connect():
host = CONFIG["smtp"]["host"]
smtp_port = CONFIG["smtp"]["port"]
use_ssl = CONFIG["smtp"].get("use_ssl", False)
msg = "{}{}:{}".format("SSL " if use_ssl else "", host, smtp_port)
LOG.debug(
"Connect to SMTP host {} with user '{}'".format(msg, CONFIG["imap"]["login"]),
)
try:
if use_ssl:
server = smtplib.SMTP_SSL(host=host, port=smtp_port)
else:
server = smtplib.SMTP(host=host, port=smtp_port)
except Exception:
LOG.error("Couldn't connect to SMTP Server")
return
LOG.debug("Connected to smtp host {}".format(msg))
try:
server.login(CONFIG["smtp"]["login"], CONFIG["smtp"]["password"])
except Exception:
LOG.error("Couldn't connect to SMTP Server")
return
LOG.debug("Logged into SMTP server {}".format(msg))
return server
def validate_shortcuts(config):
shortcuts = config.get("shortcuts", None)
if not shortcuts:
return
LOG.info(
"Validating {} Email shortcuts. This can take up to 10 seconds"
" per shortcut".format(len(shortcuts)),
)
delete_keys = []
for key in shortcuts:
LOG.info("Validating {}:{}".format(key, shortcuts[key]))
is_valid = validate_email(
email_address=shortcuts[key],
check_regex=True,
check_mx=False,
from_address=config["smtp"]["login"],
helo_host=config["smtp"]["host"],
smtp_timeout=10,
dns_timeout=10,
use_blacklist=True,
debug=False,
)
if not is_valid:
LOG.error(
"'{}' is an invalid email address. Removing shortcut".format(
shortcuts[key],
),
)
delete_keys.append(key)
for key in delete_keys:
del config["shortcuts"][key]
LOG.info("Available shortcuts: {}".format(config["shortcuts"]))
def get_email_from_shortcut(addr):
if CONFIG.get("shortcuts", False):
return CONFIG["shortcuts"].get(addr, addr)
else:
return addr
def validate_email_config(config, disable_validation=False):
"""function to simply ensure we can connect to email services.
This helps with failing early during startup.
"""
LOG.info("Checking IMAP configuration")
imap_server = _imap_connect()
LOG.info("Checking SMTP configuration")
smtp_server = _smtp_connect()
# Now validate and flag any shortcuts as invalid
if not disable_validation:
validate_shortcuts(config)
else:
LOG.info("Shortcuts email validation is Disabled!!, you were warned.")
if imap_server and smtp_server:
return True
else:
return False
def parse_email(msgid, data, server):
envelope = data[b"ENVELOPE"]
# email address match
# use raw string to avoid invalid escape secquence errors r"string here"
f = re.search(r"([\.\w_-]+@[\.\w_-]+)", str(envelope.from_[0]))
if f is not None:
from_addr = f.group(1)
else:
from_addr = "noaddr"
LOG.debug("Got a message from '{}'".format(from_addr))
m = server.fetch([msgid], ["RFC822"])
msg = email.message_from_string(m[msgid][b"RFC822"].decode(errors="ignore"))
if msg.is_multipart():
text = ""
html = None
# default in case body somehow isn't set below - happened once
body = b"* unreadable msg received"
# this uses the last text or html part in the email, phone companies often put content in an attachment
for part in msg.get_payload():
if part.get_content_charset() is None:
# or BREAK when we hit a text or html?
# We cannot know the character set,
# so return decoded "something"
LOG.debug("Email got unknown content type")
text = part.get_payload(decode=True)
continue
charset = part.get_content_charset()
if part.get_content_type() == "text/plain":
LOG.debug("Email got text/plain")
text = str(
part.get_payload(decode=True),
str(charset),
"ignore",
).encode("utf8", "replace")
if part.get_content_type() == "text/html":
LOG.debug("Email got text/html")
html = str(
part.get_payload(decode=True),
str(charset),
"ignore",
).encode("utf8", "replace")
if text is not None:
# strip removes white space fore and aft of string
body = text.strip()
else:
body = html.strip()
else: # message is not multipart
# email.uscc.net sends no charset, blows up unicode function below
LOG.debug("Email is not multipart")
if msg.get_content_charset() is None:
text = str(msg.get_payload(decode=True), "US-ASCII", "ignore").encode(
"utf8",
"replace",
)
else:
text = str(
msg.get_payload(decode=True),
msg.get_content_charset(),
"ignore",
).encode("utf8", "replace")
body = text.strip()
# FIXED: UnicodeDecodeError: 'ascii' codec can't decode byte 0xf0 in position 6: ordinal not in range(128)
# decode with errors='ignore'. be sure to encode it before we return it below, also with errors='ignore'
try:
body = body.decode(errors="ignore")
except Exception as e:
LOG.error("Unicode decode failure: " + str(e))
LOG.error("Unidoce decode failed: " + str(body))
body = "Unreadable unicode msg"
# strip all html tags
body = re.sub("<[^<]+?>", "", body)
# strip CR/LF, make it one line, .rstrip fails at this
body = body.replace("\n", " ").replace("\r", " ")
# ascii might be out of range, so encode it, removing any error characters
body = body.encode(errors="ignore")
return (body, from_addr)
# end parse_email
def send_email(to_addr, content):
global check_email_delay
shortcuts = CONFIG["shortcuts"]
email_address = get_email_from_shortcut(to_addr)
LOG.info("Sending Email_________________")
if to_addr in shortcuts:
LOG.info("To : " + to_addr)
to_addr = email_address
LOG.info(" (" + to_addr + ")")
subject = CONFIG["ham"]["callsign"]
# content = content + "\n\n(NOTE: reply with one line)"
LOG.info("Subject : " + subject)
LOG.info("Body : " + content)
# check email more often since there's activity right now
check_email_delay = 60
msg = MIMEText(content)
msg["Subject"] = subject
msg["From"] = CONFIG["smtp"]["login"]
msg["To"] = to_addr
server = _smtp_connect()
if server:
try:
server.sendmail(CONFIG["smtp"]["login"], [to_addr], msg.as_string())
except Exception as e:
msg = getattr(e, "message", repr(e))
LOG.error("Sendmail Error!!!! '{}'", msg)
server.quit()
return -1
server.quit()
return 0
# end send_email
def resend_email(count, fromcall):
global check_email_delay
date = datetime.datetime.now()
month = date.strftime("%B")[:3] # Nov, Mar, Apr
day = date.day
year = date.year
today = "{}-{}-{}".format(day, month, year)
shortcuts = CONFIG["shortcuts"]
# swap key/value
shortcuts_inverted = {v: k for k, v in shortcuts.items()}
try:
server = _imap_connect()
except Exception as e:
LOG.exception("Failed to Connect to IMAP. Cannot resend email ", e)
return
messages = server.search(["SINCE", today])
# LOG.debug("%d messages received today" % len(messages))
msgexists = False
messages.sort(reverse=True)
del messages[int(count) :] # only the latest "count" messages
for message in messages:
for msgid, data in list(server.fetch(message, ["ENVELOPE"]).items()):
# one at a time, otherwise order is random
(body, from_addr) = parse_email(msgid, data, server)
# unset seen flag, will stay bold in email client
server.remove_flags(msgid, [imapclient.SEEN])
if from_addr in shortcuts_inverted:
# reverse lookup of a shortcut
from_addr = shortcuts_inverted[from_addr]
# asterisk indicates a resend
reply = "-" + from_addr + " * " + body.decode(errors="ignore")
# messaging.send_message(fromcall, reply)
msg = messaging.TextMessage(CONFIG["aprs"]["login"], fromcall, reply)
msg.send()
msgexists = True
if msgexists is not True:
stm = time.localtime()
h = stm.tm_hour
m = stm.tm_min
s = stm.tm_sec
# append time as a kind of serial number to prevent FT1XDR from
# thinking this is a duplicate message.
# The FT1XDR pretty much ignores the aprs message number in this
# regard. The FTM400 gets it right.
reply = "No new msg {}:{}:{}".format(
str(h).zfill(2),
str(m).zfill(2),
str(s).zfill(2),
)
# messaging.send_message(fromcall, reply)
msg = messaging.TextMessage(CONFIG["aprs"]["login"], fromcall, reply)
msg.send()
# check email more often since we're resending one now
check_email_delay = 60
server.logout()
# end resend_email()
class APRSDEmailThread(threads.APRSDThread):
def __init__(self, msg_queues, config):
super().__init__("EmailThread")
self.msg_queues = msg_queues
self.config = config
def run(self):
global check_email_delay
LOG.debug("Starting")
check_email_delay = 60
past = datetime.datetime.now()
while not self.thread_stop:
time.sleep(5)
# always sleep for 5 seconds and see if we need to check email
# This allows CTRL-C to stop the execution of this loop sooner
# than check_email_delay time
now = datetime.datetime.now()
if now - past > datetime.timedelta(seconds=check_email_delay):
# It's time to check email
# slowly increase delay every iteration, max out at 300 seconds
# any send/receive/resend activity will reset this to 60 seconds
if check_email_delay < 300:
check_email_delay += 1
LOG.debug("check_email_delay is " + str(check_email_delay) + " seconds")
shortcuts = CONFIG["shortcuts"]
# swap key/value
shortcuts_inverted = {v: k for k, v in shortcuts.items()}
date = datetime.datetime.now()
month = date.strftime("%B")[:3] # Nov, Mar, Apr
day = date.day
year = date.year
today = "{}-{}-{}".format(day, month, year)
server = None
try:
server = _imap_connect()
except Exception as e:
LOG.exception("Failed to get IMAP server Can't check email.", e)
if not server:
continue
messages = server.search(["SINCE", today])
LOG.debug("{} messages received today".format(len(messages)))
for msgid, data in server.fetch(messages, ["ENVELOPE"]).items():
envelope = data[b"ENVELOPE"]
# LOG.debug('ID:%d "%s" (%s)' % (msgid, envelope.subject.decode(), envelope.date))
f = re.search(
r"'([[A-a][0-9]_-]+@[[A-a][0-9]_-\.]+)",
str(envelope.from_[0]),
)
if f is not None:
from_addr = f.group(1)
else:
from_addr = "noaddr"
# LOG.debug("Message flags/tags: " + str(server.get_flags(msgid)[msgid]))
# if "APRS" not in server.get_flags(msgid)[msgid]:
# in python3, imap tags are unicode. in py2 they're strings. so .decode them to handle both
taglist = [
x.decode(errors="ignore")
for x in server.get_flags(msgid)[msgid]
]
if "APRS" not in taglist:
# if msg not flagged as sent via aprs
server.fetch([msgid], ["RFC822"])
(body, from_addr) = parse_email(msgid, data, server)
# unset seen flag, will stay bold in email client
server.remove_flags(msgid, [imapclient.SEEN])
if from_addr in shortcuts_inverted:
# reverse lookup of a shortcut
from_addr = shortcuts_inverted[from_addr]
reply = "-" + from_addr + " " + body.decode(errors="ignore")
msg = messaging.TextMessage(
self.config["aprs"]["login"],
self.config["ham"]["callsign"],
reply,
)
self.msg_queues["tx"].put(msg)
# flag message as sent via aprs
server.add_flags(msgid, ["APRS"])
# unset seen flag, will stay bold in email client
server.remove_flags(msgid, [imapclient.SEEN])
# check email more often since we just received an email
check_email_delay = 60
# reset clock
past = datetime.datetime.now()
server.logout()
else:
# We haven't hit the email delay yet.
# LOG.debug("Delta({}) < {}".format(now - past, check_email_delay))
pass
# Remove ourselves from the global threads list
threads.APRSDThreadList().remove(self)
LOG.info("Exiting")
# end check_email()

13
aprsd/exception.py Normal file
View File

@ -0,0 +1,13 @@
class MissingConfigOptionException(Exception):
"""Missing a config option."""
def __init__(self, config_option):
self.message = f"Option '{config_option}' was not in config file"
class ConfigOptionBogusDefaultException(Exception):
"""Missing a config option."""
def __init__(self, config_option, default_fail):
self.message = (
f"Config file option '{config_option}' needs to be "
f"changed from provided default of '{default_fail}'"
)

View File

@ -1,83 +0,0 @@
import argparse
import logging
from logging.handlers import RotatingFileHandler
import socketserver
import sys
import time
from aprsd import utils
# command line args
parser = argparse.ArgumentParser()
parser.add_argument(
"--loglevel",
default="DEBUG",
choices=["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
help="The log level to use for aprsd.log",
)
parser.add_argument("--quiet", action="store_true", help="Don't log to stdout")
parser.add_argument("--port", default=9099, type=int, help="The port to listen on .")
parser.add_argument("--ip", default="127.0.0.1", help="The IP to listen on ")
CONFIG = None
LOG = logging.getLogger("ARPSSERVER")
# Setup the logging faciility
# to disable logging to stdout, but still log to file
# use the --quiet option on the cmdln
def setup_logging(args):
global LOG
levels = {
"CRITICAL": logging.CRITICAL,
"ERROR": logging.ERROR,
"WARNING": logging.WARNING,
"INFO": logging.INFO,
"DEBUG": logging.DEBUG,
}
log_level = levels[args.loglevel]
LOG.setLevel(log_level)
log_format = "%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s]" " %(message)s"
date_format = "%m/%d/%Y %I:%M:%S %p"
log_formatter = logging.Formatter(fmt=log_format, datefmt=date_format)
fh = RotatingFileHandler("aprs-server.log", maxBytes=(10248576 * 5), backupCount=4)
fh.setFormatter(log_formatter)
LOG.addHandler(fh)
if not args.quiet:
sh = logging.StreamHandler(sys.stdout)
sh.setFormatter(log_formatter)
LOG.addHandler(sh)
class MyAPRSTCPHandler(socketserver.BaseRequestHandler):
def handle(self):
# self.request is the TCP socket connected to the client
self.data = self.request.recv(1024).strip()
LOG.debug("{} wrote:".format(self.client_address[0]))
LOG.debug(self.data)
# just send back the same data, but upper-cased
self.request.sendall(self.data.upper())
def main():
global CONFIG
args = parser.parse_args()
setup_logging(args)
LOG.info("Test APRS server starting.")
time.sleep(1)
CONFIG = utils.parse_config(args)
ip = CONFIG["aprs"]["host"]
port = CONFIG["aprs"]["port"]
LOG.info("Start server listening on {}:{}".format(args.ip, args.port))
with socketserver.TCPServer((ip, port), MyAPRSTCPHandler) as server:
server.serve_forever()
if __name__ == "__main__":
main()

0
aprsd/log/__init__.py Normal file
View File

138
aprsd/log/log.py Normal file
View File

@ -0,0 +1,138 @@
import logging
from logging.handlers import QueueHandler
import queue
import sys
from loguru import logger
from oslo_config import cfg
from aprsd.conf import log as conf_log
CONF = cfg.CONF
# LOG = logging.getLogger("APRSD")
LOG = logger
class QueueLatest(queue.Queue):
"""Custom Queue to keep only the latest N items.
This prevents the queue from blowing up in size.
"""
def put(self, *args, **kwargs):
try:
super().put(*args, **kwargs)
except queue.Full:
self.queue.popleft()
super().put(*args, **kwargs)
logging_queue = QueueLatest(maxsize=200)
class InterceptHandler(logging.Handler):
def emit(self, record):
# get corresponding Loguru level if it exists
try:
level = logger.level(record.levelname).name
except ValueError:
level = record.levelno
# find caller from where originated the logged message
frame, depth = sys._getframe(6), 6
while frame and frame.f_code.co_filename == logging.__file__:
frame = frame.f_back
depth += 1
logger.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())
# Setup the log faciility
# to disable log to stdout, but still log to file
# use the --quiet option on the cmdln
def setup_logging(loglevel=None, quiet=False):
if not loglevel:
log_level = CONF.logging.log_level
else:
log_level = conf_log.LOG_LEVELS[loglevel]
# intercept everything at the root logger
logging.root.handlers = [InterceptHandler()]
logging.root.setLevel(log_level)
imap_list = [
"imapclient.imaplib", "imaplib", "imapclient",
"imapclient.util",
]
aprslib_list = [
"aprslib",
"aprslib.parsing",
"aprslib.exceptions",
]
webserver_list = [
"werkzeug",
"werkzeug._internal",
"socketio",
"urllib3.connectionpool",
"chardet",
"chardet.charsetgroupprober",
"chardet.eucjpprober",
"chardet.mbcharsetprober",
]
# We don't really want to see the aprslib parsing debug output.
disable_list = imap_list + aprslib_list + webserver_list
# remove every other logger's handlers
# and propagate to root logger
for name in logging.root.manager.loggerDict.keys():
logging.getLogger(name).handlers = []
if name in disable_list:
logging.getLogger(name).propagate = False
else:
logging.getLogger(name).propagate = True
if CONF.webchat.disable_url_request_logging:
for name in webserver_list:
logging.getLogger(name).handlers = []
logging.getLogger(name).propagate = True
logging.getLogger(name).setLevel(logging.ERROR)
handlers = [
{
"sink": sys.stdout,
"serialize": False,
"format": CONF.logging.logformat,
"colorize": True,
"level": log_level,
},
]
if CONF.logging.logfile:
handlers.append(
{
"sink": CONF.logging.logfile,
"serialize": False,
"format": CONF.logging.logformat,
"colorize": False,
"level": log_level,
},
)
if CONF.email_plugin.enabled and CONF.email_plugin.debug:
for name in imap_list:
logging.getLogger(name).propagate = True
if CONF.admin.web_enabled:
qh = QueueHandler(logging_queue)
handlers.append(
{
"sink": qh, "serialize": False,
"format": CONF.logging.logformat,
"level": log_level,
"colorize": False,
},
)
# configure loguru
logger.configure(handlers=handlers)
logger.level("DEBUG", color="<fg #BABABA>")

View File

@ -20,455 +20,143 @@
#
# python included libs
import datetime
import importlib.metadata as imp
from importlib.metadata import version as metadata_version
import logging
from logging import NullHandler
from logging.handlers import RotatingFileHandler
import os
import queue
import signal
import sys
import threading
import time
import click
from oslo_config import cfg, generator
# local imports here
import aprsd
from aprsd import client, email, messaging, plugin, threads, utils
import aprslib
from aprslib.exceptions import LoginError
import click
import click_completion
import yaml
from aprsd import cli_helper, packets, threads, utils
from aprsd.stats import collector
# setup the global logger
# logging.basicConfig(level=logging.DEBUG) # level=10
# log.basicConfig(level=log.DEBUG) # level=10
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
LOG_LEVELS = {
"CRITICAL": logging.CRITICAL,
"ERROR": logging.ERROR,
"WARNING": logging.WARNING,
"INFO": logging.INFO,
"DEBUG": logging.DEBUG,
}
CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"])
server_event = threading.Event()
# localization, please edit:
# HOST = "noam.aprs2.net" # north america tier2 servers round robin
# USER = "KM6XXX-9" # callsign of this aprs client with SSID
# PASS = "99999" # google how to generate this
# BASECALLSIGN = "KM6XXX" # callsign of radio in the field to send email
# shortcuts = {
# "aa" : "5551239999@vtext.com",
# "cl" : "craiglamparter@somedomain.org",
# "wb" : "5553909472@vtext.com"
# }
flask_enabled = False
def custom_startswith(string, incomplete):
"""A custom completion match that supports case insensitive matching."""
if os.environ.get("_CLICK_COMPLETION_COMMAND_CASE_INSENSITIVE_COMPLETE"):
string = string.lower()
incomplete = incomplete.lower()
return string.startswith(incomplete)
click_completion.core.startswith = custom_startswith
click_completion.init()
cmd_help = """Shell completion for click-completion-command
Available shell types:
\b
%s
Default type: auto
""" % "\n ".join(
"{:<12} {}".format(k, click_completion.core.shells[k])
for k in sorted(click_completion.core.shells.keys())
)
@click.group(help=cmd_help, context_settings=CONTEXT_SETTINGS)
@click.group(cls=cli_helper.AliasedGroup, context_settings=CONTEXT_SETTINGS)
@click.version_option()
def main():
@click.pass_context
def cli(ctx):
pass
@main.command()
@click.option(
"-i",
"--case-insensitive/--no-case-insensitive",
help="Case insensitive completion",
)
@click.argument(
"shell",
required=False,
type=click_completion.DocumentedChoice(click_completion.core.shells),
)
def show(shell, case_insensitive):
"""Show the click-completion-command completion code"""
extra_env = (
{"_CLICK_COMPLETION_COMMAND_CASE_INSENSITIVE_COMPLETE": "ON"}
if case_insensitive
else {}
def load_commands():
from .cmds import ( # noqa
completion, dev, fetch_stats, healthcheck, list_plugins, listen,
send_message, server, webchat,
)
click.echo(click_completion.core.get_code(shell, extra_env=extra_env))
@main.command()
@click.option(
"--append/--overwrite",
help="Append the completion code to the file",
default=None,
)
@click.option(
"-i",
"--case-insensitive/--no-case-insensitive",
help="Case insensitive completion",
)
@click.argument(
"shell",
required=False,
type=click_completion.DocumentedChoice(click_completion.core.shells),
)
@click.argument("path", required=False)
def install(append, case_insensitive, shell, path):
"""Install the click-completion-command completion"""
extra_env = (
{"_CLICK_COMPLETION_COMMAND_CASE_INSENSITIVE_COMPLETE": "ON"}
if case_insensitive
else {}
)
shell, path = click_completion.core.install(
shell=shell,
path=path,
append=append,
extra_env=extra_env,
)
click.echo("{} completion installed in {}".format(shell, path))
def main():
# First import all the possible commands for the CLI
# The commands themselves live in the cmds directory
load_commands()
utils.load_entry_points("aprsd.extension")
cli(auto_envvar_prefix="APRSD")
def signal_handler(sig, frame):
global server_vent
global flask_enabled
LOG.info(
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds to exit all threads",
)
click.echo("signal_handler: called")
threads.APRSDThreadList().stop_all()
server_event.set()
time.sleep(1)
signal.signal(signal.SIGTERM, sys.exit(0))
if "subprocess" not in str(frame):
LOG.info(
"Ctrl+C, Sending all threads exit! Can take up to 10 seconds {}".format(
datetime.datetime.now(),
),
)
time.sleep(1.5)
packets.PacketTrack().save()
packets.WatchList().save()
packets.SeenList().save()
packets.PacketList().save()
LOG.info(collector.Collector().collect())
# signal.signal(signal.SIGTERM, sys.exit(0))
# sys.exit(0)
if flask_enabled:
signal.signal(signal.SIGTERM, sys.exit(0))
# end signal_handler
# Setup the logging faciility
# to disable logging to stdout, but still log to file
# use the --quiet option on the cmdln
def setup_logging(config, loglevel, quiet):
log_level = LOG_LEVELS[loglevel]
LOG.setLevel(log_level)
log_format = "[%(asctime)s] [%(threadName)-12s] [%(levelname)-5.5s]" " %(message)s"
date_format = "%m/%d/%Y %I:%M:%S %p"
log_formatter = logging.Formatter(fmt=log_format, datefmt=date_format)
log_file = config["aprs"].get("logfile", None)
if log_file:
fh = RotatingFileHandler(log_file, maxBytes=(10248576 * 5), backupCount=4)
@cli.command()
@cli_helper.add_options(cli_helper.common_options)
@click.pass_context
@cli_helper.process_standard_options_no_config
def check_version(ctx):
"""Check this version against the latest in pypi.org."""
level, msg = utils._check_version()
if level:
click.secho(msg, fg="yellow")
else:
fh = NullHandler()
fh.setFormatter(log_formatter)
LOG.addHandler(fh)
if not quiet:
sh = logging.StreamHandler(sys.stdout)
sh.setFormatter(log_formatter)
LOG.addHandler(sh)
click.secho(msg, fg="green")
@main.command()
def sample_config():
"""This dumps the config to stdout."""
click.echo(utils.add_config_comments(yaml.dump(utils.DEFAULT_CONFIG_DICT)))
@cli.command()
@click.pass_context
def sample_config(ctx):
"""Generate a sample Config file from aprsd and all installed plugins."""
@main.command()
@click.option(
"--loglevel",
default="DEBUG",
show_default=True,
type=click.Choice(
["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
case_sensitive=False,
),
show_choices=True,
help="The log level to use for aprsd.log",
)
@click.option("--quiet", is_flag=True, default=False, help="Don't log to stdout")
@click.option(
"-c",
"--config",
"config_file",
show_default=True,
default=utils.DEFAULT_CONFIG_FILE,
help="The aprsd config file to use for options.",
)
@click.option(
"--aprs-login",
envvar="APRS_LOGIN",
show_envvar=True,
help="What callsign to send the message from.",
)
@click.option(
"--aprs-password",
envvar="APRS_PASSWORD",
show_envvar=True,
help="the APRS-IS password for APRS_LOGIN",
)
@click.option(
"--no-ack",
"-n",
is_flag=True,
show_default=True,
default=False,
help="Don't wait for an ack, just sent it to APRS-IS and bail.",
)
@click.option("--raw", default=None, help="Send a raw message. Implies --no-ack")
@click.argument("tocallsign", required=False)
@click.argument("command", nargs=-1, required=False)
def send_message(
loglevel,
quiet,
config_file,
aprs_login,
aprs_password,
no_ack,
raw,
tocallsign,
command,
):
"""Send a message to a callsign via APRS_IS."""
global got_ack, got_response
config = utils.parse_config(config_file)
if not aprs_login:
click.echo("Must set --aprs_login or APRS_LOGIN")
return
if not aprs_password:
click.echo("Must set --aprs-password or APRS_PASSWORD")
return
config["aprs"]["login"] = aprs_login
config["aprs"]["password"] = aprs_password
messaging.CONFIG = config
setup_logging(config, loglevel, quiet)
LOG.info("APRSD Started version: {}".format(aprsd.__version__))
if type(command) is tuple:
command = " ".join(command)
if not quiet:
if raw:
LOG.info("L'{}' R'{}'".format(aprs_login, raw))
def _get_selected_entry_points():
import sys
if sys.version_info < (3, 10):
all = imp.entry_points()
selected = []
if "oslo.config.opts" in all:
for x in all["oslo.config.opts"]:
if x.group == "oslo.config.opts":
selected.append(x)
else:
LOG.info("L'{}' To'{}' C'{}'".format(aprs_login, tocallsign, command))
selected = imp.entry_points(group="oslo.config.opts")
got_ack = False
got_response = False
return selected
def rx_packet(packet):
global got_ack, got_response
# LOG.debug("Got packet back {}".format(packet))
resp = packet.get("response", None)
if resp == "ack":
ack_num = packet.get("msgNo")
LOG.info("We got ack for our sent message {}".format(ack_num))
messaging.log_packet(packet)
got_ack = True
else:
message = packet.get("message_text", None)
fromcall = packet["from"]
msg_number = packet.get("msgNo", "0")
messaging.log_message(
"Received Message",
packet["raw"],
message,
fromcall=fromcall,
ack=msg_number,
)
got_response = True
# Send the ack back?
ack = messaging.AckMessage(
config["aprs"]["login"],
fromcall,
msg_id=msg_number,
)
ack.send_direct()
def get_namespaces():
args = []
if got_ack and got_response:
sys.exit(0)
# selected = imp.entry_points(group="oslo.config.opts")
selected = _get_selected_entry_points()
for entry in selected:
if "aprsd" in entry.name:
args.append("--namespace")
args.append(entry.name)
return args
args = get_namespaces()
config_version = metadata_version("oslo.config")
logging.basicConfig(level=logging.WARN)
conf = cfg.ConfigOpts()
generator.register_cli_opts(conf)
try:
cl = client.Client(config)
cl.setup_connection()
except LoginError:
sys.exit(-1)
# Send a message
# then we setup a consumer to rx messages
# We should get an ack back as well as a new message
# we should bail after we get the ack and send an ack back for the
# message
if raw:
msg = messaging.RawMessage(raw)
msg.send_direct()
sys.exit(0)
else:
msg = messaging.TextMessage(aprs_login, tocallsign, command)
msg.send_direct()
if no_ack:
sys.exit(0)
try:
# This will register a packet consumer with aprslib
# When new packets come in the consumer will process
# the packet
aprs_client = client.get_client()
aprs_client.consumer(rx_packet, raw=False)
except aprslib.exceptions.ConnectionDrop:
LOG.error("Connection dropped, reconnecting")
time.sleep(5)
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()
# is called
cl.reset()
conf(args, version=config_version)
except cfg.RequiredOptError:
conf.print_help()
if not sys.argv[1:]:
raise SystemExit
raise
generator.generate(conf)
return
# main() ###
@main.command()
@click.option(
"--loglevel",
default="INFO",
show_default=True,
type=click.Choice(
["CRITICAL", "ERROR", "WARNING", "INFO", "DEBUG"],
case_sensitive=False,
),
show_choices=True,
help="The log level to use for aprsd.log",
)
@click.option("--quiet", is_flag=True, default=False, help="Don't log to stdout")
@click.option(
"--disable-validation",
is_flag=True,
default=False,
help="Disable email shortcut validation. Bad email addresses can result in broken email responses!!",
)
@click.option(
"-c",
"--config",
"config_file",
show_default=True,
default=utils.DEFAULT_CONFIG_FILE,
help="The aprsd config file to use for options.",
)
@click.option(
"-f",
"--flush",
"flush",
is_flag=True,
show_default=True,
default=False,
help="Flush out all old aged messages on disk.",
)
@click.option(
"--stats-server",
is_flag=True,
default=False,
help="Run a stats web server on port 5001?",
)
def server(
loglevel,
quiet,
disable_validation,
config_file,
flush,
stats_server,
):
"""Start the aprsd server process."""
global event
event = threading.Event()
signal.signal(signal.SIGINT, signal_handler)
if not quiet:
click.echo("Load config")
config = utils.parse_config(config_file)
# Force setting the config to the modules that need it
# TODO(Walt): convert these modules to classes that can
# Accept the config as a constructor param, instead of this
# hacky global setting
email.CONFIG = config
setup_logging(config, loglevel, quiet)
LOG.info("APRSD Started version: {}".format(aprsd.__version__))
# TODO(walt): Make email processing/checking optional?
# Maybe someone only wants this to process messages with plugins only.
valid = email.validate_email_config(config, disable_validation)
if not valid:
LOG.error("Failed to validate email config options")
sys.exit(-1)
# Create the initial PM singleton and Register plugins
plugin_manager = plugin.PluginManager(config)
plugin_manager.setup_plugins()
try:
cl = client.Client(config)
cl.client
except LoginError:
sys.exit(-1)
# Now load the msgTrack from disk if any
if flush:
LOG.debug("Deleting saved MsgTrack.")
messaging.MsgTrack().flush()
else:
# Try and load saved MsgTrack list
LOG.debug("Loading saved MsgTrack object.")
messaging.MsgTrack().load()
rx_msg_queue = queue.Queue(maxsize=20)
tx_msg_queue = queue.Queue(maxsize=20)
msg_queues = {"rx": rx_msg_queue, "tx": tx_msg_queue}
rx_thread = threads.APRSDRXThread(msg_queues=msg_queues, config=config)
tx_thread = threads.APRSDTXThread(msg_queues=msg_queues, config=config)
email_thread = email.APRSDEmailThread(msg_queues=msg_queues, config=config)
email_thread.start()
rx_thread.start()
tx_thread.start()
messaging.MsgTrack().restart()
cntr = 0
while not server_event.is_set():
# to keep the log noise down
if cntr % 12 == 0:
tracker = messaging.MsgTrack()
LOG.debug("KeepAlive Tracker({}): {}".format(len(tracker), str(tracker)))
cntr += 1
time.sleep(10)
# If there are items in the msgTracker, then save them
tracker = messaging.MsgTrack()
tracker.save()
LOG.info("APRSD Exiting.")
@cli.command()
@click.pass_context
def version(ctx):
"""Show the APRSD version."""
click.echo(click.style("APRSD Version : ", fg="white"), nl=False)
click.secho(f"{aprsd.__version__}", fg="yellow", bold=True)
if __name__ == "__main__":

View File

@ -1,566 +1,4 @@
import abc
import datetime
import logging
from multiprocessing import RawValue
import os
import pathlib
import pickle
import re
import threading
import time
from aprsd import client, threads, utils
LOG = logging.getLogger("APRSD")
# What to return from a plugin if we have processed the message
# and it's ok, but don't send a usage string back
NULL_MESSAGE = -1
class MsgTrack:
"""Class to keep track of outstanding text messages.
This is a thread safe class that keeps track of active
messages.
When a message is asked to be sent, it is placed into this
class via it's id. The TextMessage class's send() method
automatically adds itself to this class. When the ack is
recieved from the radio, the message object is removed from
this class.
# TODO(hemna)
When aprsd is asked to quit this class should be serialized and
saved to disk/db to keep track of the state of outstanding messages.
When aprsd is started, it should try and fetch the saved state,
and reloaded to a live state.
"""
_instance = None
_start_time = None
lock = None
track = {}
total_messages_tracked = 0
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance.track = {}
cls._start_time = datetime.datetime.now()
cls._instance.lock = threading.Lock()
return cls._instance
def add(self, msg):
with self.lock:
key = int(msg.id)
self.track[key] = msg
self.total_messages_tracked += 1
def get(self, id):
with self.lock:
if id in self.track:
return self.track[id]
def remove(self, id):
with self.lock:
key = int(id)
if key in self.track.keys():
del self.track[key]
def __len__(self):
with self.lock:
return len(self.track)
def __str__(self):
with self.lock:
result = "{"
for key in self.track.keys():
result += "{}: {}, ".format(key, str(self.track[key]))
result += "}"
return result
def save(self):
"""Save this shit to disk?"""
if len(self) > 0:
LOG.info("Saving {} tracking messages to disk".format(len(self)))
pickle.dump(self.dump(), open(utils.DEFAULT_SAVE_FILE, "wb+"))
else:
self.flush()
def dump(self):
dump = {}
with self.lock:
for key in self.track.keys():
dump[key] = self.track[key]
return dump
def load(self):
if os.path.exists(utils.DEFAULT_SAVE_FILE):
raw = pickle.load(open(utils.DEFAULT_SAVE_FILE, "rb"))
if raw:
self.track = raw
LOG.debug("Loaded MsgTrack dict from disk.")
LOG.debug(self)
def restart(self):
"""Walk the list of messages and restart them if any."""
for key in self.track.keys():
msg = self.track[key]
if msg.last_send_attempt < msg.retry_count:
msg.send()
def _resend(self, msg):
msg.last_send_attempt = 0
msg.send()
def restart_delayed(self, count=None, most_recent=True):
"""Walk the list of delayed messages and restart them if any."""
if not count:
# Send all the delayed messages
for key in self.track.keys():
msg = self.track[key]
if msg.last_send_attempt == msg.retry_count:
self._resend(msg)
else:
# They want to resend <count> delayed messages
tmp = sorted(
self.track.items(),
reverse=most_recent,
key=lambda x: x[1].last_send_time,
)
msg_list = tmp[:count]
for (_key, msg) in msg_list:
self._resend(msg)
def flush(self):
"""Nuke the old pickle file that stored the old results from last aprsd run."""
if os.path.exists(utils.DEFAULT_SAVE_FILE):
pathlib.Path(utils.DEFAULT_SAVE_FILE).unlink()
with self.lock:
self.track = {}
class MessageCounter:
"""
Global message id counter class.
This is a singleton based class that keeps
an incrementing counter for all messages to
be sent. All new Message objects gets a new
message id, which is the next number available
from the MessageCounter.
"""
_instance = None
max_count = 9999
def __new__(cls, *args, **kwargs):
"""Make this a singleton class."""
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance.val = RawValue("i", 1)
cls._instance.lock = threading.Lock()
return cls._instance
def increment(self):
with self.lock:
if self.val.value == self.max_count:
self.val.value = 1
else:
self.val.value += 1
@property
def value(self):
with self.lock:
return self.val.value
def __repr__(self):
with self.lock:
return str(self.val.value)
def __str__(self):
with self.lock:
return str(self.val.value)
class Message(metaclass=abc.ABCMeta):
"""Base Message Class."""
# The message id to send over the air
id = 0
retry_count = 3
last_send_time = None
last_send_attempt = 0
def __init__(self, fromcall, tocall, msg_id=None):
self.fromcall = fromcall
self.tocall = tocall
if not msg_id:
c = MessageCounter()
c.increment()
msg_id = c.value
self.id = msg_id
@abc.abstractmethod
def send(self):
"""Child class must declare."""
pass
class RawMessage(Message):
"""Send a raw message.
This class is used for custom messages that contain the entire
contents of an APRS message in the message field.
"""
message = None
def __init__(self, message):
super().__init__(None, None, msg_id=None)
self.message = message
def __repr__(self):
return self.message
def __str__(self):
return self.message
def send(self):
tracker = MsgTrack()
tracker.add(self)
thread = SendMessageThread(message=self)
thread.start()
def send_direct(self):
"""Send a message without a separate thread."""
cl = client.get_client()
log_message(
"Sending Message Direct",
repr(self).rstrip("\n"),
self.message,
tocall=self.tocall,
fromcall=self.fromcall,
)
cl.sendall(repr(self))
class TextMessage(Message):
"""Send regular ARPS text/command messages/replies."""
message = None
def __init__(self, fromcall, tocall, message, msg_id=None, allow_delay=True):
super().__init__(fromcall, tocall, msg_id)
self.message = message
# do we try and save this message for later if we don't get
# an ack? Some messages we don't want to do this ever.
self.allow_delay = allow_delay
def __repr__(self):
"""Build raw string to send over the air."""
return "{}>APRS::{}:{}{{{}\n".format(
self.fromcall,
self.tocall.ljust(9),
self._filter_for_send(),
str(self.id),
)
def __str__(self):
delta = "Never"
if self.last_send_time:
now = datetime.datetime.now()
delta = now - self.last_send_time
return "{}>{} Msg({})({}): '{}'".format(
self.fromcall,
self.tocall,
self.id,
delta,
self.message,
)
def _filter_for_send(self):
"""Filter and format message string for FCC."""
# max? ftm400 displays 64, raw msg shows 74
# and ftm400-send is max 64. setting this to
# 67 displays 64 on the ftm400. (+3 {01 suffix)
# feature req: break long ones into two msgs
message = self.message[:67]
# We all miss George Carlin
return re.sub("fuck|shit|cunt|piss|cock|bitch", "****", message)
def send(self):
tracker = MsgTrack()
tracker.add(self)
LOG.debug("Length of MsgTrack is {}".format(len(tracker)))
thread = SendMessageThread(message=self)
thread.start()
def send_direct(self):
"""Send a message without a separate thread."""
cl = client.get_client()
log_message(
"Sending Message Direct",
repr(self).rstrip("\n"),
self.message,
tocall=self.tocall,
fromcall=self.fromcall,
)
cl.sendall(repr(self))
class SendMessageThread(threads.APRSDThread):
def __init__(self, message):
self.msg = message
name = self.msg.message[:5]
super().__init__("SendMessage-{}-{}".format(self.msg.id, name))
def loop(self):
"""Loop until a message is acked or it gets delayed.
We only sleep for 5 seconds between each loop run, so
that CTRL-C can exit the app in a short period. Each sleep
means the app quitting is blocked until sleep is done.
So we keep track of the last send attempt and only send if the
last send attempt is old enough.
"""
cl = client.get_client()
tracker = MsgTrack()
# lets see if the message is still in the tracking queue
msg = tracker.get(self.msg.id)
if not msg:
# The message has been removed from the tracking queue
# So it got acked and we are done.
LOG.info("Message Send Complete via Ack.")
return False
else:
send_now = False
if msg.last_send_attempt == msg.retry_count:
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.info("Message Send Complete. Max attempts reached.")
return False
# Message is still outstanding and needs to be acked.
if msg.last_send_time:
# Message has a last send time tracking
now = datetime.datetime.now()
sleeptime = (msg.last_send_attempt + 1) * 31
delta = now - msg.last_send_time
if delta > datetime.timedelta(seconds=sleeptime):
# It's time to try to send it again
send_now = True
else:
send_now = True
if send_now:
# no attempt time, so lets send it, and start
# tracking the time.
log_message(
"Sending Message",
repr(msg).rstrip("\n"),
msg.message,
tocall=self.msg.tocall,
retry_number=msg.last_send_attempt,
msg_num=msg.id,
)
cl.sendall(repr(msg))
msg.last_send_time = datetime.datetime.now()
msg.last_send_attempt += 1
time.sleep(5)
# Make sure we get called again.
return True
class AckMessage(Message):
"""Class for building Acks and sending them."""
def __init__(self, fromcall, tocall, msg_id):
super().__init__(fromcall, tocall, msg_id=msg_id)
def __repr__(self):
return "{}>APRS::{}:ack{}\n".format(
self.fromcall,
self.tocall.ljust(9),
self.id,
)
def __str__(self):
return "From({}) TO({}) Ack ({})".format(self.fromcall, self.tocall, self.id)
def send_thread(self):
"""Separate thread to send acks with retries."""
cl = client.get_client()
for i in range(self.retry_count, 0, -1):
log_message(
"Sending ack",
repr(self).rstrip("\n"),
None,
ack=self.id,
tocall=self.tocall,
retry_number=i,
)
cl.sendall(repr(self))
# aprs duplicate detection is 30 secs?
# (21 only sends first, 28 skips middle)
time.sleep(31)
# end_send_ack_thread
def send(self):
LOG.debug("Send ACK({}:{}) to radio.".format(self.tocall, self.id))
thread = SendAckThread(self)
thread.start()
# end send_ack()
def send_direct(self):
"""Send an ack message without a separate thread."""
cl = client.get_client()
log_message(
"Sending ack",
repr(self).rstrip("\n"),
None,
ack=self.id,
tocall=self.tocall,
fromcall=self.fromcall,
)
cl.sendall(repr(self))
class SendAckThread(threads.APRSDThread):
def __init__(self, ack):
self.ack = ack
super().__init__("SendAck-{}".format(self.ack.id))
def loop(self):
"""Separate thread to send acks with retries."""
send_now = False
if self.ack.last_send_attempt == self.ack.retry_count:
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.info("Ack Send Complete. Max attempts reached.")
return False
if self.ack.last_send_time:
# Message has a last send time tracking
now = datetime.datetime.now()
# aprs duplicate detection is 30 secs?
# (21 only sends first, 28 skips middle)
sleeptime = 31
delta = now - self.ack.last_send_time
if delta > datetime.timedelta(seconds=sleeptime):
# It's time to try to send it again
send_now = True
else:
LOG.debug("Still wating. {}".format(delta))
else:
send_now = True
if send_now:
cl = client.get_client()
log_message(
"Sending ack",
repr(self.ack).rstrip("\n"),
None,
ack=self.ack.id,
tocall=self.ack.tocall,
retry_number=self.ack.last_send_attempt,
)
cl.sendall(repr(self.ack))
self.ack.last_send_attempt += 1
self.ack.last_send_time = datetime.datetime.now()
time.sleep(5)
def log_packet(packet):
fromcall = packet.get("from", None)
tocall = packet.get("to", None)
response_type = packet.get("response", None)
msg = packet.get("message_text", None)
msg_num = packet.get("msgNo", None)
ack = packet.get("ack", None)
log_message(
"Packet",
packet["raw"],
msg,
fromcall=fromcall,
tocall=tocall,
ack=ack,
packet_type=response_type,
msg_num=msg_num,
)
def log_message(
header,
raw,
message,
tocall=None,
fromcall=None,
msg_num=None,
retry_number=None,
ack=None,
packet_type=None,
uuid=None,
):
"""
Log a message entry.
This builds a long string with newlines for the log entry, so that
it's thread safe. If we log each item as a separate log.debug() call
Then the message information could get multiplexed with other log
messages. Each python log call is automatically synchronized.
"""
log_list = [""]
if retry_number:
# LOG.info(" {} _______________(TX:{})".format(header, retry_number))
log_list.append(" {} _______________(TX:{})".format(header, retry_number))
else:
# LOG.info(" {} _______________".format(header))
log_list.append(" {} _______________".format(header))
# LOG.info(" Raw : {}".format(raw))
log_list.append(" Raw : {}".format(raw))
if packet_type:
# LOG.info(" Packet : {}".format(packet_type))
log_list.append(" Packet : {}".format(packet_type))
if tocall:
# LOG.info(" To : {}".format(tocall))
log_list.append(" To : {}".format(tocall))
if fromcall:
# LOG.info(" From : {}".format(fromcall))
log_list.append(" From : {}".format(fromcall))
if ack:
# LOG.info(" Ack : {}".format(ack))
log_list.append(" Ack : {}".format(ack))
else:
# LOG.info(" Message : {}".format(message))
log_list.append(" Message : {}".format(message))
if msg_num:
# LOG.info(" Msg number : {}".format(msg_num))
log_list.append(" Msg number : {}".format(msg_num))
if uuid:
log_list.append(" UUID : {}".format(uuid))
# LOG.info(" {} _______________ Complete".format(header))
log_list.append(" {} _______________ Complete".format(header))
LOG.info("\n".join(log_list))
# REMOVE THIS FILE

12
aprsd/packets/__init__.py Normal file
View File

@ -0,0 +1,12 @@
from aprsd.packets.core import ( # noqa: F401
AckPacket, BeaconPacket, BulletinPacket, GPSPacket, MessagePacket,
MicEPacket, ObjectPacket, Packet, RejectPacket, StatusPacket,
ThirdPartyPacket, UnknownPacket, WeatherPacket, factory,
)
from aprsd.packets.packet_list import PacketList # noqa: F401
from aprsd.packets.seen_list import SeenList # noqa: F401
from aprsd.packets.tracker import PacketTrack # noqa: F401
from aprsd.packets.watch_list import WatchList # noqa: F401
NULL_MESSAGE = -1

View File

@ -0,0 +1,56 @@
import logging
from typing import Callable, Protocol, runtime_checkable
from aprsd.packets import core
from aprsd.utils import singleton
LOG = logging.getLogger("APRSD")
@runtime_checkable
class PacketMonitor(Protocol):
"""Protocol for Monitoring packets in some way."""
def rx(self, packet: type[core.Packet]) -> None:
"""When we get a packet from the network."""
...
def tx(self, packet: type[core.Packet]) -> None:
"""When we send a packet out the network."""
...
@singleton
class PacketCollector:
def __init__(self):
self.monitors: list[Callable] = []
def register(self, monitor: Callable) -> None:
self.monitors.append(monitor)
def unregister(self, monitor: Callable) -> None:
self.monitors.remove(monitor)
def rx(self, packet: type[core.Packet]) -> None:
for name in self.monitors:
cls = name()
if isinstance(cls, PacketMonitor):
try:
cls.rx(packet)
except Exception as e:
LOG.error(f"Error in monitor {name} (rx): {e}")
else:
raise TypeError(f"Monitor {name} is not a PacketMonitor")
def tx(self, packet: type[core.Packet]) -> None:
for name in self.monitors:
cls = name()
if isinstance(cls, PacketMonitor):
try:
cls.tx(packet)
except Exception as e:
LOG.error(f"Error in monitor {name} (tx): {e}")
else:
raise TypeError(f"Monitor {name} is not a PacketMonitor")

823
aprsd/packets/core.py Normal file
View File

@ -0,0 +1,823 @@
from dataclasses import dataclass, field
from datetime import datetime
import logging
import re
import time
# Due to a failure in python 3.8
from typing import Any, List, Optional, Type, TypeVar, Union
from aprslib import util as aprslib_util
from dataclasses_json import (
CatchAll, DataClassJsonMixin, Undefined, dataclass_json,
)
from loguru import logger
from aprsd.utils import counter
# For mypy to be happy
A = TypeVar("A", bound="DataClassJsonMixin")
Json = Union[dict, list, str, int, float, bool, None]
LOG = logging.getLogger()
LOGU = logger
PACKET_TYPE_BULLETIN = "bulletin"
PACKET_TYPE_MESSAGE = "message"
PACKET_TYPE_ACK = "ack"
PACKET_TYPE_REJECT = "reject"
PACKET_TYPE_MICE = "mic-e"
PACKET_TYPE_WX = "wx"
PACKET_TYPE_WEATHER = "weather"
PACKET_TYPE_OBJECT = "object"
PACKET_TYPE_UNKNOWN = "unknown"
PACKET_TYPE_STATUS = "status"
PACKET_TYPE_BEACON = "beacon"
PACKET_TYPE_THIRDPARTY = "thirdparty"
PACKET_TYPE_TELEMETRY = "telemetry-message"
PACKET_TYPE_UNCOMPRESSED = "uncompressed"
NO_DATE = datetime(1900, 10, 24)
def _init_timestamp():
"""Build a unix style timestamp integer"""
return int(round(time.time()))
def _init_send_time():
# We have to use a datetime here, or the json encoder
# Fails on a NoneType.
return NO_DATE
def _init_msgNo(): # noqa: N802
"""For some reason __post__init doesn't get called.
So in order to initialize the msgNo field in the packet
we use this workaround.
"""
c = counter.PacketCounter()
c.increment()
return c.value
def _translate_fields(raw: dict) -> dict:
translate_fields = {
"from": "from_call",
"to": "to_call",
}
# First translate some fields
for key in translate_fields:
if key in raw:
raw[translate_fields[key]] = raw[key]
del raw[key]
# addresse overrides to_call
if "addresse" in raw:
raw["to_call"] = raw["addresse"]
return raw
@dataclass_json
@dataclass(unsafe_hash=True)
class Packet:
_type: str = field(default="Packet", hash=False)
from_call: Optional[str] = field(default=None)
to_call: Optional[str] = field(default=None)
addresse: Optional[str] = field(default=None)
format: Optional[str] = field(default=None)
msgNo: Optional[str] = field(default=None) # noqa: N815
ackMsgNo: Optional[str] = field(default=None) # noqa: N815
packet_type: Optional[str] = field(default=None)
timestamp: float = field(default_factory=_init_timestamp, compare=False, hash=False)
# Holds the raw text string to be sent over the wire
# or holds the raw string from input packet
raw: Optional[str] = field(default=None, compare=False, hash=False)
raw_dict: dict = field(repr=False, default_factory=lambda: {}, compare=False, hash=False)
# Built by calling prepare(). raw needs this built first.
payload: Optional[str] = field(default=None)
# Fields related to sending packets out
send_count: int = field(repr=False, default=0, compare=False, hash=False)
retry_count: int = field(repr=False, default=3, compare=False, hash=False)
last_send_time: float = field(repr=False, default=0, compare=False, hash=False)
# Do we allow this packet to be saved to send later?
allow_delay: bool = field(repr=False, default=True, compare=False, hash=False)
path: List[str] = field(default_factory=list, compare=False, hash=False)
via: Optional[str] = field(default=None, compare=False, hash=False)
def get(self, key: str, default: Optional[str] = None):
"""Emulate a getter on a dict."""
if hasattr(self, key):
return getattr(self, key)
else:
return default
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.addresse}:{self.msgNo}"
def update_timestamp(self) -> None:
self.timestamp = _init_timestamp()
@property
def human_info(self) -> str:
"""Build a human readable string for this packet.
This doesn't include the from to and type, but just
the human readable payload.
"""
self.prepare()
msg = self._filter_for_send(self.raw).rstrip("\n")
return msg
def prepare(self) -> None:
"""Do stuff here that is needed prior to sending over the air."""
# now build the raw message for sending
if not self.msgNo:
self.msgNo = _init_msgNo()
self._build_payload()
self._build_raw()
def _build_payload(self) -> None:
"""The payload is the non headers portion of the packet."""
if not self.to_call:
raise ValueError("to_call isn't set. Must set to_call before calling prepare()")
# The base packet class has no real payload
self.payload = (
f":{self.to_call.ljust(9)}"
)
def _build_raw(self) -> None:
"""Build the self.raw which is what is sent over the air."""
self.raw = "{}>APZ100:{}".format(
self.from_call,
self.payload,
)
def _filter_for_send(self, msg) -> str:
"""Filter and format message string for FCC."""
# max? ftm400 displays 64, raw msg shows 74
# and ftm400-send is max 64. setting this to
# 67 displays 64 on the ftm400. (+3 {01 suffix)
# feature req: break long ones into two msgs
if not msg:
return ""
message = msg[:67]
# We all miss George Carlin
return re.sub(
"fuck|shit|cunt|piss|cock|bitch", "****",
message, flags=re.IGNORECASE,
)
def __str__(self) -> str:
"""Show the raw version of the packet"""
self.prepare()
if not self.raw:
raise ValueError("self.raw is unset")
return self.raw
def __repr__(self) -> str:
"""Build the repr version of the packet."""
repr = (
f"{self.__class__.__name__}:"
f" From: {self.from_call} "
f" To: {self.to_call}"
)
return repr
@dataclass_json
@dataclass(unsafe_hash=True)
class AckPacket(Packet):
_type: str = field(default="AckPacket", hash=False)
def _build_payload(self):
self.payload = f":{self.to_call: <9}:ack{self.msgNo}"
@dataclass_json
@dataclass(unsafe_hash=True)
class BulletinPacket(Packet):
_type: str = "BulletinPacket"
# Holds the encapsulated packet
bid: Optional[str] = field(default="1")
message_text: Optional[str] = field(default=None)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:BLN{self.bid}"
@property
def human_info(self) -> str:
return f"BLN{self.bid} {self.message_text}"
def _build_payload(self) -> None:
self.payload = (
f":BLN{self.bid:<9}"
f":{self.message_text}"
)
@dataclass_json
@dataclass(unsafe_hash=True)
class RejectPacket(Packet):
_type: str = field(default="RejectPacket", hash=False)
response: Optional[str] = field(default=None)
def __post__init__(self):
if self.response:
LOG.warning("Response set!")
def _build_payload(self):
self.payload = f":{self.to_call: <9}:rej{self.msgNo}"
@dataclass_json
@dataclass(unsafe_hash=True)
class MessagePacket(Packet):
_type: str = field(default="MessagePacket", hash=False)
message_text: Optional[str] = field(default=None)
@property
def human_info(self) -> str:
self.prepare()
return self._filter_for_send(self.message_text).rstrip("\n")
def _build_payload(self):
self.payload = ":{}:{}{{{}".format(
self.to_call.ljust(9),
self._filter_for_send(self.message_text).rstrip("\n"),
str(self.msgNo),
)
@dataclass_json
@dataclass(unsafe_hash=True)
class StatusPacket(Packet):
_type: str = field(default="StatusPacket", hash=False)
status: Optional[str] = field(default=None)
messagecapable: bool = field(default=False)
comment: Optional[str] = field(default=None)
raw_timestamp: Optional[str] = field(default=None)
def _build_payload(self):
self.payload = ":{}:{}{{{}".format(
self.to_call.ljust(9),
self._filter_for_send(self.status).rstrip("\n"),
str(self.msgNo),
)
@property
def human_info(self) -> str:
self.prepare()
return self.status
@dataclass_json
@dataclass(unsafe_hash=True)
class GPSPacket(Packet):
_type: str = field(default="GPSPacket", hash=False)
latitude: float = field(default=0.00)
longitude: float = field(default=0.00)
altitude: float = field(default=0.00)
rng: float = field(default=0.00)
posambiguity: int = field(default=0)
messagecapable: bool = field(default=False)
comment: Optional[str] = field(default=None)
symbol: str = field(default="l")
symbol_table: str = field(default="/")
raw_timestamp: Optional[str] = field(default=None)
object_name: Optional[str] = field(default=None)
object_format: Optional[str] = field(default=None)
alive: Optional[bool] = field(default=None)
course: Optional[int] = field(default=None)
speed: Optional[float] = field(default=None)
phg: Optional[str] = field(default=None)
phg_power: Optional[int] = field(default=None)
phg_height: Optional[float] = field(default=None)
phg_gain: Optional[int] = field(default=None)
phg_dir: Optional[str] = field(default=None)
phg_range: Optional[float] = field(default=None)
phg_rate: Optional[int] = field(default=None)
# http://www.aprs.org/datum.txt
daodatumbyte: Optional[str] = field(default=None)
def _build_time_zulu(self):
"""Build the timestamp in UTC/zulu."""
if self.timestamp:
return datetime.utcfromtimestamp(self.timestamp).strftime("%d%H%M")
def _build_payload(self):
"""The payload is the non headers portion of the packet."""
time_zulu = self._build_time_zulu()
lat = aprslib_util.latitude_to_ddm(self.latitude)
long = aprslib_util.longitude_to_ddm(self.longitude)
payload = [
"@" if self.timestamp else "!",
time_zulu,
lat,
self.symbol_table,
long,
self.symbol,
]
if self.comment:
payload.append(self._filter_for_send(self.comment))
self.payload = "".join(payload)
def _build_raw(self):
self.raw = (
f"{self.from_call}>{self.to_call},WIDE2-1:"
f"{self.payload}"
)
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Lat:{self.latitude:03.3f}")
h_str.append(f"Lon:{self.longitude:03.3f}")
if self.altitude:
h_str.append(f"Altitude {self.altitude:03.0f}")
if self.speed:
h_str.append(f"Speed {self.speed:03.0f}MPH")
if self.course:
h_str.append(f"Course {self.course:03.0f}")
if self.rng:
h_str.append(f"RNG {self.rng:03.0f}")
if self.phg:
h_str.append(f"PHG {self.phg}")
return " ".join(h_str)
@dataclass_json
@dataclass(unsafe_hash=True)
class BeaconPacket(GPSPacket):
_type: str = field(default="BeaconPacket", hash=False)
def _build_payload(self):
"""The payload is the non headers portion of the packet."""
time_zulu = self._build_time_zulu()
lat = aprslib_util.latitude_to_ddm(self.latitude)
lon = aprslib_util.longitude_to_ddm(self.longitude)
self.payload = (
f"@{time_zulu}z{lat}{self.symbol_table}"
f"{lon}"
)
if self.comment:
comment = self._filter_for_send(self.comment)
self.payload = f"{self.payload}{self.symbol}{comment}"
else:
self.payload = f"{self.payload}{self.symbol}APRSD Beacon"
def _build_raw(self):
self.raw = (
f"{self.from_call}>APZ100:"
f"{self.payload}"
)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
else:
return f"{self.from_call}:{self.human_info.replace(' ','')}"
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Lat:{self.latitude:03.3f}")
h_str.append(f"Lon:{self.longitude:03.3f}")
h_str.append(f"{self.comment}")
return " ".join(h_str)
@dataclass_json
@dataclass(unsafe_hash=True)
class MicEPacket(GPSPacket):
_type: str = field(default="MicEPacket", hash=False)
messagecapable: bool = False
mbits: Optional[str] = None
mtype: Optional[str] = None
telemetry: Optional[dict] = field(default=None)
# in MPH
speed: float = 0.00
# 0 to 360
course: int = 0
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.human_info.replace(' ', '')}"
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.mbits} mbits"
@dataclass_json
@dataclass(unsafe_hash=True)
class TelemetryPacket(GPSPacket):
_type: str = field(default="TelemetryPacket", hash=False)
messagecapable: bool = False
mbits: Optional[str] = None
mtype: Optional[str] = None
telemetry: Optional[dict] = field(default=None)
tPARM: Optional[list[str]] = field(default=None) # noqa: N815
tUNIT: Optional[list[str]] = field(default=None) # noqa: N815
# in MPH
speed: float = 0.00
# 0 to 360
course: int = 0
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
else:
return f"{self.from_call}:{self.human_info.replace(' ','')}"
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.telemetry}"
@dataclass_json
@dataclass(unsafe_hash=True)
class ObjectPacket(GPSPacket):
_type: str = field(default="ObjectPacket", hash=False)
alive: bool = True
raw_timestamp: Optional[str] = None
symbol: str = field(default="r")
# in MPH
speed: float = 0.00
# 0 to 360
course: int = 0
def _build_payload(self):
time_zulu = self._build_time_zulu()
lat = aprslib_util.latitude_to_ddm(self.latitude)
long = aprslib_util.longitude_to_ddm(self.longitude)
self.payload = (
f"*{time_zulu}z{lat}{self.symbol_table}"
f"{long}{self.symbol}"
)
if self.comment:
comment = self._filter_for_send(self.comment)
self.payload = f"{self.payload}{comment}"
def _build_raw(self):
"""
REPEAT builds packets like
reply = "{}>APZ100:;{:9s}*{}z{}r{:.3f}MHz {} {}".format(
fromcall, callsign, time_zulu, latlon, freq, uplink_tone, offset,
)
where fromcall is the callsign that is sending the packet
callsign is the station callsign for the object
The frequency, uplink_tone, offset is part of the comment
"""
self.raw = (
f"{self.from_call}>APZ100:;{self.to_call:9s}"
f"{self.payload}"
)
@property
def human_info(self) -> str:
h_info = super().human_info
return f"{h_info} {self.comment}"
@dataclass(unsafe_hash=True)
class WeatherPacket(GPSPacket, DataClassJsonMixin):
_type: str = field(default="WeatherPacket", hash=False)
symbol: str = "_"
wind_speed: float = 0.00
wind_direction: int = 0
wind_gust: float = 0.00
temperature: float = 0.00
# in inches. 1.04 means 1.04 inches
rain_1h: float = 0.00
rain_24h: float = 0.00
rain_since_midnight: float = 0.00
humidity: int = 0
pressure: float = 0.00
comment: Optional[str] = field(default=None)
luminosity: Optional[int] = field(default=None)
wx_raw_timestamp: Optional[str] = field(default=None)
course: Optional[int] = field(default=None)
speed: Optional[float] = field(default=None)
def _translate(self, raw: dict) -> dict:
for key in raw["weather"]:
raw[key] = raw["weather"][key]
# If we have the broken aprslib, then we need to
# Convert the course and speed to wind_speed and wind_direction
# aprslib issue #80
# https://github.com/rossengeorgiev/aprs-python/issues/80
# Wind speed and course is option in the SPEC.
# For some reason aprslib multiplies the speed by 1.852.
if "wind_speed" not in raw and "wind_direction" not in raw:
# Most likely this is the broken aprslib
# So we need to convert the wind_gust speed
raw["wind_gust"] = round(raw.get("wind_gust", 0) / 0.44704, 3)
if "wind_speed" not in raw:
wind_speed = raw.get("speed")
if wind_speed:
raw["wind_speed"] = round(wind_speed / 1.852, 3)
raw["weather"]["wind_speed"] = raw["wind_speed"]
if "speed" in raw:
del raw["speed"]
# Let's adjust the rain numbers as well, since it's wrong
raw["rain_1h"] = round((raw.get("rain_1h", 0) / .254) * .01, 3)
raw["weather"]["rain_1h"] = raw["rain_1h"]
raw["rain_24h"] = round((raw.get("rain_24h", 0) / .254) * .01, 3)
raw["weather"]["rain_24h"] = raw["rain_24h"]
raw["rain_since_midnight"] = round((raw.get("rain_since_midnight", 0) / .254) * .01, 3)
raw["weather"]["rain_since_midnight"] = raw["rain_since_midnight"]
if "wind_direction" not in raw:
wind_direction = raw.get("course")
if wind_direction:
raw["wind_direction"] = wind_direction
raw["weather"]["wind_direction"] = raw["wind_direction"]
if "course" in raw:
del raw["course"]
del raw["weather"]
return raw
@classmethod
def from_dict(cls: Type[A], kvs: Json, *, infer_missing=False) -> A:
"""Create from a dictionary that has come directly from aprslib parse"""
raw = cls._translate(cls, kvs) # type: ignore
return super().from_dict(raw)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
if self.raw_timestamp:
return f"{self.from_call}:{self.raw_timestamp}"
elif self.wx_raw_timestamp:
return f"{self.from_call}:{self.wx_raw_timestamp}"
@property
def human_info(self) -> str:
h_str = []
h_str.append(f"Temp {self.temperature:03.0f}F")
h_str.append(f"Humidity {self.humidity}%")
h_str.append(f"Wind {self.wind_speed:03.0f}MPH@{self.wind_direction}")
h_str.append(f"Pressure {self.pressure}mb")
h_str.append(f"Rain {self.rain_24h}in/24hr")
return " ".join(h_str)
def _build_payload(self):
"""Build an uncompressed weather packet
Format =
_CSE/SPDgXXXtXXXrXXXpXXXPXXXhXXbXXXXX%type NEW FORMAT APRS793 June 97
NOT BACKWARD COMPATIBLE
Where: CSE/SPD is wind direction and sustained 1 minute speed
t is in degrees F
r is Rain per last 60 minutes
1.04 inches of rain will show as r104
p is precipitation per last 24 hours (sliding 24 hour window)
P is precip per last 24 hours since midnight
b is Baro in tenths of a mb
h is humidity in percent. 00=100
g is Gust (peak winds in last 5 minutes)
# is the raw rain counter for remote WX stations
See notes on remotes below
% shows software type d=Dos, m=Mac, w=Win, etc
type shows type of WX instrument
"""
time_zulu = self._build_time_zulu()
contents = [
f"@{time_zulu}z{self.latitude}{self.symbol_table}",
f"{self.longitude}{self.symbol}",
f"{self.wind_direction:03d}",
# Speed = sustained 1 minute wind speed in mph
f"{self.symbol_table}", f"{self.wind_speed:03.0f}",
# wind gust (peak wind speed in mph in the last 5 minutes)
f"g{self.wind_gust:03.0f}",
# Temperature in degrees F
f"t{self.temperature:03.0f}",
# Rainfall (in hundredths of an inch) in the last hour
f"r{self.rain_1h*100:03.0f}",
# Rainfall (in hundredths of an inch) in last 24 hours
f"p{self.rain_24h*100:03.0f}",
# Rainfall (in hundredths of an inch) since midnigt
f"P{self.rain_since_midnight*100:03.0f}",
# Humidity
f"h{self.humidity:02d}",
# Barometric pressure (in tenths of millibars/tenths of hPascal)
f"b{self.pressure:05.0f}",
]
if self.comment:
comment = self.filter_for_send(self.comment)
contents.append(comment)
self.payload = "".join(contents)
def _build_raw(self):
self.raw = (
f"{self.from_call}>{self.to_call},WIDE1-1,WIDE2-1:"
f"{self.payload}"
)
@dataclass(unsafe_hash=True)
class ThirdPartyPacket(Packet, DataClassJsonMixin):
_type: str = "ThirdPartyPacket"
# Holds the encapsulated packet
subpacket: Optional[type[Packet]] = field(default=None, compare=True, hash=False)
def __repr__(self):
"""Build the repr version of the packet."""
repr_str = (
f"{self.__class__.__name__}:"
f" From: {self.from_call} "
f" To: {self.to_call} "
f" Subpacket: {repr(self.subpacket)}"
)
return repr_str
@classmethod
def from_dict(cls: Type[A], kvs: Json, *, infer_missing=False) -> A:
obj = super().from_dict(kvs)
obj.subpacket = factory(obj.subpacket) # type: ignore
return obj
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.subpacket.key}"
@property
def human_info(self) -> str:
sub_info = self.subpacket.human_info
return f"{self.from_call}->{self.to_call} {sub_info}"
@dataclass_json(undefined=Undefined.INCLUDE)
@dataclass(unsafe_hash=True)
class UnknownPacket:
"""Catchall Packet for things we don't know about.
All of the unknown attributes are stored in the unknown_fields
"""
unknown_fields: CatchAll
_type: str = "UnknownPacket"
from_call: Optional[str] = field(default=None)
to_call: Optional[str] = field(default=None)
msgNo: str = field(default_factory=_init_msgNo) # noqa: N815
format: Optional[str] = field(default=None)
raw: Optional[str] = field(default=None)
raw_dict: dict = field(repr=False, default_factory=lambda: {}, compare=False, hash=False)
path: List[str] = field(default_factory=list, compare=False, hash=False)
packet_type: Optional[str] = field(default=None)
via: Optional[str] = field(default=None, compare=False, hash=False)
@property
def key(self) -> str:
"""Build a key for finding this packet in a dict."""
return f"{self.from_call}:{self.packet_type}:{self.to_call}"
@property
def human_info(self) -> str:
return str(self.unknown_fields)
TYPE_LOOKUP: dict[str, type[Packet]] = {
PACKET_TYPE_BULLETIN: BulletinPacket,
PACKET_TYPE_WX: WeatherPacket,
PACKET_TYPE_WEATHER: WeatherPacket,
PACKET_TYPE_MESSAGE: MessagePacket,
PACKET_TYPE_ACK: AckPacket,
PACKET_TYPE_REJECT: RejectPacket,
PACKET_TYPE_MICE: MicEPacket,
PACKET_TYPE_OBJECT: ObjectPacket,
PACKET_TYPE_STATUS: StatusPacket,
PACKET_TYPE_BEACON: BeaconPacket,
PACKET_TYPE_UNKNOWN: UnknownPacket,
PACKET_TYPE_THIRDPARTY: ThirdPartyPacket,
PACKET_TYPE_TELEMETRY: TelemetryPacket,
}
def get_packet_type(packet: dict) -> str:
"""Decode the packet type from the packet."""
pkt_format = packet.get("format")
msg_response = packet.get("response")
packet_type = PACKET_TYPE_UNKNOWN
if pkt_format == "message" and msg_response == "ack":
packet_type = PACKET_TYPE_ACK
elif pkt_format == "message" and msg_response == "rej":
packet_type = PACKET_TYPE_REJECT
elif pkt_format == "message":
packet_type = PACKET_TYPE_MESSAGE
elif pkt_format == "mic-e":
packet_type = PACKET_TYPE_MICE
elif pkt_format == "object":
packet_type = PACKET_TYPE_OBJECT
elif pkt_format == "status":
packet_type = PACKET_TYPE_STATUS
elif pkt_format == PACKET_TYPE_BULLETIN:
packet_type = PACKET_TYPE_BULLETIN
elif pkt_format == PACKET_TYPE_BEACON:
packet_type = PACKET_TYPE_BEACON
elif pkt_format == PACKET_TYPE_TELEMETRY:
packet_type = PACKET_TYPE_TELEMETRY
elif pkt_format == PACKET_TYPE_WX:
packet_type = PACKET_TYPE_WEATHER
elif pkt_format == PACKET_TYPE_UNCOMPRESSED:
if packet.get("symbol") == "_":
packet_type = PACKET_TYPE_WEATHER
elif pkt_format == PACKET_TYPE_THIRDPARTY:
packet_type = PACKET_TYPE_THIRDPARTY
if packet_type == PACKET_TYPE_UNKNOWN:
if "latitude" in packet:
packet_type = PACKET_TYPE_BEACON
else:
packet_type = PACKET_TYPE_UNKNOWN
return packet_type
def is_message_packet(packet: dict) -> bool:
return get_packet_type(packet) == PACKET_TYPE_MESSAGE
def is_ack_packet(packet: dict) -> bool:
return get_packet_type(packet) == PACKET_TYPE_ACK
def is_mice_packet(packet: dict[Any, Any]) -> bool:
return get_packet_type(packet) == PACKET_TYPE_MICE
def factory(raw_packet: dict[Any, Any]) -> type[Packet]:
"""Factory method to create a packet from a raw packet string."""
raw = raw_packet
if "_type" in raw:
cls = globals()[raw["_type"]]
return cls.from_dict(raw)
raw["raw_dict"] = raw.copy()
raw = _translate_fields(raw)
packet_type = get_packet_type(raw)
raw["packet_type"] = packet_type
packet_class = TYPE_LOOKUP[packet_type]
if packet_type == PACKET_TYPE_WX:
# the weather information is in a dict
# this brings those values out to the outer dict
packet_class = WeatherPacket
elif packet_type == PACKET_TYPE_OBJECT and "weather" in raw:
packet_class = WeatherPacket
elif packet_type == PACKET_TYPE_UNKNOWN:
# Try and figure it out here
if "latitude" in raw:
packet_class = GPSPacket
else:
# LOG.warning(raw)
packet_class = UnknownPacket
raw.get("addresse", raw.get("to_call"))
# TODO: Find a global way to enable/disable this
# LOGU.opt(colors=True).info(
# f"factory(<green>{packet_type: <8}</green>):"
# f"(<red>{packet_class.__name__: <13}</red>): "
# f"<light-blue>{raw.get('from_call'): <9}</light-blue> -> <cyan>{to: <9}</cyan>")
# LOG.info(raw.get('msgNo'))
return packet_class().from_dict(raw) # type: ignore

143
aprsd/packets/log.py Normal file
View File

@ -0,0 +1,143 @@
import logging
from typing import Optional
from loguru import logger
from oslo_config import cfg
from aprsd.packets.core import AckPacket, RejectPacket
LOG = logging.getLogger()
LOGU = logger
CONF = cfg.CONF
FROM_COLOR = "fg #C70039"
TO_COLOR = "fg #D033FF"
TX_COLOR = "red"
RX_COLOR = "green"
PACKET_COLOR = "cyan"
def log_multiline(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None:
"""LOG a packet to the logfile."""
if not CONF.enable_packet_logging:
return
if CONF.log_packet_format == "compact":
return
# asdict(packet)
logit = ["\n"]
name = packet.__class__.__name__
if isinstance(packet, AckPacket):
pkt_max_send_count = CONF.default_ack_send_count
else:
pkt_max_send_count = CONF.default_packet_send_count
if header:
if tx:
header_str = f"<{TX_COLOR}>TX</{TX_COLOR}>"
logit.append(
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}> "
f"TX:{packet.send_count + 1} of {pkt_max_send_count}",
)
else:
header_str = f"<{RX_COLOR}>RX</{RX_COLOR}>"
logit.append(
f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)",
)
else:
header_str = ""
logit.append(f"__________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
# log_list.append(f" Packet : {packet.__class__.__name__}")
if packet.msgNo:
logit.append(f" Msg # : {packet.msgNo}")
if packet.from_call:
logit.append(f" From : <{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}>")
if packet.to_call:
logit.append(f" To : <{TO_COLOR}>{packet.to_call}</{TO_COLOR}>")
if hasattr(packet, "path") and packet.path:
logit.append(f" Path : {'=>'.join(packet.path)}")
if hasattr(packet, "via") and packet.via:
logit.append(f" VIA : {packet.via}")
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
msg = packet.human_info
if msg:
msg = msg.replace("<", "\\<")
logit.append(f" Info : <light-yellow><b>{msg}</b></light-yellow>")
if hasattr(packet, "comment") and packet.comment:
logit.append(f" Comment : {packet.comment}")
raw = packet.raw.replace("<", "\\<")
logit.append(f" Raw : <fg #828282>{raw}</fg #828282>")
logit.append(f"{header_str}________(<{PACKET_COLOR}>{name}</{PACKET_COLOR}>)")
LOGU.opt(colors=True).info("\n".join(logit))
LOG.debug(repr(packet))
def log(packet, tx: Optional[bool] = False, header: Optional[bool] = True) -> None:
if not CONF.enable_packet_logging:
return
if CONF.log_packet_format == "multiline":
log_multiline(packet, tx, header)
return
logit = []
name = packet.__class__.__name__
if isinstance(packet, AckPacket):
pkt_max_send_count = CONF.default_ack_send_count
else:
pkt_max_send_count = CONF.default_packet_send_count
if header:
if tx:
via_color = "red"
arrow = f"<{via_color}>-></{via_color}>"
logit.append(
f"<red>TX {arrow}</red> "
f"<cyan>{name}</cyan>"
f":{packet.msgNo}"
f" ({packet.send_count + 1} of {pkt_max_send_count})",
)
else:
via_color = "fg #828282"
arrow = f"<{via_color}>-></{via_color}>"
left_arrow = f"<{via_color}><-</{via_color}>"
logit.append(
f"<fg #1AA730>RX</fg #1AA730> {left_arrow} "
f"<cyan>{name}</cyan>"
f":{packet.msgNo}",
)
else:
via_color = "green"
arrow = f"<{via_color}>-></{via_color}>"
logit.append(
f"<cyan>{name}</cyan>"
f":{packet.msgNo}",
)
tmp = None
if packet.path:
tmp = f"{arrow}".join(packet.path) + f"{arrow} "
logit.append(
f"<{FROM_COLOR}>{packet.from_call}</{FROM_COLOR}> {arrow}"
f"{tmp if tmp else ' '}"
f"<{TO_COLOR}>{packet.to_call}</{TO_COLOR}>",
)
if not isinstance(packet, AckPacket) and not isinstance(packet, RejectPacket):
logit.append(":")
msg = packet.human_info
if msg:
msg = msg.replace("<", "\\<")
logit.append(f"<light-yellow><b>{msg}</b></light-yellow>")
LOGU.opt(colors=True).info(" ".join(logit))
log_multiline(packet, tx, header)

View File

@ -0,0 +1,116 @@
from collections import OrderedDict
import logging
from oslo_config import cfg
from aprsd.packets import collector, core
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class PacketList(objectstore.ObjectStoreMixin):
"""Class to keep track of the packets we tx/rx."""
_instance = None
_total_rx: int = 0
_total_tx: int = 0
maxlen: int = 100
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance.maxlen = CONF.packet_list_maxlen
cls._instance._init_data()
return cls._instance
def _init_data(self):
self.data = {
"types": {},
"packets": OrderedDict(),
}
def rx(self, packet: type[core.Packet]):
"""Add a packet that was received."""
with self.lock:
self._total_rx += 1
self._add(packet)
ptype = packet.__class__.__name__
if not ptype in self.data["types"]:
self.data["types"][ptype] = {"tx": 0, "rx": 0}
self.data["types"][ptype]["rx"] += 1
def tx(self, packet: type[core.Packet]):
"""Add a packet that was received."""
with self.lock:
self._total_tx += 1
self._add(packet)
ptype = packet.__class__.__name__
if not ptype in self.data["types"]:
self.data["types"][ptype] = {"tx": 0, "rx": 0}
self.data["types"][ptype]["tx"] += 1
def add(self, packet):
with self.lock:
self._add(packet)
def _add(self, packet):
if not self.data.get("packets"):
self._init_data()
if packet.key in self.data["packets"]:
self.data["packets"].move_to_end(packet.key)
elif len(self.data["packets"]) == self.maxlen:
self.data["packets"].popitem(last=False)
self.data["packets"][packet.key] = packet
def find(self, packet):
with self.lock:
return self.data["packets"][packet.key]
def __len__(self):
with self.lock:
return len(self.data["packets"])
def total_rx(self):
with self.lock:
return self._total_rx
def total_tx(self):
with self.lock:
return self._total_tx
def stats(self, serializable=False) -> dict:
# limit the number of packets to return to 50
with self.lock:
tmp = OrderedDict(
reversed(
list(
self.data.get("packets", OrderedDict()).items(),
),
),
)
pkts = []
count = 1
for packet in tmp:
pkts.append(tmp[packet])
count += 1
if count > CONF.packet_list_stats_maxlen:
break
stats = {
"total_tracked": self._total_rx + self._total_rx,
"rx": self._total_rx,
"tx": self._total_tx,
"types": self.data.get("types", []),
"packet_count": len(self.data.get("packets", [])),
"maxlen": self.maxlen,
"packets": pkts,
}
return stats
# Now register the PacketList with the collector
# every packet we RX and TX goes through the collector
# for processing for whatever reason is needed.
collector.PacketCollector().register(PacketList)

View File

@ -0,0 +1,54 @@
import datetime
import logging
from oslo_config import cfg
from aprsd.packets import collector, core
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class SeenList(objectstore.ObjectStoreMixin):
"""Global callsign seen list."""
_instance = None
data: dict = {}
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance.data = {}
return cls._instance
def stats(self, serializable=False):
"""Return the stats for the PacketTrack class."""
with self.lock:
return self.data
def rx(self, packet: type[core.Packet]):
"""When we get a packet from the network, update the seen list."""
with self.lock:
callsign = None
if packet.from_call:
callsign = packet.from_call
else:
LOG.warning(f"Can't find FROM in packet {packet}")
return
if callsign not in self.data:
self.data[callsign] = {
"last": None,
"count": 0,
}
self.data[callsign]["last"] = datetime.datetime.now()
self.data[callsign]["count"] += 1
def tx(self, packet: type[core.Packet]):
"""We don't care about TX packets."""
# Register with the packet collector so we can process the packet
# when we get it off the client (network)
collector.PacketCollector().register(SeenList)

109
aprsd/packets/tracker.py Normal file
View File

@ -0,0 +1,109 @@
import datetime
import logging
from oslo_config import cfg
from aprsd.packets import collector, core
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class PacketTrack(objectstore.ObjectStoreMixin):
"""Class to keep track of outstanding text messages.
This is a thread safe class that keeps track of active
messages.
When a message is asked to be sent, it is placed into this
class via it's id. The TextMessage class's send() method
automatically adds itself to this class. When the ack is
recieved from the radio, the message object is removed from
this class.
"""
_instance = None
_start_time = None
data: dict = {}
total_tracked: int = 0
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance._start_time = datetime.datetime.now()
cls._instance._init_store()
return cls._instance
def __getitem__(self, name):
with self.lock:
return self.data[name]
def __iter__(self):
with self.lock:
return iter(self.data)
def keys(self):
with self.lock:
return self.data.keys()
def items(self):
with self.lock:
return self.data.items()
def values(self):
with self.lock:
return self.data.values()
def stats(self, serializable=False):
with self.lock:
stats = {
"total_tracked": self.total_tracked,
}
pkts = {}
for key in self.data:
last_send_time = self.data[key].last_send_time
pkts[key] = {
"last_send_time": last_send_time,
"send_count": self.data[key].send_count,
"retry_count": self.data[key].retry_count,
"message": self.data[key].raw,
}
stats["packets"] = pkts
return stats
def rx(self, packet: type[core.Packet]) -> None:
"""When we get a packet from the network, check if we should remove it."""
if isinstance(packet, core.AckPacket):
self._remove(packet.msgNo)
elif isinstance(packet, core.RejectPacket):
self._remove(packet.msgNo)
elif hasattr(packet, "ackMsgNo"):
# Got a piggyback ack, so remove the original message
self._remove(packet.ackMsgNo)
def tx(self, packet: type[core.Packet]) -> None:
"""Add a packet that was sent."""
with self.lock:
key = packet.msgNo
packet.send_count = 0
self.data[key] = packet
self.total_tracked += 1
def remove(self, key):
self._remove(key)
def _remove(self, key):
with self.lock:
try:
del self.data[key]
except KeyError:
pass
# Now register the PacketList with the collector
# every packet we RX and TX goes through the collector
# for processing for whatever reason is needed.
collector.PacketCollector().register(PacketTrack)

122
aprsd/packets/watch_list.py Normal file
View File

@ -0,0 +1,122 @@
import datetime
import logging
from oslo_config import cfg
from aprsd import utils
from aprsd.packets import collector, core
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class WatchList(objectstore.ObjectStoreMixin):
"""Global watch list and info for callsigns."""
_instance = None
data = {}
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
def __init__(self):
super().__init__()
self._update_from_conf()
def _update_from_conf(self, config=None):
with self.lock:
if CONF.watch_list.enabled and CONF.watch_list.callsigns:
for callsign in CONF.watch_list.callsigns:
call = callsign.replace("*", "")
# FIXME(waboring) - we should fetch the last time we saw
# a beacon from a callsign or some other mechanism to find
# last time a message was seen by aprs-is. For now this
# is all we can do.
if call not in self.data:
self.data[call] = {
"last": None,
"packet": None,
}
def stats(self, serializable=False) -> dict:
stats = {}
with self.lock:
for callsign in self.data:
stats[callsign] = {
"last": self.data[callsign]["last"],
"packet": self.data[callsign]["packet"],
"age": self.age(callsign),
"old": self.is_old(callsign),
}
return stats
def is_enabled(self):
return CONF.watch_list.enabled
def callsign_in_watchlist(self, callsign):
with self.lock:
return callsign in self.data
def rx(self, packet: type[core.Packet]) -> None:
"""Track when we got a packet from the network."""
callsign = packet.from_call
if self.callsign_in_watchlist(callsign):
with self.lock:
self.data[callsign]["last"] = datetime.datetime.now()
self.data[callsign]["packet"] = packet
def tx(self, packet: type[core.Packet]) -> None:
"""We don't care about TX packets."""
def last_seen(self, callsign):
with self.lock:
if self.callsign_in_watchlist(callsign):
return self.data[callsign]["last"]
def age(self, callsign):
now = datetime.datetime.now()
last_seen_time = self.last_seen(callsign)
if last_seen_time:
return str(now - last_seen_time)
else:
return None
def max_delta(self, seconds=None):
if not seconds:
seconds = CONF.watch_list.alert_time_seconds
max_timeout = {"seconds": seconds}
return datetime.timedelta(**max_timeout)
def is_old(self, callsign, seconds=None):
"""Watch list callsign last seen is old compared to now?
This tests to see if the last time we saw a callsign packet,
if that is older than the allowed timeout in the config.
We put this here so any notification plugin can use this
same test.
"""
if not self.callsign_in_watchlist(callsign):
return False
age = self.age(callsign)
if age:
delta = utils.parse_delta_str(age)
d = datetime.timedelta(**delta)
max_delta = self.max_delta(seconds=seconds)
if d > max_delta:
return True
else:
return False
else:
return False
collector.PacketCollector().register(WatchList)

View File

@ -1,46 +1,190 @@
# The base plugin class
from __future__ import annotations
import abc
import fnmatch
import importlib
import inspect
import logging
import os
import re
import textwrap
import threading
from oslo_config import cfg
import pluggy
from thesmuggler import smuggle
import aprsd
from aprsd import client, packets, threads
from aprsd.packets import watch_list
# setup the global logger
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
hookspec = pluggy.HookspecMarker("aprsd")
hookimpl = pluggy.HookimplMarker("aprsd")
CORE_PLUGINS = [
CORE_MESSAGE_PLUGINS = [
"aprsd.plugins.email.EmailPlugin",
"aprsd.plugins.fortune.FortunePlugin",
"aprsd.plugins.location.LocationPlugin",
"aprsd.plugins.ping.PingPlugin",
"aprsd.plugins.query.QueryPlugin",
"aprsd.plugins.time.TimePlugin",
"aprsd.plugins.weather.WeatherPlugin",
"aprsd.plugins.weather.USWeatherPlugin",
"aprsd.plugins.version.VersionPlugin",
]
CORE_NOTIFY_PLUGINS = [
"aprsd.plugins.notify.NotifySeenPlugin",
]
class APRSDCommandSpec:
hookspec = pluggy.HookspecMarker("aprsd")
hookimpl = pluggy.HookimplMarker("aprsd")
class APRSDPluginSpec:
"""A hook specification namespace."""
@hookspec
def run(self, fromcall, message, ack):
def filter(self, packet: type[packets.Packet]):
"""My special little hook that you can customize."""
pass
class APRSDPluginBase(metaclass=abc.ABCMeta):
def __init__(self, config):
"""The aprsd config object is stored."""
self.config = config
"""The base class for all APRSD Plugins."""
config = None
rx_count = 0
tx_count = 0
version = aprsd.__version__
# Holds the list of APRSDThreads that the plugin creates
threads = []
# Set this in setup()
enabled = False
def __init__(self):
self.message_counter = 0
self.setup()
self.threads = self.create_threads() or []
self.start_threads()
def start_threads(self) -> None:
if self.enabled and self.threads:
if not isinstance(self.threads, list):
self.threads = [self.threads]
try:
for thread in self.threads:
if isinstance(thread, threads.APRSDThread):
thread.start()
else:
LOG.error(
"Can't start thread {}:{}, Must be a child "
"of aprsd.threads.APRSDThread".format(
self,
thread,
),
)
except Exception:
LOG.error(
"Failed to start threads for plugin {}".format(
self,
),
)
@property
def message_count(self) -> int:
return self.message_counter
def help(self) -> str:
return "Help!"
@abc.abstractmethod
def setup(self):
"""Do any plugin setup here."""
self.enabled = True
def create_threads(self):
"""Gives the plugin writer the ability start a background thread."""
return []
def rx_inc(self):
self.rx_count += 1
def tx_inc(self):
self.tx_count += 1
def stop_threads(self):
"""Stop any threads this plugin might have created."""
for thread in self.threads:
if isinstance(thread, threads.APRSDThread):
thread.stop()
@abc.abstractmethod
def filter(self, packet: type[packets.Packet]) -> str | packets.MessagePacket:
pass
@abc.abstractmethod
def process(self, packet: type[packets.Packet]):
"""This is called when the filter passes."""
class APRSDWatchListPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
"""Base plugin class for all notification APRSD plugins.
All these plugins will get every packet seen by APRSD's
registered list of HAM callsigns in the config file's
watch_list.
When you want to 'notify' something when a packet is seen
by a particular HAM callsign, write a plugin based off of
this class.
"""
def setup(self):
# if we have a watch list enabled, we need to add filtering
# to enable seeing packets from the watch list.
if CONF.watch_list.enabled:
# watch list is enabled
self.enabled = True
watch_list = CONF.watch_list.callsigns
# make sure the timeout is set or this doesn't work
if watch_list:
aprs_client = client.client_factory.create().client
filter_str = "b/{}".format("/".join(watch_list))
aprs_client.set_filter(filter_str)
else:
LOG.warning("Watch list enabled, but no callsigns set.")
@hookimpl
def filter(self, packet: type[packets.Packet]) -> str | packets.MessagePacket:
result = packets.NULL_MESSAGE
if self.enabled:
wl = watch_list.WatchList()
if wl.callsign_in_watchlist(packet.from_call):
# packet is from a callsign in the watch list
self.rx_inc()
try:
result = self.process(packet)
except Exception as ex:
LOG.error(
"Plugin {} failed to process packet {}".format(
self.__class__, ex,
),
)
if result:
self.tx_inc()
else:
LOG.warning(f"{self.__class__} plugin is not enabled")
return result
class APRSDRegexCommandPluginBase(APRSDPluginBase, metaclass=abc.ABCMeta):
"""Base Message plugin class.
When you want to search for a particular command in an
APRSD message and send a direct reply, write a plugin
based off of this class.
"""
@property
def command_name(self):
@ -52,70 +196,175 @@ class APRSDPluginBase(metaclass=abc.ABCMeta):
"""The regex to match from the caller"""
raise NotImplementedError
@property
def version(self):
"""Version"""
raise NotImplementedError
def help(self):
return "{}: {}".format(
self.command_name.lower(),
self.command_regex,
)
def setup(self):
"""Do any plugin setup here."""
self.enabled = True
@hookimpl
def run(self, fromcall, message, ack):
if re.search(self.command_regex, message):
return self.command(fromcall, message, ack)
def filter(self, packet: packets.MessagePacket) -> str | packets.MessagePacket:
LOG.debug(f"{self.__class__.__name__} called")
if not self.enabled:
result = f"{self.__class__.__name__} isn't enabled"
LOG.warning(result)
return result
@abc.abstractmethod
def command(self, fromcall, message, ack):
"""This is the command that runs when the regex matches.
if not isinstance(packet, packets.MessagePacket):
LOG.warning(f"{self.__class__.__name__} Got a {packet.__class__.__name__} ignoring")
return packets.NULL_MESSAGE
To reply with a message over the air, return a string
to send.
"""
pass
result = None
message = packet.message_text
tocall = packet.to_call
# Only process messages destined for us
# and is an APRS message format and has a message.
if (
tocall == CONF.callsign
and isinstance(packet, packets.MessagePacket)
and message
):
if re.search(self.command_regex, message, re.IGNORECASE):
self.rx_inc()
try:
result = self.process(packet)
except Exception as ex:
LOG.error(
"Plugin {} failed to process packet {}".format(
self.__class__, ex,
),
)
LOG.exception(ex)
if result:
self.tx_inc()
return result
class APRSFIKEYMixin:
"""Mixin class to enable checking the existence of the aprs.fi apiKey."""
def ensure_aprs_fi_key(self):
if not CONF.aprs_fi.apiKey:
LOG.error("Config aprs_fi.apiKey is not set")
self.enabled = False
else:
self.enabled = True
class HelpPlugin(APRSDRegexCommandPluginBase):
"""Help Plugin that is always enabled.
This plugin is in this file to prevent a circular import.
"""
command_regex = "^[hH]"
command_name = "help"
def help(self):
return "Help: send APRS help or help <plugin>"
def process(self, packet: packets.MessagePacket):
LOG.info("HelpPlugin")
# fromcall = packet.get("from")
message = packet.message_text
# ack = packet.get("msgNo", "0")
a = re.search(r"^.*\s+(.*)", message)
command_name = None
if a is not None:
command_name = a.group(1).lower()
pm = PluginManager()
if command_name and "?" not in command_name:
# user wants help for a specific plugin
reply = None
for p in pm.get_plugins():
if (
p.enabled and isinstance(p, APRSDRegexCommandPluginBase)
and p.command_name.lower() == command_name
):
reply = p.help()
if reply:
return reply
list = []
for p in pm.get_plugins():
LOG.debug(p)
if p.enabled and isinstance(p, APRSDRegexCommandPluginBase):
name = p.command_name.lower()
if name not in list and "help" not in name:
list.append(name)
list.sort()
reply = " ".join(list)
lines = textwrap.wrap(reply, 60)
replies = ["Send APRS MSG of 'help' or 'help <plugin>'"]
for line in lines:
replies.append(f"plugins: {line}")
for entry in replies:
LOG.debug(f"{len(entry)} {entry}")
LOG.debug(f"{replies}")
return replies
class PluginManager:
# The singleton instance object for this class
_instance = None
# the pluggy PluginManager
# the pluggy PluginManager for all Message plugins
_pluggy_pm = None
# the pluggy PluginManager for all WatchList plugins
_watchlist_pm = None
# aprsd config dict
config = None
lock = None
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
# Put any initialization here.
cls._instance.lock = threading.Lock()
cls._instance._init()
return cls._instance
def __init__(self, config=None):
self.obj_list = []
if config:
self.config = config
def _init(self):
self._pluggy_pm = pluggy.PluginManager("aprsd")
self._pluggy_pm.add_hookspecs(APRSDPluginSpec)
# For the watchlist plugins
self._watchlist_pm = pluggy.PluginManager("aprsd")
self._watchlist_pm.add_hookspecs(APRSDPluginSpec)
def load_plugins_from_path(self, module_path):
if not os.path.exists(module_path):
LOG.error("plugin path '{}' doesn't exist.".format(module_path))
return None
def stats(self, serializable=False) -> dict:
"""Collect and return stats for all plugins."""
def full_name_with_qualname(obj):
return "{}.{}".format(
obj.__class__.__module__,
obj.__class__.__qualname__,
)
dir_path = os.path.realpath(module_path)
pattern = "*.py"
plugin_stats = {}
plugins = self.get_plugins()
if plugins:
self.obj_list = []
for p in plugins:
plugin_stats[full_name_with_qualname(p)] = {
"enabled": p.enabled,
"rx": p.rx_count,
"tx": p.tx_count,
"version": p.version,
}
for path, _subdirs, files in os.walk(dir_path):
for name in files:
if fnmatch.fnmatch(name, pattern):
LOG.debug("MODULE? '{}' '{}'".format(name, path))
module = smuggle("{}/{}".format(path, name))
for mem_name, obj in inspect.getmembers(module):
if inspect.isclass(obj) and self.is_plugin(obj):
self.obj_list.append(
{"name": mem_name, "obj": obj(self.config)},
)
return self.obj_list
return plugin_stats
def is_plugin(self, obj):
for c in inspect.getmro(obj):
@ -124,19 +373,32 @@ class PluginManager:
return False
def _create_class(self, module_class_string, super_cls: type = None, **kwargs):
def _create_class(
self,
module_class_string,
super_cls: type = None,
**kwargs,
):
"""
Method to create a class from a fqn python string.
:param module_class_string: full name of the class to create an object of
:param module_class_string: full name of the class to create an object
:param super_cls: expected super class for validity, None if bypass
:param kwargs: parameters to pass
:return:
"""
module_name, class_name = module_class_string.rsplit(".", 1)
module_name = None
class_name = None
try:
module_name, class_name = module_class_string.rsplit(".", 1)
module = importlib.import_module(module_name)
# Commented out because the email thread starts in a different context
# and hence gives a different singleton for the EmailStats
# module = importlib.reload(module)
except Exception as ex:
LOG.error("Failed to load Plugin '{}' : '{}'".format(module_name, ex))
if not module_name:
LOG.error(f"Failed to load Plugin {module_class_string}")
else:
LOG.error(f"Failed to load Plugin '{module_name}' : '{ex}'")
return
assert hasattr(module, class_name), "class {} is not in {}".format(
@ -166,68 +428,114 @@ class PluginManager:
plugin_obj = self._create_class(
plugin_name,
APRSDPluginBase,
config=self.config,
)
if plugin_obj:
LOG.info(
"Registering Command plugin '{}'({}) '{}'".format(
plugin_name,
plugin_obj.version,
plugin_obj.command_regex,
),
)
self._pluggy_pm.register(plugin_obj)
if isinstance(plugin_obj, APRSDWatchListPluginBase):
if plugin_obj.enabled:
LOG.info(
"Registering WatchList plugin '{}'({})".format(
plugin_name,
plugin_obj.version,
),
)
self._watchlist_pm.register(plugin_obj)
else:
LOG.warning(f"Plugin {plugin_obj.__class__.__name__} is disabled")
elif isinstance(plugin_obj, APRSDRegexCommandPluginBase):
if plugin_obj.enabled:
LOG.info(
"Registering Regex plugin '{}'({}) -- {}".format(
plugin_name,
plugin_obj.version,
plugin_obj.command_regex,
),
)
self._pluggy_pm.register(plugin_obj)
else:
LOG.warning(f"Plugin {plugin_obj.__class__.__name__} is disabled")
elif isinstance(plugin_obj, APRSDPluginBase):
if plugin_obj.enabled:
LOG.info(
"Registering Base plugin '{}'({})".format(
plugin_name,
plugin_obj.version,
),
)
self._pluggy_pm.register(plugin_obj)
else:
LOG.warning(f"Plugin {plugin_obj.__class__.__name__} is disabled")
except Exception as ex:
LOG.exception("Couldn't load plugin '{}'".format(plugin_name), ex)
LOG.error(f"Couldn't load plugin '{plugin_name}'")
LOG.exception(ex)
def setup_plugins(self):
def reload_plugins(self):
with self.lock:
del self._pluggy_pm
self.setup_plugins()
def setup_plugins(self, load_help_plugin=True):
"""Create the plugin manager and register plugins."""
LOG.info("Loading Core APRSD Command Plugins")
enabled_plugins = self.config["aprsd"].get("enabled_plugins", None)
self._pluggy_pm = pluggy.PluginManager("aprsd")
self._pluggy_pm.add_hookspecs(APRSDCommandSpec)
LOG.info("Loading APRSD Plugins")
# Help plugin is always enabled.
if load_help_plugin:
_help = HelpPlugin()
self._pluggy_pm.register(_help)
enabled_plugins = CONF.enabled_plugins
if enabled_plugins:
for p_name in enabled_plugins:
self._load_plugin(p_name)
else:
# Enabled plugins isn't set, so we default to loading all of
# the core plugins.
for p_name in CORE_PLUGINS:
for p_name in CORE_MESSAGE_PLUGINS:
self._load_plugin(p_name)
plugin_dir = self.config["aprsd"].get("plugin_dir", None)
if plugin_dir:
LOG.info("Trying to load custom plugins from '{}'".format(plugin_dir))
plugins_list = self.load_plugins_from_path(plugin_dir)
if plugins_list:
LOG.info("Discovered {} modules to load".format(len(plugins_list)))
for o in plugins_list:
plugin_obj = None
# not setting enabled plugins means load all?
plugin_obj = o["obj"]
if plugin_obj:
LOG.info(
"Registering Command plugin '{}'({}) '{}'".format(
o["name"],
o["obj"].version,
o["obj"].command_regex,
),
)
self._pluggy_pm.register(o["obj"])
else:
LOG.info("Skipping Custom Plugins directory.")
LOG.info("Completed Plugin Loading.")
def run(self, *args, **kwargs):
"""Execute all the pluguns run method."""
return self._pluggy_pm.hook.run(*args, **kwargs)
def run(self, packet: packets.MessagePacket):
"""Execute all the plugins run method."""
with self.lock:
return self._pluggy_pm.hook.filter(packet=packet)
def register(self, obj):
def run_watchlist(self, packet: packets.Packet):
with self.lock:
return self._watchlist_pm.hook.filter(packet=packet)
def stop(self):
"""Stop all threads created by all plugins."""
with self.lock:
for p in self.get_plugins():
if hasattr(p, "stop_threads"):
p.stop_threads()
def register_msg(self, obj):
"""Register the plugin."""
self._pluggy_pm.register(obj)
with self.lock:
self._pluggy_pm.register(obj)
def get_plugins(self):
return self._pluggy_pm.get_plugins()
plugin_list = []
if self._pluggy_pm:
for plug in self._pluggy_pm.get_plugins():
plugin_list.append(plug)
if self._watchlist_pm:
for plug in self._watchlist_pm.get_plugins():
plugin_list.append(plug)
return plugin_list
def get_watchlist_plugins(self):
pl = []
if self._watchlist_pm:
for plug in self._watchlist_pm.get_plugins():
pl.append(plug)
return pl
def get_message_plugins(self):
pl = []
if self._pluggy_pm:
for plug in self._pluggy_pm.get_plugins():
pl.append(plug)
return pl

86
aprsd/plugin_utils.py Normal file
View File

@ -0,0 +1,86 @@
# Utilities for plugins to use
import json
import logging
import requests
LOG = logging.getLogger("APRSD")
def get_aprs_fi(api_key, callsign):
LOG.debug(f"Fetch aprs.fi location for '{callsign}'")
try:
url = (
"http://api.aprs.fi/api/get?"
"&what=loc&apikey={}&format=json"
"&name={}".format(api_key, callsign)
)
response = requests.get(url)
except Exception:
raise Exception("Failed to get aprs.fi location")
else:
response.raise_for_status()
return json.loads(response.text)
def get_weather_gov_for_gps(lat, lon):
# FIXME(hemna) This is currently BROKEN
LOG.debug(f"Fetch station at {lat}, {lon}")
headers = requests.utils.default_headers()
headers.update(
{"User-Agent": "(aprsd, waboring@hemna.com)"},
)
try:
url2 = (
"https://forecast.weather.gov/MapClick.php?lat=%s"
"&lon=%s&FcstType=json" % (lat, lon)
# f"https://api.weather.gov/points/{lat},{lon}"
)
LOG.debug(f"Fetching weather '{url2}'")
response = requests.get(url2, headers=headers)
except Exception as e:
LOG.error(e)
raise Exception("Failed to get weather")
else:
response.raise_for_status()
return json.loads(response.text)
def get_weather_gov_metar(station):
LOG.debug(f"Fetch metar for station '{station}'")
try:
url = "https://api.weather.gov/stations/{}/observations/latest".format(
station,
)
response = requests.get(url)
except Exception:
raise Exception("Failed to fetch metar")
else:
response.raise_for_status()
return json.loads(response)
def fetch_openweathermap(api_key, lat, lon, units="metric", exclude=None):
LOG.debug(f"Fetch openweathermap for {lat}, {lon}")
if not exclude:
exclude = "minutely,hourly,daily,alerts"
try:
url = (
"https://api.openweathermap.org/data/2.5/onecall?"
"lat={}&lon={}&appid={}&units={}&exclude={}".format(
lat,
lon,
api_key,
units,
exclude,
)
)
LOG.debug(f"Fetching OWM weather '{url}'")
response = requests.get(url)
except Exception as e:
LOG.error(e)
raise Exception("Failed to get weather")
else:
response.raise_for_status()
return json.loads(response.text)

View File

@ -1,36 +1,165 @@
import datetime
import email
from email.mime.text import MIMEText
import imaplib
import logging
import re
import smtplib
import threading
import time
from aprsd import email, messaging, plugin
import imapclient
from oslo_config import cfg
from aprsd import packets, plugin, threads, utils
from aprsd.threads import tx
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
shortcuts_dict = None
class EmailPlugin(plugin.APRSDPluginBase):
class EmailInfo:
"""A singleton thread safe mechanism for the global check_email_delay.
This has to be done because we have 2 separate threads that access
the delay value.
1) when EmailPlugin runs from a user message and
2) when the background EmailThread runs to check email.
Access the check email delay with
EmailInfo().delay
Set it with
EmailInfo().delay = 100
or
EmailInfo().delay += 10
"""
_instance = None
def __new__(cls, *args, **kwargs):
"""This magic turns this into a singleton."""
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance.lock = threading.Lock()
cls._instance._delay = 60
return cls._instance
@property
def delay(self):
with self.lock:
return self._delay
@delay.setter
def delay(self, val):
with self.lock:
self._delay = val
@utils.singleton
class EmailStats:
"""Singleton object to store stats related to email."""
_instance = None
tx = 0
rx = 0
email_thread_last_time = None
def stats(self, serializable=False):
if CONF.email_plugin.enabled:
last_check_time = self.email_thread_last_time
if serializable and last_check_time:
last_check_time = last_check_time.isoformat()
stats = {
"tx": self.tx,
"rx": self.rx,
"last_check_time": last_check_time,
}
else:
stats = {}
return stats
def tx_inc(self):
self.tx += 1
def rx_inc(self):
self.rx += 1
def email_thread_update(self):
self.email_thread_last_time = datetime.datetime.now()
class EmailPlugin(plugin.APRSDRegexCommandPluginBase):
"""Email Plugin."""
version = "1.0"
command_regex = "^-.*"
command_name = "email"
short_description = "Send and Receive email"
# message_number:time combos so we don't resend the same email in
# five mins {int:int}
email_sent_dict = {}
enabled = False
def command(self, fromcall, message, ack):
def setup(self):
"""Ensure that email is enabled and start the thread."""
if CONF.email_plugin.enabled:
self.enabled = True
if not CONF.email_plugin.callsign:
self.enabled = False
LOG.error("email_plugin.callsign is not set.")
return
if not CONF.email_plugin.imap_login:
LOG.error("email_plugin.imap_login not set. Disabling Plugin")
self.enabled = False
return
if not CONF.email_plugin.smtp_login:
LOG.error("email_plugin.smtp_login not set. Disabling Plugin")
self.enabled = False
return
shortcuts = _build_shortcuts_dict()
LOG.info(f"Email shortcuts {shortcuts}")
else:
LOG.info("Email services not enabled.")
self.enabled = False
def create_threads(self):
if self.enabled:
return APRSDEmailThread()
@trace.trace
def process(self, packet: packets.MessagePacket):
LOG.info("Email COMMAND")
reply = None
if not self.enabled:
# Email has not been enabled
# so the plugin will just NOOP
return packets.NULL_MESSAGE
searchstring = "^" + self.config["ham"]["callsign"] + ".*"
fromcall = packet.from_call
message = packet.message_text
ack = packet.get("msgNo", "0")
reply = None
if not CONF.email_plugin.enabled:
LOG.debug("Email is not enabled in config file ignoring.")
return "Email not enabled."
searchstring = "^" + CONF.email_plugin.callsign + ".*"
# only I can do email
if re.search(searchstring, fromcall):
# digits only, first one is number of emails to resend
r = re.search("^-([0-9])[0-9]*$", message)
if r is not None:
LOG.debug("RESEND EMAIL")
email.resend_email(r.group(1), fromcall)
reply = messaging.NULL_MESSAGE
resend_email(r.group(1), fromcall)
reply = packets.NULL_MESSAGE
# -user@address.com body of email
elif re.search(r"^-([A-Za-z0-9_\-\.@]+) (.*)", message):
# (same search again)
@ -39,15 +168,17 @@ class EmailPlugin(plugin.APRSDPluginBase):
to_addr = a.group(1)
content = a.group(2)
email_address = email.get_email_from_shortcut(to_addr)
email_address = get_email_from_shortcut(to_addr)
if not email_address:
reply = "Bad email address"
return reply
# send recipient link to aprs.fi map
if content == "mapme":
content = "Click for my location: http://aprs.fi/{}".format(
self.config["ham"]["callsign"],
content = (
"Click for my location: http://aprs.fi/{}" ""
).format(
CONF.email_plugin.callsign,
)
too_soon = 0
now = time.time()
@ -59,14 +190,14 @@ class EmailPlugin(plugin.APRSDPluginBase):
if timedelta < 300: # five minutes
too_soon = 1
if not too_soon or ack == 0:
LOG.info("Send email '{}'".format(content))
send_result = email.send_email(to_addr, content)
reply = messaging.NULL_MESSAGE
LOG.info(f"Send email '{content}'")
send_result = send_email(to_addr, content)
reply = packets.NULL_MESSAGE
if send_result != 0:
reply = "-{} failed".format(to_addr)
# messaging.send_message(fromcall, "-" + to_addr + " failed")
reply = f"-{to_addr} failed"
else:
# clear email sent dictionary if somehow goes over 100
# clear email sent dictionary if somehow goes
# over 100
if len(self.email_sent_dict) > 98:
LOG.debug(
"DEBUG: email_sent_dict is big ("
@ -76,6 +207,7 @@ class EmailPlugin(plugin.APRSDPluginBase):
self.email_sent_dict.clear()
self.email_sent_dict[ack] = now
else:
reply = packets.NULL_MESSAGE
LOG.info(
"Email for message number "
+ ack
@ -83,6 +215,495 @@ class EmailPlugin(plugin.APRSDPluginBase):
)
else:
reply = "Bad email address"
# messaging.send_message(fromcall, "Bad email address")
return reply
def _imap_connect():
imap_port = CONF.email_plugin.imap_port
use_ssl = CONF.email_plugin.imap_use_ssl
try:
server = imapclient.IMAPClient(
CONF.email_plugin.imap_host,
port=imap_port,
use_uid=True,
ssl=use_ssl,
timeout=30,
)
except Exception:
LOG.exception("Failed to connect IMAP server")
return
try:
server.login(
CONF.email_plugin.imap_login,
CONF.email_plugin.imap_password,
)
except (imaplib.IMAP4.error, Exception) as e:
msg = getattr(e, "message", repr(e))
LOG.error(f"Failed to login {msg}")
return
server.select_folder("INBOX")
server.fetch = trace.trace(server.fetch)
server.search = trace.trace(server.search)
server.remove_flags = trace.trace(server.remove_flags)
server.add_flags = trace.trace(server.add_flags)
return server
def _smtp_connect():
host = CONF.email_plugin.smtp_host
smtp_port = CONF.email_plugin.smtp_port
use_ssl = CONF.email_plugin.smtp_use_ssl
msg = "{}{}:{}".format("SSL " if use_ssl else "", host, smtp_port)
LOG.debug(
"Connect to SMTP host {} with user '{}'".format(
msg,
CONF.email_plugin.smtp_login,
),
)
try:
if use_ssl:
server = smtplib.SMTP_SSL(
host=host,
port=smtp_port,
timeout=30,
)
else:
server = smtplib.SMTP(
host=host,
port=smtp_port,
timeout=30,
)
except Exception:
LOG.error("Couldn't connect to SMTP Server")
return
LOG.debug(f"Connected to smtp host {msg}")
debug = CONF.email_plugin.debug
if debug:
server.set_debuglevel(5)
server.sendmail = trace.trace(server.sendmail)
try:
server.login(
CONF.email_plugin.smtp_login,
CONF.email_plugin.smtp_password,
)
except Exception:
LOG.error("Couldn't connect to SMTP Server")
return
LOG.debug(f"Logged into SMTP server {msg}")
return server
def _build_shortcuts_dict():
global shortcuts_dict
if not shortcuts_dict:
if CONF.email_plugin.email_shortcuts:
shortcuts_dict = {}
tmp = CONF.email_plugin.email_shortcuts
for combo in tmp:
entry = combo.split("=")
shortcuts_dict[entry[0]] = entry[1]
else:
shortcuts_dict = {}
return shortcuts_dict
def get_email_from_shortcut(addr):
if CONF.email_plugin.email_shortcuts:
shortcuts = _build_shortcuts_dict()
LOG.info(f"Shortcut lookup {addr} returns {shortcuts.get(addr, addr)}")
return shortcuts.get(addr, addr)
else:
return addr
def validate_email_config(disable_validation=False):
"""function to simply ensure we can connect to email services.
This helps with failing early during startup.
"""
LOG.info("Checking IMAP configuration")
imap_server = _imap_connect()
LOG.info("Checking SMTP configuration")
smtp_server = _smtp_connect()
if imap_server and smtp_server:
return True
else:
return False
@trace.trace
def parse_email(msgid, data, server):
envelope = data[b"ENVELOPE"]
# email address match
# use raw string to avoid invalid escape secquence errors r"string here"
f = re.search(r"([\.\w_-]+@[\.\w_-]+)", str(envelope.from_[0]))
if f is not None:
from_addr = f.group(1)
else:
from_addr = "noaddr"
LOG.debug(f"Got a message from '{from_addr}'")
try:
m = server.fetch([msgid], ["RFC822"])
except Exception:
LOG.exception("Couldn't fetch email from server in parse_email")
return
msg = email.message_from_string(m[msgid][b"RFC822"].decode(errors="ignore"))
if msg.is_multipart():
text = ""
html = None
# default in case body somehow isn't set below - happened once
body = b"* unreadable msg received"
# this uses the last text or html part in the email,
# phone companies often put content in an attachment
for part in msg.get_payload():
if part.get_content_charset() is None:
# or BREAK when we hit a text or html?
# We cannot know the character set,
# so return decoded "something"
LOG.debug("Email got unknown content type")
text = part.get_payload(decode=True)
continue
charset = part.get_content_charset()
if part.get_content_type() == "text/plain":
LOG.debug("Email got text/plain")
text = str(
part.get_payload(decode=True),
str(charset),
"ignore",
).encode("utf8", "replace")
if part.get_content_type() == "text/html":
LOG.debug("Email got text/html")
html = str(
part.get_payload(decode=True),
str(charset),
"ignore",
).encode("utf8", "replace")
if text is not None:
# strip removes white space fore and aft of string
body = text.strip()
else:
body = html.strip()
else: # message is not multipart
# email.uscc.net sends no charset, blows up unicode function below
LOG.debug("Email is not multipart")
if msg.get_content_charset() is None:
text = str(msg.get_payload(decode=True), "US-ASCII", "ignore").encode(
"utf8",
"replace",
)
else:
text = str(
msg.get_payload(decode=True),
msg.get_content_charset(),
"ignore",
).encode("utf8", "replace")
body = text.strip()
# FIXED: UnicodeDecodeError: 'ascii' codec can't decode byte 0xf0
# in position 6: ordinal not in range(128)
# decode with errors='ignore'. be sure to encode it before we return
# it below, also with errors='ignore'
try:
body = body.decode(errors="ignore")
except Exception:
LOG.exception("Unicode decode failure")
LOG.error(f"Unidoce decode failed: {str(body)}")
body = "Unreadable unicode msg"
# strip all html tags
body = re.sub("<[^<]+?>", "", body)
# strip CR/LF, make it one line, .rstrip fails at this
body = body.replace("\n", " ").replace("\r", " ")
# ascii might be out of range, so encode it, removing any error characters
body = body.encode(errors="ignore")
return body, from_addr
# end parse_email
@trace.trace
def send_email(to_addr, content):
shortcuts = _build_shortcuts_dict()
email_address = get_email_from_shortcut(to_addr)
LOG.info("Sending Email_________________")
if to_addr in shortcuts:
LOG.info(f"To : {to_addr}")
to_addr = email_address
LOG.info(f" ({to_addr})")
subject = CONF.email_plugin.callsign
# content = content + "\n\n(NOTE: reply with one line)"
LOG.info(f"Subject : {subject}")
LOG.info(f"Body : {content}")
# check email more often since there's activity right now
EmailInfo().delay = 60
msg = MIMEText(content)
msg["Subject"] = subject
msg["From"] = CONF.email_plugin.smtp_login
msg["To"] = to_addr
server = _smtp_connect()
if server:
try:
server.sendmail(
CONF.email_plugin.smtp_login,
[to_addr],
msg.as_string(),
)
EmailStats().tx_inc()
except Exception:
LOG.exception("Sendmail Error!!!!")
server.quit()
return -1
server.quit()
return 0
@trace.trace
def resend_email(count, fromcall):
date = datetime.datetime.now()
month = date.strftime("%B")[:3] # Nov, Mar, Apr
day = date.day
year = date.year
today = f"{day}-{month}-{year}"
shortcuts = _build_shortcuts_dict()
# swap key/value
shortcuts_inverted = {v: k for k, v in shortcuts.items()}
try:
server = _imap_connect()
except Exception:
LOG.exception("Failed to Connect to IMAP. Cannot resend email ")
return
try:
messages = server.search(["SINCE", today])
except Exception:
LOG.exception("Couldn't search for emails in resend_email ")
return
# LOG.debug("%d messages received today" % len(messages))
msgexists = False
messages.sort(reverse=True)
del messages[int(count) :] # only the latest "count" messages
for message in messages:
try:
parts = server.fetch(message, ["ENVELOPE"]).items()
except Exception:
LOG.exception("Couldn't fetch email parts in resend_email")
continue
for msgid, data in list(parts):
# one at a time, otherwise order is random
(body, from_addr) = parse_email(msgid, data, server)
# unset seen flag, will stay bold in email client
try:
server.remove_flags(msgid, [imapclient.SEEN])
except Exception:
LOG.exception("Failed to remove SEEN flag in resend_email")
if from_addr in shortcuts_inverted:
# reverse lookup of a shortcut
from_addr = shortcuts_inverted[from_addr]
# asterisk indicates a resend
reply = "-" + from_addr + " * " + body.decode(errors="ignore")
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=fromcall,
message_text=reply,
),
)
msgexists = True
if msgexists is not True:
stm = time.localtime()
h = stm.tm_hour
m = stm.tm_min
s = stm.tm_sec
# append time as a kind of serial number to prevent FT1XDR from
# thinking this is a duplicate message.
# The FT1XDR pretty much ignores the aprs message number in this
# regard. The FTM400 gets it right.
reply = "No new msg {}:{}:{}".format(
str(h).zfill(2),
str(m).zfill(2),
str(s).zfill(2),
)
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=fromcall,
message_text=reply,
),
)
# check email more often since we're resending one now
EmailInfo().delay = 60
server.logout()
# end resend_email()
class APRSDEmailThread(threads.APRSDThread):
def __init__(self):
super().__init__("EmailThread")
self.past = datetime.datetime.now()
def loop(self):
time.sleep(5)
EmailStats().email_thread_update()
# always sleep for 5 seconds and see if we need to check email
# This allows CTRL-C to stop the execution of this loop sooner
# than check_email_delay time
now = datetime.datetime.now()
if now - self.past > datetime.timedelta(seconds=EmailInfo().delay):
# It's time to check email
# slowly increase delay every iteration, max out at 300 seconds
# any send/receive/resend activity will reset this to 60 seconds
if EmailInfo().delay < 300:
EmailInfo().delay += 10
LOG.debug(
f"check_email_delay is {EmailInfo().delay} seconds ",
)
shortcuts = _build_shortcuts_dict()
# swap key/value
shortcuts_inverted = {v: k for k, v in shortcuts.items()}
date = datetime.datetime.now()
month = date.strftime("%B")[:3] # Nov, Mar, Apr
day = date.day
year = date.year
today = f"{day}-{month}-{year}"
try:
server = _imap_connect()
except Exception:
LOG.exception("IMAP Failed to connect")
return True
try:
messages = server.search(["SINCE", today])
except Exception:
LOG.exception("IMAP failed to search for messages since today.")
return True
LOG.debug(f"{len(messages)} messages received today")
try:
_msgs = server.fetch(messages, ["ENVELOPE"])
except Exception:
LOG.exception("IMAP failed to fetch/flag messages: ")
return True
for msgid, data in _msgs.items():
envelope = data[b"ENVELOPE"]
LOG.debug(
'ID:%d "%s" (%s)'
% (msgid, envelope.subject.decode(), envelope.date),
)
f = re.search(
r"'([[A-a][0-9]_-]+@[[A-a][0-9]_-\.]+)",
str(envelope.from_[0]),
)
if f is not None:
from_addr = f.group(1)
else:
from_addr = "noaddr"
# LOG.debug("Message flags/tags: " +
# str(server.get_flags(msgid)[msgid]))
# if "APRS" not in server.get_flags(msgid)[msgid]:
# in python3, imap tags are unicode. in py2 they're strings.
# so .decode them to handle both
try:
taglist = [
x.decode(errors="ignore")
for x in server.get_flags(msgid)[msgid]
]
except Exception:
LOG.error("Failed to get flags.")
break
if "APRS" not in taglist:
# if msg not flagged as sent via aprs
try:
server.fetch([msgid], ["RFC822"])
except Exception:
LOG.exception("Failed single server fetch for RFC822")
break
(body, from_addr) = parse_email(msgid, data, server)
# unset seen flag, will stay bold in email client
try:
server.remove_flags(msgid, [imapclient.SEEN])
except Exception:
LOG.exception("Failed to remove flags SEEN")
# Not much we can do here, so lets try and
# send the aprs message anyway
if from_addr in shortcuts_inverted:
# reverse lookup of a shortcut
from_addr = shortcuts_inverted[from_addr]
reply = "-" + from_addr + " " + body.decode(errors="ignore")
# Send the message to the registered user in the
# config ham.callsign
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=CONF.email_plugin.callsign,
message_text=reply,
),
)
# flag message as sent via aprs
try:
server.add_flags(msgid, ["APRS"])
# unset seen flag, will stay bold in email client
except Exception:
LOG.exception("Couldn't add APRS flag to email")
try:
server.remove_flags(msgid, [imapclient.SEEN])
except Exception:
LOG.exception("Couldn't remove seen flag from email")
# check email more often since we just received an email
EmailInfo().delay = 60
# reset clock
LOG.debug("Done looping over Server.fetch, log out.")
self.past = datetime.datetime.now()
try:
server.logout()
except Exception:
LOG.exception("IMAP failed to logout: ")
return True
else:
# We haven't hit the email delay yet.
# LOG.debug("Delta({}) < {}".format(now - past, check_email_delay))
return True
return True

View File

@ -2,29 +2,44 @@ import logging
import shutil
import subprocess
from aprsd import plugin
from aprsd import packets, plugin
from aprsd.utils import trace
LOG = logging.getLogger("APRSD")
DEFAULT_FORTUNE_PATH = '/usr/games/fortune'
class FortunePlugin(plugin.APRSDPluginBase):
class FortunePlugin(plugin.APRSDRegexCommandPluginBase):
"""Fortune."""
version = "1.0"
command_regex = "^[fF]"
command_regex = r"^([f]|[f]\s|fortune)"
command_name = "fortune"
short_description = "Give me a fortune"
def command(self, fromcall, message, ack):
fortune_path = None
def setup(self):
self.fortune_path = shutil.which(DEFAULT_FORTUNE_PATH)
LOG.info(f"Fortune path {self.fortune_path}")
if not self.fortune_path:
self.enabled = False
else:
self.enabled = True
@trace.trace
def process(self, packet: packets.MessagePacket):
LOG.info("FortunePlugin")
# fromcall = packet.get("from")
# message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
reply = None
fortune_path = shutil.which("fortune")
if not fortune_path:
reply = "Fortune command not installed"
return reply
try:
cmnd = [fortune_path, "-s", "-n 60"]
cmnd = [self.fortune_path, "-s", "-n 60"]
command = " ".join(cmnd)
output = subprocess.check_output(
command,
@ -32,8 +47,14 @@ class FortunePlugin(plugin.APRSDPluginBase):
timeout=3,
universal_newlines=True,
)
output = (
output.replace("\r", "")
.replace("\n", "")
.replace(" ", "")
.replace("\t", " ")
)
except subprocess.CalledProcessError as ex:
reply = "Fortune command failed '{}'".format(ex.output)
reply = f"Fortune command failed '{ex.output}'"
else:
reply = output

View File

@ -1,78 +1,179 @@
import json
import logging
import re
import time
from aprsd import plugin
import requests
from geopy.geocoders import ArcGIS, AzureMaps, Baidu, Bing, GoogleV3
from geopy.geocoders import HereV7, Nominatim, OpenCage, TomTom, What3WordsV3, Woosmap
from oslo_config import cfg
from aprsd import packets, plugin, plugin_utils
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class LocationPlugin(plugin.APRSDPluginBase):
class UsLocation:
raw = {}
def __init__(self, info):
self.info = info
def __str__(self):
return self.info
class USGov:
"""US Government geocoder that uses the geopy API.
This is a dummy class the implements the geopy reverse API,
so the factory can return an object that conforms to the API.
"""
def reverse(self, coordinates):
"""Reverse geocode a coordinate."""
LOG.info(f"USGov reverse geocode {coordinates}")
coords = coordinates.split(",")
lat = float(coords[0])
lon = float(coords[1])
result = plugin_utils.get_weather_gov_for_gps(lat, lon)
# LOG.info(f"WEATHER: {result}")
# LOG.info(f"area description {result['location']['areaDescription']}")
if 'location' in result:
loc = UsLocation(result['location']['areaDescription'])
else:
loc = UsLocation("Unknown Location")
LOG.info(f"USGov reverse geocode LOC {loc}")
return loc
def geopy_factory():
"""Factory function for geopy geocoders."""
geocoder = CONF.location_plugin.geopy_geocoder
LOG.info(f"Using geocoder: {geocoder}")
user_agent = CONF.location_plugin.user_agent
LOG.info(f"Using user_agent: {user_agent}")
if geocoder == "Nominatim":
return Nominatim(user_agent=user_agent)
elif geocoder == "USGov":
return USGov()
elif geocoder == "ArcGIS":
return ArcGIS(
username=CONF.location_plugin.arcgis_username,
password=CONF.location_plugin.arcgis_password,
user_agent=user_agent,
)
elif geocoder == "AzureMaps":
return AzureMaps(
user_agent=user_agent,
subscription_key=CONF.location_plugin.azuremaps_subscription_key,
)
elif geocoder == "Baidu":
return Baidu(user_agent=user_agent, api_key=CONF.location_plugin.baidu_api_key)
elif geocoder == "Bing":
return Bing(user_agent=user_agent, api_key=CONF.location_plugin.bing_api_key)
elif geocoder == "GoogleV3":
return GoogleV3(user_agent=user_agent, api_key=CONF.location_plugin.google_api_key)
elif geocoder == "HERE":
return HereV7(user_agent=user_agent, api_key=CONF.location_plugin.here_api_key)
elif geocoder == "OpenCage":
return OpenCage(user_agent=user_agent, api_key=CONF.location_plugin.opencage_api_key)
elif geocoder == "TomTom":
return TomTom(user_agent=user_agent, api_key=CONF.location_plugin.tomtom_api_key)
elif geocoder == "What3Words":
return What3WordsV3(user_agent=user_agent, api_key=CONF.location_plugin.what3words_api_key)
elif geocoder == "Woosmap":
return Woosmap(user_agent=user_agent, api_key=CONF.location_plugin.woosmap_api_key)
else:
raise ValueError(f"Unknown geocoder: {geocoder}")
class LocationPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
"""Location!"""
version = "1.0"
command_regex = "^[lL]"
command_regex = r"^([l]|[l]\s|location)"
command_name = "location"
short_description = "Where in the world is a CALLSIGN's last GPS beacon?"
config_items = {"apikey": "aprs.fi api key here"}
def setup(self):
self.ensure_aprs_fi_key()
def command(self, fromcall, message, ack):
@trace.trace
def process(self, packet: packets.MessagePacket):
LOG.info("Location Plugin")
# get last location of a callsign, get descriptive name from weather service
api_key = self.config["aprs.fi"]["apiKey"]
try:
# optional second argument is a callsign to search
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
# if no second argument, search for calling station
searchcall = fromcall
url = (
"http://api.aprs.fi/api/get?name="
+ searchcall
+ "&what=loc&apikey={}&format=json".format(api_key)
)
response = requests.get(url)
# aprs_data = json.loads(response.read())
aprs_data = json.loads(response.text)
LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
try: # altitude not always provided
alt = aprs_data["entries"][0]["altitude"]
except Exception:
alt = 0
altfeet = int(alt * 3.28084)
aprs_lasttime_seconds = aprs_data["entries"][0]["lasttime"]
# aprs_lasttime_seconds = aprs_lasttime_seconds.encode(
# "ascii", errors="ignore"
# ) # unicode to ascii
delta_seconds = time.time() - int(aprs_lasttime_seconds)
delta_hours = delta_seconds / 60 / 60
url2 = (
"https://forecast.weather.gov/MapClick.php?lat="
+ str(lat)
+ "&lon="
+ str(lon)
+ "&FcstType=json"
)
response2 = requests.get(url2)
wx_data = json.loads(response2.text)
fromcall = packet.from_call
message = packet.get("message_text", None)
reply = "{}: {} {}' {},{} {}h ago".format(
searchcall,
wx_data["location"]["areaDescription"],
str(altfeet),
str(lat),
str(lon),
str("%.1f" % round(delta_hours, 1)),
).rstrip()
except Exception as e:
LOG.debug("Locate failed with: " + "%s" % str(e))
reply = "Unable to find station " + searchcall + ". Sending beacons?"
api_key = CONF.aprs_fi.apiKey
# optional second argument is a callsign to search
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
# if no second argument, search for calling station
searchcall = fromcall
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi '{ex}'")
return "Failed to fetch aprs.fi location"
LOG.debug(f"LocationPlugin: aprs_data = {aprs_data}")
if not len(aprs_data["entries"]):
LOG.error("Didn't get any entries from aprs.fi")
return "Failed to fetch aprs.fi location"
lat = float(aprs_data["entries"][0]["lat"])
lon = float(aprs_data["entries"][0]["lng"])
# Get some information about their location
try:
tic = time.perf_counter()
geolocator = geopy_factory()
LOG.info(f"Using GEOLOCATOR: {geolocator}")
coordinates = f"{lat:0.6f}, {lon:0.6f}"
location = geolocator.reverse(coordinates)
address = location.raw.get("address")
LOG.debug(f"GEOLOCATOR address: {address}")
toc = time.perf_counter()
if address:
LOG.info(f"Geopy address {address} took {toc - tic:0.4f}")
if address.get("country_code") == "us":
area_info = f"{address.get('county')}, {address.get('state')}"
else:
# what to do for address for non US?
area_info = f"{address.get('country'), 'Unknown'}"
else:
area_info = str(location)
except Exception as ex:
LOG.error(ex)
LOG.error(f"Failed to fetch Geopy address {ex}")
area_info = "Unknown Location"
try: # altitude not always provided
alt = float(aprs_data["entries"][0]["altitude"])
except Exception:
alt = 0
altfeet = int(alt * 3.28084)
aprs_lasttime_seconds = aprs_data["entries"][0]["lasttime"]
# aprs_lasttime_seconds = aprs_lasttime_seconds.encode(
# "ascii", errors="ignore"
# ) # unicode to ascii
delta_seconds = time.time() - int(aprs_lasttime_seconds)
delta_hours = delta_seconds / 60 / 60
reply = "{}: {} {}' {},{} {}h ago".format(
searchcall,
area_info,
str(altfeet),
f"{lat:0.2f}",
f"{lon:0.2f}",
str("%.1f" % round(delta_hours, 1)),
).rstrip()
return reply

61
aprsd/plugins/notify.py Normal file
View File

@ -0,0 +1,61 @@
import logging
from oslo_config import cfg
from aprsd import packets, plugin
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class NotifySeenPlugin(plugin.APRSDWatchListPluginBase):
"""Notification plugin to send seen message for callsign.
This plugin will track callsigns in the watch list and report
when a callsign has been seen when the last time they were
seen was older than the configured age limit.
"""
short_description = "Notify me when a CALLSIGN is recently seen on APRS-IS"
def process(self, packet: packets.MessagePacket):
LOG.info("NotifySeenPlugin")
notify_callsign = CONF.watch_list.alert_callsign
fromcall = packet.from_call
wl = packets.WatchList()
age = wl.age(fromcall)
if fromcall != notify_callsign:
if wl.is_old(fromcall):
LOG.info(
"NOTIFY {} last seen {} max age={}".format(
fromcall,
age,
wl.max_delta(),
),
)
packet_type = packet.__class__.__name__
# we shouldn't notify the alert user that they are online.
pkt = packets.MessagePacket(
from_call=CONF.callsign,
to_call=notify_callsign,
message_text=(
f"{fromcall} was just seen by type:'{packet_type}'"
),
allow_delay=False,
)
pkt.allow_delay = False
return pkt
else:
LOG.debug(
"Not old enough to notify on callsign "
f"'{fromcall}' : {age} < {wl.max_delta()}",
)
return packets.NULL_MESSAGE
else:
LOG.debug("fromcall and notify_callsign are the same, ignoring")
return packets.NULL_MESSAGE

View File

@ -2,19 +2,25 @@ import logging
import time
from aprsd import plugin
from aprsd.utils import trace
LOG = logging.getLogger("APRSD")
class PingPlugin(plugin.APRSDPluginBase):
class PingPlugin(plugin.APRSDRegexCommandPluginBase):
"""Ping."""
version = "1.0"
command_regex = "^[pP]"
command_regex = r"^([p]|[p]\s|ping)"
command_name = "ping"
short_description = "reply with a Pong!"
def command(self, fromcall, message, ack):
LOG.info("PINGPlugin")
@trace.trace
def process(self, packet):
LOG.info("PingPlugin")
# fromcall = packet.get("from")
# message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
stm = time.localtime()
h = stm.tm_hour
m = stm.tm_min

View File

@ -1,64 +0,0 @@
import datetime
import logging
import re
from aprsd import messaging, plugin
LOG = logging.getLogger("APRSD")
class QueryPlugin(plugin.APRSDPluginBase):
"""Query command."""
version = "1.0"
command_regex = r"^\?.*"
command_name = "query"
def command(self, fromcall, message, ack):
LOG.info("Query COMMAND")
tracker = messaging.MsgTrack()
now = datetime.datetime.now()
reply = "Pending messages ({}) {}".format(
len(tracker),
now.strftime("%H:%M:%S"),
)
searchstring = "^" + self.config["ham"]["callsign"] + ".*"
# only I can do admin commands
if re.search(searchstring, fromcall):
# resend last N most recent: "?3"
r = re.search(r"^\?([0-9]).*", message)
if r is not None:
if len(tracker) > 0:
last_n = r.group(1)
reply = messaging.NULL_MESSAGE
LOG.debug(reply)
tracker.restart_delayed(count=int(last_n))
else:
reply = "No pending msgs to resend"
LOG.debug(reply)
return reply
# resend all: "?a"
r = re.search(r"^\?[aA].*", message)
if r is not None:
if len(tracker) > 0:
reply = messaging.NULL_MESSAGE
LOG.debug(reply)
tracker.restart_delayed()
else:
reply = "No pending msgs"
LOG.debug(reply)
return reply
# delete all: "?d"
r = re.search(r"^\?[dD].*", message)
if r is not None:
reply = "Deleted ALL pending msgs."
LOG.debug(reply)
tracker.flush()
return reply
return reply

View File

@ -1,28 +1,115 @@
import logging
import time
import re
from aprsd import fuzzyclock, plugin
from oslo_config import cfg
import pytz
from tzlocal import get_localzone
from aprsd import packets, plugin, plugin_utils
from aprsd.utils import fuzzy, trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class TimePlugin(plugin.APRSDPluginBase):
class TimePlugin(plugin.APRSDRegexCommandPluginBase):
"""Time command."""
version = "1.0"
command_regex = "^[tT]"
# Look for t or t<space> or T<space> or time
command_regex = r"^([t]|[t]\s|time)"
command_name = "time"
short_description = "What is the current local time."
def command(self, fromcall, message, ack):
LOG.info("TIME COMMAND")
stm = time.localtime()
h = stm.tm_hour
m = stm.tm_min
cur_time = fuzzyclock.fuzzy(h, m, 1)
reply = "{} ({}:{} PDT) ({})".format(
def _get_local_tz(self):
lz = get_localzone()
return pytz.timezone(str(lz))
def _get_utcnow(self):
return pytz.datetime.datetime.utcnow()
def build_date_str(self, localzone):
utcnow = self._get_utcnow()
gmt_t = pytz.utc.localize(utcnow)
local_t = gmt_t.astimezone(localzone)
local_short_str = local_t.strftime("%H:%M %Z")
local_hour = local_t.strftime("%H")
local_min = local_t.strftime("%M")
cur_time = fuzzy(int(local_hour), int(local_min), 1)
reply = "{} ({})".format(
cur_time,
str(h),
str(m).rjust(2, "0"),
message.rstrip(),
local_short_str,
)
return reply
@trace.trace
def process(self, packet: packets.Packet):
LOG.info("TIME COMMAND")
# So we can mock this in unit tests
localzone = self._get_local_tz()
return self.build_date_str(localzone)
class TimeOWMPlugin(TimePlugin, plugin.APRSFIKEYMixin):
"""OpenWeatherMap based timezone fetching."""
command_regex = r"^([t]|[t]\s|time)"
command_name = "time"
short_description = "Current time of GPS beacon's timezone. Uses OpenWeatherMap"
def setup(self):
self.ensure_aprs_fi_key()
@trace.trace
def process(self, packet: packets.MessagePacket):
fromcall = packet.from_call
message = packet.message_text
# ack = packet.get("msgNo", "0")
# optional second argument is a callsign to search
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
# if no second argument, search for calling station
searchcall = fromcall
api_key = CONF.aprs_fi.apiKey
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch location"
LOG.debug(f"LocationPlugin: aprs_data = {aprs_data}")
if not len(aprs_data["entries"]):
LOG.error("Didn't get any entries from aprs.fi")
return "Failed to fetch aprs.fi location"
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
try:
self.config.exists(
["services", "openweathermap", "apiKey"],
)
except Exception as ex:
LOG.error(f"Failed to find config openweathermap:apiKey {ex}")
return "No openweathermap apiKey found"
api_key = self.config["services"]["openweathermap"]["apiKey"]
try:
results = plugin_utils.fetch_openweathermap(api_key, lat, lon)
except Exception as ex:
LOG.error(f"Couldn't fetch openweathermap api '{ex}'")
# default to UTC
localzone = pytz.timezone("UTC")
else:
tzone = results["timezone"]
localzone = pytz.timezone(tzone)
return self.build_date_str(localzone)

View File

@ -2,21 +2,30 @@ import logging
import aprsd
from aprsd import plugin
from aprsd.stats import collector
LOG = logging.getLogger("APRSD")
class VersionPlugin(plugin.APRSDPluginBase):
class VersionPlugin(plugin.APRSDRegexCommandPluginBase):
"""Version of APRSD Plugin."""
version = "1.0"
command_regex = "^[vV]"
command_regex = r"^([v]|[v]\s|version)"
command_name = "version"
short_description = "What is the APRSD Version"
# message_number:time combos so we don't resend the same email in
# five mins {int:int}
email_sent_dict = {}
def command(self, fromcall, message, ack):
def process(self, packet):
LOG.info("Version COMMAND")
return "APRSD version '{}'".format(aprsd.__version__)
# fromcall = packet.get("from")
# message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
s = collector.Collector().collect()
return "APRSD ver:{} uptime:{}".format(
aprsd.__version__,
s["APRSDStats"]["uptime"],
)

View File

@ -1,54 +1,405 @@
import json
import logging
import re
from aprsd import plugin
from oslo_config import cfg
import requests
from aprsd import plugin, plugin_utils
from aprsd.utils import trace
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class WeatherPlugin(plugin.APRSDPluginBase):
"""Weather Command"""
class USWeatherPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
"""USWeather Command
version = "1.0"
command_regex = "^[wW]"
command_name = "weather"
Returns a weather report for the calling weather station
inside the United States only. This uses the
forecast.weather.gov API to fetch the weather.
def command(self, fromcall, message, ack):
This service does not require an apiKey.
How to Call: Send a message to aprsd
"weather" - returns weather near the calling callsign
"""
# command_regex = r"^([w][x]|[w][x]\s|weather)"
command_regex = r"^[wW]"
command_name = "USWeather"
short_description = "Provide USA only weather of GPS Beacon location"
def setup(self):
self.ensure_aprs_fi_key()
@trace.trace
def process(self, packet):
LOG.info("Weather Plugin")
api_key = self.config["aprs.fi"]["apiKey"]
fromcall = packet.from_call
message = packet.get("message_text", None)
# message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
searchcall = fromcall
api_key = CONF.aprs_fi.apiKey
try:
url = (
"http://api.aprs.fi/api/get?"
"&what=loc&apikey={}&format=json"
"&name={}".format(api_key, fromcall)
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch aprs.fi location"
LOG.debug(f"LocationPlugin: aprs_data = {aprs_data}")
if not len(aprs_data["entries"]):
LOG.error("Didn't get any entries from aprs.fi")
return "Failed to fetch aprs.fi location"
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
try:
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
except Exception as ex:
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
return "Unable to get weather"
LOG.info(f"WX data {wx_data}")
reply = (
"%sF(%sF/%sF) %s. %s, %s."
% (
wx_data["currentobservation"]["Temp"],
wx_data["data"]["temperature"][0],
wx_data["data"]["temperature"][1],
wx_data["data"]["weather"][0],
wx_data["time"]["startPeriodName"][1],
wx_data["data"]["weather"][1],
)
response = requests.get(url)
# aprs_data = json.loads(response.read())
aprs_data = json.loads(response.text)
).rstrip()
LOG.debug(f"reply: '{reply}' ")
return reply
class USMetarPlugin(plugin.APRSDRegexCommandPluginBase, plugin.APRSFIKEYMixin):
"""METAR Command
This provides a METAR weather report from a station near the caller
or callsign using the forecast.weather.gov api. This only works
for stations inside the United States.
This service does not require an apiKey.
How to Call: Send a message to aprsd
"metar" - returns metar report near the calling callsign
"metar CALLSIGN" - returns metar report near CALLSIGN
"""
command_regex = r"^([m]|[M]|[m]\s|metar)"
command_name = "USMetar"
short_description = "USA only METAR of GPS Beacon location"
def setup(self):
self.ensure_aprs_fi_key()
@trace.trace
def process(self, packet):
fromcall = packet.get("from")
message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
LOG.info(f"WX Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
station = searchcall.upper()
try:
resp = plugin_utils.get_weather_gov_metar(station)
except Exception as e:
LOG.debug(f"Weather failed with: {str(e)}")
reply = "Unable to find station METAR"
else:
station_data = json.loads(resp.text)
reply = station_data["properties"]["rawMessage"]
return reply
else:
# if no second argument, search for calling station
fromcall = fromcall
api_key = CONF.aprs_fi.apiKey
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, fromcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch aprs.fi location"
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch aprs.fi location"
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
url2 = (
"https://forecast.weather.gov/MapClick.php?lat=%s"
"&lon=%s&FcstType=json" % (lat, lon)
)
response2 = requests.get(url2)
# wx_data = json.loads(response2.read())
wx_data = json.loads(response2.text)
reply = (
"%sF(%sF/%sF) %s. %s, %s."
% (
wx_data["currentobservation"]["Temp"],
wx_data["data"]["temperature"][0],
wx_data["data"]["temperature"][1],
wx_data["data"]["weather"][0],
wx_data["time"]["startPeriodName"][1],
wx_data["data"]["weather"][1],
)
).rstrip()
LOG.debug("reply: '{}' ".format(reply))
except Exception as e:
LOG.debug("Weather failed with: " + "%s" % str(e))
reply = "Unable to find you (send beacon?)"
try:
wx_data = plugin_utils.get_weather_gov_for_gps(lat, lon)
except Exception as ex:
LOG.error(f"Couldn't fetch forecast.weather.gov '{ex}'")
return "Unable to metar find station."
if wx_data["location"]["metar"]:
station = wx_data["location"]["metar"]
try:
resp = plugin_utils.get_weather_gov_metar(station)
except Exception as e:
LOG.debug(f"Weather failed with: {str(e)}")
reply = "Failed to get Metar"
else:
station_data = json.loads(resp.text)
reply = station_data["properties"]["rawMessage"]
else:
# Couldn't find a station
reply = "No Metar station found"
return reply
class OWMWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
"""OpenWeatherMap Weather Command
This provides weather near the caller or callsign.
How to Call: Send a message to aprsd
"weather" - returns the weather near the calling callsign
"weather CALLSIGN" - returns the weather near CALLSIGN
This plugin uses the openweathermap API to fetch
location and weather information.
To use this plugin you need to get an openweathermap
account and apikey.
https://home.openweathermap.org/api_keys
"""
# command_regex = r"^([w][x]|[w][x]\s|weather)"
command_regex = r"^[wW]"
command_name = "OpenWeatherMap"
short_description = "OpenWeatherMap weather of GPS Beacon location"
def setup(self):
if not CONF.owm_weather_plugin.apiKey:
LOG.error("Config.owm_weather_plugin.apiKey is not set. Disabling")
self.enabled = False
else:
self.enabled = True
def help(self):
_help = [
"openweathermap: Send {} to get weather "
"from your location".format(self.command_regex),
"openweathermap: Send {} <callsign> to get "
"weather from <callsign>".format(self.command_regex),
]
return _help
@trace.trace
def process(self, packet):
fromcall = packet.get("from_call")
message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
LOG.info(f"OWMWeather Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
searchcall = fromcall
api_key = CONF.aprs_fi.apiKey
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch location"
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch location"
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
units = CONF.units
api_key = CONF.owm_weather_plugin.apiKey
try:
wx_data = plugin_utils.fetch_openweathermap(
api_key,
lat,
lon,
units=units,
exclude="minutely,hourly",
)
except Exception as ex:
LOG.error(f"Couldn't fetch openweathermap api '{ex}'")
# default to UTC
return "Unable to get weather"
if units == "metric":
degree = "C"
else:
degree = "F"
if "wind_gust" in wx_data["current"]:
wind = "{:.0f}@{}G{:.0f}".format(
wx_data["current"]["wind_speed"],
wx_data["current"]["wind_deg"],
wx_data["current"]["wind_gust"],
)
else:
wind = "{:.0f}@{}".format(
wx_data["current"]["wind_speed"],
wx_data["current"]["wind_deg"],
)
# LOG.debug(wx_data["current"])
# LOG.debug(wx_data["daily"])
reply = "{} {:.1f}{}/{:.1f}{} Wind {} {}%".format(
wx_data["current"]["weather"][0]["description"],
wx_data["current"]["temp"],
degree,
wx_data["current"]["dew_point"],
degree,
wind,
wx_data["current"]["humidity"],
)
return reply
class AVWXWeatherPlugin(plugin.APRSDRegexCommandPluginBase):
"""AVWXWeatherMap Weather Command
Fetches a METAR weather report for the nearest
weather station from the callsign
Can be called with:
metar - fetches metar for caller
metar <CALLSIGN> - fetches metar for <CALLSIGN>
This plugin requires the avwx-api service
to provide the metar for a station near
the callsign.
avwx-api is an opensource project that has
a hosted service here: https://avwx.rest/
You can launch your own avwx-api in a container
by cloning the githug repo here: https://github.com/avwx-rest/AVWX-API
Then build the docker container with:
docker build -f Dockerfile -t avwx-api:master .
"""
command_regex = r"^([m]|[m]|[m]\s|metar)"
command_name = "AVWXWeather"
short_description = "AVWX weather of GPS Beacon location"
def setup(self):
if not CONF.avwx_plugin.base_url:
LOG.error("Config avwx_plugin.base_url not specified. Disabling")
return False
elif not CONF.avwx_plugin.apiKey:
LOG.error("Config avwx_plugin.apiKey not specified. Disabling")
return False
else:
return True
def help(self):
_help = [
"avwxweather: Send {} to get weather "
"from your location".format(self.command_regex),
"avwxweather: Send {} <callsign> to get "
"weather from <callsign>".format(self.command_regex),
]
return _help
@trace.trace
def process(self, packet):
fromcall = packet.get("from")
message = packet.get("message_text", None)
# ack = packet.get("msgNo", "0")
LOG.info(f"AVWXWeather Plugin '{message}'")
a = re.search(r"^.*\s+(.*)", message)
if a is not None:
searchcall = a.group(1)
searchcall = searchcall.upper()
else:
searchcall = fromcall
api_key = CONF.aprs_fi.apiKey
try:
aprs_data = plugin_utils.get_aprs_fi(api_key, searchcall)
except Exception as ex:
LOG.error(f"Failed to fetch aprs.fi data {ex}")
return "Failed to fetch location"
# LOG.debug("LocationPlugin: aprs_data = {}".format(aprs_data))
if not len(aprs_data["entries"]):
LOG.error("Found no entries from aprs.fi!")
return "Failed to fetch location"
lat = aprs_data["entries"][0]["lat"]
lon = aprs_data["entries"][0]["lng"]
api_key = CONF.avwx_plugin.apiKey
base_url = CONF.avwx_plugin.base_url
token = f"TOKEN {api_key}"
headers = {"Authorization": token}
try:
coord = f"{lat},{lon}"
url = (
"{}/api/station/near/{}?"
"n=1&airport=false&reporting=true&format=json".format(base_url, coord)
)
LOG.debug(f"Get stations near me '{url}'")
response = requests.get(url, headers=headers)
except Exception as ex:
LOG.error(ex)
raise Exception(f"Failed to get the weather '{ex}'")
else:
wx_data = json.loads(response.text)
# LOG.debug(wx_data)
station = wx_data[0]["station"]["icao"]
try:
url = (
"{}/api/metar/{}?options=info,translate,summary"
"&airport=true&reporting=true&format=json&onfail=cache".format(
base_url,
station,
)
)
LOG.debug(f"Get METAR '{url}'")
response = requests.get(url, headers=headers)
except Exception as ex:
LOG.error(ex)
raise Exception(f"Failed to get metar {ex}")
else:
metar_data = json.loads(response.text)
# LOG.debug(metar_data)
return metar_data["raw"]

20
aprsd/stats/__init__.py Normal file
View File

@ -0,0 +1,20 @@
from aprsd import plugin
from aprsd.client import stats as client_stats
from aprsd.packets import packet_list, seen_list, tracker, watch_list
from aprsd.plugins import email
from aprsd.stats import app, collector
from aprsd.threads import aprsd
# Create the collector and register all the objects
# that APRSD has that implement the stats protocol
stats_collector = collector.Collector()
stats_collector.register_producer(app.APRSDStats)
stats_collector.register_producer(packet_list.PacketList)
stats_collector.register_producer(watch_list.WatchList)
stats_collector.register_producer(tracker.PacketTrack)
stats_collector.register_producer(plugin.PluginManager)
stats_collector.register_producer(aprsd.APRSDThreadList)
stats_collector.register_producer(email.EmailStats)
stats_collector.register_producer(client_stats.APRSClientStats)
stats_collector.register_producer(seen_list.SeenList)

49
aprsd/stats/app.py Normal file
View File

@ -0,0 +1,49 @@
import datetime
import tracemalloc
from oslo_config import cfg
import aprsd
from aprsd import utils
from aprsd.log import log as aprsd_log
CONF = cfg.CONF
class APRSDStats:
"""The AppStats class is used to collect stats from the application."""
_instance = None
start_time = None
def __new__(cls, *args, **kwargs):
"""Have to override the new method to make this a singleton
instead of using @singletone decorator so the unit tests work.
"""
if not cls._instance:
cls._instance = super().__new__(cls)
cls._instance.start_time = datetime.datetime.now()
return cls._instance
def uptime(self):
return datetime.datetime.now() - self.start_time
def stats(self, serializable=False) -> dict:
current, peak = tracemalloc.get_traced_memory()
uptime = self.uptime()
qsize = aprsd_log.logging_queue.qsize()
if serializable:
uptime = str(uptime)
stats = {
"version": aprsd.__version__,
"uptime": uptime,
"callsign": CONF.callsign,
"memory_current": int(current),
"memory_current_str": utils.human_size(current),
"memory_peak": int(peak),
"memory_peak_str": utils.human_size(peak),
"loging_queue": qsize,
}
return stats

38
aprsd/stats/collector.py Normal file
View File

@ -0,0 +1,38 @@
import logging
from typing import Callable, Protocol, runtime_checkable
from aprsd.utils import singleton
LOG = logging.getLogger("APRSD")
@runtime_checkable
class StatsProducer(Protocol):
"""The StatsProducer protocol is used to define the interface for collecting stats."""
def stats(self, serializeable=False) -> dict:
"""provide stats in a dictionary format."""
...
@singleton
class Collector:
"""The Collector class is used to collect stats from multiple StatsProducer instances."""
def __init__(self):
self.producers: list[Callable] = []
def collect(self, serializable=False) -> dict:
stats = {}
for name in self.producers:
cls = name()
if isinstance(cls, StatsProducer):
try:
stats[cls.__class__.__name__] = cls.stats(serializable=serializable).copy()
except Exception as e:
LOG.error(f"Error in producer {name} (stats): {e}")
else:
raise TypeError(f"{cls} is not an instance of StatsProducer")
return stats
def register_producer(self, producer_name: Callable):
self.producers.append(producer_name)

View File

@ -1,234 +0,0 @@
import abc
import logging
import queue
import threading
import time
from aprsd import client, messaging, plugin
import aprslib
LOG = logging.getLogger("APRSD")
RX_THREAD = "RX"
TX_THREAD = "TX"
EMAIL_THREAD = "Email"
class APRSDThreadList:
"""Singleton class that keeps track of application wide threads."""
_instance = None
threads_list = []
lock = None
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls.lock = threading.Lock()
cls.threads_list = []
return cls._instance
def add(self, thread_obj):
with self.lock:
self.threads_list.append(thread_obj)
def remove(self, thread_obj):
with self.lock:
self.threads_list.remove(thread_obj)
def stop_all(self):
"""Iterate over all threads and call stop on them."""
with self.lock:
for th in self.threads_list:
th.stop()
class APRSDThread(threading.Thread, metaclass=abc.ABCMeta):
def __init__(self, name):
super().__init__(name=name)
self.thread_stop = False
APRSDThreadList().add(self)
def stop(self):
self.thread_stop = True
def run(self):
LOG.debug("Starting")
while not self.thread_stop:
can_loop = self.loop()
if not can_loop:
self.stop()
APRSDThreadList().remove(self)
LOG.debug("Exiting")
class APRSDRXThread(APRSDThread):
def __init__(self, msg_queues, config):
super().__init__("RX_MSG")
self.msg_queues = msg_queues
self.config = config
def stop(self):
self.thread_stop = True
client.get_client().stop()
def callback(self, packet):
try:
packet = aprslib.parse(packet)
print(packet)
except (aprslib.ParseError, aprslib.UnknownFormat):
pass
def loop(self):
aprs_client = client.get_client()
# setup the consumer of messages and block until a messages
try:
# This will register a packet consumer with aprslib
# When new packets come in the consumer will process
# the packet
# Do a partial here because the consumer signature doesn't allow
# For kwargs to be passed in to the consumer func we declare
# and the aprslib developer didn't want to allow a PR to add
# kwargs. :(
# https://github.com/rossengeorgiev/aprs-python/pull/56
aprs_client.consumer(self.process_packet, raw=False, blocking=False)
except aprslib.exceptions.ConnectionDrop:
LOG.error("Connection dropped, reconnecting")
time.sleep(5)
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()
# is called
client.Client().reset()
# Continue to loop
return True
def process_ack_packet(self, packet):
ack_num = packet.get("msgNo")
LOG.info("Got ack for message {}".format(ack_num))
messaging.log_message(
"ACK",
packet["raw"],
None,
ack=ack_num,
fromcall=packet["from"],
)
tracker = messaging.MsgTrack()
tracker.remove(ack_num)
return
def process_mic_e_packet(self, packet):
LOG.info("Mic-E Packet detected. Currenlty unsupported.")
messaging.log_packet(packet)
return
def process_message_packet(self, packet):
fromcall = packet["from"]
message = packet.get("message_text", None)
msg_id = packet.get("msgNo", "0")
messaging.log_message(
"Received Message",
packet["raw"],
message,
fromcall=fromcall,
msg_num=msg_id,
)
found_command = False
# Get singleton of the PM
pm = plugin.PluginManager()
try:
results = pm.run(fromcall=fromcall, message=message, ack=msg_id)
for reply in results:
found_command = True
# A plugin can return a null message flag which signals
# us that they processed the message correctly, but have
# nothing to reply with, so we avoid replying with a usage string
if reply is not messaging.NULL_MESSAGE:
LOG.debug("Sending '{}'".format(reply))
msg = messaging.TextMessage(
self.config["aprs"]["login"],
fromcall,
reply,
)
self.msg_queues["tx"].put(msg)
else:
LOG.debug("Got NULL MESSAGE from plugin")
if not found_command:
plugins = pm.get_plugins()
names = [x.command_name for x in plugins]
names.sort()
# reply = "Usage: {}".format(", ".join(names))
reply = "Usage: weather, locate [call], time, fortune, ping"
msg = messaging.TextMessage(
self.config["aprs"]["login"],
fromcall,
reply,
)
self.msg_queues["tx"].put(msg)
except Exception as ex:
LOG.exception("Plugin failed!!!", ex)
reply = "A Plugin failed! try again?"
msg = messaging.TextMessage(self.config["aprs"]["login"], fromcall, reply)
self.msg_queues["tx"].put(msg)
# let any threads do their thing, then ack
# send an ack last
ack = messaging.AckMessage(
self.config["aprs"]["login"],
fromcall,
msg_id=msg_id,
)
self.msg_queues["tx"].put(ack)
LOG.debug("Packet processing complete")
def process_packet(self, packet):
"""Process a packet recieved from aprs-is server."""
try:
LOG.info("Got message: {}".format(packet))
msg = packet.get("message_text", None)
msg_format = packet.get("format", None)
msg_response = packet.get("response", None)
if msg_format == "message" and msg:
# we want to send the message through the
# plugins
self.process_message_packet(packet)
return
elif msg_response == "ack":
self.process_ack_packet(packet)
return
if msg_format == "mic-e":
# process a mic-e packet
self.process_mic_e_packet(packet)
return
except (aprslib.ParseError, aprslib.UnknownFormat) as exp:
LOG.exception("Failed to parse packet from aprs-is", exp)
class APRSDTXThread(APRSDThread):
def __init__(self, msg_queues, config):
super().__init__("TX_MSG")
self.msg_queues = msg_queues
self.config = config
def loop(self):
try:
msg = self.msg_queues["tx"].get(timeout=0.1)
msg.send()
except queue.Empty:
pass
# Continue to loop
return True

11
aprsd/threads/__init__.py Normal file
View File

@ -0,0 +1,11 @@
import queue
# Make these available to anyone importing
# aprsd.threads
from .aprsd import APRSDThread, APRSDThreadList # noqa: F401
from .rx import ( # noqa: F401
APRSDDupeRXThread, APRSDProcessPacketThread, APRSDRXThread,
)
packet_queue = queue.Queue(maxsize=20)

119
aprsd/threads/aprsd.py Normal file
View File

@ -0,0 +1,119 @@
import abc
import datetime
import logging
import threading
from typing import List
import wrapt
LOG = logging.getLogger("APRSD")
class APRSDThread(threading.Thread, metaclass=abc.ABCMeta):
"""Base class for all threads in APRSD."""
loop_count = 1
def __init__(self, name):
super().__init__(name=name)
self.thread_stop = False
APRSDThreadList().add(self)
self._last_loop = datetime.datetime.now()
def _should_quit(self):
""" see if we have a quit message from the global queue."""
if self.thread_stop:
return True
def stop(self):
self.thread_stop = True
@abc.abstractmethod
def loop(self):
pass
def _cleanup(self):
"""Add code to subclass to do any cleanup"""
def __str__(self):
out = f"Thread <{self.__class__.__name__}({self.name}) Alive? {self.is_alive()}>"
return out
def loop_age(self):
"""How old is the last loop call?"""
return datetime.datetime.now() - self._last_loop
def run(self):
LOG.debug("Starting")
while not self._should_quit():
self.loop_count += 1
can_loop = self.loop()
self._last_loop = datetime.datetime.now()
if not can_loop:
self.stop()
self._cleanup()
APRSDThreadList().remove(self)
LOG.debug("Exiting")
class APRSDThreadList:
"""Singleton class that keeps track of application wide threads."""
_instance = None
threads_list: List[APRSDThread] = []
lock = threading.Lock()
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls.threads_list = []
return cls._instance
def stats(self, serializable=False) -> dict:
stats = {}
for th in self.threads_list:
age = th.loop_age()
if serializable:
age = str(age)
stats[th.name] = {
"name": th.name,
"class": th.__class__.__name__,
"alive": th.is_alive(),
"age": th.loop_age(),
"loop_count": th.loop_count,
}
return stats
@wrapt.synchronized(lock)
def add(self, thread_obj):
self.threads_list.append(thread_obj)
@wrapt.synchronized(lock)
def remove(self, thread_obj):
self.threads_list.remove(thread_obj)
@wrapt.synchronized(lock)
def stop_all(self):
"""Iterate over all threads and call stop on them."""
for th in self.threads_list:
LOG.info(f"Stopping Thread {th.name}")
if hasattr(th, "packet"):
LOG.info(F"{th.name} packet {th.packet}")
th.stop()
@wrapt.synchronized(lock)
def info(self):
"""Go through all the threads and collect info about each."""
info = {}
for thread in self.threads_list:
alive = thread.is_alive()
age = thread.loop_age()
key = thread.__class__.__name__
info[key] = {"alive": True if alive else False, "age": age, "name": thread.name}
return info
@wrapt.synchronized(lock)
def __len__(self):
return len(self.threads_list)

124
aprsd/threads/keep_alive.py Normal file
View File

@ -0,0 +1,124 @@
import datetime
import logging
import time
import tracemalloc
from oslo_config import cfg
from aprsd import packets, utils
from aprsd.client import client_factory
from aprsd.log import log as aprsd_log
from aprsd.stats import collector
from aprsd.threads import APRSDThread, APRSDThreadList
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class KeepAliveThread(APRSDThread):
cntr = 0
checker_time = datetime.datetime.now()
def __init__(self):
tracemalloc.start()
super().__init__("KeepAlive")
max_timeout = {"hours": 0.0, "minutes": 2, "seconds": 0}
self.max_delta = datetime.timedelta(**max_timeout)
def loop(self):
if self.loop_count % 60 == 0:
stats_json = collector.Collector().collect()
pl = packets.PacketList()
thread_list = APRSDThreadList()
now = datetime.datetime.now()
if "EmailStats" in stats_json:
email_stats = stats_json["EmailStats"]
if email_stats.get("last_check_time"):
email_thread_time = utils.strfdelta(now - email_stats["last_check_time"])
else:
email_thread_time = "N/A"
else:
email_thread_time = "N/A"
if "APRSClientStats" in stats_json and stats_json["APRSClientStats"].get("transport") == "aprsis":
if stats_json["APRSClientStats"].get("server_keepalive"):
last_msg_time = utils.strfdelta(now - stats_json["APRSClientStats"]["server_keepalive"])
else:
last_msg_time = "N/A"
else:
last_msg_time = "N/A"
tracked_packets = stats_json["PacketTrack"]["total_tracked"]
tx_msg = 0
rx_msg = 0
if "PacketList" in stats_json:
msg_packets = stats_json["PacketList"].get("MessagePacket")
if msg_packets:
tx_msg = msg_packets.get("tx", 0)
rx_msg = msg_packets.get("rx", 0)
keepalive = (
"{} - Uptime {} RX:{} TX:{} Tracker:{} Msgs TX:{} RX:{} "
"Last:{} Email: {} - RAM Current:{} Peak:{} Threads:{} LoggingQueue:{}"
).format(
stats_json["APRSDStats"]["callsign"],
stats_json["APRSDStats"]["uptime"],
pl.total_rx(),
pl.total_tx(),
tracked_packets,
tx_msg,
rx_msg,
last_msg_time,
email_thread_time,
stats_json["APRSDStats"]["memory_current_str"],
stats_json["APRSDStats"]["memory_peak_str"],
len(thread_list),
aprsd_log.logging_queue.qsize(),
)
LOG.info(keepalive)
if "APRSDThreadList" in stats_json:
thread_list = stats_json["APRSDThreadList"]
for thread_name in thread_list:
thread = thread_list[thread_name]
alive = thread["alive"]
age = thread["age"]
key = thread["name"]
if not alive:
LOG.error(f"Thread {thread}")
LOG.info(f"{key: <15} Alive? {str(alive): <5} {str(age): <20}")
# check the APRS connection
cl = client_factory.create()
# Reset the connection if it's dead and this isn't our
# First time through the loop.
# The first time through the loop can happen at startup where
# The keepalive thread starts before the client has a chance
# to make it's connection the first time.
if not cl.is_alive() and self.cntr > 0:
LOG.error(f"{cl.__class__.__name__} is not alive!!! Resetting")
client_factory.create().reset()
# else:
# # See if we should reset the aprs-is client
# # Due to losing a keepalive from them
# delta_dict = utils.parse_delta_str(last_msg_time)
# delta = datetime.timedelta(**delta_dict)
#
# if delta > self.max_delta:
# # We haven't gotten a keepalive from aprs-is in a while
# # reset the connection.a
# if not client.KISSClient.is_enabled():
# LOG.warning(f"Resetting connection to APRS-IS {delta}")
# client.factory.create().reset()
# Check version every day
delta = now - self.checker_time
if delta > datetime.timedelta(hours=24):
self.checker_time = now
level, msg = utils._check_version()
if level:
LOG.warning(msg)
self.cntr += 1
time.sleep(1)
return True

View File

@ -0,0 +1,121 @@
import datetime
import logging
import threading
from oslo_config import cfg
import requests
import wrapt
from aprsd import threads
from aprsd.log import log
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
def send_log_entries(force=False):
"""Send all of the log entries to the web interface."""
if CONF.admin.web_enabled:
if force or LogEntries().is_purge_ready():
entries = LogEntries().get_all_and_purge()
if entries:
try:
requests.post(
f"http://{CONF.admin.web_ip}:{CONF.admin.web_port}/log_entries",
json=entries,
auth=(CONF.admin.user, CONF.admin.password),
)
except Exception:
LOG.warning(f"Failed to send log entries. len={len(entries)}")
class LogEntries:
entries = []
lock = threading.Lock()
_instance = None
last_purge = datetime.datetime.now()
max_delta = datetime.timedelta(
hours=0.0, minutes=0, seconds=2,
)
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
def stats(self) -> dict:
return {
"log_entries": self.entries,
}
@wrapt.synchronized(lock)
def add(self, entry):
self.entries.append(entry)
@wrapt.synchronized(lock)
def get_all_and_purge(self):
entries = self.entries.copy()
self.entries = []
self.last_purge = datetime.datetime.now()
return entries
def is_purge_ready(self):
now = datetime.datetime.now()
if (
now - self.last_purge > self.max_delta
and len(self.entries) > 1
):
return True
return False
@wrapt.synchronized(lock)
def __len__(self):
return len(self.entries)
class LogMonitorThread(threads.APRSDThread):
def __init__(self):
super().__init__("LogMonitorThread")
def stop(self):
send_log_entries(force=True)
super().stop()
def loop(self):
try:
record = log.logging_queue.get(block=True, timeout=2)
if isinstance(record, list):
for item in record:
entry = self.json_record(item)
LogEntries().add(entry)
else:
entry = self.json_record(record)
LogEntries().add(entry)
except Exception:
# Just ignore thi
pass
send_log_entries()
return True
def json_record(self, record):
entry = {}
entry["filename"] = record.filename
entry["funcName"] = record.funcName
entry["levelname"] = record.levelname
entry["lineno"] = record.lineno
entry["module"] = record.module
entry["name"] = record.name
entry["pathname"] = record.pathname
entry["process"] = record.process
entry["processName"] = record.processName
if hasattr(record, "stack_info"):
entry["stack_info"] = record.stack_info
else:
entry["stack_info"] = None
entry["thread"] = record.thread
entry["threadName"] = record.threadName
entry["message"] = record.getMessage()
return entry

56
aprsd/threads/registry.py Normal file
View File

@ -0,0 +1,56 @@
import logging
import time
from oslo_config import cfg
import requests
import aprsd
from aprsd import threads as aprsd_threads
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSRegistryThread(aprsd_threads.APRSDThread):
"""This sends service information to the configured APRS Registry."""
_loop_cnt: int = 1
def __init__(self):
super().__init__("APRSRegistryThread")
self._loop_cnt = 1
if not CONF.aprs_registry.enabled:
LOG.error(
"APRS Registry is not enabled. ",
)
LOG.error(
"APRS Registry thread is STOPPING.",
)
self.stop()
LOG.info(
"APRS Registry thread is running and will send "
f"info every {CONF.aprs_registry.frequency_seconds} seconds "
f"to {CONF.aprs_registry.registry_url}.",
)
def loop(self):
# Only call the registry every N seconds
if self._loop_cnt % CONF.aprs_registry.frequency_seconds == 0:
info = {
"callsign": CONF.callsign,
"description": CONF.aprs_registry.description,
"service_website": CONF.aprs_registry.service_website,
"software": f"APRSD version {aprsd.__version__} "
"https://github.com/craigerl/aprsd",
}
try:
requests.post(
f"{CONF.aprs_registry.registry_url}",
json=info,
)
except Exception as e:
LOG.error(f"Failed to send registry info: {e}")
time.sleep(1)
self._loop_cnt += 1
return True

354
aprsd/threads/rx.py Normal file
View File

@ -0,0 +1,354 @@
import abc
import logging
import queue
import time
import aprslib
from oslo_config import cfg
from aprsd import packets, plugin
from aprsd.client import client_factory
from aprsd.packets import collector
from aprsd.packets import log as packet_log
from aprsd.threads import APRSDThread, tx
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class APRSDRXThread(APRSDThread):
def __init__(self, packet_queue):
super().__init__("RX_PKT")
self.packet_queue = packet_queue
self._client = client_factory.create()
def stop(self):
self.thread_stop = True
if self._client:
self._client.stop()
def loop(self):
if not self._client:
self._client = client_factory.create()
time.sleep(1)
return True
# setup the consumer of messages and block until a messages
try:
# This will register a packet consumer with aprslib
# When new packets come in the consumer will process
# the packet
# Do a partial here because the consumer signature doesn't allow
# For kwargs to be passed in to the consumer func we declare
# and the aprslib developer didn't want to allow a PR to add
# kwargs. :(
# https://github.com/rossengeorgiev/aprs-python/pull/56
self._client.consumer(
self._process_packet, raw=False, blocking=False,
)
except (
aprslib.exceptions.ConnectionDrop,
aprslib.exceptions.ConnectionError,
):
LOG.error("Connection dropped, reconnecting")
# Force the deletion of the client object connected to aprs
# This will cause a reconnect, next time client.get_client()
# is called
self._client.reset()
time.sleep(5)
except Exception:
# LOG.exception(ex)
LOG.error("Resetting connection and trying again.")
self._client.reset()
time.sleep(5)
# Continue to loop
return True
def _process_packet(self, *args, **kwargs):
"""Intermediate callback so we can update the keepalive time."""
# Now call the 'real' packet processing for a RX'x packet
self.process_packet(*args, **kwargs)
@abc.abstractmethod
def process_packet(self, *args, **kwargs):
pass
class APRSDDupeRXThread(APRSDRXThread):
"""Process received packets.
This is the main APRSD Server command thread that
receives packets and makes sure the packet
hasn't been seen previously before sending it on
to be processed.
"""
def process_packet(self, *args, **kwargs):
"""This handles the processing of an inbound packet.
When a packet is received by the connected client object,
it sends the raw packet into this function. This function then
decodes the packet via the client, and then processes the packet.
Ack Packets are sent to the PluginProcessPacketThread for processing.
All other packets have to be checked as a dupe, and then only after
we haven't seen this packet before, do we send it to the
PluginProcessPacketThread for processing.
"""
packet = self._client.decode_packet(*args, **kwargs)
# LOG.debug(raw)
packet_log.log(packet)
pkt_list = packets.PacketList()
if isinstance(packet, packets.AckPacket):
# We don't need to drop AckPackets, those should be
# processed.
self.packet_queue.put(packet)
else:
# Make sure we aren't re-processing the same packet
# For RF based APRS Clients we can get duplicate packets
# So we need to track them and not process the dupes.
found = False
try:
# Find the packet in the list of already seen packets
# Based on the packet.key
found = pkt_list.find(packet)
except KeyError:
found = False
if not found:
# We haven't seen this packet before, so we process it.
collector.PacketCollector().rx(packet)
self.packet_queue.put(packet)
elif packet.timestamp - found.timestamp < CONF.packet_dupe_timeout:
# If the packet came in within N seconds of the
# Last time seeing the packet, then we drop it as a dupe.
LOG.warning(f"Packet {packet.from_call}:{packet.msgNo} already tracked, dropping.")
else:
LOG.warning(
f"Packet {packet.from_call}:{packet.msgNo} already tracked "
f"but older than {CONF.packet_dupe_timeout} seconds. processing.",
)
collector.PacketCollector().rx(packet)
self.packet_queue.put(packet)
class APRSDPluginRXThread(APRSDDupeRXThread):
""""Process received packets.
For backwards compatibility, we keep the APRSDPluginRXThread.
"""
class APRSDProcessPacketThread(APRSDThread):
"""Base class for processing received packets.
This is the base class for processing packets coming from
the consumer. This base class handles sending ack packets and
will ack a message before sending the packet to the subclass
for processing."""
def __init__(self, packet_queue):
self.packet_queue = packet_queue
super().__init__("ProcessPKT")
def process_ack_packet(self, packet):
"""We got an ack for a message, no need to resend it."""
ack_num = packet.msgNo
LOG.debug(f"Got ack for message {ack_num}")
collector.PacketCollector().rx(packet)
def process_piggyback_ack(self, packet):
"""We got an ack embedded in a packet."""
ack_num = packet.ackMsgNo
LOG.debug(f"Got PiggyBackAck for message {ack_num}")
collector.PacketCollector().rx(packet)
def process_reject_packet(self, packet):
"""We got a reject message for a packet. Stop sending the message."""
ack_num = packet.msgNo
LOG.debug(f"Got REJECT for message {ack_num}")
collector.PacketCollector().rx(packet)
def loop(self):
try:
packet = self.packet_queue.get(timeout=1)
if packet:
self.process_packet(packet)
except queue.Empty:
pass
return True
def process_packet(self, packet):
"""Process a packet received from aprs-is server."""
LOG.debug(f"ProcessPKT-LOOP {self.loop_count}")
our_call = CONF.callsign.lower()
from_call = packet.from_call
if packet.addresse:
to_call = packet.addresse
else:
to_call = packet.to_call
msg_id = packet.msgNo
# We don't put ack packets destined for us through the
# plugins.
if (
isinstance(packet, packets.AckPacket)
and packet.addresse.lower() == our_call
):
self.process_ack_packet(packet)
elif (
isinstance(packet, packets.RejectPacket)
and packet.addresse.lower() == our_call
):
self.process_reject_packet(packet)
else:
if hasattr(packet, "ackMsgNo") and packet.ackMsgNo:
# we got an ack embedded in this packet
# we need to handle the ack
self.process_piggyback_ack(packet)
# Only ack messages that were sent directly to us
if isinstance(packet, packets.MessagePacket):
if to_call and to_call.lower() == our_call:
# It's a MessagePacket and it's for us!
# let any threads do their thing, then ack
# send an ack last
tx.send(
packets.AckPacket(
from_call=CONF.callsign,
to_call=from_call,
msgNo=msg_id,
),
)
self.process_our_message_packet(packet)
else:
# Packet wasn't meant for us!
self.process_other_packet(packet, for_us=False)
else:
self.process_other_packet(
packet, for_us=(to_call.lower() == our_call),
)
LOG.debug(f"Packet processing complete for pkt '{packet.key}'")
return False
@abc.abstractmethod
def process_our_message_packet(self, packet):
"""Process a MessagePacket destined for us!"""
def process_other_packet(self, packet, for_us=False):
"""Process an APRS Packet that isn't a message or ack"""
if not for_us:
LOG.info("Got a packet not meant for us.")
else:
LOG.info("Got a non AckPacket/MessagePacket")
class APRSDPluginProcessPacketThread(APRSDProcessPacketThread):
"""Process the packet through the plugin manager.
This is the main aprsd server plugin processing thread."""
def process_other_packet(self, packet, for_us=False):
pm = plugin.PluginManager()
try:
results = pm.run_watchlist(packet)
for reply in results:
if isinstance(reply, list):
for subreply in reply:
LOG.debug(f"Sending '{subreply}'")
if isinstance(subreply, packets.Packet):
tx.send(subreply)
else:
wl = CONF.watch_list
to_call = wl["alert_callsign"]
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=to_call,
message_text=subreply,
),
)
elif isinstance(reply, packets.Packet):
# We have a message based object.
tx.send(reply)
except Exception as ex:
LOG.error("Plugin failed!!!")
LOG.exception(ex)
def process_our_message_packet(self, packet):
"""Send the packet through the plugins."""
from_call = packet.from_call
if packet.addresse:
to_call = packet.addresse
else:
to_call = None
pm = plugin.PluginManager()
try:
results = pm.run(packet)
replied = False
for reply in results:
if isinstance(reply, list):
# one of the plugins wants to send multiple messages
replied = True
for subreply in reply:
LOG.debug(f"Sending '{subreply}'")
if isinstance(subreply, packets.Packet):
tx.send(subreply)
else:
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=from_call,
message_text=subreply,
),
)
elif isinstance(reply, packets.Packet):
# We have a message based object.
tx.send(reply)
replied = True
else:
replied = True
# A plugin can return a null message flag which signals
# us that they processed the message correctly, but have
# nothing to reply with, so we avoid replying with a
# usage string
if reply is not packets.NULL_MESSAGE:
LOG.debug(f"Sending '{reply}'")
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=from_call,
message_text=reply,
),
)
# If the message was for us and we didn't have a
# response, then we send a usage statement.
if to_call == CONF.callsign and not replied:
LOG.warning("Sending help!")
message_text = "Unknown command! Send 'help' message for help"
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=from_call,
message_text=message_text,
),
)
except Exception as ex:
LOG.error("Plugin failed!!!")
LOG.exception(ex)
# Do we need to send a reply?
if to_call == CONF.callsign:
reply = "A Plugin failed! try again?"
tx.send(
packets.MessagePacket(
from_call=CONF.callsign,
to_call=from_call,
message_text=reply,
),
)
LOG.debug("Completed process_our_message_packet")

44
aprsd/threads/stats.py Normal file
View File

@ -0,0 +1,44 @@
import logging
import threading
import time
from oslo_config import cfg
import wrapt
from aprsd.stats import collector
from aprsd.threads import APRSDThread
from aprsd.utils import objectstore
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class StatsStore(objectstore.ObjectStoreMixin):
"""Container to save the stats from the collector."""
lock = threading.Lock()
data = {}
@wrapt.synchronized(lock)
def add(self, stats: dict):
self.data = stats
class APRSDStatsStoreThread(APRSDThread):
"""Save APRSD Stats to disk periodically."""
# how often in seconds to write the file
save_interval = 10
def __init__(self):
super().__init__("StatsStore")
def loop(self):
if self.loop_count % self.save_interval == 0:
stats = collector.Collector().collect()
ss = StatsStore()
ss.add(stats)
ss.save()
time.sleep(1)
return True

255
aprsd/threads/tx.py Normal file
View File

@ -0,0 +1,255 @@
import logging
import threading
import time
from oslo_config import cfg
from rush import quota, throttle
from rush.contrib import decorator
from rush.limiters import periodic
from rush.stores import dictionary
import wrapt
from aprsd import conf # noqa
from aprsd import threads as aprsd_threads
from aprsd.client import client_factory
from aprsd.packets import collector, core
from aprsd.packets import log as packet_log
from aprsd.packets import tracker
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
msg_t = throttle.Throttle(
limiter=periodic.PeriodicLimiter(
store=dictionary.DictionaryStore(),
),
rate=quota.Quota.per_second(
count=CONF.msg_rate_limit_period,
),
)
ack_t = throttle.Throttle(
limiter=periodic.PeriodicLimiter(
store=dictionary.DictionaryStore(),
),
rate=quota.Quota.per_second(
count=CONF.ack_rate_limit_period,
),
)
msg_throttle_decorator = decorator.ThrottleDecorator(throttle=msg_t)
ack_throttle_decorator = decorator.ThrottleDecorator(throttle=ack_t)
s_lock = threading.Lock()
@wrapt.synchronized(s_lock)
@msg_throttle_decorator.sleep_and_retry
def send(packet: core.Packet, direct=False, aprs_client=None):
"""Send a packet either in a thread or directly to the client."""
# prepare the packet for sending.
# This constructs the packet.raw
packet.prepare()
# Have to call the collector to track the packet
# After prepare, as prepare assigns the msgNo
collector.PacketCollector().tx(packet)
if isinstance(packet, core.AckPacket):
_send_ack(packet, direct=direct, aprs_client=aprs_client)
else:
_send_packet(packet, direct=direct, aprs_client=aprs_client)
@msg_throttle_decorator.sleep_and_retry
def _send_packet(packet: core.Packet, direct=False, aprs_client=None):
if not direct:
thread = SendPacketThread(packet=packet)
thread.start()
else:
_send_direct(packet, aprs_client=aprs_client)
@ack_throttle_decorator.sleep_and_retry
def _send_ack(packet: core.AckPacket, direct=False, aprs_client=None):
if not direct:
thread = SendAckThread(packet=packet)
thread.start()
else:
_send_direct(packet, aprs_client=aprs_client)
def _send_direct(packet, aprs_client=None):
if aprs_client:
cl = aprs_client
else:
cl = client_factory.create()
packet.update_timestamp()
packet_log.log(packet, tx=True)
try:
cl.send(packet)
except Exception as e:
LOG.error(f"Failed to send packet: {packet}")
LOG.error(e)
class SendPacketThread(aprsd_threads.APRSDThread):
loop_count: int = 1
def __init__(self, packet):
self.packet = packet
super().__init__(f"TX-{packet.to_call}-{self.packet.msgNo}")
def loop(self):
"""Loop until a message is acked or it gets delayed.
We only sleep for 5 seconds between each loop run, so
that CTRL-C can exit the app in a short period. Each sleep
means the app quitting is blocked until sleep is done.
So we keep track of the last send attempt and only send if the
last send attempt is old enough.
"""
pkt_tracker = tracker.PacketTrack()
# lets see if the message is still in the tracking queue
packet = pkt_tracker.get(self.packet.msgNo)
if not packet:
# The message has been removed from the tracking queue
# So it got acked and we are done.
LOG.info(
f"{self.packet.__class__.__name__}"
f"({self.packet.msgNo}) "
"Message Send Complete via Ack.",
)
return False
else:
send_now = False
if packet.send_count >= packet.retry_count:
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.info(
f"{packet.__class__.__name__} "
f"({packet.msgNo}) "
"Message Send Complete. Max attempts reached"
f" {packet.retry_count}",
)
pkt_tracker.remove(packet.msgNo)
return False
# Message is still outstanding and needs to be acked.
if packet.last_send_time:
# Message has a last send time tracking
now = int(round(time.time()))
sleeptime = (packet.send_count + 1) * 31
delta = now - packet.last_send_time
if delta > sleeptime:
# It's time to try to send it again
send_now = True
else:
send_now = True
if send_now:
# no attempt time, so lets send it, and start
# tracking the time.
packet.last_send_time = int(round(time.time()))
_send_direct(packet)
packet.send_count += 1
time.sleep(1)
# Make sure we get called again.
self.loop_count += 1
return True
class SendAckThread(aprsd_threads.APRSDThread):
loop_count: int = 1
max_retries = 3
def __init__(self, packet):
self.packet = packet
super().__init__(f"TXAck-{packet.to_call}-{self.packet.msgNo}")
self.max_retries = CONF.default_ack_send_count
def loop(self):
"""Separate thread to send acks with retries."""
send_now = False
if self.packet.send_count == self.max_retries:
# we reached the send limit, don't send again
# TODO(hemna) - Need to put this in a delayed queue?
LOG.debug(
f"{self.packet.__class__.__name__}"
f"({self.packet.msgNo}) "
"Send Complete. Max attempts reached"
f" {self.max_retries}",
)
return False
if self.packet.last_send_time:
# Message has a last send time tracking
now = int(round(time.time()))
# aprs duplicate detection is 30 secs?
# (21 only sends first, 28 skips middle)
sleep_time = 31
delta = now - self.packet.last_send_time
if delta > sleep_time:
# It's time to try to send it again
send_now = True
elif self.loop_count % 10 == 0:
LOG.debug(f"Still wating. {delta}")
else:
send_now = True
if send_now:
_send_direct(self.packet)
self.packet.send_count += 1
self.packet.last_send_time = int(round(time.time()))
time.sleep(1)
self.loop_count += 1
return True
class BeaconSendThread(aprsd_threads.APRSDThread):
"""Thread that sends a GPS beacon packet periodically.
Settings are in the [DEFAULT] section of the config file.
"""
_loop_cnt: int = 1
def __init__(self):
super().__init__("BeaconSendThread")
self._loop_cnt = 1
# Make sure Latitude and Longitude are set.
if not CONF.latitude or not CONF.longitude:
LOG.error(
"Latitude and Longitude are not set in the config file."
"Beacon will not be sent and thread is STOPPED.",
)
self.stop()
LOG.info(
"Beacon thread is running and will send "
f"beacons every {CONF.beacon_interval} seconds.",
)
def loop(self):
# Only dump out the stats every N seconds
if self._loop_cnt % CONF.beacon_interval == 0:
pkt = core.BeaconPacket(
from_call=CONF.callsign,
to_call="APRS",
latitude=float(CONF.latitude),
longitude=float(CONF.longitude),
comment="APRSD GPS Beacon",
symbol=CONF.beacon_symbol,
)
try:
# Only send it once
pkt.retry_count = 1
send(pkt, direct=True)
except Exception as e:
LOG.error(f"Failed to send beacon: {e}")
client_factory.create().reset()
time.sleep(5)
self._loop_cnt += 1
time.sleep(1)
return True

View File

@ -1,224 +0,0 @@
"""Utilities and helper functions."""
import errno
import functools
import os
from pathlib import Path
import sys
import threading
from aprsd import plugin
import click
import yaml
# an example of what should be in the ~/.aprsd/config.yml
DEFAULT_CONFIG_DICT = {
"ham": {"callsign": "CALLSIGN"},
"aprs": {
"login": "CALLSIGN",
"password": "00000",
"host": "rotate.aprs.net",
"port": 14580,
"logfile": "/tmp/aprsd.log",
},
"aprs.fi": {"apiKey": "set me"},
"shortcuts": {
"aa": "5551239999@vtext.com",
"cl": "craiglamparter@somedomain.org",
"wb": "555309@vtext.com",
},
"smtp": {
"login": "SMTP_USERNAME",
"password": "SMTP_PASSWORD",
"host": "smtp.gmail.com",
"port": 465,
"use_ssl": False,
},
"imap": {
"login": "IMAP_USERNAME",
"password": "IMAP_PASSWORD",
"host": "imap.gmail.com",
"port": 993,
"use_ssl": True,
},
"aprsd": {
"plugin_dir": "~/.config/aprsd/plugins",
"enabled_plugins": plugin.CORE_PLUGINS,
},
}
home = str(Path.home())
DEFAULT_CONFIG_DIR = "{}/.config/aprsd/".format(home)
DEFAULT_SAVE_FILE = "{}/.config/aprsd/aprsd.p".format(home)
DEFAULT_CONFIG_FILE = "{}/.config/aprsd/aprsd.yml".format(home)
def synchronized(wrapped):
lock = threading.Lock()
@functools.wraps(wrapped)
def _wrap(*args, **kwargs):
with lock:
return wrapped(*args, **kwargs)
return _wrap
def env(*vars, **kwargs):
"""This returns the first environment variable set.
if none are non-empty, defaults to '' or keyword arg default
"""
for v in vars:
value = os.environ.get(v, None)
if value:
return value
return kwargs.get("default", "")
def mkdir_p(path):
"""Make directory and have it work in py2 and py3."""
try:
os.makedirs(path)
except OSError as exc: # Python >= 2.5
if exc.errno == errno.EEXIST and os.path.isdir(path):
pass
else:
raise
def insert_str(string, str_to_insert, index):
return string[:index] + str_to_insert + string[index:]
def end_substr(original, substr):
"""Get the index of the end of the <substr>.
So you can insert a string after <substr>
"""
idx = original.find(substr)
if idx != -1:
idx += len(substr)
return idx
def add_config_comments(raw_yaml):
end_idx = end_substr(raw_yaml, "aprs.fi:")
if end_idx != -1:
# lets insert a comment
raw_yaml = insert_str(
raw_yaml,
"\n # Get the apiKey from your aprs.fi account here: http://aprs.fi/account",
end_idx,
)
return raw_yaml
def create_default_config():
"""Create a default config file."""
# make sure the directory location exists
config_file_expanded = os.path.expanduser(DEFAULT_CONFIG_FILE)
config_dir = os.path.dirname(config_file_expanded)
if not os.path.exists(config_dir):
click.echo("Config dir '{}' doesn't exist, creating.".format(config_dir))
mkdir_p(config_dir)
with open(config_file_expanded, "w+") as cf:
raw_yaml = yaml.dump(DEFAULT_CONFIG_DICT)
cf.write(add_config_comments(raw_yaml))
def get_config(config_file):
"""This tries to read the yaml config from <config_file>."""
config_file_expanded = os.path.expanduser(config_file)
if os.path.exists(config_file_expanded):
with open(config_file_expanded) as stream:
config = yaml.load(stream, Loader=yaml.FullLoader)
return config
else:
if config_file == DEFAULT_CONFIG_FILE:
click.echo(
"{} is missing, creating config file".format(config_file_expanded),
)
create_default_config()
msg = (
"Default config file created at {}. Please edit with your "
"settings.".format(config_file)
)
click.echo(msg)
else:
# The user provided a config file path different from the
# Default, so we won't try and create it, just bitch and bail.
msg = "Custom config file '{}' is missing.".format(config_file)
click.echo(msg)
sys.exit(-1)
# This method tries to parse the config yaml file
# and consume the settings.
# If the required params don't exist,
# it will look in the environment
def parse_config(config_file):
# for now we still use globals....ugh
global CONFIG
def fail(msg):
click.echo(msg)
sys.exit(-1)
def check_option(config, section, name=None, default=None, default_fail=None):
if section in config:
if name and name not in config[section]:
if not default:
fail(
"'{}' was not in '{}' section of config file".format(
name,
section,
),
)
else:
config[section][name] = default
else:
if (
default_fail
and name in config[section]
and config[section][name] == default_fail
):
# We have to fail and bail if the user hasn't edited
# this config option.
fail("Config file needs to be edited from provided defaults.")
else:
fail("'%s' section wasn't in config file" % section)
return config
config = get_config(config_file)
check_option(config, "shortcuts")
# special check here to make sure user has edited the config file
# and changed the ham callsign
check_option(
config,
"ham",
"callsign",
default_fail=DEFAULT_CONFIG_DICT["ham"]["callsign"],
)
check_option(
config,
"aprs.fi",
"apiKey",
default_fail=DEFAULT_CONFIG_DICT["aprs.fi"]["apiKey"],
)
check_option(config, "aprs", "login")
check_option(config, "aprs", "password")
# check_option(config, "aprs", "host")
# check_option(config, "aprs", "port")
check_option(config, "aprs", "logfile", "./aprsd.log")
check_option(config, "imap", "host")
check_option(config, "imap", "login")
check_option(config, "imap", "password")
check_option(config, "smtp", "host")
check_option(config, "smtp", "port")
check_option(config, "smtp", "login")
check_option(config, "smtp", "password")
return config

163
aprsd/utils/__init__.py Normal file
View File

@ -0,0 +1,163 @@
"""Utilities and helper functions."""
import errno
import functools
import os
import re
import sys
import traceback
import update_checker
import aprsd
from .fuzzyclock import fuzzy # noqa: F401
# Make these available by anyone importing
# aprsd.utils
from .ring_buffer import RingBuffer # noqa: F401
if sys.version_info.major == 3 and sys.version_info.minor >= 3:
from collections.abc import MutableMapping
else:
from collections.abc import MutableMapping
def singleton(cls):
"""Make a class a Singleton class (only one instance)"""
@functools.wraps(cls)
def wrapper_singleton(*args, **kwargs):
if wrapper_singleton.instance is None:
wrapper_singleton.instance = cls(*args, **kwargs)
return wrapper_singleton.instance
wrapper_singleton.instance = None
return wrapper_singleton
def env(*vars, **kwargs):
"""This returns the first environment variable set.
if none are non-empty, defaults to '' or keyword arg default
"""
for v in vars:
value = os.environ.get(v, None)
if value:
return value
return kwargs.get("default", "")
def mkdir_p(path):
"""Make directory and have it work in py2 and py3."""
try:
os.makedirs(path)
except OSError as exc: # Python >= 2.5
if exc.errno == errno.EEXIST and os.path.isdir(path):
pass
else:
raise
def insert_str(string, str_to_insert, index):
return string[:index] + str_to_insert + string[index:]
def end_substr(original, substr):
"""Get the index of the end of the <substr>.
So you can insert a string after <substr>
"""
idx = original.find(substr)
if idx != -1:
idx += len(substr)
return idx
def rgb_from_name(name):
"""Create an rgb tuple from a string."""
hash = 0
for char in name:
hash = ord(char) + ((hash << 5) - hash)
red = hash & 255
green = (hash >> 8) & 255
blue = (hash >> 16) & 255
return red, green, blue
def human_size(bytes, units=None):
"""Returns a human readable string representation of bytes"""
if not units:
units = [" bytes", "KB", "MB", "GB", "TB", "PB", "EB"]
return str(bytes) + units[0] if bytes < 1024 else human_size(bytes >> 10, units[1:])
def strfdelta(tdelta, fmt="{hours:{width}}:{minutes:{width}}:{seconds:{width}}"):
d = {
"days": tdelta.days,
"width": "02",
}
if tdelta.days > 0:
fmt = "{days} days " + fmt
d["hours"], rem = divmod(tdelta.seconds, 3600)
d["minutes"], d["seconds"] = divmod(rem, 60)
return fmt.format(**d)
def _check_version():
# check for a newer version
try:
check = update_checker.UpdateChecker()
result = check.check("aprsd", aprsd.__version__)
if result:
# Looks like there is an updated version.
return 1, result
else:
return 0, "APRSD is up to date"
except Exception:
# probably can't get in touch with pypi for some reason
# Lets put up an error and move on. We might not
# have internet in this aprsd deployment.
return 1, "Couldn't check for new version of APRSD"
def flatten_dict(d, parent_key="", sep="."):
"""Flatten a dict to key.key.key = value."""
items = []
for k, v in d.items():
new_key = parent_key + sep + k if parent_key else k
if isinstance(v, MutableMapping):
items.extend(flatten_dict(v, new_key, sep=sep).items())
else:
items.append((new_key, v))
return dict(items)
def parse_delta_str(s):
if "day" in s:
m = re.match(
r"(?P<days>[-\d]+) day[s]*, (?P<hours>\d+):(?P<minutes>\d+):(?P<seconds>\d[\.\d+]*)",
s,
)
else:
m = re.match(r"(?P<hours>\d+):(?P<minutes>\d+):(?P<seconds>\d[\.\d+]*)", s)
if m:
return {key: float(val) for key, val in m.groupdict().items()}
else:
return {}
def load_entry_points(group):
"""Load all extensions registered to the given entry point group"""
try:
import importlib_metadata
except ImportError:
# For python 3.10 and later
import importlib.metadata as importlib_metadata
eps = importlib_metadata.entry_points(group=group)
for ep in eps:
try:
ep.load()
except Exception as e:
print(f"Extension {ep.name} of group {group} failed to load with {e}", file=sys.stderr)
print(traceback.format_exc(), file=sys.stderr)

51
aprsd/utils/counter.py Normal file
View File

@ -0,0 +1,51 @@
from multiprocessing import RawValue
import random
import threading
import wrapt
MAX_PACKET_ID = 9999
class PacketCounter:
"""
Global Packet id counter class.
This is a singleton based class that keeps
an incrementing counter for all packets to
be sent. All new Packet objects gets a new
message id, which is the next number available
from the PacketCounter.
"""
_instance = None
lock = threading.Lock()
def __new__(cls, *args, **kwargs):
"""Make this a singleton class."""
if cls._instance is None:
cls._instance = super().__new__(cls, *args, **kwargs)
cls._instance.val = RawValue("i", random.randint(1, MAX_PACKET_ID))
return cls._instance
@wrapt.synchronized(lock)
def increment(self):
if self.val.value == MAX_PACKET_ID:
self.val.value = 1
else:
self.val.value += 1
@property
@wrapt.synchronized(lock)
def value(self):
return str(self.val.value)
@wrapt.synchronized(lock)
def __repr__(self):
return str(self.val.value)
@wrapt.synchronized(lock)
def __str__(self):
return str(self.val.value)

80
aprsd/utils/json.py Normal file
View File

@ -0,0 +1,80 @@
import datetime
import decimal
import json
import sys
from aprsd.packets import core
class EnhancedJSONEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, datetime.datetime):
args = (
"year", "month", "day", "hour", "minute",
"second", "microsecond",
)
return {
"__type__": "datetime.datetime",
"args": [getattr(obj, a) for a in args],
}
elif isinstance(obj, datetime.date):
args = ("year", "month", "day")
return {
"__type__": "datetime.date",
"args": [getattr(obj, a) for a in args],
}
elif isinstance(obj, datetime.time):
args = ("hour", "minute", "second", "microsecond")
return {
"__type__": "datetime.time",
"args": [getattr(obj, a) for a in args],
}
elif isinstance(obj, datetime.timedelta):
args = ("days", "seconds", "microseconds")
return {
"__type__": "datetime.timedelta",
"args": [getattr(obj, a) for a in args],
}
elif isinstance(obj, decimal.Decimal):
return {
"__type__": "decimal.Decimal",
"args": [str(obj)],
}
else:
return super().default(obj)
class SimpleJSONEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, datetime.datetime):
return obj.isoformat()
elif isinstance(obj, datetime.date):
return str(obj)
elif isinstance(obj, datetime.time):
return str(obj)
elif isinstance(obj, datetime.timedelta):
return str(obj)
elif isinstance(obj, decimal.Decimal):
return str(obj)
elif isinstance(obj, core.Packet):
return obj.to_dict()
else:
return super().default(obj)
class EnhancedJSONDecoder(json.JSONDecoder):
def __init__(self, *args, **kwargs):
super().__init__(
*args, object_hook=self.object_hook,
**kwargs,
)
def object_hook(self, d):
if "__type__" not in d:
return d
o = sys.modules[__name__]
for e in d["__type__"].split("."):
o = getattr(o, e)
args, kwargs = d.get("args", ()), d.get("kwargs", {})
return o(*args, **kwargs)

123
aprsd/utils/objectstore.py Normal file
View File

@ -0,0 +1,123 @@
import logging
import os
import pathlib
import pickle
import threading
from oslo_config import cfg
CONF = cfg.CONF
LOG = logging.getLogger("APRSD")
class ObjectStoreMixin:
"""Class 'MIXIN' intended to save/load object data.
The asumption of how this mixin is used:
The using class has to have a:
* data in self.data as a dictionary
* a self.lock thread lock
* Class must specify self.save_file as the location.
When APRSD quits, it calls save()
When APRSD Starts, it calls load()
aprsd server -f (flush) will wipe all saved objects.
"""
def __init__(self):
self.lock = threading.RLock()
def __len__(self):
with self.lock:
return len(self.data)
def __iter__(self):
with self.lock:
return iter(self.data)
def get_all(self):
with self.lock:
return self.data
def get(self, key):
with self.lock:
return self.data.get(key)
def copy(self):
with self.lock:
return self.data.copy()
def _init_store(self):
if not CONF.enable_save:
return
sl = CONF.save_location
if not os.path.exists(sl):
LOG.warning(f"Save location {sl} doesn't exist")
try:
os.makedirs(sl)
except Exception as ex:
LOG.exception(ex)
def _save_filename(self):
save_location = CONF.save_location
return "{}/{}.p".format(
save_location,
self.__class__.__name__.lower(),
)
def save(self):
"""Save any queued to disk?"""
if not CONF.enable_save:
return
self._init_store()
save_filename = self._save_filename()
if len(self) > 0:
LOG.info(
f"{self.__class__.__name__}::Saving"
f" {len(self)} entries to disk at "
f"{save_filename}",
)
with self.lock:
with open(save_filename, "wb+") as fp:
pickle.dump(self.data, fp)
else:
LOG.debug(
"{} Nothing to save, flushing old save file '{}'".format(
self.__class__.__name__,
save_filename,
),
)
self.flush()
def load(self):
if not CONF.enable_save:
return
if os.path.exists(self._save_filename()):
try:
with open(self._save_filename(), "rb") as fp:
raw = pickle.load(fp)
if raw:
self.data = raw
LOG.debug(
f"{self.__class__.__name__}::Loaded {len(self)} entries from disk.",
)
else:
LOG.debug(f"{self.__class__.__name__}::No data to load.")
except (pickle.UnpicklingError, Exception) as ex:
LOG.error(f"Failed to UnPickle {self._save_filename()}")
LOG.error(ex)
self.data = {}
else:
LOG.debug(f"{self.__class__.__name__}::No save file found.")
def flush(self):
"""Nuke the old pickle file that stored the old results from last aprsd run."""
if not CONF.enable_save:
return
if os.path.exists(self._save_filename()):
pathlib.Path(self._save_filename()).unlink()
with self.lock:
self.data = {}

View File

@ -0,0 +1,40 @@
class RingBuffer:
"""class that implements a not-yet-full buffer"""
max: int = 100
data: list = []
def __init__(self, size_max):
self.max = size_max
self.data = []
class __Full:
"""class that implements a full buffer"""
def append(self, x):
"""Append an element overwriting the oldest one."""
self.data[self.cur] = x
self.cur = (self.cur + 1) % self.max
def get(self):
"""return list of elements in correct order"""
return self.data[self.cur :] + self.data[: self.cur]
def __len__(self):
return len(self.data)
def append(self, x):
"""append an element at the end of the buffer"""
self.data.append(x)
if len(self.data) == self.max:
self.cur = 0
# Permanently change self's class from non-full to full
self.__class__ = self.__Full
def get(self):
"""Return a list of elements from the oldest to the newest."""
return self.data
def __len__(self):
return len(self.data)

180
aprsd/utils/trace.py Normal file
View File

@ -0,0 +1,180 @@
import abc
import functools
import inspect
import logging
import time
import types
VALID_TRACE_FLAGS = {"method", "api"}
TRACE_API = False
TRACE_METHOD = False
TRACE_ENABLED = False
LOG = logging.getLogger("APRSD")
def trace(*dec_args, **dec_kwargs):
"""Trace calls to the decorated function.
This decorator should always be defined as the outermost decorator so it
is defined last. This is important so it does not interfere
with other decorators.
Using this decorator on a function will cause its execution to be logged at
`DEBUG` level with arguments, return values, and exceptions.
:returns: a function decorator
"""
def _decorator(f):
func_name = f.__name__
@functools.wraps(f)
def trace_logging_wrapper(*args, **kwargs):
filter_function = dec_kwargs.get("filter_function")
logger = LOG
# NOTE(ameade): Don't bother going any further if DEBUG log level
# is not enabled for the logger.
if not logger.isEnabledFor(logging.DEBUG) or not TRACE_ENABLED:
return f(*args, **kwargs)
all_args = inspect.getcallargs(f, *args, **kwargs)
pass_filter = filter_function is None or filter_function(all_args)
if pass_filter:
logger.debug(
"==> %(func)s: call %(all_args)r",
{
"func": func_name,
"all_args": str(all_args),
},
)
start_time = time.time() * 1000
try:
result = f(*args, **kwargs)
except Exception as exc:
total_time = int(round(time.time() * 1000)) - start_time
logger.debug(
"<== %(func)s: exception (%(time)dms) %(exc)r",
{
"func": func_name,
"time": total_time,
"exc": exc,
},
)
raise
total_time = int(round(time.time() * 1000)) - start_time
if isinstance(result, dict):
mask_result = result
elif isinstance(result, str):
mask_result = result
else:
mask_result = result
if pass_filter:
logger.debug(
"<== %(func)s: return (%(time)dms) %(result)r",
{
"func": func_name,
"time": total_time,
"result": mask_result,
},
)
return result
return trace_logging_wrapper
if len(dec_args) == 0:
# filter_function is passed and args does not contain f
return _decorator
else:
# filter_function is not passed
return _decorator(dec_args[0])
def trace_api(*dec_args, **dec_kwargs):
"""Decorates a function if TRACE_API is true."""
def _decorator(f):
@functools.wraps(f)
def trace_api_logging_wrapper(*args, **kwargs):
if TRACE_API:
return trace(f, *dec_args, **dec_kwargs)(*args, **kwargs)
return f(*args, **kwargs)
return trace_api_logging_wrapper
if len(dec_args) == 0:
# filter_function is passed and args does not contain f
return _decorator
else:
# filter_function is not passed
return _decorator(dec_args[0])
def trace_method(f):
"""Decorates a function if TRACE_METHOD is true."""
@functools.wraps(f)
def trace_method_logging_wrapper(*args, **kwargs):
if TRACE_METHOD:
return trace(f)(*args, **kwargs)
return f(*args, **kwargs)
return trace_method_logging_wrapper
class TraceWrapperMetaclass(type):
"""Metaclass that wraps all methods of a class with trace_method.
This metaclass will cause every function inside of the class to be
decorated with the trace_method decorator.
To use the metaclass you define a class like so:
class MyClass(object, metaclass=utils.TraceWrapperMetaclass):
"""
def __new__(cls, classname, bases, class_dict):
new_class_dict = {}
for attribute_name, attribute in class_dict.items():
if isinstance(attribute, types.FunctionType):
# replace it with a wrapped version
attribute = functools.update_wrapper(
trace_method(attribute),
attribute,
)
new_class_dict[attribute_name] = attribute
return type.__new__(cls, classname, bases, new_class_dict)
class TraceWrapperWithABCMetaclass(abc.ABCMeta, TraceWrapperMetaclass):
"""Metaclass that wraps all methods of a class with trace."""
def setup_tracing(trace_flags):
"""Set global variables for each trace flag.
Sets variables TRACE_METHOD and TRACE_API, which represent
whether to log methods or api traces.
:param trace_flags: a list of strings
"""
global TRACE_METHOD
global TRACE_API
global TRACE_ENABLED
try:
trace_flags = [flag.strip() for flag in trace_flags]
except TypeError: # Handle when trace_flags is None or a test mock
trace_flags = []
for invalid_flag in set(trace_flags) - VALID_TRACE_FLAGS:
LOG.warning("Invalid trace flag: %s", invalid_flag)
TRACE_METHOD = "method" in trace_flags
TRACE_API = "api" in trace_flags
TRACE_ENABLED = True

0
aprsd/web/__init__.py Normal file
View File

View File

View File

@ -0,0 +1,84 @@
body {
background: #eeeeee;
margin: 2em;
text-align: center;
font-family: system-ui, sans-serif;
}
footer {
padding: 2em;
text-align: center;
height: 10vh;
}
.ui.segment {
background: #eeeeee;
}
#graphs {
display: grid;
width: 100%;
height: 300px;
grid-template-columns: 1fr 1fr;
}
#graphs_center {
display: block;
margin-top: 10px;
margin-bottom: 10px;
width: 100%;
height: 300px;
}
#left {
margin-right: 2px;
height: 300px;
}
#right {
height: 300px;
}
#center {
height: 300px;
}
#packetsChart, #messageChart, #emailChart, #memChart {
border: 1px solid #ccc;
background: #ddd;
}
#stats {
margin: auto;
width: 80%;
}
#jsonstats {
display: none;
}
#title {
font-size: 4em;
}
#version{
font-size: .5em;
}
#uptime, #aprsis {
font-size: 1em;
}
#callsign {
font-size: 1.4em;
color: #00F;
padding-top: 8px;
margin:10px;
}
#title_rx {
background-color: darkseagreen;
text-align: left;
}
#title_tx {
background-color: lightcoral;
text-align: left;
}
.aprsd_1 {
background-image: url(/static/images/aprs-symbols-16-0.png);
background-repeat: no-repeat;
background-position: -160px -48px;
width: 16px;
height: 16px;
}

View File

@ -0,0 +1,4 @@
/* PrismJS 1.29.0
https://prismjs.com/download.html#themes=prism-tomorrow&languages=markup+css+clike+javascript+json+json5+log&plugins=show-language+toolbar */
code[class*=language-],pre[class*=language-]{color:#ccc;background:0 0;font-family:Consolas,Monaco,'Andale Mono','Ubuntu Mono',monospace;font-size:1em;text-align:left;white-space:pre;word-spacing:normal;word-break:normal;word-wrap:normal;line-height:1.5;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-hyphens:none;-moz-hyphens:none;-ms-hyphens:none;hyphens:none}pre[class*=language-]{padding:1em;margin:.5em 0;overflow:auto}:not(pre)>code[class*=language-],pre[class*=language-]{background:#2d2d2d}:not(pre)>code[class*=language-]{padding:.1em;border-radius:.3em;white-space:normal}.token.block-comment,.token.cdata,.token.comment,.token.doctype,.token.prolog{color:#999}.token.punctuation{color:#ccc}.token.attr-name,.token.deleted,.token.namespace,.token.tag{color:#e2777a}.token.function-name{color:#6196cc}.token.boolean,.token.function,.token.number{color:#f08d49}.token.class-name,.token.constant,.token.property,.token.symbol{color:#f8c555}.token.atrule,.token.builtin,.token.important,.token.keyword,.token.selector{color:#cc99cd}.token.attr-value,.token.char,.token.regex,.token.string,.token.variable{color:#7ec699}.token.entity,.token.operator,.token.url{color:#67cdcc}.token.bold,.token.important{font-weight:700}.token.italic{font-style:italic}.token.entity{cursor:help}.token.inserted{color:green}
div.code-toolbar{position:relative}div.code-toolbar>.toolbar{position:absolute;z-index:10;top:.3em;right:.2em;transition:opacity .3s ease-in-out;opacity:0}div.code-toolbar:hover>.toolbar{opacity:1}div.code-toolbar:focus-within>.toolbar{opacity:1}div.code-toolbar>.toolbar>.toolbar-item{display:inline-block}div.code-toolbar>.toolbar>.toolbar-item>a{cursor:pointer}div.code-toolbar>.toolbar>.toolbar-item>button{background:0 0;border:0;color:inherit;font:inherit;line-height:normal;overflow:visible;padding:0;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none}div.code-toolbar>.toolbar>.toolbar-item>a,div.code-toolbar>.toolbar>.toolbar-item>button,div.code-toolbar>.toolbar>.toolbar-item>span{color:#bbb;font-size:.8em;padding:0 .5em;background:#f5f2f0;background:rgba(224,224,224,.2);box-shadow:0 2px 0 0 rgba(0,0,0,.2);border-radius:.5em}div.code-toolbar>.toolbar>.toolbar-item>a:focus,div.code-toolbar>.toolbar>.toolbar-item>a:hover,div.code-toolbar>.toolbar>.toolbar-item>button:focus,div.code-toolbar>.toolbar>.toolbar-item>button:hover,div.code-toolbar>.toolbar>.toolbar-item>span:focus,div.code-toolbar>.toolbar>.toolbar-item>span:hover{color:inherit;text-decoration:none}

View File

@ -0,0 +1,35 @@
/* Style the tab */
.tab {
overflow: hidden;
border: 1px solid #ccc;
background-color: #f1f1f1;
}
/* Style the buttons that are used to open the tab content */
.tab button {
background-color: inherit;
float: left;
border: none;
outline: none;
cursor: pointer;
padding: 14px 16px;
transition: 0.3s;
}
/* Change background color of buttons on hover */
.tab button:hover {
background-color: #ddd;
}
/* Create an active/current tablink class */
.tab button.active {
background-color: #ccc;
}
/* Style the tab content */
.tabcontent {
display: none;
padding: 6px 12px;
border: 1px solid #ccc;
border-top: none;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Some files were not shown because too many files have changed in this diff Show More