The following page describes several ideas that may be interesting for people considering or joining the project. This list is presented in no specific order.
Feature wish list
1. DHCP Degradation Canary
Stork is able to monitor Kea instances. However, admins want some additional assurance that the server is able to provide a DHCP service. Note that the server is up and alive doesn't imply that it is always able to provide service (think about cases like running out of disk space => memfile will not be able to store leases, the DB credentials problem => DB connection works but is unable to insert leases, firewall problems dropping responses, a bug causing Kea to stuck in an infinite loop or drop responses, various types of misconfiguration, etc). Therefore a mechanism is needed that will act as a DHCP client and actually get a lease (and then release it quickly to not consume resources). The basic approach would be to conduct this action manually, by clicking a button on Stork UI. The improved version would be to offer an ability to conduct such a check periodically. The difficulties are listed below.
- Figure out who will send it - the server or agent. Both approaches have pros and cons.
- Figure out if a DHCP server needs any special configuration to reply to the DHCP queries from Stork. It is particularly interesting when the agent sends queries to the DHCP server from its localhost.
- Figure out how to conduct DHCP exchange. Getting DHCP implementation in go is one option. Using an external tool, such as perfdhcp, is another.
- Figure out how to store the responses and get as much info from the exchange (maybe measure latency?)
- If this is sent by an agent, an API is needed for the server to use it.
- If this is sent by a server, the scalability needs to be considered.
Related requirements: #158
2. Display Kea config (JSON)
doc/examples/kea6 in the Kea sources. The Stork server already has the capability to retrieve that information using the existing API (it instructs stork-agent to call
config-get command on Kea). The retrieved information is then stored on the stork-server in a PostgreSQL database.
As of March 2021, Stork can show specific elements (networks, subnets, reservations), but lacks the ability to show the whole Kea configuration. We would like to get the ability to display the whole configuration assigned to the specific Kea daemon (DHCP server, Control Agent etc). This feature can be implemented incrementally, starting from the easiest to more sophisticated.
- Phase 1. In the first step, a configuration should be presented in a JSON format with the possibility to show and hide certain nodes. There's some degree of flexibility here. One potential way to visualize would be a tree with expandable/hidable sub-trees.
- Phase 2. In the second step, some of the elements should be made clickable, and clicking on them would direct the user to the specific view, e.g. shared network view, subnet view etc.
- Phase 3. Finally, in the third step, the old configurations should be tracked in the Stork database, and it should be possible to view the selected configuration. It must be possible to generate a diff view between any two configurations by selecting them to compare and clicking a compare button. The diff may look the same as a diff between two JSON files in a version control system.
Related requirement: #43
Configuration tracking may sound a bit complicated, but Stork already stores the current configuration in the database as JSONB. We also have a mechanism to detect configuration changes. A new table needs to be added to store old configurations with appropriate metadata and a trigger in the database to move the current configuration to this new table.
2.1 Phase 1: Visualising Kea configuration
Here's a high level sketch of tasks required to complete phase 1:
- Evaluate if stork-server code (written in go) provides an API to make the configuration available. The Kea config information is there and there are API calls (such as
/subnets) that return specific fragments of the configuration, but most likely there will be a need to implement new API call that returns it as a whole JSON structure.
- Extend the Angular interface to visualize the Kea configuration. The visualization can be simple for now, but it should be extensible, so the more complicated tasks would be possible in the future. See Phase 2 and 3 above. The current Stork interface is implemented using AngularJS 9 (migration to 10 is planned) with extensive use of the PrimeNG library. Use of existing libraries with compatible licenses is encouraged.
- The solution should have unit tests (see Stork ARM Sections 5.6 and 5.7 )
- The solution should have adequate code comments
- The patch will go through a normal review that applies to all existing Stork developers. The process is described here.
The phase 1 is expected to take roughly a month to complete. However, Stork is a long term project and ISC values code quality over rapid delivery, so there is some flexibility here.
3. Host Reservation details
Stork is able to show host reservations from Kea, but as of March 2021, its capabilities are basic - only identifier (e.g. MAC or DUID), IP address, delegated prefix, hostname and whether the reservation was retrieved from config or host backend, are shown. The missing pieces are:
- fixed fields (e.g. bootfile, sname)
- DHCP options
- client classes
Related requirement: #314
Displaying this additional information requires adding a detailed view of a host reservation. Currently, all the available information is displayed on a list of host reservations which is already wide.
- The next step is adding a button in the detailed host reservation view that checks if the reservation is currently in use, i.e. the Kea server has a valid lease for this reservation.
Related requirement: #237
Leases data change dynamically, and we currently have no systematic way of gathering this information from the Kea servers. Gathering the information for the whole network is not a goal of this task. This task aims to generate a lease4-get or the lease6-get commands on demand for a single host reservation as a result of clicking the button. If the server returns valid leases, the reservations should be highlighted as in use.
Potential future ideas
The following areas of interest are currently blocked for various technical and business reasons. If you are interested, please reach out to the Stork team to discuss the matter.
4. User management - Read-only user
As of March 2021, Stork has only two user roles: super-admin (can do everything) and admin (can do everything, except managing other users). We need more fine grained access control. The most basic addition would be a read-only user. This would be used by a junior admin, who can only observe the system, but is not permitted to make any changes. In the future, the role system will become more sophisticated, so the solution must be extensible. In particular, the following use cases will need to be possible: a role to manage a single server, a role to manage certain subnet (including situations where it is handled by a pair of HA servers). This is currently blocked, because the Stork team needs to write down requirements and our early attempt indicates it's more complex than it looks at the first glance.
Related requirement: #157
5. Address allocation details
As of March 2021, we have the ability to show statistics for networks and subnets. We can say that 30 of 250 addresses are used. The statistics are good first approximation, but they have several flaws. First, there were bugs in statistics that caused them to not truly reflect the pool state, in particular in cases where several Kea instances are sharing the same DB. Second, getting an overview of the pool utilization is often not enough and admins want to have more detailed insight. The major difficulty here is to come up with an efficient way to keep this information roughly up to date. The current mechanisms available in Kea (e.g. lease4-get-all) are insufficient and wouldn't scale for deployments that count devices in millions. There's a plan to implement incremental lease updates, so Stork would retrieve all leases just once (that's acceptable) and then only get the lease updates periodically. This is currently blocked until such a mechanism is implemented in Kea. See lease tracking design.