Skip to content

Conversation

yorick1989
Copy link
Contributor

@yorick1989 yorick1989 commented Jul 27, 2025

SUMMARY

Added the option subresources to the kubernetes.core.k8s module, to support subresources.

ISSUE TYPE
  • Feature Pull Request
ADDITIONAL INFORMATION

Check below the test I ran by approving a CSR by using the 'approval' subresource of the 'certificatesigningrequest' resource.

Added the CSR

cat <<EOF | kubectl apply -f -
apiVersion: certificates.k8s.io/v1
kind: CertificateSigningRequest
metadata:
  name: yorick-test
spec:
  request: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURSBSRVFVRVNULS0tLS0KTUlJRFZqQ0NBYjRDQVFBd0VURVBNQTBHQTFVRUF3d0dlVzl5YVdOck1JSUJvakFOQmdrcWhraUc5dzBCQVFFRgpBQU9DQVk4QU1JSUJpZ0tDQVlFQXZXWGlIZlc4VHVQRVZUai8xcmFaREE3K0sxWVhwZThDd0MyMDJxYnJyMlY4Ck1xSHQ3T1ZPV3RFZ1N1RlRvOHRDRTRUT0JqUTB4Y1p2K0tDSlhrT2xGTmFTcVF5M3ZWeFd5L2V4YnVMcGpza00KUWErMzFEZG5tNDgydi9RUGl5cDd3VEtIMlJwODNFNlFjSHcwdU1KNDA3YUEwTUpuTENhOG1aeVRWdEVUWWJ3YwowQnZ5M010L0NiWFlHV1FVSXg1aWdWMW1HNHJtb0ZHMWwrVVZHNzBYcHdMKys4U09MUnlqM3JTVzdNb3laSml1CjQ5a2JYTmtqTjJxZmVXTWd3T1Z5ZEpBbVIrYzQ4LzNLbndub1pjQXp5YlBiUXdRK0tITDAzekdBdTdMNGgzUUoKUDV3Y2xMYjFPMEI1VzI4V29kMDZsNUNYeFZGV1JvMFllTk16L2FZVitTSGdSd3hTeGgzbm9PbnpIZFF1QzlYTgpvQ2ZrQlBuWDMvcUVoc0syS2w5N2VjYWpMaUFaRUJ0REJNQkpqbDVFdnAySTBJNkxRY29JTHlibzNlWlVWelZxClhEM2x0cm1MQklpWGpUQ09ONmhQeUo2aWtIdWdSbi80MjZJeFp6MTByQ24zMUNUdGJTYlU4T1FoWnZrdlpPa0MKdmlYenBOYU51UEx1QWtHNENKQXhBZ01CQUFHZ0FEQU5CZ2txaGtpRzl3MEJBUXNGQUFPQ0FZRUFOK3VZcFpregpObURyc2NwYU92NWViQUlLeHNYVnRUaldBTkR3a1RSNml6RFFuRVhpQmttdEFnUkhPeFo2dmVKWFlMZ1kyMFRjClN1YlJYT2Nrdk95U1hKSzBHbkQ0U2xnYnRMNUE1cTZIWHhLN0oyMVlGbGU1M3hGQ3ZKOXRkaisrWVRlVmRKYTIKSW9kSHlJOXFNVFRuTWVhUWg3aDJtTEk0NkI0YlBua2cyUTQzSmNIcGthdnM1SGFHWmw4VWJRc3VIeDRRSkVaTgovTDhiSWs5NnFyWTQ3aE40eGtCMTV4RlJGU1loeExSTXpMd2pKdWpGL0lrblU0SlBJdFFub3k4VXlTUWNlUFpHCkkzc1hsQmkwOFkwdStuNEVQaDZZRU1jTW9QL0QycE5SSWpab3RVZXowbWNLNCt3RmYraW1ZUXRwQitMNkcyVksKLy9MTU9SY1ZLWmRnWVoyYnZkUEUrQzdxVXV4SDBGbnVUdWZucmhVY3Zaa1hBUW9SVllVRHFpaVFLTkp6QjZWRgo0Q043U1R2bnA4ck96YmRRL1hWS2ZrQ2Zib0N4NnkyRE9jVUdqOWVtNEZ0VVBpTWltQ3dqT0dsMW54NTR4dGs2CmFCNGFxcm9VMk1qR2V6V2pVZkYrZDdiR1VkWDdsK1BJcCs4eW5DVklHdVcxV3lDQlRZMWlVT3l3Ci0tLS0tRU5EIENFUlRJRklDQVRFIFJFUVVFU1QtLS0tLQo=
  signerName: kubernetes.io/kube-apiserver-client
  usages:
  - client auth
EOF

Output

certificatesigningrequest.certificates.k8s.io/yorick-test created

Pull the CSR info

cat << EOF | ansible-playbook /dev/stdin
- name: Get CSR subresource.
  hosts: localhost
  gather_facts: False
  tasks:

  - kubernetes.core.k8s_info:
      kubeconfig: '~/.kube/config'
      api_version: certificates.k8s.io/v1
      kind: certificatesigningrequests
      name: yorick-test
    register: yorick_test

  - ansible.builtin.debug:
      msg: "{{ yorick_test | json_query('resources[].{name: metadata.name, status: status}') | to_nice_json }}"
EOF

Output

PLAY [Get CSR subresource.] **********************************************************************************************************************************

TASK [kubernetes.core.k8s_info] ******************************************************************************************************************************
Sunday 27 July 2025  02:34:51 +0200 (0:00:00.010)       0:00:00.010 ***********
ok: [localhost]

TASK [ansible.builtin.debug] *********************************************************************************************************************************
Sunday 27 July 2025  02:34:55 +0200 (0:00:03.970)       0:00:03.980 ***********
ok: [localhost] => {}

MSG:

[
    {
        "name": "yorick-test",
        "status": {}
    }
]

PLAY RECAP ***************************************************************************************************************************************************
localhost                  : ok=2    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Playbook run took 0 days, 0 hours, 0 minutes, 4 seconds

Approve the CSR

cat << EOF | ansible-playbook /dev/stdin
- name: Update CSR subresource.
  hosts: localhost
  gather_facts: False
  tasks:

  - kubernetes.core.k8s:
      kubeconfig: '~/.kube/config'
      subresource: approval
      definition:
        apiVersion: certificates.k8s.io/v1
        kind: certificatesigningrequests
        metadata:
          name: yorick-test
        status:
          conditions:
          - lastTransitionTime: "2025-07-27T02:12:22Z"
            lastUpdateTime: "2025-07-27T02:12:22Z"
            message: Approval yorick test
            reason: Approved
            status: "True"
            type: Approved
    register: yorick_test

  - kubernetes.core.k8s_info:
      kubeconfig: '~/.kube/config'
      api_version: certificates.k8s.io/v1
      kind: certificatesigningrequests
      name: yorick-test
    register: yorick_test

  - ansible.builtin.debug:
      msg: "{{ yorick_test | json_query('resources[].{name: metadata.name, status: status}') | to_nice_json }}"
EOF

Output

PLAY [Update CSR subresource.] *******************************************************************************************************************************

TASK [kubernetes.core.k8s] ***********************************************************************************************************************************
Sunday 27 July 2025  02:35:35 +0200 (0:00:00.010)       0:00:00.010 ***********
--- before
+++ after
@@ -15,9 +15,44 @@
                 "manager": "kubectl-client-side-apply",
                 "operation": "Update",
                 "time": "2025-07-27T00:34:31Z"
+            },
+            {
+                "apiVersion": "certificates.k8s.io/v1",
+                "fieldsType": "FieldsV1",
+                "fieldsV1": {
+                    "f:status": {
+                        "f:conditions": {
+                            ".": {},
+                            "k:{\"type\":\"Approved\"}": {
+                                ".": {},
+                                "f:lastTransitionTime": {},
+                                "f:lastUpdateTime": {},
+                                "f:message": {},
+                                "f:reason": {},
+                                "f:status": {},
+                                "f:type": {}
+                            }
+                        }
+                    }
+                },
+                "manager": "OpenAPI-Generator",
+                "operation": "Update",
+                "subresource": "approval",
+                "time": "2025-07-27T00:35:38Z"
             }
         ],
-        "resourceVersion": "26759657"
+        "resourceVersion": "26761740"
     },
-    "status": {}
+    "status": {
+        "conditions": [
+            {
+                "lastTransitionTime": "2025-07-27T02:12:22Z",
+                "lastUpdateTime": "2025-07-27T02:12:22Z",
+                "message": "Approval yorick test",
+                "reason": "Approved",
+                "status": "True",
+                "type": "Approved"
+            }
+        ]
+    }
 }

changed: [localhost]

TASK [kubernetes.core.k8s_info] ******************************************************************************************************************************
Sunday 27 July 2025  02:35:39 +0200 (0:00:04.067)       0:00:04.078 ***********
ok: [localhost]

TASK [ansible.builtin.debug] *********************************************************************************************************************************
Sunday 27 July 2025  02:35:39 +0200 (0:00:00.814)       0:00:04.893 ***********
ok: [localhost] => {}

MSG:

[
    {
        "name": "yorick-test",
        "status": {
            "certificate": "...",
            "conditions": [
                {
                    "lastTransitionTime": "2025-07-27T02:12:22Z",
                    "lastUpdateTime": "2025-07-27T02:12:22Z",
                    "message": "Approval yorick test",
                    "reason": "Approved",
                    "status": "True",
                    "type": "Approved"
                }
            ]
        }
    }
]

PLAY RECAP ***************************************************************************************************************************************************
localhost                  : ok=3    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Playbook run took 0 days, 0 hours, 0 minutes, 4 seconds

Double-check, using the oc command

oc get csr yorick-test
NAME          AGE   SIGNERNAME                            REQUESTOR      REQUESTEDDURATION   CONDITION
yorick-test   18m   kubernetes.io/kube-apiserver-client   system:admin   <none>              Approved,Issued

Please, let me know if there is more required for this to be implemented.

Thanks!

@yorick1989 yorick1989 marked this pull request as ready for review July 27, 2025 00:57
Copy link

Copy link

Copy link

Copy link

Copy link

@yorick1989
Copy link
Contributor Author

@beeankha can you please run the workflows again? I'd make the changes where the previous one failed on:

  • removed json_query dependency
  • fixed indentation errors

Check: https://github.com/ansible-collections/kubernetes.core/compare/ecd340ec88fac16615cb4ba9fdc2c2e3cd02a098..e35bc453ce6b5742d09b915016d00e56ccdddc7b

Thanks!

Copy link

@yorick1989 yorick1989 force-pushed the main branch 2 times, most recently from e35bc45 to 9674509 Compare August 6, 2025 20:07
Copy link

@yorick1989
Copy link
Contributor Author

yorick1989 commented Aug 6, 2025

@beeankha I see that the workflow this time failed on one of the few commits that were merged into main in the meantime, while I was working on this item. I checked out the latest version of the original main and cherry-picked the changes I made and pushed it back to my own main branch. Everything should be okay now.

And, maybe it's done on purpose, but why is there no ref: ${{ github.event.pull_request.head.sha }} @ https://github.com/ansible-collections/kubernetes.core/blob/c48778d70926993d31a2e35551c6d0067a3d4f5c/.github/workflows/integration-tests.yaml#L24C1-L28C27 ? Because, if new integration tests are added they will not be tested during the PR phase. This because the splitter is not finding them in the main branch (of course :)). The same is true the other way around (what I experienced); when someone is working on an item, and is behind, tests that were added to the main branch in the meantime will result in a fail during workflow runs of open pull requests. Just a thought and maybe something to improve :). Unless there is a good reason why it is working like this.

@yorick1989
Copy link
Contributor Author

@beeankha sorry for disturbing you with this, but can you maybe tell me how I can get my 2 PRs reviewed and (planned) to be merged into master / the next version?

And if there are better channels to communicate in about changes to community collections like this one, can you please share them with me too?

Thanks!

Copy link
Contributor

@yurnov yurnov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Besides a version_added, it's LGTM

Copy link

@yorick1989 yorick1989 requested a review from yurnov September 14, 2025 17:11
@yurnov
Copy link
Contributor

yurnov commented Sep 24, 2025

Hi @yorick1989, the failed sanity (that was not related to your PR) got resolved. Could you please rebase your PR or merge master to your branch to have successful CI tests

@yorick1989
Copy link
Contributor Author

Hi @yorick1989, the failed sanity (that was not related to your PR) got resolved. Could you please rebase your PR or merge master to your branch to have successful CI tests

Why are there keep coming changes to the main branch while i'm waiting for almost 2 months now to have my pull requests in main? If the PR is not wanted to be pulled in main, just reject it.

The same with the other, old, pull requests that are open for years now. I really don't understand why it is so hard to have such small PRs reviewed.

If there is some process or communication line that can be used for discussing PRs, please let me know and put it in the documentation somewhere.

Thanks

@beeankha beeankha requested review from gravesm, alinabuzachis and abikouo and removed request for yurnov September 24, 2025 13:48
@beeankha
Copy link
Contributor

Hi @yorick1989 , thank you for this PR (as well as #971), and apologies for the lack of responses to your comments and questions. After the latest release of this collection (5.4.0 and 6.1.0) the maintainers of this repository got pulled onto different projects and we were unable to review and merge anything for a little while. The recent CI fix unblocked several open PRs for the failing sanity tests, which is why it was the first thing to get merged. Your PRs as well as a few others will be reviewed before our next planned 5.5.0 and 6.2.0 releases.

Copy link

@yorick1989
Copy link
Contributor Author

Hi @yorick1989 , thank you for this PR (as well as #971), and apologies for the lack of responses to your comments and questions. After the latest release of this collection (5.4.0 and 6.1.0) the maintainers of this repository got pulled onto different projects and we were unable to review and merge anything for a little while. The recent CI fix unblocked several open PRs for the failing sanity tests, which is why it was the first thing to get merged. Your PRs as well as a few others will be reviewed before our next planned 5.5.0 and 6.2.0 releases.

Good to know that, thanks. Also for the update.

I've also created a small PR with a fix to the integration test @ #981

@yorick1989
Copy link
Contributor Author

Hi @yorick1989, the failed sanity (that was not related to your PR) got resolved. Could you please rebase your PR or merge master to your branch to have successful CI tests

I've pulled in the changes from main and pulled in my commit for this PR.

@beeankha
Copy link
Contributor

Thanks again for this pull request, @yorick1989. Some thoughts after reviewing this PR:

The k8s API's subresource design doesn't align well with the resource-oriented model that the rest of the k8s API fits into. Subresources are heterogeneous by nature (e.g., the scale subresource of deployments vs. the log and exec subresources of pods vs. the approval subresource of CSRs) and this heterogeneity requires specialized handling, which is why dedicated modules like k8s_scale, k8s_exec, and k8s_log were developed.

Rather than adding generic subresource support to the main k8s module, one of these following approaches may be more ideal:

  1. Create a specialized k8s_csr module for Certificate Signing Request operations (approval, denial, etc.)
  2. Enhance documentation about using existing specialized modules for subresource operations

Please let us know what you think of the above.

@yorick1989
Copy link
Contributor Author

Thanks again for this pull request, @yorick1989. Some thoughts after reviewing this PR:

The k8s API's subresource design doesn't align well with the resource-oriented model that the rest of the k8s API fits into. Subresources are heterogeneous by nature (e.g., the scale subresource of deployments vs. the log and exec subresources of pods vs. the approval subresource of CSRs) and this heterogeneity requires specialized handling, which is why dedicated modules like k8s_scale, k8s_exec, and k8s_log were developed.

Rather than adding generic subresource support to the main k8s module, one of these following approaches may be more ideal:

1. Create a specialized `k8s_csr` module for Certificate Signing Request operations (approval, denial, etc.)

2. Enhance documentation about using existing specialized modules for subresource operations

Please let us know what you think of the above.

Thank you for your reply. I'd agree with your comment. Do you think it's valuable to create a separate k8s_csr approval module or use k8s_exec instead? If you think it adds value to create a separate module for it, then I will work on that instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants