From patchwork Mon Oct 28 17:49:41 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147550 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 89CB445BFF; Mon, 28 Oct 2024 18:51:20 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id A7D78427C9; Mon, 28 Oct 2024 18:51:16 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id 153D74279F for ; Mon, 28 Oct 2024 18:51:14 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 2079016F2; Mon, 28 Oct 2024 10:51:43 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id B55693F66E; Mon, 28 Oct 2024 10:51:12 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 1/8] dts: add pydantic dependency Date: Mon, 28 Oct 2024 17:49:41 +0000 Message-ID: <20241028174949.3283701-2-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org As part of configuration validation and deserialization improvements, this adds pydantic as a project dependency. Pydantic is a library that caters to all of the aforementioned needs, while improving the process and code. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- dts/poetry.lock | 171 ++++++++++++++++++++++++++++++++++++++++++++- dts/pyproject.toml | 1 + 2 files changed, 170 insertions(+), 2 deletions(-) diff --git a/dts/poetry.lock b/dts/poetry.lock index cf5f6569c6..56c50ad52c 100644 --- a/dts/poetry.lock +++ b/dts/poetry.lock @@ -1,4 +1,4 @@ -# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand. +# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand. [[package]] name = "aenum" @@ -23,6 +23,17 @@ files = [ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"}, ] +[[package]] +name = "annotated-types" +version = "0.7.0" +description = "Reusable constraint types to use with typing.Annotated" +optional = false +python-versions = ">=3.8" +files = [ + {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"}, + {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, +] + [[package]] name = "attrs" version = "23.1.0" @@ -567,6 +578,16 @@ files = [ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, @@ -762,6 +783,130 @@ files = [ {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"}, ] +[[package]] +name = "pydantic" +version = "2.9.2" +description = "Data validation using Python type hints" +optional = false +python-versions = ">=3.8" +files = [ + {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"}, + {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"}, +] + +[package.dependencies] +annotated-types = ">=0.6.0" +pydantic-core = "2.23.4" +typing-extensions = [ + {version = ">=4.12.2", markers = "python_version >= \"3.13\""}, + {version = ">=4.6.1", markers = "python_version < \"3.13\""}, +] + +[package.extras] +email = ["email-validator (>=2.0.0)"] +timezone = ["tzdata"] + +[[package]] +name = "pydantic-core" +version = "2.23.4" +description = "Core functionality for Pydantic validation and serialization" +optional = false +python-versions = ">=3.8" +files = [ + {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"}, + {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"}, + {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"}, + {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"}, + {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"}, + {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"}, + {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"}, + {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"}, + {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"}, + {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"}, + {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"}, + {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"}, + {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"}, + {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"}, + {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"}, + {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"}, + {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"}, + {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"}, + {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"}, + {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"}, + {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"}, + {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"}, + {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"}, + {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"}, + {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"}, + {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"}, + {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"}, + {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"}, + {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"}, + {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"}, + {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"}, + {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"}, + {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"}, + {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"}, + {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"}, + {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"}, + {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"}, + {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"}, + {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"}, + {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"}, + {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"}, + {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"}, + {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"}, + {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"}, + {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" + [[package]] name = "pydocstyle" version = "6.1.1" @@ -880,6 +1025,7 @@ files = [ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, + {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, @@ -887,8 +1033,16 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, + {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, + {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"}, + {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, + {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, + {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, + {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, @@ -905,6 +1059,7 @@ files = [ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, + {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, @@ -912,6 +1067,7 @@ files = [ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, + {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, @@ -1327,6 +1483,17 @@ files = [ {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"}, ] +[[package]] +name = "typing-extensions" +version = "4.12.2" +description = "Backported and Experimental Type Hints for Python 3.8+" +optional = false +python-versions = ">=3.8" +files = [ + {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"}, + {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"}, +] + [[package]] name = "urllib3" version = "2.0.7" @@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5" [metadata] lock-version = "2.0" python-versions = "^3.10" -content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394" +content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827" diff --git a/dts/pyproject.toml b/dts/pyproject.toml index 506380ac2f..6c2d1ca8a4 100644 --- a/dts/pyproject.toml +++ b/dts/pyproject.toml @@ -28,6 +28,7 @@ scapy = "^2.5.0" pydocstyle = "6.1.1" typing-extensions = "^4.11.0" aenum = "^3.1.15" +pydantic = "^2.9.2" [tool.poetry.group.dev.dependencies] mypy = "^1.10.0" From patchwork Mon Oct 28 17:49:42 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147551 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 4FCF645BFF; Mon, 28 Oct 2024 18:51:27 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id F032F427B7; Mon, 28 Oct 2024 18:51:19 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id A20404279F for ; Mon, 28 Oct 2024 18:51:14 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id E727B13D5; Mon, 28 Oct 2024 10:51:43 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id 9E29E3F66E; Mon, 28 Oct 2024 10:51:13 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Date: Mon, 28 Oct 2024 17:49:42 +0000 Message-ID: <20241028174949.3283701-3-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Currently there is a lack of a definition which identifies all the test suites available to test. This change intends to simplify the process to discover all the test suites and idenfity them. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- dts/framework/runner.py | 2 +- dts/framework/test_suite.py | 189 +++++++++++++++++++--- dts/framework/testbed_model/capability.py | 12 +- 3 files changed, 177 insertions(+), 26 deletions(-) diff --git a/dts/framework/runner.py b/dts/framework/runner.py index 8bbe698eaf..195622c653 100644 --- a/dts/framework/runner.py +++ b/dts/framework/runner.py @@ -225,7 +225,7 @@ def _get_test_suites_with_cases( for test_suite_config in test_suite_configs: test_suite_class = self._get_test_suite_class(test_suite_config.test_suite) test_cases: list[type[TestCase]] = [] - func_test_cases, perf_test_cases = test_suite_class.get_test_cases( + func_test_cases, perf_test_cases = test_suite_class.filter_test_cases( test_suite_config.test_cases ) if func: diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py index cbe3b30ffc..936eb2cede 100644 --- a/dts/framework/test_suite.py +++ b/dts/framework/test_suite.py @@ -1,6 +1,7 @@ # SPDX-License-Identifier: BSD-3-Clause # Copyright(c) 2010-2014 Intel Corporation # Copyright(c) 2023 PANTHEON.tech s.r.o. +# Copyright(c) 2024 Arm Limited """Features common to all test suites. @@ -16,13 +17,20 @@ import inspect from collections import Counter from collections.abc import Callable, Sequence +from dataclasses import dataclass from enum import Enum, auto +from functools import cached_property +from importlib import import_module from ipaddress import IPv4Interface, IPv6Interface, ip_interface +from pkgutil import iter_modules +from types import ModuleType from typing import ClassVar, Protocol, TypeVar, Union, cast +from pydantic.alias_generators import to_pascal from scapy.layers.inet import IP # type: ignore[import-untyped] from scapy.layers.l2 import Ether # type: ignore[import-untyped] from scapy.packet import Packet, Padding, raw # type: ignore[import-untyped] +from typing_extensions import Self from framework.testbed_model.capability import TestProtocol from framework.testbed_model.port import Port @@ -33,7 +41,7 @@ PacketFilteringConfig, ) -from .exception import ConfigurationError, TestCaseVerifyError +from .exception import ConfigurationError, InternalError, TestCaseVerifyError from .logger import DTSLogger, get_dts_logger from .utils import get_packet_summaries @@ -112,10 +120,24 @@ def __init__( self._tg_ip_address_ingress = ip_interface("192.168.101.3/24") @classmethod - def get_test_cases( + def get_test_cases(cls) -> list[type["TestCase"]]: + """A list of all the available test cases.""" + + def is_test_case(function: Callable) -> bool: + if inspect.isfunction(function): + # TestCase is not used at runtime, so we can't use isinstance() with `function`. + # But function.test_type exists. + if hasattr(function, "test_type"): + return isinstance(function.test_type, TestCaseType) + return False + + return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)] + + @classmethod + def filter_test_cases( cls, test_case_sublist: Sequence[str] | None = None ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]: - """Filter `test_case_subset` from this class. + """Filter `test_case_sublist` from this class. Test cases are regular (or bound) methods decorated with :func:`func_test` or :func:`perf_test`. @@ -129,17 +151,8 @@ def get_test_cases( as methods are bound to instances and this method only has access to the class. Raises: - ConfigurationError: If a test case from `test_case_subset` is not found. + ConfigurationError: If a test case from `test_case_sublist` is not found. """ - - def is_test_case(function: Callable) -> bool: - if inspect.isfunction(function): - # TestCase is not used at runtime, so we can't use isinstance() with `function`. - # But function.test_type exists. - if hasattr(function, "test_type"): - return isinstance(function.test_type, TestCaseType) - return False - if test_case_sublist is None: test_case_sublist = [] @@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool: func_test_cases = set() perf_test_cases = set() - for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case): - if test_case_name in test_case_sublist_copy: + for test_case in cls.get_test_cases(): + if test_case.name in test_case_sublist_copy: # if test_case_sublist_copy is non-empty, remove the found test case # so that we can look at the remainder at the end - test_case_sublist_copy.remove(test_case_name) + test_case_sublist_copy.remove(test_case.name) elif test_case_sublist: # the original list not being empty means we're filtering test cases - # since we didn't remove test_case_name in the previous branch, + # since we didn't remove test_case.name in the previous branch, # it doesn't match the filter and we don't want to remove it continue - match test_case_function.test_type: + match test_case.test_type: case TestCaseType.PERFORMANCE: - perf_test_cases.add(test_case_function) + perf_test_cases.add(test_case) case TestCaseType.FUNCTIONAL: - func_test_cases.add(test_case_function) + func_test_cases.add(test_case) if test_case_sublist_copy: raise ConfigurationError( @@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]): test case function to :class:`TestCase` and sets common variables. """ + #: + name: ClassVar[str] #: test_type: ClassVar[TestCaseType] #: necessary for mypy so that it can treat this class as the function it's shadowing @@ -560,6 +575,7 @@ def make_decorator( def _decorator(func: TestSuiteMethodType) -> type[TestCase]: test_case = cast(type[TestCase], func) + test_case.name = func.__name__ test_case.skip = cls.skip test_case.skip_reason = cls.skip_reason test_case.required_capabilities = set() @@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]: func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL) #: The decorator for performance test cases. perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE) + + +@dataclass +class TestSuiteSpec: + """A class defining the specification of a test suite. + + Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is + provided to automatically discover all the available test suites. + + Attributes: + module_name: The name of the test suite's module. + """ + + #: + TEST_SUITES_PACKAGE_NAME = "tests" + #: + TEST_SUITE_MODULE_PREFIX = "TestSuite_" + #: + TEST_SUITE_CLASS_PREFIX = "Test" + #: + TEST_CASE_METHOD_PREFIX = "test_" + #: + FUNC_TEST_CASE_REGEX = r"test_(?!perf_)" + #: + PERF_TEST_CASE_REGEX = r"test_perf_" + + module_name: str + + @cached_property + def name(self) -> str: + """The name of the test suite's module.""" + return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :] + + @cached_property + def module(self) -> ModuleType: + """A reference to the test suite's module.""" + return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}") + + @cached_property + def class_name(self) -> str: + """The name of the test suite's class.""" + return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}" + + @cached_property + def class_obj(self) -> type[TestSuite]: + """A reference to the test suite's class.""" + + def is_test_suite(obj) -> bool: + """Check whether `obj` is a :class:`TestSuite`. + + The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself. + + Args: + obj: The object to be checked. + + Returns: + :data:`True` if `obj` is a subclass of `TestSuite`. + """ + try: + if issubclass(obj, TestSuite) and obj is not TestSuite: + return True + except TypeError: + return False + return False + + for class_name, class_obj in inspect.getmembers(self.module, is_test_suite): + if class_name == self.class_name: + return class_obj + + raise InternalError( + f"Expected class {self.class_name} not found in module {self.module_name}." + ) + + @classmethod + def discover_all( + cls, package_name: str | None = None, module_prefix: str | None = None + ) -> list[Self]: + """Discover all the test suites. + + The test suites are discovered in the provided `package_name`. The full module name, + expected under that package, is prefixed with `module_prefix`. + The module name is a standard filename with words separated with underscores. + For each module found, search for a :class:`TestSuite` class which starts + with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in + PascalCase. + + The PascalCase convention applies to abbreviations, acronyms, initialisms and so on:: + + OS -> Os + TCP -> Tcp + + Args: + package_name: The name of the package where to find the test suites. If :data:`None`, + the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used. + module_prefix: The name prefix defining the test suite module. If :data:`None`, the + :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used. + + Returns: + A list containing all the discovered test suites. + """ + if package_name is None: + package_name = cls.TEST_SUITES_PACKAGE_NAME + if module_prefix is None: + module_prefix = cls.TEST_SUITE_MODULE_PREFIX + + test_suites = [] + + test_suites_pkg = import_module(package_name) + for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__): + if not module_name.startswith(module_prefix) or is_pkg: + continue + + test_suite = cls(module_name) + try: + if test_suite.class_obj: + test_suites.append(test_suite) + except InternalError as err: + get_dts_logger().warning(err) + + return test_suites + + +AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all() +"""Constant to store all the available, discovered and imported test suites. + +The test suites should be gathered from this list to avoid importing more than once. +""" + + +def find_by_name(name: str) -> TestSuiteSpec | None: + """Find a requested test suite by name from the available ones.""" + test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES) + return next(test_suites, None) diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py index 2207957a7a..0d5f0e0b32 100644 --- a/dts/framework/testbed_model/capability.py +++ b/dts/framework/testbed_model/capability.py @@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self): import inspect from abc import ABC, abstractmethod -from collections.abc import MutableSet, Sequence +from collections.abc import MutableSet from dataclasses import dataclass -from typing import Callable, ClassVar, Protocol +from typing import TYPE_CHECKING, Callable, ClassVar, Protocol from typing_extensions import Self @@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self): from .sut_node import SutNode from .topology import Topology, TopologyType +if TYPE_CHECKING: + from framework.test_suite import TestCase + class Capability(ABC): """The base class for various capabilities. @@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None: if inspect.isclass(test_case_or_suite): if self.topology_type is not TopologyType.default: self.add_to_required(test_case_or_suite) - func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases() - for test_case in func_test_cases | perf_test_cases: + for test_case in test_case_or_suite.get_test_cases(): if test_case.topology_type.topology_type is TopologyType.default: # test case topology has not been set, use the one set by the test suite self.add_to_required(test_case) @@ -446,7 +448,7 @@ class TestProtocol(Protocol): required_capabilities: ClassVar[set[Capability]] = set() @classmethod - def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]: + def get_test_cases(cls) -> list[type["TestCase"]]: """Get test cases. Should be implemented by subclasses containing test cases. Raises: From patchwork Mon Oct 28 17:49:43 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147552 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 5769D45BFF; Mon, 28 Oct 2024 18:51:34 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id 0C0A7427D2; Mon, 28 Oct 2024 18:51:21 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id C0D33427B7 for ; Mon, 28 Oct 2024 18:51:15 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id B246A497; Mon, 28 Oct 2024 10:51:44 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id 6DC553F66E; Mon, 28 Oct 2024 10:51:14 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 3/8] dts: refactor build and node info classes Date: Mon, 28 Oct 2024 17:49:43 +0000 Message-ID: <20241028174949.3283701-4-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org The DPDKBuildInfo and NodeInfo classes, representing information gathered in runtime, were erroneously placed in the configuration package. This moves them in more appropriate modules. NodeInfo, specifically, ia moved to os_session instead of node mostly as a consequence of circular dependencies. And given os_session is the top-most module to reference it, it appears to be the most suitable place outside of node. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek --- dts/framework/config/__init__.py | 31 -------------------- dts/framework/test_result.py | 4 ++- dts/framework/testbed_model/os_session.py | 21 ++++++++++++- dts/framework/testbed_model/posix_session.py | 4 +-- dts/framework/testbed_model/sut_node.py | 18 ++++++++++-- 5 files changed, 40 insertions(+), 38 deletions(-) diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py index d0d95d00c7..7403ccbf14 100644 --- a/dts/framework/config/__init__.py +++ b/dts/framework/config/__init__.py @@ -318,24 +318,6 @@ class TGNodeConfiguration(NodeConfiguration): traffic_generator: TrafficGeneratorConfig -@dataclass(slots=True, frozen=True) -class NodeInfo: - """Supplemental node information. - - Attributes: - os_name: The name of the running operating system of - the :class:`~framework.testbed_model.node.Node`. - os_version: The version of the running operating system of - the :class:`~framework.testbed_model.node.Node`. - kernel_version: The kernel version of the running operating system of - the :class:`~framework.testbed_model.node.Node`. - """ - - os_name: str - os_version: str - kernel_version: str - - @dataclass(slots=True, frozen=True) class DPDKBuildConfiguration: """DPDK build configuration. @@ -493,19 +475,6 @@ def from_dict(cls, d: DPDKConfigurationDict) -> Self: ) -@dataclass(slots=True, frozen=True) -class DPDKBuildInfo: - """Various versions and other information about a DPDK build. - - Attributes: - dpdk_version: The DPDK version that was built. - compiler_version: The version of the compiler used to build DPDK. - """ - - dpdk_version: str | None - compiler_version: str | None - - @dataclass(slots=True, frozen=True) class TestSuiteConfig: """Test suite configuration. diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py index 00263ad69e..d2f3a90eed 100644 --- a/dts/framework/test_result.py +++ b/dts/framework/test_result.py @@ -30,11 +30,13 @@ from framework.testbed_model.capability import Capability -from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig +from .config import TestRunConfiguration, TestSuiteConfig from .exception import DTSError, ErrorSeverity from .logger import DTSLogger from .settings import SETTINGS from .test_suite import TestCase, TestSuite +from .testbed_model.os_session import NodeInfo +from .testbed_model.sut_node import DPDKBuildInfo @dataclass(slots=True, frozen=True) diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py index 6194ddb989..5f087f40d6 100644 --- a/dts/framework/testbed_model/os_session.py +++ b/dts/framework/testbed_model/os_session.py @@ -24,11 +24,12 @@ """ from abc import ABC, abstractmethod from collections.abc import Iterable +from dataclasses import dataclass from ipaddress import IPv4Interface, IPv6Interface from pathlib import Path, PurePath, PurePosixPath from typing import Union -from framework.config import Architecture, NodeConfiguration, NodeInfo +from framework.config import Architecture, NodeConfiguration from framework.logger import DTSLogger from framework.remote_session import ( InteractiveRemoteSession, @@ -44,6 +45,24 @@ from .port import Port +@dataclass(slots=True, frozen=True) +class NodeInfo: + """Supplemental node information. + + Attributes: + os_name: The name of the running operating system of + the :class:`~framework.testbed_model.node.Node`. + os_version: The version of the running operating system of + the :class:`~framework.testbed_model.node.Node`. + kernel_version: The kernel version of the running operating system of + the :class:`~framework.testbed_model.node.Node`. + """ + + os_name: str + os_version: str + kernel_version: str + + class OSSession(ABC): """OS-unaware to OS-aware translation API definition. diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py index 5ab7c18fb7..0d3abbc519 100644 --- a/dts/framework/testbed_model/posix_session.py +++ b/dts/framework/testbed_model/posix_session.py @@ -15,7 +15,7 @@ from collections.abc import Iterable from pathlib import Path, PurePath, PurePosixPath -from framework.config import Architecture, NodeInfo +from framework.config import Architecture from framework.exception import DPDKBuildError, RemoteCommandExecutionError from framework.settings import SETTINGS from framework.utils import ( @@ -26,7 +26,7 @@ extract_tarball, ) -from .os_session import OSSession +from .os_session import NodeInfo, OSSession class PosixSession(OSSession): diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py index e160386324..a6c42b548c 100644 --- a/dts/framework/testbed_model/sut_node.py +++ b/dts/framework/testbed_model/sut_node.py @@ -14,13 +14,12 @@ import os import time +from dataclasses import dataclass from pathlib import PurePath from framework.config import ( DPDKBuildConfiguration, - DPDKBuildInfo, DPDKLocation, - NodeInfo, SutNodeConfiguration, TestRunConfiguration, ) @@ -30,10 +29,23 @@ from framework.utils import MesonArgs, TarCompressionFormat from .node import Node -from .os_session import OSSession +from .os_session import NodeInfo, OSSession from .virtual_device import VirtualDevice +@dataclass(slots=True, frozen=True) +class DPDKBuildInfo: + """Various versions and other information about a DPDK build. + + Attributes: + dpdk_version: The DPDK version that was built. + compiler_version: The version of the compiler used to build DPDK. + """ + + dpdk_version: str | None + compiler_version: str | None + + class SutNode(Node): """The system under test node. From patchwork Mon Oct 28 17:49:44 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147553 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 8D2A045BFF; Mon, 28 Oct 2024 18:51:40 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id 1BE2C427DE; Mon, 28 Oct 2024 18:51:22 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id 7D83E427C5 for ; Mon, 28 Oct 2024 18:51:16 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 9F27F13D5; Mon, 28 Oct 2024 10:51:45 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id 3C53B3F66E; Mon, 28 Oct 2024 10:51:15 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 4/8] dts: use pydantic in the configuration Date: Mon, 28 Oct 2024 17:49:44 +0000 Message-ID: <20241028174949.3283701-5-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org This change brings in pydantic in place of warlock. Pydantic offers a built-in model validation system in the classes, which allows for a more resilient and simpler code. As a consequence of this change: - most validation is now built-in - further validation is added to verify: - cross referencing of node names and ports - test suite and test cases names - dictionaries representing the config schema are removed - the config schema is no longer used and therefore dropped - the TrafficGeneratorType enum has been changed from inheriting StrEnum to the native str and Enum. This change was necessary to enable the discriminator for object unions - the structure of the classes has been slightly changed to perfectly match the structure of the configuration files - the test suite argument catches the ValidationError that TestSuiteConfig can now raise - the DPDK location has been wrapped under another configuration mapping `dpdk_location` - the DPDK locations are now structured and enforced by classes, further simplifying the validation and handling thanks to pattern matching Bugzilla ID: 1508 Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- doc/api/dts/conf_yaml_schema.json | 1 - doc/api/dts/framework.config.rst | 6 - doc/api/dts/framework.config.types.rst | 8 - dts/conf.yaml | 11 +- dts/framework/config/__init__.py | 822 +++++++++--------- dts/framework/config/conf_yaml_schema.json | 459 ---------- dts/framework/config/types.py | 149 ---- dts/framework/runner.py | 57 +- dts/framework/settings.py | 124 +-- dts/framework/testbed_model/node.py | 15 +- dts/framework/testbed_model/os_session.py | 4 +- dts/framework/testbed_model/port.py | 4 +- dts/framework/testbed_model/posix_session.py | 10 +- dts/framework/testbed_model/sut_node.py | 182 ++-- dts/framework/testbed_model/topology.py | 11 +- .../traffic_generator/__init__.py | 4 +- .../traffic_generator/traffic_generator.py | 2 +- dts/framework/utils.py | 2 +- dts/tests/TestSuite_smoke_tests.py | 2 +- 19 files changed, 653 insertions(+), 1220 deletions(-) delete mode 120000 doc/api/dts/conf_yaml_schema.json delete mode 100644 doc/api/dts/framework.config.types.rst delete mode 100644 dts/framework/config/conf_yaml_schema.json delete mode 100644 dts/framework/config/types.py diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json deleted file mode 120000 index 5978642d76..0000000000 --- a/doc/api/dts/conf_yaml_schema.json +++ /dev/null @@ -1 +0,0 @@ -../../../dts/framework/config/conf_yaml_schema.json \ No newline at end of file diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst index 261997aefa..cc266276c1 100644 --- a/doc/api/dts/framework.config.rst +++ b/doc/api/dts/framework.config.rst @@ -6,9 +6,3 @@ config - Configuration Package .. automodule:: framework.config :members: :show-inheritance: - -.. toctree:: - :hidden: - :maxdepth: 1 - - framework.config.types diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst deleted file mode 100644 index a50a0c874a..0000000000 --- a/doc/api/dts/framework.config.types.rst +++ /dev/null @@ -1,8 +0,0 @@ -.. SPDX-License-Identifier: BSD-3-Clause - -config.types - Configuration Types -================================== - -.. automodule:: framework.config.types - :members: - :show-inheritance: diff --git a/dts/conf.yaml b/dts/conf.yaml index 8a65a481d6..2496262854 100644 --- a/dts/conf.yaml +++ b/dts/conf.yaml @@ -5,11 +5,12 @@ test_runs: # define one test run environment - dpdk_build: - # dpdk_tree: Commented out because `tarball` is defined. - tarball: dpdk-tarball.tar.xz - # Either `dpdk_tree` or `tarball` can be defined, but not both. - remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` - # is located on the SUT node, instead of the execution host. + dpdk_location: + # dpdk_tree: Commented out because `tarball` is defined. + tarball: dpdk-tarball.tar.xz + # Either `dpdk_tree` or `tarball` can be defined, but not both. + remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` + # is located on the SUT node, instead of the execution host. # precompiled_build_dir: Commented out because `build_options` is defined. build_options: diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py index 7403ccbf14..c86bfaaabf 100644 --- a/dts/framework/config/__init__.py +++ b/dts/framework/config/__init__.py @@ -2,17 +2,18 @@ # Copyright(c) 2010-2021 Intel Corporation # Copyright(c) 2022-2023 University of New Hampshire # Copyright(c) 2023 PANTHEON.tech s.r.o. +# Copyright(c) 2024 Arm Limited """Testbed configuration and test suite specification. This package offers classes that hold real-time information about the testbed, hold test run configuration describing the tested testbed and a loader function, :func:`load_config`, which loads -the YAML test run configuration file -and validates it according to :download:`the schema `. +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic +model. The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout -this package. The allowed keys and types inside this dictionary are defined in -the :doc:`types ` module. +this package. The allowed keys and types inside this dictionary map directly to the +:class:`Configuration` model, its fields and sub-models. The test run configuration has two main sections: @@ -24,39 +25,28 @@ The real-time information about testbed is supposed to be gathered at runtime. -The classes defined in this package make heavy use of :mod:`dataclasses`. -All of them use slots and are frozen: +The classes defined in this package make heavy use of :mod:`pydantic`. +Nearly all of them are frozen: - * Slots enables some optimizations, by pre-allocating space for the defined - attributes in the underlying data structure, * Frozen makes the object immutable. This enables further optimizations, and makes it thread safe should we ever want to move in that direction. """ -import json -import os.path import tarfile -from dataclasses import dataclass, fields -from enum import auto, unique -from pathlib import Path -from typing import Union +from enum import Enum, auto, unique +from functools import cached_property +from pathlib import Path, PurePath +from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple -import warlock # type: ignore[import-untyped] import yaml +from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator from typing_extensions import Self -from framework.config.types import ( - ConfigurationDict, - DPDKBuildConfigDict, - DPDKConfigurationDict, - NodeConfigDict, - PortConfigDict, - TestRunConfigDict, - TestSuiteConfigDict, - TrafficGeneratorConfigDict, -) from framework.exception import ConfigurationError -from framework.utils import StrEnum +from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum + +if TYPE_CHECKING: + from framework.test_suite import TestSuiteSpec @unique @@ -118,15 +108,14 @@ class Compiler(StrEnum): @unique -class TrafficGeneratorType(StrEnum): +class TrafficGeneratorType(str, Enum): """The supported traffic generators.""" #: - SCAPY = auto() + SCAPY = "SCAPY" -@dataclass(slots=True, frozen=True) -class HugepageConfiguration: +class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"): r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s. Attributes: @@ -138,12 +127,10 @@ class HugepageConfiguration: force_first_numa: bool -@dataclass(slots=True, frozen=True) -class PortConfig: +class PortConfig(BaseModel, frozen=True, extra="forbid"): r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s. Attributes: - node: The :class:`~framework.testbed_model.node.Node` where this port exists. pci: The PCI address of the port. os_driver_for_dpdk: The operating system driver name for use with DPDK. os_driver: The operating system driver name when the operating system controls the port. @@ -152,70 +139,57 @@ class PortConfig: peer_pci: The PCI address of the port connected to this port. """ - node: str - pci: str - os_driver_for_dpdk: str - os_driver: str - peer_node: str - peer_pci: str - - @classmethod - def from_dict(cls, node: str, d: PortConfigDict) -> Self: - """A convenience method that creates the object from fewer inputs. - - Args: - node: The node where this port exists. - d: The configuration dictionary. - - Returns: - The port configuration instance. - """ - return cls(node=node, **d) - - -@dataclass(slots=True, frozen=True) -class TrafficGeneratorConfig: - """The configuration of traffic generators. - - The class will be expanded when more configuration is needed. + pci: str = Field( + description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS + ) + os_driver_for_dpdk: str = Field( + description="The driver that the kernel should bind this device to for DPDK to use it.", + examples=["vfio-pci", "mlx5_core"], + ) + os_driver: str = Field( + description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"] + ) + peer_node: str = Field(description="The name of the peer node this port is connected to.") + peer_pci: str = Field( + description="The PCI address of the peer port this port is connected to.", + pattern=REGEX_FOR_PCI_ADDRESS, + ) + + +class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"): + """A protocol required to define traffic generator types. Attributes: - traffic_generator_type: The type of the traffic generator. + type: The traffic generator type, the child class is required to define to be distinguished + among others. """ - traffic_generator_type: TrafficGeneratorType + type: TrafficGeneratorType - @staticmethod - def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig": - """A convenience method that produces traffic generator config of the proper type. - Args: - d: The configuration dictionary. +class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"): + """Scapy traffic generator specific configuration.""" - Returns: - The traffic generator configuration instance. + type: Literal[TrafficGeneratorType.SCAPY] - Raises: - ConfigurationError: An unknown traffic generator type was encountered. - """ - match TrafficGeneratorType(d["type"]): - case TrafficGeneratorType.SCAPY: - return ScapyTrafficGeneratorConfig( - traffic_generator_type=TrafficGeneratorType.SCAPY - ) - case _: - raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".') +#: A union type discriminating traffic generators by the `type` field. +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")] -@dataclass(slots=True, frozen=True) -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig): - """Scapy traffic generator specific configuration.""" - pass +#: A field representing logical core ranges. +LogicalCores = Annotated[ + str, + Field( + description="Comma-separated list of logical cores to use. " + "An empty string means use all lcores.", + examples=["1,2,3,4,5,18-22", "10-15"], + pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$", + ), +] -@dataclass(slots=True, frozen=True) -class NodeConfiguration: +class NodeConfiguration(BaseModel, frozen=True, extra="forbid"): r"""The configuration of :class:`~framework.testbed_model.node.Node`\s. Attributes: @@ -234,285 +208,317 @@ class NodeConfiguration: ports: The ports that can be used in testing. """ - name: str - hostname: str - user: str - password: str | None + name: str = Field(description="A unique identifier for this node.") + hostname: str = Field(description="The hostname or IP address of the node.") + user: str = Field(description="The login user to use to connect to this node.") + password: str | None = Field( + default=None, + description="The login password to use to connect to this node. " + "SSH keys are STRONGLY preferred, use only as last resort.", + ) arch: Architecture os: OS - lcores: str - use_first_core: bool - hugepages: HugepageConfiguration | None - ports: list[PortConfig] - - @staticmethod - def from_dict( - d: NodeConfigDict, - ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]: - """A convenience method that processes the inputs before creating a specialized instance. - - Args: - d: The configuration dictionary. - - Returns: - Either an SUT or TG configuration instance. - """ - hugepage_config = None - if "hugepages_2mb" in d: - hugepage_config_dict = d["hugepages_2mb"] - if "force_first_numa" not in hugepage_config_dict: - hugepage_config_dict["force_first_numa"] = False - hugepage_config = HugepageConfiguration(**hugepage_config_dict) - - # The calls here contain duplicated code which is here because Mypy doesn't - # properly support dictionary unpacking with TypedDicts - if "traffic_generator" in d: - return TGNodeConfiguration( - name=d["name"], - hostname=d["hostname"], - user=d["user"], - password=d.get("password"), - arch=Architecture(d["arch"]), - os=OS(d["os"]), - lcores=d.get("lcores", "1"), - use_first_core=d.get("use_first_core", False), - hugepages=hugepage_config, - ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]], - traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]), - ) - else: - return SutNodeConfiguration( - name=d["name"], - hostname=d["hostname"], - user=d["user"], - password=d.get("password"), - arch=Architecture(d["arch"]), - os=OS(d["os"]), - lcores=d.get("lcores", "1"), - use_first_core=d.get("use_first_core", False), - hugepages=hugepage_config, - ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]], - memory_channels=d.get("memory_channels", 1), - ) + lcores: LogicalCores = "1" + use_first_core: bool = Field( + default=False, description="DPDK won't use the first physical core if set to False." + ) + hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb") + ports: list[PortConfig] = Field(min_length=1) -@dataclass(slots=True, frozen=True) -class SutNodeConfiguration(NodeConfiguration): +class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"): """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration. Attributes: memory_channels: The number of memory channels to use when running DPDK. """ - memory_channels: int + memory_channels: int = Field( + default=1, description="Number of memory channels to use when running DPDK." + ) -@dataclass(slots=True, frozen=True) -class TGNodeConfiguration(NodeConfiguration): +class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"): """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration. Attributes: traffic_generator: The configuration of the traffic generator present on the TG node. """ - traffic_generator: TrafficGeneratorConfig + traffic_generator: TrafficGeneratorConfigTypes + + +#: Union type for all the node configuration types. +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration -@dataclass(slots=True, frozen=True) -class DPDKBuildConfiguration: - """DPDK build configuration. +def resolve_path(path: Path) -> Path: + """Resolve a path into a real path.""" + return path.resolve() - The configuration used for building DPDK. + +class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"): + """DPDK location. + + The path to the DPDK sources, build dir and type of location. Attributes: - arch: The target architecture to build for. - os: The target os to build for. - cpu: The target CPU to build for. - compiler: The compiler executable to use. - compiler_wrapper: This string will be put in front of the compiler when - executing the build. Useful for adding wrapper commands, such as ``ccache``. - name: The name of the compiler. + remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is + located on the SUT node, instead of the execution host. """ - arch: Architecture - os: OS - cpu: CPUType - compiler: Compiler - compiler_wrapper: str - name: str + remote: bool = False - @classmethod - def from_dict(cls, d: DPDKBuildConfigDict) -> Self: - r"""A convenience method that processes the inputs before creating an instance. - `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and - `name` is constructed from `arch`, `os`, `cpu` and `compiler`. +class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"): + """Local DPDK location parent class. - Args: - d: The configuration dictionary. + This class is meant to represent any location that is present only locally. + """ - Returns: - The DPDK build configuration instance. - """ - return cls( - arch=Architecture(d["arch"]), - os=OS(d["os"]), - cpu=CPUType(d["cpu"]), - compiler=Compiler(d["compiler"]), - compiler_wrapper=d.get("compiler_wrapper", ""), - name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}", - ) + remote: Literal[False] = False -@dataclass(slots=True, frozen=True) -class DPDKLocation: - """DPDK location. +class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"): + """Local DPDK tree location. - The path to the DPDK sources, build dir and type of location. + This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly + validation. Attributes: - dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball` - must be provided. - tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be - provided. - remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is - located on the SUT node, instead of the execution host. - build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in - a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using - `build_options` from configuration to build the DPDK from source. + dpdk_tree: The path to the DPDK source tree directory. """ - dpdk_tree: str | None - tarball: str | None - remote: bool - build_dir: str | None + dpdk_tree: Path - @classmethod - def from_dict(cls, d: DPDKConfigurationDict) -> Self: - """A convenience method that processes and validates the inputs before creating an instance. + #: Resolve the local DPDK tree path + resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path) - Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if - `remote` is False. + @model_validator(mode="after") + def validate_dpdk_tree_path(self) -> Self: + """Validate the provided DPDK tree path.""" + assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem." + assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory." + return self - Args: - d: The configuration dictionary. - Returns: - The DPDK location instance. +class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"): + """Local DPDK tarball location. - Raises: - ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they - aren't in the right format. - """ - dpdk_tree = d.get("dpdk_tree") - tarball = d.get("tarball") - remote = d.get("remote", False) - - if not remote: - if dpdk_tree: - if not Path(dpdk_tree).exists(): - raise ConfigurationError( - f"DPDK tree '{dpdk_tree}' not found in local filesystem." - ) - - if not Path(dpdk_tree).is_dir(): - raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.") - - dpdk_tree = os.path.realpath(dpdk_tree) - - if tarball: - if not Path(tarball).exists(): - raise ConfigurationError( - f"DPDK tarball '{tarball}' not found in local filesystem." - ) - - if not tarfile.is_tarfile(tarball): - raise ConfigurationError( - f"The DPDK tarball '{tarball}' must be a valid tar archive." - ) - - return cls( - dpdk_tree=dpdk_tree, - tarball=tarball, - remote=remote, - build_dir=d.get("precompiled_build_dir"), - ) + This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly + validation. + + Attributes: + tarball: The path to the DPDK tarball. + """ + tarball: Path -@dataclass -class DPDKConfiguration: - """The configuration of the DPDK build. + #: Resolve the local tarball path + resolve_tarball_path = field_validator("tarball")(resolve_path) - The configuration contain the location of the DPDK and configuration used for - building it. + @model_validator(mode="after") + def validate_tarball_path(self) -> Self: + """Validate the provided tarball.""" + assert self.tarball.exists(), "DPDK tarball not found in local filesystem." + assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive." + return self + + +class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"): + """Remote DPDK location parent class. + + This class is meant to represent any location that is present only remotely. + """ + + remote: Literal[True] = True + + +class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"): + """Remote DPDK tree location. + + This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation. + + Attributes: + dpdk_tree: The path to the DPDK source tree directory. + """ + + dpdk_tree: PurePath + + +class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"): + """Remote DPDK tarball location. + + This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly + validation. + + Attributes: + tarball: The path to the DPDK tarball. + """ + + tarball: PurePath + + +#: Union type for different DPDK locations +DPDKLocation = ( + LocalDPDKTreeLocation + | LocalDPDKTarballLocation + | RemoteDPDKTreeLocation + | RemoteDPDKTarballLocation +) + + +class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"): + """The base configuration for different types of build. + + The configuration contain the location of the DPDK and configuration used for building it. Attributes: dpdk_location: The location of the DPDK tree. - dpdk_build_config: A DPDK build configuration to test. If :data:`None`, - DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`. """ dpdk_location: DPDKLocation - dpdk_build_config: DPDKBuildConfiguration | None - @classmethod - def from_dict(cls, d: DPDKConfigurationDict) -> Self: - """A convenience method that processes the inputs before creating an instance. - Args: - d: The configuration dictionary. +class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"): + """DPDK precompiled build configuration. - Returns: - The DPDK configuration. - """ - return cls( - dpdk_location=DPDKLocation.from_dict(d), - dpdk_build_config=( - DPDKBuildConfiguration.from_dict(d["build_options"]) - if d.get("build_options") - else None - ), - ) + Attributes: + precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory + is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will + be using `dpdk_build_config` from configuration to build the DPDK from source. + """ + + precompiled_build_dir: str = Field(min_length=1) + + +class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"): + """DPDK build options configuration. + + The build options used for building DPDK. + + Attributes: + arch: The target architecture to build for. + os: The target os to build for. + cpu: The target CPU to build for. + compiler: The compiler executable to use. + compiler_wrapper: This string will be put in front of the compiler when executing the build. + Useful for adding wrapper commands, such as ``ccache``. + """ + + arch: Architecture + os: OS + cpu: CPUType + compiler: Compiler + compiler_wrapper: str = "" + @cached_property + def name(self) -> str: + """The name of the compiler.""" + return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}" -@dataclass(slots=True, frozen=True) -class TestSuiteConfig: + +class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"): + """DPDK uncompiled build configuration. + + Attributes: + build_options: The build options to compile DPDK. + """ + + build_options: DPDKBuildOptionsConfiguration + + +#: Union type for different build configurations +DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration + + +class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"): """Test suite configuration. - Information about a single test suite to be executed. + Information about a single test suite to be executed. This can also be represented as a string + instead of a mapping, example: + + .. code:: yaml + + test_runs: + - test_suites: + # As string representation: + - hello_world # test all of `hello_world`, or + - hello_world hello_world_single_core # test only `hello_world_single_core` + # or as model fields: + - test_suite: hello_world + test_cases: [hello_world_single_core] # without this field all test cases are run Attributes: - test_suite: The name of the test suite module without the starting ``TestSuite_``. - test_cases: The names of test cases from this test suite to execute. + test_suite_name: The name of the test suite module without the starting ``TestSuite_``. + test_cases_names: The names of test cases from this test suite to execute. If empty, all test cases will be executed. """ - test_suite: str - test_cases: list[str] - + test_suite_name: str = Field( + title="Test suite name", + description="The identifying module name of the test suite without the prefix.", + alias="test_suite", + ) + test_cases_names: list[str] = Field( + default_factory=list, + title="Test cases by name", + description="The identifying name of the test cases of the test suite.", + alias="test_cases", + ) + + @cached_property + def test_suite_spec(self) -> "TestSuiteSpec": + """The specification of the requested test suite.""" + from framework.test_suite import find_by_name + + test_suite_spec = find_by_name(self.test_suite_name) + assert ( + test_suite_spec is not None + ), f"{self.test_suite_name} is not a valid test suite module name." + return test_suite_spec + + @model_validator(mode="before") @classmethod - def from_dict( - cls, - entry: str | TestSuiteConfigDict, - ) -> Self: - """Create an instance from two different types. + def convert_from_string(cls, data: Any) -> Any: + """Convert the string representation of the model into a valid mapping.""" + if isinstance(data, str): + [test_suite, *test_cases] = data.split() + return dict(test_suite=test_suite, test_cases=test_cases) + return data + + @model_validator(mode="after") + def validate_names(self) -> Self: + """Validate the supplied test suite and test cases names. + + This validator relies on the cached property `test_suite_spec` to run for the first + time in this call, therefore triggering the assertions if needed. + """ + available_test_cases = map( + lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases() + ) + for requested_test_case in self.test_cases_names: + assert requested_test_case in available_test_cases, ( + f"{requested_test_case} is not a valid test case " + f"of test suite {self.test_suite_name}." + ) - Args: - entry: Either a suite name or a dictionary containing the config. + return self - Returns: - The test suite configuration instance. - """ - if isinstance(entry, str): - return cls(test_suite=entry, test_cases=[]) - elif isinstance(entry, dict): - return cls(test_suite=entry["suite"], test_cases=entry["cases"]) - else: - raise TypeError(f"{type(entry)} is not valid for a test suite config.") +class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"): + """The SUT node configuration of a test run. -@dataclass(slots=True, frozen=True) -class TestRunConfiguration: + Attributes: + node_name: The SUT node to use in this test run. + vdevs: The names of virtual devices to test. + """ + + node_name: str + vdevs: list[str] = Field(default_factory=list) + + +class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"): """The configuration of a test run. The configuration contains testbed information, what tests to execute @@ -524,144 +530,130 @@ class TestRunConfiguration: func: Whether to run functional tests. skip_smoke_tests: Whether to skip smoke tests. test_suites: The names of test suites and/or test cases to execute. - system_under_test_node: The SUT node to use in this test run. - traffic_generator_node: The TG node to use in this test run. - vdevs: The names of virtual devices to test. + system_under_test_node: The SUT node configuration to use in this test run. + traffic_generator_node: The TG node name to use in this test run. random_seed: The seed to use for pseudo-random generation. """ - dpdk_config: DPDKConfiguration - perf: bool - func: bool - skip_smoke_tests: bool - test_suites: list[TestSuiteConfig] - system_under_test_node: SutNodeConfiguration - traffic_generator_node: TGNodeConfiguration - vdevs: list[str] - random_seed: int | None - - @classmethod - def from_dict( - cls, - d: TestRunConfigDict, - node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration], - ) -> Self: - """A convenience method that processes the inputs before creating an instance. - - The DPDK build and the test suite config are transformed into their respective objects. - SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes - are just stored. - - Args: - d: The test run configuration dictionary. - node_map: A dictionary mapping node names to their config objects. - - Returns: - The test run configuration instance. - """ - test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"])) - sut_name = d["system_under_test_node"]["node_name"] - skip_smoke_tests = d.get("skip_smoke_tests", False) - assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}" - system_under_test_node = node_map[sut_name] - assert isinstance( - system_under_test_node, SutNodeConfiguration - ), f"Invalid SUT configuration {system_under_test_node}" - - tg_name = d["traffic_generator_node"] - assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}" - traffic_generator_node = node_map[tg_name] - assert isinstance( - traffic_generator_node, TGNodeConfiguration - ), f"Invalid TG configuration {traffic_generator_node}" - - vdevs = ( - d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else [] - ) - random_seed = d.get("random_seed", None) - return cls( - dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]), - perf=d["perf"], - func=d["func"], - skip_smoke_tests=skip_smoke_tests, - test_suites=test_suites, - system_under_test_node=system_under_test_node, - traffic_generator_node=traffic_generator_node, - vdevs=vdevs, - random_seed=random_seed, - ) - - def copy_and_modify(self, **kwargs) -> Self: - """Create a shallow copy with any of the fields modified. + dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build") + perf: bool = Field(description="Enable performance testing.") + func: bool = Field(description="Enable functional testing.") + skip_smoke_tests: bool = False + test_suites: list[TestSuiteConfig] = Field(min_length=1) + system_under_test_node: TestRunSUTNodeConfiguration + traffic_generator_node: str + random_seed: int | None = None - The only new data are those passed to this method. - The rest are copied from the object's fields calling the method. - Args: - **kwargs: The names and types of keyword arguments are defined - by the fields of the :class:`TestRunConfiguration` class. +class TestRunWithNodesConfiguration(NamedTuple): + """Tuple containing the configuration of the test run and its associated nodes.""" - Returns: - The copied and modified test run configuration. - """ - new_config = {} - for field in fields(self): - if field.name in kwargs: - new_config[field.name] = kwargs[field.name] - else: - new_config[field.name] = getattr(self, field.name) - - return type(self)(**new_config) + #: + test_run_config: TestRunConfiguration + #: + sut_node_config: SutNodeConfiguration + #: + tg_node_config: TGNodeConfiguration -@dataclass(slots=True, frozen=True) -class Configuration: +class Configuration(BaseModel, extra="forbid"): """DTS testbed and test configuration. - The node configuration is not stored in this object. Rather, all used node configurations - are stored inside the test run configuration where the nodes are actually used. - Attributes: test_runs: Test run configurations. + nodes: Node configurations. """ - test_runs: list[TestRunConfiguration] + test_runs: list[TestRunConfiguration] = Field(min_length=1) + nodes: list[NodeConfigurationTypes] = Field(min_length=1) - @classmethod - def from_dict(cls, d: ConfigurationDict) -> Self: - """A convenience method that processes the inputs before creating an instance. + @cached_property + def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]: + """List of test runs with the associated nodes.""" + test_runs_with_nodes = [] - DPDK build and test suite config are transformed into their respective objects. - SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes - are just stored. + for test_run_no, test_run in enumerate(self.test_runs): + sut_node_name = test_run.system_under_test_node.node_name + sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None) - Args: - d: The configuration dictionary. + assert sut_node is not None, ( + f"test_runs.{test_run_no}.sut_node_config.node_name " + f"({test_run.system_under_test_node.node_name}) is not a valid node name" + ) + assert isinstance(sut_node, SutNodeConfiguration), ( + f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, " + "but it is not a valid SUT node" + ) - Returns: - The whole configuration instance. - """ - nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list( - map(NodeConfiguration.from_dict, d["nodes"]) - ) - assert len(nodes) > 0, "There must be a node to test" + tg_node_name = test_run.traffic_generator_node + tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None) - node_map = {node.name: node for node in nodes} - assert len(nodes) == len(node_map), "Duplicate node names are not allowed" + assert tg_node is not None, ( + f"test_runs.{test_run_no}.tg_node_name " + f"({test_run.traffic_generator_node}) is not a valid node name" + ) + assert isinstance(tg_node, TGNodeConfiguration), ( + f"test_runs.{test_run_no}.tg_node_name is a valid node name, " + "but it is not a valid TG node" + ) - test_runs: list[TestRunConfiguration] = list( - map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d]) - ) + test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node)) + + return test_runs_with_nodes + + @field_validator("nodes") + @classmethod + def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]: + """Validate that the node names are unique.""" + nodes_by_name: dict[str, int] = {} + for node_no, node in enumerate(nodes): + assert node.name not in nodes_by_name, ( + f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} " + f"({node.name})" + ) + nodes_by_name[node.name] = node_no + + return nodes + + @model_validator(mode="after") + def validate_ports(self) -> Self: + """Validate that the ports are all linked to valid ones.""" + port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = { + (node.name, port.pci): False for node in self.nodes for port in node.ports + } + + for node_no, node in enumerate(self.nodes): + for port_no, port in enumerate(node.ports): + peer_port_identifier = (port.peer_node, port.peer_pci) + peer_port = port_links.get(peer_port_identifier, None) + assert peer_port is not None, ( + "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}" + ) + assert peer_port is False, ( + f"the peer port specified for nodes.{node_no}.ports.{port_no} " + f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}" + ) + port_links[peer_port_identifier] = (node_no, port_no) - return cls(test_runs=test_runs) + return self + + @model_validator(mode="after") + def validate_test_runs_with_nodes(self) -> Self: + """Validate the test runs to nodes associations. + + This validator relies on the cached property `test_runs_with_nodes` to run for the first + time in this call, therefore triggering the assertions if needed. + """ + if self.test_runs_with_nodes: + pass + return self def load_config(config_file_path: Path) -> Configuration: """Load DTS test run configuration from a file. - Load the YAML test run configuration file - and :download:`the configuration file schema `, - validate the test run configuration file, and create a test run configuration object. + Load the YAML test run configuration file, validate it, and create a test run configuration + object. The YAML test run configuration file is specified in the :option:`--config-file` command line argument or the :envvar:`DTS_CFG_FILE` environment variable. @@ -671,14 +663,14 @@ def load_config(config_file_path: Path) -> Configuration: Returns: The parsed test run configuration. + + Raises: + ConfigurationError: If the supplied configuration file is invalid. """ with open(config_file_path, "r") as f: config_data = yaml.safe_load(f) - schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json") - - with open(schema_path, "r") as f: - schema = json.load(f) - config = warlock.model_factory(schema, name="_Config")(config_data) - config_obj: Configuration = Configuration.from_dict(dict(config)) # type: ignore[arg-type] - return config_obj + try: + return Configuration.model_validate(config_data) + except ValidationError as e: + raise ConfigurationError("failed to load the supplied configuration") from e diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json deleted file mode 100644 index cc3e78cef5..0000000000 --- a/dts/framework/config/conf_yaml_schema.json +++ /dev/null @@ -1,459 +0,0 @@ -{ - "$schema": "https://json-schema.org/draft-07/schema", - "title": "DTS Config Schema", - "definitions": { - "node_name": { - "type": "string", - "description": "A unique identifier for a node" - }, - "NIC": { - "type": "string", - "enum": [ - "ALL", - "ConnectX3_MT4103", - "ConnectX4_LX_MT4117", - "ConnectX4_MT4115", - "ConnectX5_MT4119", - "ConnectX5_MT4121", - "I40E_10G-10G_BASE_T_BC", - "I40E_10G-10G_BASE_T_X722", - "I40E_10G-SFP_X722", - "I40E_10G-SFP_XL710", - "I40E_10G-X722_A0", - "I40E_1G-1G_BASE_T_X722", - "I40E_25G-25G_SFP28", - "I40E_40G-QSFP_A", - "I40E_40G-QSFP_B", - "IAVF-ADAPTIVE_VF", - "IAVF-VF", - "IAVF_10G-X722_VF", - "ICE_100G-E810C_QSFP", - "ICE_25G-E810C_SFP", - "ICE_25G-E810_XXV_SFP", - "IGB-I350_VF", - "IGB_1G-82540EM", - "IGB_1G-82545EM_COPPER", - "IGB_1G-82571EB_COPPER", - "IGB_1G-82574L", - "IGB_1G-82576", - "IGB_1G-82576_QUAD_COPPER", - "IGB_1G-82576_QUAD_COPPER_ET2", - "IGB_1G-82580_COPPER", - "IGB_1G-I210_COPPER", - "IGB_1G-I350_COPPER", - "IGB_1G-I354_SGMII", - "IGB_1G-PCH_LPTLP_I218_LM", - "IGB_1G-PCH_LPTLP_I218_V", - "IGB_1G-PCH_LPT_I217_LM", - "IGB_1G-PCH_LPT_I217_V", - "IGB_2.5G-I354_BACKPLANE_2_5GBPS", - "IGC-I225_LM", - "IGC-I226_LM", - "IXGBE_10G-82599_SFP", - "IXGBE_10G-82599_SFP_SF_QP", - "IXGBE_10G-82599_T3_LOM", - "IXGBE_10G-82599_VF", - "IXGBE_10G-X540T", - "IXGBE_10G-X540_VF", - "IXGBE_10G-X550EM_A_SFP", - "IXGBE_10G-X550EM_X_10G_T", - "IXGBE_10G-X550EM_X_SFP", - "IXGBE_10G-X550EM_X_VF", - "IXGBE_10G-X550T", - "IXGBE_10G-X550_VF", - "brcm_57414", - "brcm_P2100G", - "cavium_0011", - "cavium_a034", - "cavium_a063", - "cavium_a064", - "fastlinq_ql41000", - "fastlinq_ql41000_vf", - "fastlinq_ql45000", - "fastlinq_ql45000_vf", - "hi1822", - "virtio" - ] - }, - - "ARCH": { - "type": "string", - "enum": [ - "x86_64", - "arm64", - "ppc64le" - ] - }, - "OS": { - "type": "string", - "enum": [ - "linux" - ] - }, - "cpu": { - "type": "string", - "description": "Native should be the default on x86", - "enum": [ - "native", - "armv8a", - "dpaa2", - "thunderx", - "xgene1" - ] - }, - "compiler": { - "type": "string", - "enum": [ - "gcc", - "clang", - "icc", - "mscv" - ] - }, - "build_options": { - "type": "object", - "properties": { - "arch": { - "type": "string", - "enum": [ - "ALL", - "x86_64", - "arm64", - "ppc64le", - "other" - ] - }, - "os": { - "$ref": "#/definitions/OS" - }, - "cpu": { - "$ref": "#/definitions/cpu" - }, - "compiler": { - "$ref": "#/definitions/compiler" - }, - "compiler_wrapper": { - "type": "string", - "description": "This will be added before compiler to the CC variable when building DPDK. Optional." - } - }, - "additionalProperties": false, - "required": [ - "arch", - "os", - "cpu", - "compiler" - ] - }, - "dpdk_build": { - "type": "object", - "description": "DPDK source and build configuration.", - "properties": { - "dpdk_tree": { - "type": "string", - "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided." - }, - "tarball": { - "type": "string", - "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided." - }, - "remote": { - "type": "boolean", - "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host." - }, - "precompiled_build_dir": { - "type": "string", - "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both." - }, - "build_options": { - "$ref": "#/definitions/build_options", - "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS." - } - }, - "allOf": [ - { - "oneOf": [ - { - "required": [ - "dpdk_tree" - ] - }, - { - "required": [ - "tarball" - ] - } - ] - }, - { - "oneOf": [ - { - "required": [ - "precompiled_build_dir" - ] - }, - { - "required": [ - "build_options" - ] - } - ] - } - ], - "additionalProperties": false - }, - "hugepages_2mb": { - "type": "object", - "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.", - "properties": { - "number_of": { - "type": "integer", - "description": "The number of hugepages to configure. Hugepage size will be the system default." - }, - "force_first_numa": { - "type": "boolean", - "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False." - } - }, - "additionalProperties": false, - "required": [ - "number_of" - ] - }, - "mac_address": { - "type": "string", - "description": "A MAC address", - "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$" - }, - "pci_address": { - "type": "string", - "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$" - }, - "port_peer_address": { - "description": "Peer is a TRex port, and IXIA port or a PCI address", - "oneOf": [ - { - "description": "PCI peer port", - "$ref": "#/definitions/pci_address" - } - ] - }, - "test_suite": { - "type": "string", - "enum": [ - "hello_world", - "os_udp", - "pmd_buffer_scatter", - "vlan" - ] - }, - "test_target": { - "type": "object", - "properties": { - "suite": { - "$ref": "#/definitions/test_suite" - }, - "cases": { - "type": "array", - "description": "If specified, only this subset of test suite's test cases will be run.", - "items": { - "type": "string" - }, - "minimum": 1 - } - }, - "required": [ - "suite" - ], - "additionalProperties": false - } - }, - "type": "object", - "properties": { - "nodes": { - "type": "array", - "items": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "A unique identifier for this node" - }, - "hostname": { - "type": "string", - "description": "A hostname from which the node running DTS can access this node. This can also be an IP address." - }, - "user": { - "type": "string", - "description": "The user to access this node with." - }, - "password": { - "type": "string", - "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred." - }, - "arch": { - "$ref": "#/definitions/ARCH" - }, - "os": { - "$ref": "#/definitions/OS" - }, - "lcores": { - "type": "string", - "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$", - "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores." - }, - "use_first_core": { - "type": "boolean", - "description": "Indicate whether DPDK should use the first physical core. It won't be used by default." - }, - "memory_channels": { - "type": "integer", - "description": "How many memory channels to use. Optional, defaults to 1." - }, - "hugepages_2mb": { - "$ref": "#/definitions/hugepages_2mb" - }, - "ports": { - "type": "array", - "items": { - "type": "object", - "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.", - "properties": { - "pci": { - "$ref": "#/definitions/pci_address", - "description": "The local PCI address of the port" - }, - "os_driver_for_dpdk": { - "type": "string", - "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)" - }, - "os_driver": { - "type": "string", - "description": "The driver normally used by this port (ex: i40e)" - }, - "peer_node": { - "type": "string", - "description": "The name of the node the peer port is on" - }, - "peer_pci": { - "$ref": "#/definitions/pci_address", - "description": "The PCI address of the peer port" - } - }, - "additionalProperties": false, - "required": [ - "pci", - "os_driver_for_dpdk", - "os_driver", - "peer_node", - "peer_pci" - ] - }, - "minimum": 1 - }, - "traffic_generator": { - "oneOf": [ - { - "type": "object", - "description": "Scapy traffic generator. Used for functional testing.", - "properties": { - "type": { - "type": "string", - "enum": [ - "SCAPY" - ] - } - } - } - ] - } - }, - "additionalProperties": false, - "required": [ - "name", - "hostname", - "user", - "arch", - "os" - ] - }, - "minimum": 1 - }, - "test_runs": { - "type": "array", - "items": { - "type": "object", - "properties": { - "dpdk_build": { - "$ref": "#/definitions/dpdk_build" - }, - "perf": { - "type": "boolean", - "description": "Enable performance testing." - }, - "func": { - "type": "boolean", - "description": "Enable functional testing." - }, - "test_suites": { - "type": "array", - "items": { - "oneOf": [ - { - "$ref": "#/definitions/test_suite" - }, - { - "$ref": "#/definitions/test_target" - } - ] - } - }, - "skip_smoke_tests": { - "description": "Optional field that allows you to skip smoke testing", - "type": "boolean" - }, - "system_under_test_node": { - "type":"object", - "properties": { - "node_name": { - "$ref": "#/definitions/node_name" - }, - "vdevs": { - "description": "Optional list of names of vdevs to be used in the test run", - "type": "array", - "items": { - "type": "string" - } - } - }, - "required": [ - "node_name" - ] - }, - "traffic_generator_node": { - "$ref": "#/definitions/node_name" - }, - "random_seed": { - "type": "integer", - "description": "Optional field. Allows you to set a seed for pseudo-random generation." - } - }, - "additionalProperties": false, - "required": [ - "dpdk_build", - "perf", - "func", - "test_suites", - "system_under_test_node", - "traffic_generator_node" - ] - }, - "minimum": 1 - } - }, - "required": [ - "test_runs", - "nodes" - ], - "additionalProperties": false -} diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py deleted file mode 100644 index 02e738a61e..0000000000 --- a/dts/framework/config/types.py +++ /dev/null @@ -1,149 +0,0 @@ -# SPDX-License-Identifier: BSD-3-Clause -# Copyright(c) 2023 PANTHEON.tech s.r.o. - -"""Configuration dictionary contents specification. - -These type definitions serve as documentation of the configuration dictionary contents. - -The definitions use the built-in :class:`~typing.TypedDict` construct. -""" - -from typing import TypedDict - - -class PortConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - pci: str - #: - os_driver_for_dpdk: str - #: - os_driver: str - #: - peer_node: str - #: - peer_pci: str - - -class TrafficGeneratorConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - type: str - - -class HugepageConfigurationDict(TypedDict): - """Allowed keys and values.""" - - #: - number_of: int - #: - force_first_numa: bool - - -class NodeConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - hugepages_2mb: HugepageConfigurationDict - #: - name: str - #: - hostname: str - #: - user: str - #: - password: str - #: - arch: str - #: - os: str - #: - lcores: str - #: - use_first_core: bool - #: - ports: list[PortConfigDict] - #: - memory_channels: int - #: - traffic_generator: TrafficGeneratorConfigDict - - -class DPDKBuildConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - arch: str - #: - os: str - #: - cpu: str - #: - compiler: str - #: - compiler_wrapper: str - - -class DPDKConfigurationDict(TypedDict): - """Allowed keys and values.""" - - #: - dpdk_tree: str | None - #: - tarball: str | None - #: - remote: bool - #: - precompiled_build_dir: str | None - #: - build_options: DPDKBuildConfigDict - - -class TestSuiteConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - suite: str - #: - cases: list[str] - - -class TestRunSUTConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - node_name: str - #: - vdevs: list[str] - - -class TestRunConfigDict(TypedDict): - """Allowed keys and values.""" - - #: - dpdk_build: DPDKConfigurationDict - #: - perf: bool - #: - func: bool - #: - skip_smoke_tests: bool - #: - test_suites: TestSuiteConfigDict - #: - system_under_test_node: TestRunSUTConfigDict - #: - traffic_generator_node: str - #: - random_seed: int - - -class ConfigurationDict(TypedDict): - """Allowed keys and values.""" - - #: - nodes: list[NodeConfigDict] - #: - test_runs: list[TestRunConfigDict] diff --git a/dts/framework/runner.py b/dts/framework/runner.py index 195622c653..c3d9a27a8c 100644 --- a/dts/framework/runner.py +++ b/dts/framework/runner.py @@ -30,7 +30,15 @@ from framework.testbed_model.sut_node import SutNode from framework.testbed_model.tg_node import TGNode -from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config +from .config import ( + Configuration, + DPDKPrecompiledBuildConfiguration, + SutNodeConfiguration, + TestRunConfiguration, + TestSuiteConfig, + TGNodeConfiguration, + load_config, +) from .exception import ( BlockingTestSuiteError, ConfigurationError, @@ -133,11 +141,10 @@ def run(self) -> None: self._result.update_setup(Result.PASS) # for all test run sections - for test_run_config in self._configuration.test_runs: + for test_run_with_nodes_config in self._configuration.test_runs_with_nodes: + test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config self._logger.set_stage(DtsStage.test_run_setup) - self._logger.info( - f"Running test run with SUT '{test_run_config.system_under_test_node.name}'." - ) + self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.") self._init_random_seed(test_run_config) test_run_result = self._result.add_test_run(test_run_config) # we don't want to modify the original config, so create a copy @@ -145,7 +152,7 @@ def run(self) -> None: SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites ) if not test_run_config.skip_smoke_tests: - test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")] + test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")] try: test_suites_with_cases = self._get_test_suites_with_cases( test_run_test_suites, test_run_config.func, test_run_config.perf @@ -161,6 +168,8 @@ def run(self) -> None: self._connect_nodes_and_run_test_run( sut_nodes, tg_nodes, + sut_node_config, + tg_node_config, test_run_config, test_run_result, test_suites_with_cases, @@ -223,10 +232,10 @@ def _get_test_suites_with_cases( test_suites_with_cases = [] for test_suite_config in test_suite_configs: - test_suite_class = self._get_test_suite_class(test_suite_config.test_suite) + test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name) test_cases: list[type[TestCase]] = [] func_test_cases, perf_test_cases = test_suite_class.filter_test_cases( - test_suite_config.test_cases + test_suite_config.test_cases_names ) if func: test_cases.extend(func_test_cases) @@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run( self, sut_nodes: dict[str, SutNode], tg_nodes: dict[str, TGNode], + sut_node_config: SutNodeConfiguration, + tg_node_config: TGNodeConfiguration, test_run_config: TestRunConfiguration, test_run_result: TestRunResult, test_suites_with_cases: Iterable[TestSuiteWithCases], @@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run( Args: sut_nodes: A dictionary storing connected/to be connected SUT nodes. tg_nodes: A dictionary storing connected/to be connected TG nodes. + sut_node_config: The test run's SUT node configuration. + tg_node_config: The test run's TG node configuration. test_run_config: A test run configuration. test_run_result: The test run's result. test_suites_with_cases: The test suites with test cases to run. """ - sut_node = sut_nodes.get(test_run_config.system_under_test_node.name) - tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name) + sut_node = sut_nodes.get(sut_node_config.name) + tg_node = tg_nodes.get(tg_node_config.name) try: if not sut_node: - sut_node = SutNode(test_run_config.system_under_test_node) + sut_node = SutNode(sut_node_config) sut_nodes[sut_node.name] = sut_node if not tg_node: - tg_node = TGNode(test_run_config.traffic_generator_node) + tg_node = TGNode(tg_node_config) tg_nodes[tg_node.name] = tg_node except Exception as e: - failed_node = test_run_config.system_under_test_node.name + failed_node = test_run_config.system_under_test_node.node_name if sut_node: - failed_node = test_run_config.traffic_generator_node.name + failed_node = test_run_config.traffic_generator_node self._logger.exception(f"The Creation of node {failed_node} failed.") test_run_result.update_setup(Result.FAIL, e) @@ -369,14 +382,22 @@ def _run_test_run( ConfigurationError: If the DPDK sources or build is not set up from config or settings. """ self._logger.info( - f"Running test run with SUT '{test_run_config.system_under_test_node.name}'." + f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'." ) test_run_result.add_sut_info(sut_node.node_info) try: - dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location - sut_node.set_up_test_run(test_run_config, dpdk_location) + dpdk_build_config = test_run_config.dpdk_config + if new_location := SETTINGS.dpdk_location: + dpdk_build_config = dpdk_build_config.model_copy( + update={"dpdk_location": new_location} + ) + if dir := SETTINGS.precompiled_build_dir: + dpdk_build_config = DPDKPrecompiledBuildConfiguration( + dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir + ) + sut_node.set_up_test_run(test_run_config, dpdk_build_config) test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info()) - tg_node.set_up_test_run(test_run_config, dpdk_location) + tg_node.set_up_test_run(test_run_config, dpdk_build_config) test_run_result.update_setup(Result.PASS) except Exception as e: self._logger.exception("Test run setup failed.") diff --git a/dts/framework/settings.py b/dts/framework/settings.py index a452319b90..1253ed86ac 100644 --- a/dts/framework/settings.py +++ b/dts/framework/settings.py @@ -60,9 +60,8 @@ .. option:: --precompiled-build-dir .. envvar:: DTS_PRECOMPILED_BUILD_DIR - Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are - located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with - --dpdk-tree or --tarball. + Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled + binaries are located. .. option:: --test-suite .. envvar:: DTS_TEST_SUITES @@ -95,13 +94,21 @@ import argparse import os import sys -import tarfile from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name from dataclasses import dataclass, field from pathlib import Path from typing import Callable -from .config import DPDKLocation, TestSuiteConfig +from pydantic import ValidationError + +from .config import ( + DPDKLocation, + LocalDPDKTarballLocation, + LocalDPDKTreeLocation, + RemoteDPDKTarballLocation, + RemoteDPDKTreeLocation, + TestSuiteConfig, +) @dataclass(slots=True) @@ -122,6 +129,8 @@ class Settings: #: dpdk_location: DPDKLocation | None = None #: + precompiled_build_dir: str | None = None + #: compile_timeout: float = 1200 #: test_suites: list[TestSuiteConfig] = field(default_factory=list) @@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser: action = dpdk_build.add_argument( "--precompiled-build-dir", - help="Define the subdirectory under the DPDK tree root directory where the pre-compiled " - "binaries are located. If set, DTS will build DPDK under the `build` directory instead. " - "Can only be used with --dpdk-tree or --tarball.", + help="Define the subdirectory under the DPDK tree root directory or tarball where the " + "pre-compiled binaries are located.", metavar="DIR_NAME", ) _add_env_var_to_action(action) - _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path") action = parser.add_argument( "--compile-timeout", @@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser: def _process_dpdk_location( + parser: _DTSArgumentParser, dpdk_tree: str | None, tarball: str | None, remote: bool, - build_dir: str | None, -): +) -> DPDKLocation | None: """Process and validate DPDK build arguments. Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns - the :class:`DPDKLocation` with the provided parameters if validation is successful. + any valid :class:`DPDKLocation` with the provided parameters if validation is successful. Args: - dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball` - must be provided. - tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be - provided. + dpdk_tree: The path to the DPDK source tree directory. + tarball: The path to the DPDK tarball. remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host. - build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a - subdirectory of `dpdk_tree` or `tarball` root directory. Returns: A DPDK location if construction is successful, otherwise None. - - Raises: - argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or - they aren't in the right format. """ - if not (dpdk_tree or tarball): - return None - - if not remote: - if dpdk_tree: - if not Path(dpdk_tree).exists(): - raise argparse.ArgumentTypeError( - f"DPDK tree '{dpdk_tree}' not found in local filesystem." - ) - - if not Path(dpdk_tree).is_dir(): - raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.") - - dpdk_tree = os.path.realpath(dpdk_tree) - - if tarball: - if not Path(tarball).exists(): - raise argparse.ArgumentTypeError( - f"DPDK tarball '{tarball}' not found in local filesystem." - ) - - if not tarfile.is_tarfile(tarball): - raise argparse.ArgumentTypeError( - f"DPDK tarball '{tarball}' must be a valid tar archive." - ) - - return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir) + if dpdk_tree: + action = parser.find_action("dpdk_tree", _is_from_env) + + try: + if remote: + return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree}) + else: + return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree}) + except ValidationError as e: + print( + "An error has occurred while validating the DPDK tree supplied in the " + f"{'environment variable' if action else 'arguments'}:", + file=sys.stderr, + ) + print(e, file=sys.stderr) + sys.exit(1) + + if tarball: + action = parser.find_action("tarball", _is_from_env) + + try: + if remote: + return RemoteDPDKTarballLocation.model_validate({"tarball": tarball}) + else: + return LocalDPDKTarballLocation.model_validate({"tarball": tarball}) + except ValidationError as e: + print( + "An error has occurred while validating the DPDK tarball supplied in the " + f"{'environment variable' if action else 'arguments'}:", + file=sys.stderr, + ) + print(e, file=sys.stderr) + sys.exit(1) + + return None def _process_test_suites( @@ -512,11 +519,24 @@ def _process_test_suites( Returns: A list of test suite configurations to execute. """ - if parser.find_action("test_suites", _is_from_env): + action = parser.find_action("test_suites", _is_from_env) + if action: # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..." args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")] - return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args] + try: + return [ + TestSuiteConfig(test_suite=test_suite, test_cases=test_cases) + for [test_suite, *test_cases] in args + ] + except ValidationError as e: + print( + "An error has occurred while validating the test suites supplied in the " + f"{'environment variable' if action else 'arguments'}:", + file=sys.stderr, + ) + print(e, file=sys.stderr) + sys.exit(1) def get_settings() -> Settings: @@ -536,7 +556,7 @@ def get_settings() -> Settings: args = parser.parse_args() args.dpdk_location = _process_dpdk_location( - args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir + parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source ) args.test_suites = _process_test_suites(parser, args.test_suites) diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py index 62867fd80c..6031eaf937 100644 --- a/dts/framework/testbed_model/node.py +++ b/dts/framework/testbed_model/node.py @@ -17,7 +17,12 @@ from ipaddress import IPv4Interface, IPv6Interface from typing import Union -from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration +from framework.config import ( + OS, + DPDKBuildConfiguration, + NodeConfiguration, + TestRunConfiguration, +) from framework.exception import ConfigurationError from framework.logger import DTSLogger, get_dts_logger @@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration): self._init_ports() def _init_ports(self) -> None: - self.ports = [Port(port_config) for port_config in self.config.ports] + self.ports = [Port(self.name, port_config) for port_config in self.config.ports] self.main_session.update_ports(self.ports) for port in self.ports: self.configure_port_state(port) def set_up_test_run( - self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation + self, + test_run_config: TestRunConfiguration, + dpdk_build_config: DPDKBuildConfiguration, ) -> None: """Test run setup steps. @@ -105,7 +112,7 @@ def set_up_test_run( Args: test_run_config: A test run configuration according to which the setup steps will be taken. - dpdk_location: The target source of the DPDK tree. + dpdk_build_config: The build configuration of DPDK. """ self._setup_hugepages() diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py index 5f087f40d6..42ab4bb8fd 100644 --- a/dts/framework/testbed_model/os_session.py +++ b/dts/framework/testbed_model/os_session.py @@ -364,7 +364,7 @@ def extract_remote_tarball( """ @abstractmethod - def is_remote_dir(self, remote_path: str) -> bool: + def is_remote_dir(self, remote_path: PurePath) -> bool: """Check if the `remote_path` is a directory. Args: @@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool: """ @abstractmethod - def is_remote_tarfile(self, remote_tarball_path: str) -> bool: + def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool: """Check if the `remote_tarball_path` is a tar archive. Args: diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py index 82c84cf4f8..817405bea4 100644 --- a/dts/framework/testbed_model/port.py +++ b/dts/framework/testbed_model/port.py @@ -54,7 +54,7 @@ class Port: mac_address: str = "" logical_name: str = "" - def __init__(self, config: PortConfig): + def __init__(self, node_name: str, config: PortConfig): """Initialize the port from `node_name` and `config`. Args: @@ -62,7 +62,7 @@ def __init__(self, config: PortConfig): config: The test run configuration of the port. """ self.identifier = PortIdentifier( - node=config.node, + node=node_name, pci=config.pci, ) self.os_driver = config.os_driver diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py index 0d3abbc519..6b66f33e22 100644 --- a/dts/framework/testbed_model/posix_session.py +++ b/dts/framework/testbed_model/posix_session.py @@ -201,12 +201,12 @@ def extract_remote_tarball( if expected_dir: self.send_command(f"ls {expected_dir}", verify=True) - def is_remote_dir(self, remote_path: str) -> bool: + def is_remote_dir(self, remote_path: PurePath) -> bool: """Overrides :meth:`~.os_session.OSSession.is_remote_dir`.""" result = self.send_command(f"test -d {remote_path}") return not result.return_code - def is_remote_tarfile(self, remote_tarball_path: str) -> bool: + def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool: """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`.""" result = self.send_command(f"tar -tvf {remote_tarball_path}") return not result.return_code @@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo: SETTINGS.timeout, ).stdout.split("\n") kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout - return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version) + return NodeInfo( + os_name=os_release_info[0].strip(), + os_version=os_release_info[1].strip(), + kernel_version=kernel_version, + ) diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py index a6c42b548c..57337c8e7d 100644 --- a/dts/framework/testbed_model/sut_node.py +++ b/dts/framework/testbed_model/sut_node.py @@ -15,11 +15,17 @@ import os import time from dataclasses import dataclass -from pathlib import PurePath +from pathlib import Path, PurePath from framework.config import ( DPDKBuildConfiguration, - DPDKLocation, + DPDKBuildOptionsConfiguration, + DPDKPrecompiledBuildConfiguration, + DPDKUncompiledBuildConfiguration, + LocalDPDKTarballLocation, + LocalDPDKTreeLocation, + RemoteDPDKTarballLocation, + RemoteDPDKTreeLocation, SutNodeConfiguration, TestRunConfiguration, ) @@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo: return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version) def set_up_test_run( - self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation + self, + test_run_config: TestRunConfiguration, + dpdk_build_config: DPDKBuildConfiguration, ) -> None: """Extend the test run setup with vdev config and DPDK build set up. @@ -188,12 +196,12 @@ def set_up_test_run( Args: test_run_config: A test run configuration according to which the setup steps will be taken. - dpdk_location: The target source of the DPDK tree. + dpdk_build_config: The build configuration of DPDK. """ - super().set_up_test_run(test_run_config, dpdk_location) - for vdev in test_run_config.vdevs: + super().set_up_test_run(test_run_config, dpdk_build_config) + for vdev in test_run_config.system_under_test_node.vdevs: self.virtual_devices.append(VirtualDevice(vdev)) - self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config) + self._set_up_dpdk(dpdk_build_config) def tear_down_test_run(self) -> None: """Extend the test run teardown with virtual device teardown and DPDK teardown.""" @@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None: self._tear_down_dpdk() def _set_up_dpdk( - self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None + self, + dpdk_build_config: DPDKBuildConfiguration, ) -> None: """Set up DPDK the SUT node and bind ports. @@ -211,21 +220,26 @@ def _set_up_dpdk( are bound to those that DPDK needs. Args: - dpdk_location: The location of the DPDK tree. - dpdk_build_config: A DPDK build configuration to test. If :data:`None`, - DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`. + dpdk_build_config: A DPDK build configuration to test. """ - self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote) - if not self._remote_dpdk_tree_path: - if dpdk_location.dpdk_tree: - self._copy_dpdk_tree(dpdk_location.dpdk_tree) - elif dpdk_location.tarball: - self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote) - - self._set_remote_dpdk_build_dir(dpdk_location.build_dir) - if not self.remote_dpdk_build_dir and dpdk_build_config: - self._configure_dpdk_build(dpdk_build_config) - self._build_dpdk() + match dpdk_build_config.dpdk_location: + case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree): + self._set_remote_dpdk_tree_path(dpdk_tree) + case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree): + self._copy_dpdk_tree(dpdk_tree) + case RemoteDPDKTarballLocation(tarball=tarball): + self._validate_remote_dpdk_tarball(tarball) + self._prepare_and_extract_dpdk_tarball(tarball) + case LocalDPDKTarballLocation(tarball=tarball): + remote_tarball = self._copy_dpdk_tarball_to_remote(tarball) + self._prepare_and_extract_dpdk_tarball(remote_tarball) + + match dpdk_build_config: + case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir): + self._set_remote_dpdk_build_dir(build_dir) + case DPDKUncompiledBuildConfiguration(build_options=build_options): + self._configure_dpdk_build(build_options) + self._build_dpdk() self.bind_ports_to_driver() @@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None: self.compiler_version = None self.bind_ports_to_driver(for_dpdk=False) - def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool): + def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath): """Set the path to the remote DPDK source tree based on the provided DPDK location. - If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree` - on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing. - Verify DPDK source tree existence on the SUT node, if exists sets the `_remote_dpdk_tree_path` property, otherwise sets nothing. Args: dpdk_tree: The path to the DPDK source tree directory. - remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the - execution host. Raises: RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but is not found. """ - if remote and dpdk_tree: - if not self.main_session.remote_path_exists(dpdk_tree): - raise RemoteFileNotFoundError( - f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node." - ) - if not self.main_session.is_remote_dir(dpdk_tree): - raise ConfigurationError( - f"Remote DPDK source tree '{dpdk_tree}' must be a directory." - ) - - self.__remote_dpdk_tree_path = PurePath(dpdk_tree) - - def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None: + if not self.main_session.remote_path_exists(dpdk_tree): + raise RemoteFileNotFoundError( + f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node." + ) + if not self.main_session.is_remote_dir(dpdk_tree): + raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.") + + self.__remote_dpdk_tree_path = dpdk_tree + + def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None: """Copy the DPDK source tree to the SUT. Args: @@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None: self._remote_tmp_dir, PurePath(dpdk_tree_path).name ) - def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None: - """Ensure the DPDK tarball is available on the SUT node and extract it. + def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None: + """Validate the DPDK tarball on the SUT node. - This method ensures that the DPDK source tree tarball is available on the - SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the - `dpdk_tarball` is already on the SUT node, it verifies its existence. - The `dpdk_tarball` is then extracted on the SUT node. + Args: + dpdk_tarball: The path to the DPDK tarball on the SUT node. - This method sets the `_remote_dpdk_tree_path` property to the path of the - extracted DPDK tree on the SUT node. + Raises: + RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is + not found. + ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive. + """ + if not self.main_session.remote_path_exists(dpdk_tarball): + raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.") + if not self.main_session.is_remote_tarfile(dpdk_tarball): + raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.") + + def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath: + """Copy the local DPDK tarball to the SUT node. Args: - dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node. - remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the - execution host. + dpdk_tarball: The local path to the DPDK tarball. - Raises: - RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but - is not found. + Returns: + The path of the copied tarball on the SUT node. + """ + self._logger.info( + f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'." + ) + self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir) + return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name) + + def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None: + """Prepare the remote DPDK tree path and extract the tarball. + + This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to + the path of the extracted DPDK tree on the SUT node. + + Args: + remote_tarball_path: The path to the DPDK tarball on the SUT node. """ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath: @@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath: return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, "")) return remote_tarball_path.with_suffix("") - if remote: - if not self.main_session.remote_path_exists(dpdk_tarball): - raise RemoteFileNotFoundError( - f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT." - ) - if not self.main_session.is_remote_tarfile(dpdk_tarball): - raise ConfigurationError( - f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive." - ) - - remote_tarball_path = PurePath(dpdk_tarball) - else: - self._logger.info( - f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'." - ) - self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir) - - remote_tarball_path = self.main_session.join_remote_path( - self._remote_tmp_dir, PurePath(dpdk_tarball).name - ) - tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path) self.__remote_dpdk_tree_path = self.main_session.join_remote_path( - PurePath(remote_tarball_path).parent, + remote_tarball_path.parent, tarball_top_dir or remove_tarball_suffix(remote_tarball_path), ) @@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath: self._remote_dpdk_tree_path, ) - def _set_remote_dpdk_build_dir(self, build_dir: str | None): + def _set_remote_dpdk_build_dir(self, build_dir: str): """Set the `remote_dpdk_build_dir` on the SUT. - If :data:`build_dir` is defined, check existence on the SUT node and sets the + Check existence on the SUT node and sets the `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`. Otherwise, sets nothing. Args: - build_dir: If it's defined, DPDK has been pre-built and the build directory is located + build_dir: DPDK has been pre-built and the build directory is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Raises: RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT node. """ - if build_dir: - remote_dpdk_build_dir = self.main_session.join_remote_path( - self._remote_dpdk_tree_path, build_dir + remote_dpdk_build_dir = self.main_session.join_remote_path( + self._remote_dpdk_tree_path, build_dir + ) + if not self.main_session.remote_path_exists(remote_dpdk_build_dir): + raise RemoteFileNotFoundError( + f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node." ) - if not self.main_session.remote_path_exists(remote_dpdk_build_dir): - raise RemoteFileNotFoundError( - f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node." - ) - self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir) + self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir) - def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None: + def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None: """Populate common environment variables and set the DPDK build related properties. This method sets `compiler_version` for additional information and `remote_dpdk_build_dir` diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py index d38ae36c2a..17b333e76a 100644 --- a/dts/framework/testbed_model/topology.py +++ b/dts/framework/testbed_model/topology.py @@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]): port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port)) self.type = TopologyType.get_from_value(len(port_links)) - dummy_port = Port(PortConfig("", "", "", "", "", "")) + dummy_port = Port( + "", + PortConfig( + pci="0000:00:00.0", + os_driver_for_dpdk="", + os_driver="", + peer_node="", + peer_pci="0000:00:00.0", + ), + ) self.tg_port_egress = dummy_port self.sut_port_ingress = dummy_port self.sut_port_egress = dummy_port diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py index a319fa5320..945f6bbbbb 100644 --- a/dts/framework/testbed_model/traffic_generator/__init__.py +++ b/dts/framework/testbed_model/traffic_generator/__init__.py @@ -38,6 +38,4 @@ def create_traffic_generator( case ScapyTrafficGeneratorConfig(): return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True) case _: - raise ConfigurationError( - f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}" - ) + raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}") diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py index 469a12a780..5ac61cd4e1 100644 --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py @@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs): """ self._config = config self._tg_node = tg_node - self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}") + self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}") super().__init__(tg_node, **kwargs) def send_packet(self, packet: Packet, port: Port) -> None: diff --git a/dts/framework/utils.py b/dts/framework/utils.py index 78a39e32c7..e862e3ac66 100644 --- a/dts/framework/utils.py +++ b/dts/framework/utils.py @@ -28,7 +28,7 @@ from .exception import InternalError -REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/" +REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}" _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}" _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}" REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}" diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py index d7870bd40f..bc3a2a6bf9 100644 --- a/dts/tests/TestSuite_smoke_tests.py +++ b/dts/tests/TestSuite_smoke_tests.py @@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None: path_to_devbind = self.sut_node.path_to_devbind_script all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command( - f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'", + f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'", SETTINGS.timeout, ).stdout From patchwork Mon Oct 28 17:49:45 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147554 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id B541445BFF; Mon, 28 Oct 2024 18:51:48 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id 35FC2427E6; Mon, 28 Oct 2024 18:51:23 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id 25C21427CB for ; Mon, 28 Oct 2024 18:51:17 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 6E32A16F2; Mon, 28 Oct 2024 10:51:46 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id 274803F66E; Mon, 28 Oct 2024 10:51:16 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 5/8] dts: remove warlock dependency Date: Mon, 28 Oct 2024 17:49:45 +0000 Message-ID: <20241028174949.3283701-6-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Since pydantic has completely replaced warlock, there is no more need to keep it as a dependency. This removes it. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- dts/poetry.lock | 227 +-------------------------------------------- dts/pyproject.toml | 1 - 2 files changed, 1 insertion(+), 227 deletions(-) diff --git a/dts/poetry.lock b/dts/poetry.lock index 56c50ad52c..9f7db60793 100644 --- a/dts/poetry.lock +++ b/dts/poetry.lock @@ -34,24 +34,6 @@ files = [ {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, ] -[[package]] -name = "attrs" -version = "23.1.0" -description = "Classes Without Boilerplate" -optional = false -python-versions = ">=3.7" -files = [ - {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"}, - {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"}, -] - -[package.extras] -cov = ["attrs[tests]", "coverage[toml] (>=5.3)"] -dev = ["attrs[docs,tests]", "pre-commit"] -docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"] -tests = ["attrs[tests-no-zope]", "zope-interface"] -tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"] - [[package]] name = "babel" version = "2.13.1" @@ -491,66 +473,6 @@ MarkupSafe = ">=2.0" [package.extras] i18n = ["Babel (>=2.7)"] -[[package]] -name = "jsonpatch" -version = "1.33" -description = "Apply JSON-Patches (RFC 6902)" -optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*" -files = [ - {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"}, - {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"}, -] - -[package.dependencies] -jsonpointer = ">=1.9" - -[[package]] -name = "jsonpointer" -version = "2.4" -description = "Identify specific nodes in a JSON document (RFC 6901)" -optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*" -files = [ - {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"}, - {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"}, -] - -[[package]] -name = "jsonschema" -version = "4.18.4" -description = "An implementation of JSON Schema validation for Python" -optional = false -python-versions = ">=3.8" -files = [ - {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"}, - {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"}, -] - -[package.dependencies] -attrs = ">=22.2.0" -jsonschema-specifications = ">=2023.03.6" -referencing = ">=0.28.4" -rpds-py = ">=0.7.1" - -[package.extras] -format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"] -format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"] - -[[package]] -name = "jsonschema-specifications" -version = "2023.7.1" -description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry" -optional = false -python-versions = ">=3.8" -files = [ - {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"}, - {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"}, -] - -[package.dependencies] -referencing = ">=0.28.0" - [[package]] name = "markupsafe" version = "2.1.3" @@ -1073,21 +995,6 @@ files = [ {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, ] -[[package]] -name = "referencing" -version = "0.30.0" -description = "JSON Referencing + Python" -optional = false -python-versions = ">=3.8" -files = [ - {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"}, - {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"}, -] - -[package.dependencies] -attrs = ">=22.2.0" -rpds-py = ">=0.7.0" - [[package]] name = "requests" version = "2.31.0" @@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3" socks = ["PySocks (>=1.5.6,!=1.5.7)"] use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] -[[package]] -name = "rpds-py" -version = "0.9.2" -description = "Python bindings to Rust's persistent data structures (rpds)" -optional = false -python-versions = ">=3.8" -files = [ - {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"}, - {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"}, - {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"}, - {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"}, - {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"}, - {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"}, - {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"}, - {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"}, - {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"}, - {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"}, - {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"}, - {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"}, - {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"}, - {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"}, - {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"}, - {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"}, - {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"}, - {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"}, - {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"}, - {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"}, - {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"}, - {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"}, - {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"}, - {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"}, - {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"}, - {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"}, - {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"}, - {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"}, - {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"}, - {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"}, - {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"}, - {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"}, - {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"}, - {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"}, - {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"}, - {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"}, - {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"}, - {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"}, - {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"}, - {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"}, - {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"}, - {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"}, -] - [[package]] name = "scapy" version = "2.5.0" @@ -1472,17 +1273,6 @@ files = [ {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"}, ] -[[package]] -name = "typing-extensions" -version = "4.11.0" -description = "Backported and Experimental Type Hints for Python 3.8+" -optional = false -python-versions = ">=3.8" -files = [ - {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"}, - {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"}, -] - [[package]] name = "typing-extensions" version = "4.12.2" @@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17. socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] zstd = ["zstandard (>=0.18.0)"] -[[package]] -name = "warlock" -version = "2.0.1" -description = "Python object model built on JSON schema and JSON patch." -optional = false -python-versions = ">=3.7,<4.0" -files = [ - {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"}, - {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"}, -] - -[package.dependencies] -jsonpatch = ">=1,<2" -jsonschema = ">=4,<5" - [metadata] lock-version = "2.0" python-versions = "^3.10" -content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827" +content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f" diff --git a/dts/pyproject.toml b/dts/pyproject.toml index 6c2d1ca8a4..9a3fb02ee9 100644 --- a/dts/pyproject.toml +++ b/dts/pyproject.toml @@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html" [tool.poetry.dependencies] python = "^3.10" -warlock = "^2.0.1" PyYAML = "^6.0" types-PyYAML = "^6.0.8" fabric = "^2.7.1" From patchwork Mon Oct 28 17:49:46 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147555 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 50D7D45BFF; Mon, 28 Oct 2024 18:51:55 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id 6248B42831; Mon, 28 Oct 2024 18:51:24 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id E9E37427CE for ; Mon, 28 Oct 2024 18:51:17 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 3DFAD497; Mon, 28 Oct 2024 10:51:47 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id EBCA23F66E; Mon, 28 Oct 2024 10:51:16 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 6/8] dts: add autodoc pydantic Date: Mon, 28 Oct 2024 17:49:46 +0000 Message-ID: <20241028174949.3283701-7-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Add and enable the autodoc-pydantic sphinx extension. Pydantic models are not correctly recognised by autodoc, causing the generated docs to lack all the actual model information. The autodoc-pydantic sphinx extension fixes the original behaviour by correctly formatting them. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- doc/guides/conf.py | 13 +++ doc/guides/tools/dts.rst | 187 ++------------------------------------- dts/poetry.lock | 59 +++++++++++- dts/pyproject.toml | 1 + 4 files changed, 79 insertions(+), 181 deletions(-) diff --git a/doc/guides/conf.py b/doc/guides/conf.py index b553d9d5bf..71fed45b3d 100644 --- a/doc/guides/conf.py +++ b/doc/guides/conf.py @@ -60,6 +60,19 @@ # DTS API docs additional configuration if environ.get('DTS_DOC_BUILD'): extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx'] + + # Pydantic models require autodoc_pydantic for the right formatting + try: + import sphinxcontrib.autodoc_pydantic + + extensions.append("sphinxcontrib.autodoc_pydantic") + except ImportError: + print( + "The DTS API doc dependencies are missing. The generated output won't be " + "as intended, and autodoc may throw unexpected warnings.", + file=stderr, + ) + # Napoleon enables the Google format of Python doscstrings. napoleon_numpy_docstring = False napoleon_attr_annotations = True diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst index c52de1808c..7ccca63ae8 100644 --- a/doc/guides/tools/dts.rst +++ b/doc/guides/tools/dts.rst @@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries. Configuring DTS ~~~~~~~~~~~~~~~ -DTS configuration is split into nodes and test runs and build targets within test runs, -and follows a defined schema as described in `Configuration Schema`_. -By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file `, +DTS configuration is split into nodes and test runs, and must respect the the model definitions as +documented in the DTS API docs under the ``config`` page. The root of the configuration is +represented by the ``Configuration`` model. +By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file `, which is a template that illustrates what can be configured in DTS. The user must have :ref:`administrator privileges ` @@ -470,184 +471,10 @@ The output is generated in ``build/doc/api/dts/html``. Make sure to fix any Sphinx warnings when adding or updating docstrings. +.. _configuration_example: -Configuration Schema --------------------- - -Definitions -~~~~~~~~~~~ - -_`Node name` - *string* – A unique identifier for a node. - **Examples**: ``SUT1``, ``TG1``. - -_`ARCH` - *string* – The CPU architecture. - **Supported values**: ``x86_64``, ``arm64``, ``ppc64le``. - -_`CPU` - *string* – The CPU microarchitecture. Use ``native`` for x86. - **Supported values**: ``native``, ``armv8a``, ``dpaa2``, ``thunderx``, ``xgene1``. - -_`OS` - *string* – The operating system. **Supported values**: ``linux``. - -_`Compiler` - *string* – The compiler used for building DPDK. - **Supported values**: ``gcc``, ``clang``, ``icc``, ``mscv``. - -_`Build target` - *mapping* – Build targets supported by DTS for building DPDK, described as: - - ==================== ================================================================= - ``arch`` See `ARCH`_ - ``os`` See `OS`_ - ``cpu`` See `CPU`_ - ``compiler`` See `Compiler`_ - ``compiler_wrapper`` *string* – Value prepended to the CC variable for the DPDK build. - - **Example**: ``ccache`` - ==================== ================================================================= - -_`hugepages_2mb` - *mapping* – hugepages_2mb described as: - - ==================== ================================================================ - ``number_of`` *integer* – The number of 2MB hugepages to configure. - - Hugepage size will be the system default. - ``force_first_numa`` (*optional*, defaults to ``false``) – If ``true``, it forces the - - configuration of hugepages on the first NUMA node. - ==================== ================================================================ - -_`Network port` - *mapping* – the NIC port described as: - - ====================== ================================================================================= - ``pci`` *string* – the local PCI address of the port. **Example**: ``0000:00:08.0`` - ``os_driver_for_dpdk`` | *string* – this port's device driver when using with DPDK - | When setting up the SUT, DTS will bind the network device to this driver - | for compatibility with DPDK. - - **Examples**: ``vfio-pci``, ``mlx5_core`` - ``os_driver`` | *string* – this port's device driver when **not** using with DPDK - | When tearing down the tests on the SUT, DTS will bind the network device - | *back* to this driver. This driver is meant to be the one that the SUT would - | normally use for this device, or whichever driver it is preferred to leave the - | device bound to after testing. - | This also represents the driver that is used in conjunction with the traffic - | generator software. - - **Examples**: ``i40e``, ``mlx5_core`` - ``peer_node`` *string* – the name of the peer node connected to this port. - ``peer_pci`` *string* – the PCI address of the peer node port. **Example**: ``000a:01:00.1`` - ====================== ================================================================================= - -_`Test suite` - *string* – name of the test suite to run. **Examples**: ``hello_world``, ``os_udp`` - -_`Test target` - *mapping* – selects specific test cases to run from a test suite. Mapping is described as follows: - - ========= =============================================================================================== - ``suite`` See `Test suite`_ - ``cases`` (*optional*) *sequence* of *string* – list of the selected test cases in the test suite to run. - - Unknown test cases will be silently ignored. - ========= =============================================================================================== - - -Properties -~~~~~~~~~~ - -The configuration requires listing all the test run environments and nodes -involved in the testing. These can be defined with the following mappings: - -``test runs`` - `sequence `_ listing - the test run environments. Each entry is described as per the following - `mapping `_: - - +----------------------------+-------------------------------------------------------------------+ - | ``build_targets`` | *sequence* of `Build target`_ | - +----------------------------+-------------------------------------------------------------------+ - | ``perf`` | *boolean* – Enable performance testing. | - +----------------------------+-------------------------------------------------------------------+ - | ``func`` | *boolean* – Enable functional testing. | - +----------------------------+-------------------------------------------------------------------+ - | ``test_suites`` | *sequence* of **one of** `Test suite`_ **or** `Test target`_ | - +----------------------------+-------------------------------------------------------------------+ - | ``skip_smoke_tests`` | (*optional*) *boolean* – Allows you to skip smoke testing | - | | if ``true``. | - +----------------------------+-------------------------------------------------------------------+ - | ``system_under_test_node`` | System under test node specified with: | - | +---------------+---------------------------------------------------+ - | | ``node_name`` | See `Node name`_ | - | +---------------+---------------------------------------------------+ - | | ``vdevs`` | (*optional*) *sequence* of *string* | - | | | | - | | | List of virtual devices passed with the ``--vdev``| - | | | argument to DPDK. **Example**: ``crypto_openssl`` | - +----------------------------+---------------+---------------------------------------------------+ - | ``traffic_generator_node`` | Node name for the traffic generator node. | - +----------------------------+-------------------------------------------------------------------+ - | ``random_seed`` | (*optional*) *int* – Set a seed for pseudo-random generation. | - +----------------------------+-------------------------------------------------------------------+ - -``nodes`` - `sequence `_ listing - the nodes. Each entry is described as per the following - `mapping `_: - - +-----------------------+---------------------------------------------------------------------------------------+ - | ``name`` | See `Node name`_ | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``hostname`` | *string* – The network hostname or IP address of this node. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``user`` | *string* – The SSH user credential to use to login to this node. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``password`` | (*optional*) *string* – The SSH password credential for this node. | - | | | - | | **NB**: Use only as last resort. SSH keys are **strongly** preferred. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``arch`` | The architecture of this node. See `ARCH`_ for supported values. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``os`` | The operating system of this node. See `OS`_ for supported values. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``lcores`` | | (*optional*, defaults to 1) *string* – Comma-separated list of logical | - | | | cores to use. An empty string means use all lcores. | - | | | - | | **Example**: ``1,2,3,4,5,18-22`` | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``use_first_core`` | (*optional*, defaults to ``false``) *boolean* | - | | | - | | Indicates whether DPDK should use only the first physical core or not. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``memory_channels`` | (*optional*, defaults to 1) *integer* | - | | | - | | The number of the memory channels to use. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``hugepages_2mb`` | (*optional*) See `hugepages_2mb`_. If unset, 2MB hugepages won't be configured | - | | | - | | in favour of the system configuration. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``ports`` | | *sequence* of `Network port`_ – Describe ports that are **directly** paired with | - | | | other nodes used in conjunction with this one. Both ends of the links must be | - | | | described. If there any inconsistencies DTS won't run. | - | | | - | | **Example**: port 1 of node ``SUT1`` is connected to port 1 of node ``TG1`` etc. | - +-----------------------+---------------------------------------------------------------------------------------+ - | ``traffic_generator`` | (*optional*) Traffic generator, if any, setup on this node described as: | - | +----------+----------------------------------------------------------------------------+ - | | ``type`` | *string* – **Supported values**: *SCAPY* | - +-----------------------+----------+----------------------------------------------------------------------------+ - - -.. _configuration_schema_example: - -Example -~~~~~~~ +Configuration Example +--------------------- The following example (which can be found in ``dts/conf.yaml``) sets up two nodes: diff --git a/dts/poetry.lock b/dts/poetry.lock index 9f7db60793..ee564676b4 100644 --- a/dts/poetry.lock +++ b/dts/poetry.lock @@ -34,6 +34,29 @@ files = [ {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, ] +[[package]] +name = "autodoc-pydantic" +version = "2.2.0" +description = "Seamlessly integrate pydantic models in your Sphinx documentation." +optional = false +python-versions = "<4.0.0,>=3.8.1" +files = [ + {file = "autodoc_pydantic-2.2.0-py3-none-any.whl", hash = "sha256:8c6a36fbf6ed2700ea9c6d21ea76ad541b621fbdf16b5a80ee04673548af4d95"}, +] + +[package.dependencies] +pydantic = ">=2.0,<3.0.0" +pydantic-settings = ">=2.0,<3.0.0" +Sphinx = ">=4.0" + +[package.extras] +docs = ["myst-parser (>=3.0.0,<4.0.0)", "sphinx-copybutton (>=0.5.0,<0.6.0)", "sphinx-rtd-theme (>=2.0.0,<3.0.0)", "sphinx-tabs (>=3,<4)", "sphinxcontrib-mermaid (>=0.9.0,<0.10.0)"] +erdantic = ["erdantic (<2.0)"] +linting = ["ruff (>=0.4.0,<0.5.0)"] +security = ["pip-audit (>=2.7.2,<3.0.0)"] +test = ["coverage (>=7,<8)", "defusedxml (>=0.7.1)", "pytest (>=8.0.0,<9.0.0)", "pytest-sugar (>=1.0.0,<2.0.0)"] +type-checking = ["mypy (>=1.9,<2.0)", "types-docutils (>=0.20,<0.21)", "typing-extensions (>=4.11,<5.0)"] + [[package]] name = "babel" version = "2.13.1" @@ -829,6 +852,26 @@ files = [ [package.dependencies] typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" +[[package]] +name = "pydantic-settings" +version = "2.6.0" +description = "Settings management using Pydantic" +optional = false +python-versions = ">=3.8" +files = [ + {file = "pydantic_settings-2.6.0-py3-none-any.whl", hash = "sha256:4a819166f119b74d7f8c765196b165f95cc7487ce58ea27dec8a5a26be0970e0"}, + {file = "pydantic_settings-2.6.0.tar.gz", hash = "sha256:44a1804abffac9e6a30372bb45f6cafab945ef5af25e66b1c634c01dd39e0188"}, +] + +[package.dependencies] +pydantic = ">=2.7.0" +python-dotenv = ">=0.21.0" + +[package.extras] +azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"] +toml = ["tomli (>=2.0.1)"] +yaml = ["pyyaml (>=6.0.1)"] + [[package]] name = "pydocstyle" version = "6.1.1" @@ -935,6 +978,20 @@ cffi = ">=1.4.1" docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"] tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"] +[[package]] +name = "python-dotenv" +version = "1.0.1" +description = "Read key-value pairs from a .env file and set them as environment variables" +optional = false +python-versions = ">=3.8" +files = [ + {file = "python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca"}, + {file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"}, +] + +[package.extras] +cli = ["click (>=5.0)"] + [[package]] name = "pyyaml" version = "6.0.1" @@ -1304,4 +1361,4 @@ zstd = ["zstandard (>=0.18.0)"] [metadata] lock-version = "2.0" python-versions = "^3.10" -content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f" +content-hash = "fe9a9fdf7b43e8dce2fb5ee600921d4047fef2f4037a78bbd150f71df202493e" diff --git a/dts/pyproject.toml b/dts/pyproject.toml index 9a3fb02ee9..f69c70877a 100644 --- a/dts/pyproject.toml +++ b/dts/pyproject.toml @@ -44,6 +44,7 @@ optional = true sphinx = "<=7" sphinx-rtd-theme = ">=1.2.2" pyelftools = "^0.31" +autodoc-pydantic = "^2.2.0" [build-system] requires = ["poetry-core>=1.0.0"] From patchwork Mon Oct 28 17:49:47 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147556 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 48BA745BFF; Mon, 28 Oct 2024 18:52:07 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id 9B2D742D76; Mon, 28 Oct 2024 18:51:30 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id DE315427CE for ; Mon, 28 Oct 2024 18:51:18 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id 0DAFD13D5; Mon, 28 Oct 2024 10:51:48 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id BBCD43F66E; Mon, 28 Oct 2024 10:51:17 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 7/8] dts: improve configuration API docs Date: Mon, 28 Oct 2024 17:49:47 +0000 Message-ID: <20241028174949.3283701-8-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Pydantic models are not treated the same way as dataclasses by autodoc. As a consequence the docstrings need to be applied directly to each field. Otherwise the generated API documentation page would present two entries per each field with each their own differences. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- doc/guides/tools/dts.rst | 5 +- dts/framework/config/__init__.py | 253 +++++++++++-------------------- 2 files changed, 88 insertions(+), 170 deletions(-) diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst index 7ccca63ae8..ac12c5c4fa 100644 --- a/doc/guides/tools/dts.rst +++ b/doc/guides/tools/dts.rst @@ -1,5 +1,6 @@ .. SPDX-License-Identifier: BSD-3-Clause Copyright(c) 2022-2023 PANTHEON.tech s.r.o. + Copyright(c) 2024 Arm Limited DPDK Test Suite =============== @@ -327,8 +328,8 @@ where we deviate or where some additional clarification is helpful: * The ``dataclass.dataclass`` decorator changes how the attributes are processed. The dataclass attributes which result in instance variables/attributes should also be recorded in the ``Attributes:`` section. - * Class variables/attributes, on the other hand, should be documented with ``#:`` - above the type annotated line. + * Class variables/attributes and Pydantic model fields, on the other hand, should be documented + with ``#:`` above the type annotated line. The description may be omitted if the meaning is obvious. * The ``Enum`` and ``TypedDict`` also process the attributes in particular ways and should be documented with ``#:`` as well. diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py index c86bfaaabf..d7d3907a33 100644 --- a/dts/framework/config/__init__.py +++ b/dts/framework/config/__init__.py @@ -116,54 +116,34 @@ class TrafficGeneratorType(str, Enum): class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"): - r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s. - - Attributes: - number_of: The number of hugepages to allocate. - force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node. - """ + r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.""" + #: The number of hugepages to allocate. number_of: int + #: If :data:`True`, the hugepages will be configured on the first NUMA node. force_first_numa: bool class PortConfig(BaseModel, frozen=True, extra="forbid"): - r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s. - - Attributes: - pci: The PCI address of the port. - os_driver_for_dpdk: The operating system driver name for use with DPDK. - os_driver: The operating system driver name when the operating system controls the port. - peer_node: The :class:`~framework.testbed_model.node.Node` of the port - connected to this port. - peer_pci: The PCI address of the port connected to this port. - """ + r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.""" - pci: str = Field( - description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS - ) - os_driver_for_dpdk: str = Field( - description="The driver that the kernel should bind this device to for DPDK to use it.", - examples=["vfio-pci", "mlx5_core"], - ) - os_driver: str = Field( - description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"] - ) - peer_node: str = Field(description="The name of the peer node this port is connected to.") - peer_pci: str = Field( - description="The PCI address of the peer port this port is connected to.", - pattern=REGEX_FOR_PCI_ADDRESS, - ) + #: The PCI address of the port. + pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS) + #: The driver that the kernel should bind this device to for DPDK to use it. + os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"]) + #: The operating system driver name when the operating system controls the port. + os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"]) + #: The name of the peer node this port is connected to. + peer_node: str + #: The PCI address of the peer port connected to this port. + peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS) class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"): - """A protocol required to define traffic generator types. - - Attributes: - type: The traffic generator type, the child class is required to define to be distinguished - among others. - """ + """A protocol required to define traffic generator types.""" + #: The traffic generator type the child class is required to define to be distinguished among + #: others. type: TrafficGeneratorType @@ -176,13 +156,10 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo #: A union type discriminating traffic generators by the `type` field. TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")] - -#: A field representing logical core ranges. +#: Comma-separated list of logical cores to use. An empty string means use all lcores. LogicalCores = Annotated[ str, Field( - description="Comma-separated list of logical cores to use. " - "An empty string means use all lcores.", examples=["1,2,3,4,5,18-22", "10-15"], pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$", ), @@ -190,61 +167,41 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo class NodeConfiguration(BaseModel, frozen=True, extra="forbid"): - r"""The configuration of :class:`~framework.testbed_model.node.Node`\s. - - Attributes: - name: The name of the :class:`~framework.testbed_model.node.Node`. - hostname: The hostname of the :class:`~framework.testbed_model.node.Node`. - Can be an IP or a domain name. - user: The name of the user used to connect to - the :class:`~framework.testbed_model.node.Node`. - password: The password of the user. The use of passwords is heavily discouraged. - Please use keys instead. - arch: The architecture of the :class:`~framework.testbed_model.node.Node`. - os: The operating system of the :class:`~framework.testbed_model.node.Node`. - lcores: A comma delimited list of logical cores to use when running DPDK. - use_first_core: If :data:`True`, the first logical core won't be used. - hugepages: An optional hugepage configuration. - ports: The ports that can be used in testing. - """ - - name: str = Field(description="A unique identifier for this node.") - hostname: str = Field(description="The hostname or IP address of the node.") - user: str = Field(description="The login user to use to connect to this node.") - password: str | None = Field( - default=None, - description="The login password to use to connect to this node. " - "SSH keys are STRONGLY preferred, use only as last resort.", - ) + r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.""" + + #: The name of the :class:`~framework.testbed_model.node.Node`. + name: str + #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address. + hostname: str + #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`. + user: str + #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys. + password: str | None = None + #: The architecture of the :class:`~framework.testbed_model.node.Node`. arch: Architecture + #: The operating system of the :class:`~framework.testbed_model.node.Node`. os: OS + #: A comma delimited list of logical cores to use when running DPDK. lcores: LogicalCores = "1" - use_first_core: bool = Field( - default=False, description="DPDK won't use the first physical core if set to False." - ) + #: If :data:`True`, the first logical core won't be used. + use_first_core: bool = False + #: An optional hugepage configuration. hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb") + #: The ports that can be used in testing. ports: list[PortConfig] = Field(min_length=1) class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"): - """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration. + """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.""" - Attributes: - memory_channels: The number of memory channels to use when running DPDK. - """ - - memory_channels: int = Field( - default=1, description="Number of memory channels to use when running DPDK." - ) + #: The number of memory channels to use when running DPDK. + memory_channels: int = 1 class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"): - """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration. - - Attributes: - traffic_generator: The configuration of the traffic generator present on the TG node. - """ + """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.""" + #: The configuration of the traffic generator present on the TG node. traffic_generator: TrafficGeneratorConfigTypes @@ -258,20 +215,18 @@ def resolve_path(path: Path) -> Path: class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"): - """DPDK location. + """DPDK location base class. - The path to the DPDK sources, build dir and type of location. - - Attributes: - remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is - located on the SUT node, instead of the execution host. + The path to the DPDK sources and type of location. """ + #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively + #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`. remote: bool = False class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"): - """Local DPDK location parent class. + """Local DPDK location base class. This class is meant to represent any location that is present only locally. """ @@ -284,14 +239,12 @@ class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"): This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly validation. - - Attributes: - dpdk_tree: The path to the DPDK source tree directory. """ + #: The path to the DPDK source tree directory on the local host passed as string. dpdk_tree: Path - #: Resolve the local DPDK tree path + #: Resolve the local DPDK tree path. resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path) @model_validator(mode="after") @@ -307,14 +260,12 @@ class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"): This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly validation. - - Attributes: - tarball: The path to the DPDK tarball. """ + #: The path to the DPDK tarball on the local host passed as string. tarball: Path - #: Resolve the local tarball path + #: Resolve the local tarball path. resolve_tarball_path = field_validator("tarball")(resolve_path) @model_validator(mode="after") @@ -326,7 +277,7 @@ def validate_tarball_path(self) -> Self: class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"): - """Remote DPDK location parent class. + """Remote DPDK location base class. This class is meant to represent any location that is present only remotely. """ @@ -338,11 +289,9 @@ class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"): """Remote DPDK tree location. This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation. - - Attributes: - dpdk_tree: The path to the DPDK source tree directory. """ + #: The path to the DPDK source tree directory on the remote node passed as string. dpdk_tree: PurePath @@ -351,11 +300,9 @@ class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"): This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly validation. - - Attributes: - tarball: The path to the DPDK tarball. """ + #: The path to the DPDK tarball on the remote node passed as string. tarball: PurePath @@ -372,23 +319,17 @@ class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"): """The base configuration for different types of build. The configuration contain the location of the DPDK and configuration used for building it. - - Attributes: - dpdk_location: The location of the DPDK tree. """ + #: The location of the DPDK tree. dpdk_location: DPDKLocation class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"): - """DPDK precompiled build configuration. - - Attributes: - precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory - is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will - be using `dpdk_build_config` from configuration to build the DPDK from source. - """ + """DPDK precompiled build configuration.""" + #: If it's defined, DPDK has been pre-compiled and the build directory is located in a + #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory. precompiled_build_dir: str = Field(min_length=1) @@ -396,20 +337,18 @@ class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"): """DPDK build options configuration. The build options used for building DPDK. - - Attributes: - arch: The target architecture to build for. - os: The target os to build for. - cpu: The target CPU to build for. - compiler: The compiler executable to use. - compiler_wrapper: This string will be put in front of the compiler when executing the build. - Useful for adding wrapper commands, such as ``ccache``. """ + #: The target architecture to build for. arch: Architecture + #: The target OS to build for. os: OS + #: The target CPU to build for. cpu: CPUType + #: The compiler executable to use. compiler: Compiler + #: This string will be put in front of the compiler when executing the build. Useful for adding + #: wrapper commands, such as ``ccache``. compiler_wrapper: str = "" @cached_property @@ -419,12 +358,9 @@ def name(self) -> str: class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"): - """DPDK uncompiled build configuration. - - Attributes: - build_options: The build options to compile DPDK. - """ + """DPDK uncompiled build configuration.""" + #: The build options to compiled DPDK with. build_options: DPDKBuildOptionsConfiguration @@ -448,24 +384,13 @@ class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"): # or as model fields: - test_suite: hello_world test_cases: [hello_world_single_core] # without this field all test cases are run - - Attributes: - test_suite_name: The name of the test suite module without the starting ``TestSuite_``. - test_cases_names: The names of test cases from this test suite to execute. - If empty, all test cases will be executed. """ - test_suite_name: str = Field( - title="Test suite name", - description="The identifying module name of the test suite without the prefix.", - alias="test_suite", - ) - test_cases_names: list[str] = Field( - default_factory=list, - title="Test cases by name", - description="The identifying name of the test cases of the test suite.", - alias="test_cases", - ) + #: The name of the test suite module without the starting ``TestSuite_``. + test_suite_name: str = Field(alias="test_suite") + #: The names of test cases from this test suite to execute. If empty, all test cases will be + #: executed. + test_cases_names: list[str] = Field(default_factory=list, alias="test_cases") @cached_property def test_suite_spec(self) -> "TestSuiteSpec": @@ -507,14 +432,11 @@ def validate_names(self) -> Self: class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"): - """The SUT node configuration of a test run. - - Attributes: - node_name: The SUT node to use in this test run. - vdevs: The names of virtual devices to test. - """ + """The SUT node configuration of a test run.""" + #: The SUT node to use in this test run. node_name: str + #: The names of virtual devices to test. vdevs: list[str] = Field(default_factory=list) @@ -523,25 +445,23 @@ class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"): The configuration contains testbed information, what tests to execute and with what DPDK build. - - Attributes: - dpdk_config: The DPDK configuration used to test. - perf: Whether to run performance tests. - func: Whether to run functional tests. - skip_smoke_tests: Whether to skip smoke tests. - test_suites: The names of test suites and/or test cases to execute. - system_under_test_node: The SUT node configuration to use in this test run. - traffic_generator_node: The TG node name to use in this test run. - random_seed: The seed to use for pseudo-random generation. """ + #: The DPDK configuration used to test. dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build") - perf: bool = Field(description="Enable performance testing.") - func: bool = Field(description="Enable functional testing.") + #: Whether to run performance tests. + perf: bool + #: Whether to run functional tests. + func: bool + #: Whether to skip smoke tests. skip_smoke_tests: bool = False + #: The names of test suites and/or test cases to execute. test_suites: list[TestSuiteConfig] = Field(min_length=1) + #: The SUT node configuration to use in this test run. system_under_test_node: TestRunSUTNodeConfiguration + #: The TG node name to use in this test run. traffic_generator_node: str + #: The seed to use for pseudo-random generation. random_seed: int | None = None @@ -557,14 +477,11 @@ class TestRunWithNodesConfiguration(NamedTuple): class Configuration(BaseModel, extra="forbid"): - """DTS testbed and test configuration. - - Attributes: - test_runs: Test run configurations. - nodes: Node configurations. - """ + """DTS testbed and test configuration.""" + #: Test run configurations. test_runs: list[TestRunConfiguration] = Field(min_length=1) + #: Node configurations. nodes: list[NodeConfigurationTypes] = Field(min_length=1) @cached_property From patchwork Mon Oct 28 17:49:48 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Luca Vizzarro X-Patchwork-Id: 147557 X-Patchwork-Delegate: paul.szczepanek@arm.com Return-Path: X-Original-To: patchwork@inbox.dpdk.org Delivered-To: patchwork@inbox.dpdk.org Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 1A0C545BFF; Mon, 28 Oct 2024 18:52:13 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id ACB5742D83; Mon, 28 Oct 2024 18:51:31 +0100 (CET) Received: from foss.arm.com (foss.arm.com [217.140.110.172]) by mails.dpdk.org (Postfix) with ESMTP id AAB434279F for ; Mon, 28 Oct 2024 18:51:19 +0100 (CET) Received: from usa-sjc-imap-foss1.foss.arm.com (unknown [10.121.207.14]) by usa-sjc-mx-foss1.foss.arm.com (Postfix) with ESMTP id CE0E316F2; Mon, 28 Oct 2024 10:51:48 -0700 (PDT) Received: from localhost.localdomain (JR4XG4HTQC.cambridge.arm.com [10.1.31.47]) by usa-sjc-imap-foss1.foss.arm.com (Postfix) with ESMTPA id 8AC6F3F66E; Mon, 28 Oct 2024 10:51:18 -0700 (PDT) From: Luca Vizzarro To: dev@dpdk.org Cc: Paul Szczepanek , Patrick Robb , Luca Vizzarro Subject: [PATCH v4 8/8] dts: use TestSuiteSpec class imports Date: Mon, 28 Oct 2024 17:49:48 +0000 Message-ID: <20241028174949.3283701-9-luca.vizzarro@arm.com> X-Mailer: git-send-email 2.43.0 In-Reply-To: <20241028174949.3283701-1-luca.vizzarro@arm.com> References: <20240822163941.1390326-1-luca.vizzarro@arm.com> <20241028174949.3283701-1-luca.vizzarro@arm.com> MIME-Version: 1.0 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org The introduction of TestSuiteSpec adds auto-discovery of test suites, which are also automatically imported. This causes double imports as the runner loads the test suites. This changes the behaviour of the runner to load the imported classes from TestSuiteSpec instead of importing them again. Signed-off-by: Luca Vizzarro Reviewed-by: Paul Szczepanek Reviewed-by: Nicholas Pratte --- dts/framework/runner.py | 84 ++++------------------------------------- 1 file changed, 7 insertions(+), 77 deletions(-) diff --git a/dts/framework/runner.py b/dts/framework/runner.py index c3d9a27a8c..5f5837a132 100644 --- a/dts/framework/runner.py +++ b/dts/framework/runner.py @@ -2,6 +2,7 @@ # Copyright(c) 2010-2019 Intel Corporation # Copyright(c) 2022-2023 PANTHEON.tech s.r.o. # Copyright(c) 2022-2023 University of New Hampshire +# Copyright(c) 2024 Arm Limited """Test suite runner module. @@ -17,8 +18,6 @@ and the test case stage runs test cases individually. """ -import importlib -import inspect import os import random import sys @@ -39,12 +38,7 @@ TGNodeConfiguration, load_config, ) -from .exception import ( - BlockingTestSuiteError, - ConfigurationError, - SSHTimeoutError, - TestCaseVerifyError, -) +from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError from .logger import DTSLogger, DtsStage, get_dts_logger from .settings import SETTINGS from .test_result import ( @@ -215,11 +209,10 @@ def _get_test_suites_with_cases( func: bool, perf: bool, ) -> list[TestSuiteWithCases]: - """Test suites with test cases discovery. + """Get test suites with selected cases. - The test suites with test cases defined in the user configuration are discovered - and stored for future use so that we don't import the modules twice and so that - the list of test suites with test cases is available for recording right away. + The test suites with test cases defined in the user configuration are selected + and the corresponding functions and classes are gathered. Args: test_suite_configs: Test suite configurations. @@ -227,12 +220,12 @@ def _get_test_suites_with_cases( perf: Whether to include performance test cases in the final list. Returns: - The discovered test suites, each with test cases. + The test suites, each with test cases. """ test_suites_with_cases = [] for test_suite_config in test_suite_configs: - test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name) + test_suite_class = test_suite_config.test_suite_spec.class_obj test_cases: list[type[TestCase]] = [] func_test_cases, perf_test_cases = test_suite_class.filter_test_cases( test_suite_config.test_cases_names @@ -245,71 +238,8 @@ def _get_test_suites_with_cases( test_suites_with_cases.append( TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases) ) - return test_suites_with_cases - def _get_test_suite_class(self, module_name: str) -> type[TestSuite]: - """Find the :class:`TestSuite` class in `module_name`. - - The full module name is `module_name` prefixed with `self._test_suite_module_prefix`. - The module name is a standard filename with words separated with underscores. - Search the `module_name` for a :class:`TestSuite` class which starts - with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`. - The first matching class is returned. - - The CamelCase convention applies to abbreviations, acronyms, initialisms and so on:: - - OS -> Os - TCP -> Tcp - - Args: - module_name: The module name without prefix where to search for the test suite. - - Returns: - The found test suite class. - - Raises: - ConfigurationError: If the corresponding module is not found or - a valid :class:`TestSuite` is not found in the module. - """ - - def is_test_suite(object) -> bool: - """Check whether `object` is a :class:`TestSuite`. - - The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself. - - Args: - object: The object to be checked. - - Returns: - :data:`True` if `object` is a subclass of `TestSuite`. - """ - try: - if issubclass(object, TestSuite) and object is not TestSuite: - return True - except TypeError: - return False - return False - - testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}" - try: - test_suite_module = importlib.import_module(testsuite_module_path) - except ModuleNotFoundError as e: - raise ConfigurationError( - f"Test suite module '{testsuite_module_path}' not found." - ) from e - - camel_case_suite_name = "".join( - [suite_word.capitalize() for suite_word in module_name.split("_")] - ) - full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}" - for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite): - if class_name == full_suite_name_to_find: - return class_obj - raise ConfigurationError( - f"Couldn't find any valid test suites in {test_suite_module.__name__}." - ) - def _connect_nodes_and_run_test_run( self, sut_nodes: dict[str, SutNode],