[v4,4/8] dts: use pydantic in the configuration

Message ID 20241028174949.3283701-5-luca.vizzarro@arm.com (mailing list archive)
State New
Delegated to: Paul Szczepanek
Headers
Series dts: Pydantic configuration |

Checks

Context Check Description
ci/checkpatch success coding style OK

Commit Message

Luca Vizzarro Oct. 28, 2024, 5:49 p.m. UTC
This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 822 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 653 insertions(+), 1220 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py
  

Comments

Nicholas Pratte Oct. 31, 2024, 8:20 p.m. UTC | #1
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> This change brings in pydantic in place of warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
>
> - most validation is now built-in
> - further validation is added to verify:
>   - cross referencing of node names and ports
>   - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used and therefore dropped
> - the TrafficGeneratorType enum has been changed from inheriting
>   StrEnum to the native str and Enum. This change was necessary to
>   enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>   match the structure of the configuration files
> - the test suite argument catches the ValidationError that
>   TestSuiteConfig can now raise
> - the DPDK location has been wrapped under another configuration
>   mapping `dpdk_location`
> - the DPDK locations are now structured and enforced by classes,
>   further simplifying the validation and handling thanks to
>   pattern matching
>
> Bugzilla ID: 1508
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  doc/api/dts/conf_yaml_schema.json             |   1 -
>  doc/api/dts/framework.config.rst              |   6 -
>  doc/api/dts/framework.config.types.rst        |   8 -
>  dts/conf.yaml                                 |  11 +-
>  dts/framework/config/__init__.py              | 822 +++++++++---------
>  dts/framework/config/conf_yaml_schema.json    | 459 ----------
>  dts/framework/config/types.py                 | 149 ----
>  dts/framework/runner.py                       |  57 +-
>  dts/framework/settings.py                     | 124 +--
>  dts/framework/testbed_model/node.py           |  15 +-
>  dts/framework/testbed_model/os_session.py     |   4 +-
>  dts/framework/testbed_model/port.py           |   4 +-
>  dts/framework/testbed_model/posix_session.py  |  10 +-
>  dts/framework/testbed_model/sut_node.py       | 182 ++--
>  dts/framework/testbed_model/topology.py       |  11 +-
>  .../traffic_generator/__init__.py             |   4 +-
>  .../traffic_generator/traffic_generator.py    |   2 +-
>  dts/framework/utils.py                        |   2 +-
>  dts/tests/TestSuite_smoke_tests.py            |   2 +-
>  19 files changed, 653 insertions(+), 1220 deletions(-)
>  delete mode 120000 doc/api/dts/conf_yaml_schema.json
>  delete mode 100644 doc/api/dts/framework.config.types.rst
>  delete mode 100644 dts/framework/config/conf_yaml_schema.json
>  delete mode 100644 dts/framework/config/types.py
>
> diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
> deleted file mode 120000
> index 5978642d76..0000000000
> --- a/doc/api/dts/conf_yaml_schema.json
> +++ /dev/null
> @@ -1 +0,0 @@
> -../../../dts/framework/config/conf_yaml_schema.json
> \ No newline at end of file
> diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
> index 261997aefa..cc266276c1 100644
> --- a/doc/api/dts/framework.config.rst
> +++ b/doc/api/dts/framework.config.rst
> @@ -6,9 +6,3 @@ config - Configuration Package
>  .. automodule:: framework.config
>     :members:
>     :show-inheritance:
> -
> -.. toctree::
> -   :hidden:
> -   :maxdepth: 1
> -
> -   framework.config.types
> diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
> deleted file mode 100644
> index a50a0c874a..0000000000
> --- a/doc/api/dts/framework.config.types.rst
> +++ /dev/null
> @@ -1,8 +0,0 @@
> -.. SPDX-License-Identifier: BSD-3-Clause
> -
> -config.types - Configuration Types
> -==================================
> -
> -.. automodule:: framework.config.types
> -   :members:
> -   :show-inheritance:
> diff --git a/dts/conf.yaml b/dts/conf.yaml
> index 8a65a481d6..2496262854 100644
> --- a/dts/conf.yaml
> +++ b/dts/conf.yaml
> @@ -5,11 +5,12 @@
>  test_runs:
>    # define one test run environment
>    - dpdk_build:
> -      # dpdk_tree: Commented out because `tarball` is defined.
> -      tarball: dpdk-tarball.tar.xz
> -      # Either `dpdk_tree` or `tarball` can be defined, but not both.
> -      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> -                    # is located on the SUT node, instead of the execution host.
> +      dpdk_location:
> +        # dpdk_tree: Commented out because `tarball` is defined.
> +        tarball: dpdk-tarball.tar.xz
> +        # Either `dpdk_tree` or `tarball` can be defined, but not both.
> +        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> +                      # is located on the SUT node, instead of the execution host.
>
>        # precompiled_build_dir: Commented out because `build_options` is defined.
>        build_options:
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index 7403ccbf14..c86bfaaabf 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -2,17 +2,18 @@
>  # Copyright(c) 2010-2021 Intel Corporation
>  # Copyright(c) 2022-2023 University of New Hampshire
>  # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>
>  """Testbed configuration and test suite specification.
>
>  This package offers classes that hold real-time information about the testbed, hold test run
>  configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> -the YAML test run configuration file
> -and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> +model.
>
>  The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> -this package. The allowed keys and types inside this dictionary are defined in
> -the :doc:`types <framework.config.types>` module.
> +this package. The allowed keys and types inside this dictionary map directly to the
> +:class:`Configuration` model, its fields and sub-models.
>
>  The test run configuration has two main sections:
>
> @@ -24,39 +25,28 @@
>
>  The real-time information about testbed is supposed to be gathered at runtime.
>
> -The classes defined in this package make heavy use of :mod:`dataclasses`.
> -All of them use slots and are frozen:
> +The classes defined in this package make heavy use of :mod:`pydantic`.
> +Nearly all of them are frozen:
>
> -    * Slots enables some optimizations, by pre-allocating space for the defined
> -      attributes in the underlying data structure,
>      * Frozen makes the object immutable. This enables further optimizations,
>        and makes it thread safe should we ever want to move in that direction.
>  """
>
> -import json
> -import os.path
>  import tarfile
> -from dataclasses import dataclass, fields
> -from enum import auto, unique
> -from pathlib import Path
> -from typing import Union
> +from enum import Enum, auto, unique
> +from functools import cached_property
> +from pathlib import Path, PurePath
> +from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
>
> -import warlock  # type: ignore[import-untyped]
>  import yaml
> +from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
>  from typing_extensions import Self
>
> -from framework.config.types import (
> -    ConfigurationDict,
> -    DPDKBuildConfigDict,
> -    DPDKConfigurationDict,
> -    NodeConfigDict,
> -    PortConfigDict,
> -    TestRunConfigDict,
> -    TestSuiteConfigDict,
> -    TrafficGeneratorConfigDict,
> -)
>  from framework.exception import ConfigurationError
> -from framework.utils import StrEnum
> +from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
> +
> +if TYPE_CHECKING:
> +    from framework.test_suite import TestSuiteSpec
>
>
>  @unique
> @@ -118,15 +108,14 @@ class Compiler(StrEnum):
>
>
>  @unique
> -class TrafficGeneratorType(StrEnum):
> +class TrafficGeneratorType(str, Enum):
>      """The supported traffic generators."""
>
>      #:
> -    SCAPY = auto()
> +    SCAPY = "SCAPY"
>
>
> -@dataclass(slots=True, frozen=True)
> -class HugepageConfiguration:
> +class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
>      r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> @@ -138,12 +127,10 @@ class HugepageConfiguration:
>      force_first_numa: bool
>
>
> -@dataclass(slots=True, frozen=True)
> -class PortConfig:
> +class PortConfig(BaseModel, frozen=True, extra="forbid"):
>      r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> -        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
>          pci: The PCI address of the port.
>          os_driver_for_dpdk: The operating system driver name for use with DPDK.
>          os_driver: The operating system driver name when the operating system controls the port.
> @@ -152,70 +139,57 @@ class PortConfig:
>          peer_pci: The PCI address of the port connected to this port.
>      """
>
> -    node: str
> -    pci: str
> -    os_driver_for_dpdk: str
> -    os_driver: str
> -    peer_node: str
> -    peer_pci: str
> -
> -    @classmethod
> -    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
> -        """A convenience method that creates the object from fewer inputs.
> -
> -        Args:
> -            node: The node where this port exists.
> -            d: The configuration dictionary.
> -
> -        Returns:
> -            The port configuration instance.
> -        """
> -        return cls(node=node, **d)
> -
> -
> -@dataclass(slots=True, frozen=True)
> -class TrafficGeneratorConfig:
> -    """The configuration of traffic generators.
> -
> -    The class will be expanded when more configuration is needed.
> +    pci: str = Field(
> +        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
> +    )
> +    os_driver_for_dpdk: str = Field(
> +        description="The driver that the kernel should bind this device to for DPDK to use it.",
> +        examples=["vfio-pci", "mlx5_core"],
> +    )
> +    os_driver: str = Field(
> +        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
> +    )
> +    peer_node: str = Field(description="The name of the peer node this port is connected to.")
> +    peer_pci: str = Field(
> +        description="The PCI address of the peer port this port is connected to.",
> +        pattern=REGEX_FOR_PCI_ADDRESS,
> +    )
> +
> +
> +class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
> +    """A protocol required to define traffic generator types.
>
>      Attributes:
> -        traffic_generator_type: The type of the traffic generator.
> +        type: The traffic generator type, the child class is required to define to be distinguished
> +            among others.
>      """
>
> -    traffic_generator_type: TrafficGeneratorType
> +    type: TrafficGeneratorType
>
> -    @staticmethod
> -    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
> -        """A convenience method that produces traffic generator config of the proper type.
>
> -        Args:
> -            d: The configuration dictionary.
> +class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
> +    """Scapy traffic generator specific configuration."""
>
> -        Returns:
> -            The traffic generator configuration instance.
> +    type: Literal[TrafficGeneratorType.SCAPY]
>
> -        Raises:
> -            ConfigurationError: An unknown traffic generator type was encountered.
> -        """
> -        match TrafficGeneratorType(d["type"]):
> -            case TrafficGeneratorType.SCAPY:
> -                return ScapyTrafficGeneratorConfig(
> -                    traffic_generator_type=TrafficGeneratorType.SCAPY
> -                )
> -            case _:
> -                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
>
> +#: A union type discriminating traffic generators by the `type` field.
> +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
>
> -@dataclass(slots=True, frozen=True)
> -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> -    """Scapy traffic generator specific configuration."""
>
> -    pass
> +#: A field representing logical core ranges.
> +LogicalCores = Annotated[
> +    str,
> +    Field(
> +        description="Comma-separated list of logical cores to use. "
> +        "An empty string means use all lcores.",
> +        examples=["1,2,3,4,5,18-22", "10-15"],
> +        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> +    ),
> +]
>
>
> -@dataclass(slots=True, frozen=True)
> -class NodeConfiguration:
> +class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
>      r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> @@ -234,285 +208,317 @@ class NodeConfiguration:
>          ports: The ports that can be used in testing.
>      """
>
> -    name: str
> -    hostname: str
> -    user: str
> -    password: str | None
> +    name: str = Field(description="A unique identifier for this node.")
> +    hostname: str = Field(description="The hostname or IP address of the node.")
> +    user: str = Field(description="The login user to use to connect to this node.")
> +    password: str | None = Field(
> +        default=None,
> +        description="The login password to use to connect to this node. "
> +        "SSH keys are STRONGLY preferred, use only as last resort.",
> +    )
>      arch: Architecture
>      os: OS
> -    lcores: str
> -    use_first_core: bool
> -    hugepages: HugepageConfiguration | None
> -    ports: list[PortConfig]
> -
> -    @staticmethod
> -    def from_dict(
> -        d: NodeConfigDict,
> -    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> -        """A convenience method that processes the inputs before creating a specialized instance.
> -
> -        Args:
> -            d: The configuration dictionary.
> -
> -        Returns:
> -            Either an SUT or TG configuration instance.
> -        """
> -        hugepage_config = None
> -        if "hugepages_2mb" in d:
> -            hugepage_config_dict = d["hugepages_2mb"]
> -            if "force_first_numa" not in hugepage_config_dict:
> -                hugepage_config_dict["force_first_numa"] = False
> -            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
> -
> -        # The calls here contain duplicated code which is here because Mypy doesn't
> -        # properly support dictionary unpacking with TypedDicts
> -        if "traffic_generator" in d:
> -            return TGNodeConfiguration(
> -                name=d["name"],
> -                hostname=d["hostname"],
> -                user=d["user"],
> -                password=d.get("password"),
> -                arch=Architecture(d["arch"]),
> -                os=OS(d["os"]),
> -                lcores=d.get("lcores", "1"),
> -                use_first_core=d.get("use_first_core", False),
> -                hugepages=hugepage_config,
> -                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> -                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
> -            )
> -        else:
> -            return SutNodeConfiguration(
> -                name=d["name"],
> -                hostname=d["hostname"],
> -                user=d["user"],
> -                password=d.get("password"),
> -                arch=Architecture(d["arch"]),
> -                os=OS(d["os"]),
> -                lcores=d.get("lcores", "1"),
> -                use_first_core=d.get("use_first_core", False),
> -                hugepages=hugepage_config,
> -                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> -                memory_channels=d.get("memory_channels", 1),
> -            )
> +    lcores: LogicalCores = "1"
> +    use_first_core: bool = Field(
> +        default=False, description="DPDK won't use the first physical core if set to False."
> +    )
> +    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
> +    ports: list[PortConfig] = Field(min_length=1)
>
>
> -@dataclass(slots=True, frozen=True)
> -class SutNodeConfiguration(NodeConfiguration):
> +class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
>      """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
>
>      Attributes:
>          memory_channels: The number of memory channels to use when running DPDK.
>      """
>
> -    memory_channels: int
> +    memory_channels: int = Field(
> +        default=1, description="Number of memory channels to use when running DPDK."
> +    )
>
>
> -@dataclass(slots=True, frozen=True)
> -class TGNodeConfiguration(NodeConfiguration):
> +class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
>      """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
>
>      Attributes:
>          traffic_generator: The configuration of the traffic generator present on the TG node.
>      """
>
> -    traffic_generator: TrafficGeneratorConfig
> +    traffic_generator: TrafficGeneratorConfigTypes
> +
> +
> +#: Union type for all the node configuration types.
> +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
>
>
> -@dataclass(slots=True, frozen=True)
> -class DPDKBuildConfiguration:
> -    """DPDK build configuration.
> +def resolve_path(path: Path) -> Path:
> +    """Resolve a path into a real path."""
> +    return path.resolve()
>
> -    The configuration used for building DPDK.
> +
> +class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
> +    """DPDK location.
> +
> +    The path to the DPDK sources, build dir and type of location.
>
>      Attributes:
> -        arch: The target architecture to build for.
> -        os: The target os to build for.
> -        cpu: The target CPU to build for.
> -        compiler: The compiler executable to use.
> -        compiler_wrapper: This string will be put in front of the compiler when
> -            executing the build. Useful for adding wrapper commands, such as ``ccache``.
> -        name: The name of the compiler.
> +        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
> +            located on the SUT node, instead of the execution host.
>      """
>
> -    arch: Architecture
> -    os: OS
> -    cpu: CPUType
> -    compiler: Compiler
> -    compiler_wrapper: str
> -    name: str
> +    remote: bool = False
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
> -        r"""A convenience method that processes the inputs before creating an instance.
>
> -        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
> -        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
> +class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK location parent class.
>
> -        Args:
> -            d: The configuration dictionary.
> +    This class is meant to represent any location that is present only locally.
> +    """
>
> -        Returns:
> -            The DPDK build configuration instance.
> -        """
> -        return cls(
> -            arch=Architecture(d["arch"]),
> -            os=OS(d["os"]),
> -            cpu=CPUType(d["cpu"]),
> -            compiler=Compiler(d["compiler"]),
> -            compiler_wrapper=d.get("compiler_wrapper", ""),
> -            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
> -        )
> +    remote: Literal[False] = False
>
>
> -@dataclass(slots=True, frozen=True)
> -class DPDKLocation:
> -    """DPDK location.
> +class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK tree location.
>
> -    The path to the DPDK sources, build dir and type of location.
> +    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
> +    validation.
>
>      Attributes:
> -        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
> -            must be provided.
> -        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
> -            provided.
> -        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
> -            located on the SUT node, instead of the execution host.
> -        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
> -            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
> -            `build_options` from configuration to build the DPDK from source.
> +        dpdk_tree: The path to the DPDK source tree directory.
>      """
>
> -    dpdk_tree: str | None
> -    tarball: str | None
> -    remote: bool
> -    build_dir: str | None
> +    dpdk_tree: Path
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
> -        """A convenience method that processes and validates the inputs before creating an instance.
> +    #: Resolve the local DPDK tree path
> +    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
>
> -        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
> -        `remote` is False.
> +    @model_validator(mode="after")
> +    def validate_dpdk_tree_path(self) -> Self:
> +        """Validate the provided DPDK tree path."""
> +        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
> +        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
> +        return self
>
> -        Args:
> -            d: The configuration dictionary.
>
> -        Returns:
> -            The DPDK location instance.
> +class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK tarball location.
>
> -        Raises:
> -            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
> -                aren't in the right format.
> -        """
> -        dpdk_tree = d.get("dpdk_tree")
> -        tarball = d.get("tarball")
> -        remote = d.get("remote", False)
> -
> -        if not remote:
> -            if dpdk_tree:
> -                if not Path(dpdk_tree).exists():
> -                    raise ConfigurationError(
> -                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
> -                    )
> -
> -                if not Path(dpdk_tree).is_dir():
> -                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
> -
> -                dpdk_tree = os.path.realpath(dpdk_tree)
> -
> -            if tarball:
> -                if not Path(tarball).exists():
> -                    raise ConfigurationError(
> -                        f"DPDK tarball '{tarball}' not found in local filesystem."
> -                    )
> -
> -                if not tarfile.is_tarfile(tarball):
> -                    raise ConfigurationError(
> -                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
> -                    )
> -
> -        return cls(
> -            dpdk_tree=dpdk_tree,
> -            tarball=tarball,
> -            remote=remote,
> -            build_dir=d.get("precompiled_build_dir"),
> -        )
> +    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
> +    validation.
> +
> +    Attributes:
> +        tarball: The path to the DPDK tarball.
> +    """
>
> +    tarball: Path
>
> -@dataclass
> -class DPDKConfiguration:
> -    """The configuration of the DPDK build.
> +    #: Resolve the local tarball path
> +    resolve_tarball_path = field_validator("tarball")(resolve_path)
>
> -    The configuration contain the location of the DPDK and configuration used for
> -    building it.
> +    @model_validator(mode="after")
> +    def validate_tarball_path(self) -> Self:
> +        """Validate the provided tarball."""
> +        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
> +        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
> +        return self
> +
> +
> +class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK location parent class.
> +
> +    This class is meant to represent any location that is present only remotely.
> +    """
> +
> +    remote: Literal[True] = True
> +
> +
> +class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK tree location.
> +
> +    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
> +
> +    Attributes:
> +        dpdk_tree: The path to the DPDK source tree directory.
> +    """
> +
> +    dpdk_tree: PurePath
> +
> +
> +class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK tarball location.
> +
> +    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
> +    validation.
> +
> +    Attributes:
> +        tarball: The path to the DPDK tarball.
> +    """
> +
> +    tarball: PurePath
> +
> +
> +#: Union type for different DPDK locations
> +DPDKLocation = (
> +    LocalDPDKTreeLocation
> +    | LocalDPDKTarballLocation
> +    | RemoteDPDKTreeLocation
> +    | RemoteDPDKTarballLocation
> +)
> +
> +
> +class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """The base configuration for different types of build.
> +
> +    The configuration contain the location of the DPDK and configuration used for building it.
>
>      Attributes:
>          dpdk_location: The location of the DPDK tree.
> -        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
> -            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
>      """
>
>      dpdk_location: DPDKLocation
> -    dpdk_build_config: DPDKBuildConfiguration | None
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
>
> -        Args:
> -            d: The configuration dictionary.
> +class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> +    """DPDK precompiled build configuration.
>
> -        Returns:
> -            The DPDK configuration.
> -        """
> -        return cls(
> -            dpdk_location=DPDKLocation.from_dict(d),
> -            dpdk_build_config=(
> -                DPDKBuildConfiguration.from_dict(d["build_options"])
> -                if d.get("build_options")
> -                else None
> -            ),
> -        )
> +    Attributes:
> +        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
> +            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
> +            be using `dpdk_build_config` from configuration to build the DPDK from source.
> +    """
> +
> +    precompiled_build_dir: str = Field(min_length=1)
> +
> +
> +class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """DPDK build options configuration.
> +
> +    The build options used for building DPDK.
> +
> +    Attributes:
> +        arch: The target architecture to build for.
> +        os: The target os to build for.
> +        cpu: The target CPU to build for.
> +        compiler: The compiler executable to use.
> +        compiler_wrapper: This string will be put in front of the compiler when executing the build.
> +            Useful for adding wrapper commands, such as ``ccache``.
> +    """
> +
> +    arch: Architecture
> +    os: OS
> +    cpu: CPUType
> +    compiler: Compiler
> +    compiler_wrapper: str = ""
>
> +    @cached_property
> +    def name(self) -> str:
> +        """The name of the compiler."""
> +        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
>
> -@dataclass(slots=True, frozen=True)
> -class TestSuiteConfig:
> +
> +class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> +    """DPDK uncompiled build configuration.
> +
> +    Attributes:
> +        build_options: The build options to compile DPDK.
> +    """
> +
> +    build_options: DPDKBuildOptionsConfiguration
> +
> +
> +#: Union type for different build configurations
> +DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
> +
> +
> +class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
>      """Test suite configuration.
>
> -    Information about a single test suite to be executed.
> +    Information about a single test suite to be executed. This can also be represented as a string
> +    instead of a mapping, example:
> +
> +    .. code:: yaml
> +
> +        test_runs:
> +        - test_suites:
> +            # As string representation:
> +            - hello_world # test all of `hello_world`, or
> +            - hello_world hello_world_single_core # test only `hello_world_single_core`
> +            # or as model fields:
> +            - test_suite: hello_world
> +              test_cases: [hello_world_single_core] # without this field all test cases are run
>
>      Attributes:
> -        test_suite: The name of the test suite module without the starting ``TestSuite_``.
> -        test_cases: The names of test cases from this test suite to execute.
> +        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
> +        test_cases_names: The names of test cases from this test suite to execute.
>              If empty, all test cases will be executed.
>      """
>
> -    test_suite: str
> -    test_cases: list[str]
> -
> +    test_suite_name: str = Field(
> +        title="Test suite name",
> +        description="The identifying module name of the test suite without the prefix.",
> +        alias="test_suite",
> +    )
> +    test_cases_names: list[str] = Field(
> +        default_factory=list,
> +        title="Test cases by name",
> +        description="The identifying name of the test cases of the test suite.",
> +        alias="test_cases",
> +    )
> +
> +    @cached_property
> +    def test_suite_spec(self) -> "TestSuiteSpec":
> +        """The specification of the requested test suite."""
> +        from framework.test_suite import find_by_name
> +
> +        test_suite_spec = find_by_name(self.test_suite_name)
> +        assert (
> +            test_suite_spec is not None
> +        ), f"{self.test_suite_name} is not a valid test suite module name."
> +        return test_suite_spec
> +
> +    @model_validator(mode="before")
>      @classmethod
> -    def from_dict(
> -        cls,
> -        entry: str | TestSuiteConfigDict,
> -    ) -> Self:
> -        """Create an instance from two different types.
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation of the model into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
> +
> +    @model_validator(mode="after")
> +    def validate_names(self) -> Self:
> +        """Validate the supplied test suite and test cases names.
> +
> +        This validator relies on the cached property `test_suite_spec` to run for the first
> +        time in this call, therefore triggering the assertions if needed.
> +        """
> +        available_test_cases = map(
> +            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
> +        )
> +        for requested_test_case in self.test_cases_names:
> +            assert requested_test_case in available_test_cases, (
> +                f"{requested_test_case} is not a valid test case "
> +                f"of test suite {self.test_suite_name}."
> +            )
>
> -        Args:
> -            entry: Either a suite name or a dictionary containing the config.
> +        return self
>
> -        Returns:
> -            The test suite configuration instance.
> -        """
> -        if isinstance(entry, str):
> -            return cls(test_suite=entry, test_cases=[])
> -        elif isinstance(entry, dict):
> -            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
> -        else:
> -            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
>
> +class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """The SUT node configuration of a test run.
>
> -@dataclass(slots=True, frozen=True)
> -class TestRunConfiguration:
> +    Attributes:
> +        node_name: The SUT node to use in this test run.
> +        vdevs: The names of virtual devices to test.
> +    """
> +
> +    node_name: str
> +    vdevs: list[str] = Field(default_factory=list)
> +
> +
> +class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
>      """The configuration of a test run.
>
>      The configuration contains testbed information, what tests to execute
> @@ -524,144 +530,130 @@ class TestRunConfiguration:
>          func: Whether to run functional tests.
>          skip_smoke_tests: Whether to skip smoke tests.
>          test_suites: The names of test suites and/or test cases to execute.
> -        system_under_test_node: The SUT node to use in this test run.
> -        traffic_generator_node: The TG node to use in this test run.
> -        vdevs: The names of virtual devices to test.
> +        system_under_test_node: The SUT node configuration to use in this test run.
> +        traffic_generator_node: The TG node name to use in this test run.
>          random_seed: The seed to use for pseudo-random generation.
>      """
>
> -    dpdk_config: DPDKConfiguration
> -    perf: bool
> -    func: bool
> -    skip_smoke_tests: bool
> -    test_suites: list[TestSuiteConfig]
> -    system_under_test_node: SutNodeConfiguration
> -    traffic_generator_node: TGNodeConfiguration
> -    vdevs: list[str]
> -    random_seed: int | None
> -
> -    @classmethod
> -    def from_dict(
> -        cls,
> -        d: TestRunConfigDict,
> -        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
> -    ) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> -
> -        The DPDK build and the test suite config are transformed into their respective objects.
> -        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
> -        are just stored.
> -
> -        Args:
> -            d: The test run configuration dictionary.
> -            node_map: A dictionary mapping node names to their config objects.
> -
> -        Returns:
> -            The test run configuration instance.
> -        """
> -        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
> -        sut_name = d["system_under_test_node"]["node_name"]
> -        skip_smoke_tests = d.get("skip_smoke_tests", False)
> -        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
> -        system_under_test_node = node_map[sut_name]
> -        assert isinstance(
> -            system_under_test_node, SutNodeConfiguration
> -        ), f"Invalid SUT configuration {system_under_test_node}"
> -
> -        tg_name = d["traffic_generator_node"]
> -        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
> -        traffic_generator_node = node_map[tg_name]
> -        assert isinstance(
> -            traffic_generator_node, TGNodeConfiguration
> -        ), f"Invalid TG configuration {traffic_generator_node}"
> -
> -        vdevs = (
> -            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
> -        )
> -        random_seed = d.get("random_seed", None)
> -        return cls(
> -            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
> -            perf=d["perf"],
> -            func=d["func"],
> -            skip_smoke_tests=skip_smoke_tests,
> -            test_suites=test_suites,
> -            system_under_test_node=system_under_test_node,
> -            traffic_generator_node=traffic_generator_node,
> -            vdevs=vdevs,
> -            random_seed=random_seed,
> -        )
> -
> -    def copy_and_modify(self, **kwargs) -> Self:
> -        """Create a shallow copy with any of the fields modified.
> +    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
> +    perf: bool = Field(description="Enable performance testing.")
> +    func: bool = Field(description="Enable functional testing.")
> +    skip_smoke_tests: bool = False
> +    test_suites: list[TestSuiteConfig] = Field(min_length=1)
> +    system_under_test_node: TestRunSUTNodeConfiguration
> +    traffic_generator_node: str
> +    random_seed: int | None = None
>
> -        The only new data are those passed to this method.
> -        The rest are copied from the object's fields calling the method.
>
> -        Args:
> -            **kwargs: The names and types of keyword arguments are defined
> -                by the fields of the :class:`TestRunConfiguration` class.
> +class TestRunWithNodesConfiguration(NamedTuple):
> +    """Tuple containing the configuration of the test run and its associated nodes."""
>
> -        Returns:
> -            The copied and modified test run configuration.
> -        """
> -        new_config = {}
> -        for field in fields(self):
> -            if field.name in kwargs:
> -                new_config[field.name] = kwargs[field.name]
> -            else:
> -                new_config[field.name] = getattr(self, field.name)
> -
> -        return type(self)(**new_config)
> +    #:
> +    test_run_config: TestRunConfiguration
> +    #:
> +    sut_node_config: SutNodeConfiguration
> +    #:
> +    tg_node_config: TGNodeConfiguration
>
>
> -@dataclass(slots=True, frozen=True)
> -class Configuration:
> +class Configuration(BaseModel, extra="forbid"):
>      """DTS testbed and test configuration.
>
> -    The node configuration is not stored in this object. Rather, all used node configurations
> -    are stored inside the test run configuration where the nodes are actually used.
> -
>      Attributes:
>          test_runs: Test run configurations.
> +        nodes: Node configurations.
>      """
>
> -    test_runs: list[TestRunConfiguration]
> +    test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
>
> -    @classmethod
> -    def from_dict(cls, d: ConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> +    @cached_property
> +    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
> +        """List of test runs with the associated nodes."""
> +        test_runs_with_nodes = []
>
> -        DPDK build and test suite config are transformed into their respective objects.
> -        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
> -        are just stored.
> +        for test_run_no, test_run in enumerate(self.test_runs):
> +            sut_node_name = test_run.system_under_test_node.node_name
> +            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
>
> -        Args:
> -            d: The configuration dictionary.
> +            assert sut_node is not None, (
> +                f"test_runs.{test_run_no}.sut_node_config.node_name "
> +                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
> +            )
> +            assert isinstance(sut_node, SutNodeConfiguration), (
> +                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
> +                "but it is not a valid SUT node"
> +            )
>
> -        Returns:
> -            The whole configuration instance.
> -        """
> -        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
> -            map(NodeConfiguration.from_dict, d["nodes"])
> -        )
> -        assert len(nodes) > 0, "There must be a node to test"
> +            tg_node_name = test_run.traffic_generator_node
> +            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
>
> -        node_map = {node.name: node for node in nodes}
> -        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
> +            assert tg_node is not None, (
> +                f"test_runs.{test_run_no}.tg_node_name "
> +                f"({test_run.traffic_generator_node}) is not a valid node name"
> +            )
> +            assert isinstance(tg_node, TGNodeConfiguration), (
> +                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
> +                "but it is not a valid TG node"
> +            )
>
> -        test_runs: list[TestRunConfiguration] = list(
> -            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
> -        )
> +            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
> +
> +        return test_runs_with_nodes
> +
> +    @field_validator("nodes")
> +    @classmethod
> +    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
> +        """Validate that the node names are unique."""
> +        nodes_by_name: dict[str, int] = {}
> +        for node_no, node in enumerate(nodes):
> +            assert node.name not in nodes_by_name, (
> +                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
> +                f"({node.name})"
> +            )
> +            nodes_by_name[node.name] = node_no
> +
> +        return nodes
> +
> +    @model_validator(mode="after")
> +    def validate_ports(self) -> Self:
> +        """Validate that the ports are all linked to valid ones."""
> +        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
> +            (node.name, port.pci): False for node in self.nodes for port in node.ports
> +        }
> +
> +        for node_no, node in enumerate(self.nodes):
> +            for port_no, port in enumerate(node.ports):
> +                peer_port_identifier = (port.peer_node, port.peer_pci)
> +                peer_port = port_links.get(peer_port_identifier, None)
> +                assert peer_port is not None, (
> +                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
> +                )
> +                assert peer_port is False, (
> +                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
> +                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
> +                )
> +                port_links[peer_port_identifier] = (node_no, port_no)
>
> -        return cls(test_runs=test_runs)
> +        return self
> +
> +    @model_validator(mode="after")
> +    def validate_test_runs_with_nodes(self) -> Self:
> +        """Validate the test runs to nodes associations.
> +
> +        This validator relies on the cached property `test_runs_with_nodes` to run for the first
> +        time in this call, therefore triggering the assertions if needed.
> +        """
> +        if self.test_runs_with_nodes:
> +            pass
> +        return self
>
>
>  def load_config(config_file_path: Path) -> Configuration:
>      """Load DTS test run configuration from a file.
>
> -    Load the YAML test run configuration file
> -    and :download:`the configuration file schema <conf_yaml_schema.json>`,
> -    validate the test run configuration file, and create a test run configuration object.
> +    Load the YAML test run configuration file, validate it, and create a test run configuration
> +    object.
>
>      The YAML test run configuration file is specified in the :option:`--config-file` command line
>      argument or the :envvar:`DTS_CFG_FILE` environment variable.
> @@ -671,14 +663,14 @@ def load_config(config_file_path: Path) -> Configuration:
>
>      Returns:
>          The parsed test run configuration.
> +
> +    Raises:
> +        ConfigurationError: If the supplied configuration file is invalid.
>      """
>      with open(config_file_path, "r") as f:
>          config_data = yaml.safe_load(f)
>
> -    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
> -
> -    with open(schema_path, "r") as f:
> -        schema = json.load(f)
> -    config = warlock.model_factory(schema, name="_Config")(config_data)
> -    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
> -    return config_obj
> +    try:
> +        return Configuration.model_validate(config_data)
> +    except ValidationError as e:
> +        raise ConfigurationError("failed to load the supplied configuration") from e
> diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
> deleted file mode 100644
> index cc3e78cef5..0000000000
> --- a/dts/framework/config/conf_yaml_schema.json
> +++ /dev/null
> @@ -1,459 +0,0 @@
> -{
> -  "$schema": "https://json-schema.org/draft-07/schema",
> -  "title": "DTS Config Schema",
> -  "definitions": {
> -    "node_name": {
> -      "type": "string",
> -      "description": "A unique identifier for a node"
> -    },
> -    "NIC": {
> -      "type": "string",
> -      "enum": [
> -        "ALL",
> -        "ConnectX3_MT4103",
> -        "ConnectX4_LX_MT4117",
> -        "ConnectX4_MT4115",
> -        "ConnectX5_MT4119",
> -        "ConnectX5_MT4121",
> -        "I40E_10G-10G_BASE_T_BC",
> -        "I40E_10G-10G_BASE_T_X722",
> -        "I40E_10G-SFP_X722",
> -        "I40E_10G-SFP_XL710",
> -        "I40E_10G-X722_A0",
> -        "I40E_1G-1G_BASE_T_X722",
> -        "I40E_25G-25G_SFP28",
> -        "I40E_40G-QSFP_A",
> -        "I40E_40G-QSFP_B",
> -        "IAVF-ADAPTIVE_VF",
> -        "IAVF-VF",
> -        "IAVF_10G-X722_VF",
> -        "ICE_100G-E810C_QSFP",
> -        "ICE_25G-E810C_SFP",
> -        "ICE_25G-E810_XXV_SFP",
> -        "IGB-I350_VF",
> -        "IGB_1G-82540EM",
> -        "IGB_1G-82545EM_COPPER",
> -        "IGB_1G-82571EB_COPPER",
> -        "IGB_1G-82574L",
> -        "IGB_1G-82576",
> -        "IGB_1G-82576_QUAD_COPPER",
> -        "IGB_1G-82576_QUAD_COPPER_ET2",
> -        "IGB_1G-82580_COPPER",
> -        "IGB_1G-I210_COPPER",
> -        "IGB_1G-I350_COPPER",
> -        "IGB_1G-I354_SGMII",
> -        "IGB_1G-PCH_LPTLP_I218_LM",
> -        "IGB_1G-PCH_LPTLP_I218_V",
> -        "IGB_1G-PCH_LPT_I217_LM",
> -        "IGB_1G-PCH_LPT_I217_V",
> -        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
> -        "IGC-I225_LM",
> -        "IGC-I226_LM",
> -        "IXGBE_10G-82599_SFP",
> -        "IXGBE_10G-82599_SFP_SF_QP",
> -        "IXGBE_10G-82599_T3_LOM",
> -        "IXGBE_10G-82599_VF",
> -        "IXGBE_10G-X540T",
> -        "IXGBE_10G-X540_VF",
> -        "IXGBE_10G-X550EM_A_SFP",
> -        "IXGBE_10G-X550EM_X_10G_T",
> -        "IXGBE_10G-X550EM_X_SFP",
> -        "IXGBE_10G-X550EM_X_VF",
> -        "IXGBE_10G-X550T",
> -        "IXGBE_10G-X550_VF",
> -        "brcm_57414",
> -        "brcm_P2100G",
> -        "cavium_0011",
> -        "cavium_a034",
> -        "cavium_a063",
> -        "cavium_a064",
> -        "fastlinq_ql41000",
> -        "fastlinq_ql41000_vf",
> -        "fastlinq_ql45000",
> -        "fastlinq_ql45000_vf",
> -        "hi1822",
> -        "virtio"
> -      ]
> -    },
> -
> -    "ARCH": {
> -      "type": "string",
> -      "enum": [
> -        "x86_64",
> -        "arm64",
> -        "ppc64le"
> -      ]
> -    },
> -    "OS": {
> -      "type": "string",
> -      "enum": [
> -        "linux"
> -      ]
> -    },
> -    "cpu": {
> -      "type": "string",
> -      "description": "Native should be the default on x86",
> -      "enum": [
> -        "native",
> -        "armv8a",
> -        "dpaa2",
> -        "thunderx",
> -        "xgene1"
> -      ]
> -    },
> -    "compiler": {
> -      "type": "string",
> -      "enum": [
> -        "gcc",
> -        "clang",
> -        "icc",
> -        "mscv"
> -      ]
> -    },
> -    "build_options": {
> -      "type": "object",
> -      "properties": {
> -        "arch": {
> -          "type": "string",
> -          "enum": [
> -            "ALL",
> -            "x86_64",
> -            "arm64",
> -            "ppc64le",
> -            "other"
> -          ]
> -        },
> -        "os": {
> -          "$ref": "#/definitions/OS"
> -        },
> -        "cpu": {
> -          "$ref": "#/definitions/cpu"
> -        },
> -        "compiler": {
> -          "$ref": "#/definitions/compiler"
> -        },
> -        "compiler_wrapper": {
> -          "type": "string",
> -          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
> -        }
> -      },
> -      "additionalProperties": false,
> -      "required": [
> -        "arch",
> -        "os",
> -        "cpu",
> -        "compiler"
> -      ]
> -    },
> -    "dpdk_build": {
> -      "type": "object",
> -      "description": "DPDK source and build configuration.",
> -      "properties": {
> -        "dpdk_tree": {
> -          "type": "string",
> -          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
> -        },
> -        "tarball": {
> -          "type": "string",
> -          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
> -        },
> -        "remote": {
> -          "type": "boolean",
> -          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
> -        },
> -        "precompiled_build_dir": {
> -          "type": "string",
> -          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
> -        },
> -        "build_options": {
> -          "$ref": "#/definitions/build_options",
> -          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
> -        }
> -      },
> -      "allOf": [
> -        {
> -          "oneOf": [
> -            {
> -            "required": [
> -              "dpdk_tree"
> -              ]
> -            },
> -            {
> -              "required": [
> -                "tarball"
> -              ]
> -            }
> -          ]
> -        },
> -        {
> -          "oneOf": [
> -            {
> -              "required": [
> -                "precompiled_build_dir"
> -              ]
> -            },
> -            {
> -              "required": [
> -                "build_options"
> -              ]
> -            }
> -          ]
> -        }
> -      ],
> -      "additionalProperties": false
> -    },
> -    "hugepages_2mb": {
> -      "type": "object",
> -      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
> -      "properties": {
> -        "number_of": {
> -          "type": "integer",
> -          "description": "The number of hugepages to configure. Hugepage size will be the system default."
> -        },
> -        "force_first_numa": {
> -          "type": "boolean",
> -          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
> -        }
> -      },
> -      "additionalProperties": false,
> -      "required": [
> -        "number_of"
> -      ]
> -    },
> -    "mac_address": {
> -      "type": "string",
> -      "description": "A MAC address",
> -      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
> -    },
> -    "pci_address": {
> -      "type": "string",
> -      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
> -    },
> -    "port_peer_address": {
> -      "description": "Peer is a TRex port, and IXIA port or a PCI address",
> -      "oneOf": [
> -        {
> -          "description": "PCI peer port",
> -          "$ref": "#/definitions/pci_address"
> -        }
> -      ]
> -    },
> -    "test_suite": {
> -      "type": "string",
> -      "enum": [
> -        "hello_world",
> -        "os_udp",
> -        "pmd_buffer_scatter",
> -        "vlan"
> -      ]
> -    },
> -    "test_target": {
> -      "type": "object",
> -      "properties": {
> -        "suite": {
> -          "$ref": "#/definitions/test_suite"
> -        },
> -        "cases": {
> -          "type": "array",
> -          "description": "If specified, only this subset of test suite's test cases will be run.",
> -          "items": {
> -            "type": "string"
> -          },
> -          "minimum": 1
> -        }
> -      },
> -      "required": [
> -        "suite"
> -      ],
> -      "additionalProperties": false
> -    }
> -  },
> -  "type": "object",
> -  "properties": {
> -    "nodes": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "name": {
> -            "type": "string",
> -            "description": "A unique identifier for this node"
> -          },
> -          "hostname": {
> -            "type": "string",
> -            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
> -          },
> -          "user": {
> -            "type": "string",
> -            "description": "The user to access this node with."
> -          },
> -          "password": {
> -            "type": "string",
> -            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
> -          },
> -          "arch": {
> -            "$ref": "#/definitions/ARCH"
> -          },
> -          "os": {
> -            "$ref": "#/definitions/OS"
> -          },
> -          "lcores": {
> -            "type": "string",
> -            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> -            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
> -          },
> -          "use_first_core": {
> -            "type": "boolean",
> -            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
> -          },
> -          "memory_channels": {
> -            "type": "integer",
> -            "description": "How many memory channels to use. Optional, defaults to 1."
> -          },
> -          "hugepages_2mb": {
> -            "$ref": "#/definitions/hugepages_2mb"
> -          },
> -          "ports": {
> -            "type": "array",
> -            "items": {
> -              "type": "object",
> -              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
> -              "properties": {
> -                "pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The local PCI address of the port"
> -                },
> -                "os_driver_for_dpdk": {
> -                  "type": "string",
> -                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
> -                },
> -                "os_driver": {
> -                  "type": "string",
> -                  "description": "The driver normally used by this port (ex: i40e)"
> -                },
> -                "peer_node": {
> -                  "type": "string",
> -                  "description": "The name of the node the peer port is on"
> -                },
> -                "peer_pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The PCI address of the peer port"
> -                }
> -              },
> -              "additionalProperties": false,
> -              "required": [
> -                "pci",
> -                "os_driver_for_dpdk",
> -                "os_driver",
> -                "peer_node",
> -                "peer_pci"
> -              ]
> -            },
> -            "minimum": 1
> -          },
> -          "traffic_generator": {
> -            "oneOf": [
> -              {
> -                "type": "object",
> -                "description": "Scapy traffic generator. Used for functional testing.",
> -                "properties": {
> -                  "type": {
> -                    "type": "string",
> -                    "enum": [
> -                      "SCAPY"
> -                    ]
> -                  }
> -                }
> -              }
> -            ]
> -          }
> -        },
> -        "additionalProperties": false,
> -        "required": [
> -          "name",
> -          "hostname",
> -          "user",
> -          "arch",
> -          "os"
> -        ]
> -      },
> -      "minimum": 1
> -    },
> -    "test_runs": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "dpdk_build": {
> -            "$ref": "#/definitions/dpdk_build"
> -          },
> -          "perf": {
> -            "type": "boolean",
> -            "description": "Enable performance testing."
> -          },
> -          "func": {
> -            "type": "boolean",
> -            "description": "Enable functional testing."
> -          },
> -          "test_suites": {
> -            "type": "array",
> -            "items": {
> -              "oneOf": [
> -                {
> -                  "$ref": "#/definitions/test_suite"
> -                },
> -                {
> -                  "$ref": "#/definitions/test_target"
> -                }
> -              ]
> -            }
> -          },
> -          "skip_smoke_tests": {
> -            "description": "Optional field that allows you to skip smoke testing",
> -            "type": "boolean"
> -          },
> -          "system_under_test_node": {
> -            "type":"object",
> -            "properties": {
> -              "node_name": {
> -                "$ref": "#/definitions/node_name"
> -              },
> -              "vdevs": {
> -                "description": "Optional list of names of vdevs to be used in the test run",
> -                "type": "array",
> -                "items": {
> -                  "type": "string"
> -                }
> -              }
> -            },
> -            "required": [
> -              "node_name"
> -            ]
> -          },
> -          "traffic_generator_node": {
> -            "$ref": "#/definitions/node_name"
> -          },
> -          "random_seed": {
> -            "type": "integer",
> -            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
> -          }
> -        },
> -        "additionalProperties": false,
> -        "required": [
> -          "dpdk_build",
> -          "perf",
> -          "func",
> -          "test_suites",
> -          "system_under_test_node",
> -          "traffic_generator_node"
> -        ]
> -      },
> -      "minimum": 1
> -    }
> -  },
> -  "required": [
> -    "test_runs",
> -    "nodes"
> -  ],
> -  "additionalProperties": false
> -}
> diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
> deleted file mode 100644
> index 02e738a61e..0000000000
> --- a/dts/framework/config/types.py
> +++ /dev/null
> @@ -1,149 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -"""Configuration dictionary contents specification.
> -
> -These type definitions serve as documentation of the configuration dictionary contents.
> -
> -The definitions use the built-in :class:`~typing.TypedDict` construct.
> -"""
> -
> -from typing import TypedDict
> -
> -
> -class PortConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    pci: str
> -    #:
> -    os_driver_for_dpdk: str
> -    #:
> -    os_driver: str
> -    #:
> -    peer_node: str
> -    #:
> -    peer_pci: str
> -
> -
> -class TrafficGeneratorConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    type: str
> -
> -
> -class HugepageConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    number_of: int
> -    #:
> -    force_first_numa: bool
> -
> -
> -class NodeConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    hugepages_2mb: HugepageConfigurationDict
> -    #:
> -    name: str
> -    #:
> -    hostname: str
> -    #:
> -    user: str
> -    #:
> -    password: str
> -    #:
> -    arch: str
> -    #:
> -    os: str
> -    #:
> -    lcores: str
> -    #:
> -    use_first_core: bool
> -    #:
> -    ports: list[PortConfigDict]
> -    #:
> -    memory_channels: int
> -    #:
> -    traffic_generator: TrafficGeneratorConfigDict
> -
> -
> -class DPDKBuildConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    arch: str
> -    #:
> -    os: str
> -    #:
> -    cpu: str
> -    #:
> -    compiler: str
> -    #:
> -    compiler_wrapper: str
> -
> -
> -class DPDKConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    dpdk_tree: str | None
> -    #:
> -    tarball: str | None
> -    #:
> -    remote: bool
> -    #:
> -    precompiled_build_dir: str | None
> -    #:
> -    build_options: DPDKBuildConfigDict
> -
> -
> -class TestSuiteConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    suite: str
> -    #:
> -    cases: list[str]
> -
> -
> -class TestRunSUTConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    node_name: str
> -    #:
> -    vdevs: list[str]
> -
> -
> -class TestRunConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    dpdk_build: DPDKConfigurationDict
> -    #:
> -    perf: bool
> -    #:
> -    func: bool
> -    #:
> -    skip_smoke_tests: bool
> -    #:
> -    test_suites: TestSuiteConfigDict
> -    #:
> -    system_under_test_node: TestRunSUTConfigDict
> -    #:
> -    traffic_generator_node: str
> -    #:
> -    random_seed: int
> -
> -
> -class ConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    nodes: list[NodeConfigDict]
> -    #:
> -    test_runs: list[TestRunConfigDict]
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index 195622c653..c3d9a27a8c 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -30,7 +30,15 @@
>  from framework.testbed_model.sut_node import SutNode
>  from framework.testbed_model.tg_node import TGNode
>
> -from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
> +from .config import (
> +    Configuration,
> +    DPDKPrecompiledBuildConfiguration,
> +    SutNodeConfiguration,
> +    TestRunConfiguration,
> +    TestSuiteConfig,
> +    TGNodeConfiguration,
> +    load_config,
> +)
>  from .exception import (
>      BlockingTestSuiteError,
>      ConfigurationError,
> @@ -133,11 +141,10 @@ def run(self) -> None:
>              self._result.update_setup(Result.PASS)
>
>              # for all test run sections
> -            for test_run_config in self._configuration.test_runs:
> +            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
> +                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
>                  self._logger.set_stage(DtsStage.test_run_setup)
> -                self._logger.info(
> -                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
> -                )
> +                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
>                  self._init_random_seed(test_run_config)
>                  test_run_result = self._result.add_test_run(test_run_config)
>                  # we don't want to modify the original config, so create a copy
> @@ -145,7 +152,7 @@ def run(self) -> None:
>                      SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
>                  )
>                  if not test_run_config.skip_smoke_tests:
> -                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
> +                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
>                  try:
>                      test_suites_with_cases = self._get_test_suites_with_cases(
>                          test_run_test_suites, test_run_config.func, test_run_config.perf
> @@ -161,6 +168,8 @@ def run(self) -> None:
>                      self._connect_nodes_and_run_test_run(
>                          sut_nodes,
>                          tg_nodes,
> +                        sut_node_config,
> +                        tg_node_config,
>                          test_run_config,
>                          test_run_result,
>                          test_suites_with_cases,
> @@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
>          test_suites_with_cases = []
>
>          for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
> +            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
>              test_cases: list[type[TestCase]] = []
>              func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
> -                test_suite_config.test_cases
> +                test_suite_config.test_cases_names
>              )
>              if func:
>                  test_cases.extend(func_test_cases)
> @@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
>          self,
>          sut_nodes: dict[str, SutNode],
>          tg_nodes: dict[str, TGNode],
> +        sut_node_config: SutNodeConfiguration,
> +        tg_node_config: TGNodeConfiguration,
>          test_run_config: TestRunConfiguration,
>          test_run_result: TestRunResult,
>          test_suites_with_cases: Iterable[TestSuiteWithCases],
> @@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
>          Args:
>              sut_nodes: A dictionary storing connected/to be connected SUT nodes.
>              tg_nodes: A dictionary storing connected/to be connected TG nodes.
> +            sut_node_config: The test run's SUT node configuration.
> +            tg_node_config: The test run's TG node configuration.
>              test_run_config: A test run configuration.
>              test_run_result: The test run's result.
>              test_suites_with_cases: The test suites with test cases to run.
>          """
> -        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
> -        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
> +        sut_node = sut_nodes.get(sut_node_config.name)
> +        tg_node = tg_nodes.get(tg_node_config.name)
>
>          try:
>              if not sut_node:
> -                sut_node = SutNode(test_run_config.system_under_test_node)
> +                sut_node = SutNode(sut_node_config)
>                  sut_nodes[sut_node.name] = sut_node
>              if not tg_node:
> -                tg_node = TGNode(test_run_config.traffic_generator_node)
> +                tg_node = TGNode(tg_node_config)
>                  tg_nodes[tg_node.name] = tg_node
>          except Exception as e:
> -            failed_node = test_run_config.system_under_test_node.name
> +            failed_node = test_run_config.system_under_test_node.node_name
>              if sut_node:
> -                failed_node = test_run_config.traffic_generator_node.name
> +                failed_node = test_run_config.traffic_generator_node
>              self._logger.exception(f"The Creation of node {failed_node} failed.")
>              test_run_result.update_setup(Result.FAIL, e)
>
> @@ -369,14 +382,22 @@ def _run_test_run(
>              ConfigurationError: If the DPDK sources or build is not set up from config or settings.
>          """
>          self._logger.info(
> -            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
> +            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
>          )
>          test_run_result.add_sut_info(sut_node.node_info)
>          try:
> -            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
> -            sut_node.set_up_test_run(test_run_config, dpdk_location)
> +            dpdk_build_config = test_run_config.dpdk_config
> +            if new_location := SETTINGS.dpdk_location:
> +                dpdk_build_config = dpdk_build_config.model_copy(
> +                    update={"dpdk_location": new_location}
> +                )
> +            if dir := SETTINGS.precompiled_build_dir:
> +                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
> +                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
> +                )
> +            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
>              test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
> -            tg_node.set_up_test_run(test_run_config, dpdk_location)
> +            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
>              test_run_result.update_setup(Result.PASS)
>          except Exception as e:
>              self._logger.exception("Test run setup failed.")
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index a452319b90..1253ed86ac 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -60,9 +60,8 @@
>  .. option:: --precompiled-build-dir
>  .. envvar:: DTS_PRECOMPILED_BUILD_DIR
>
> -    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
> -    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
> -    --dpdk-tree or --tarball.
> +    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
> +    binaries are located.
>
>  .. option:: --test-suite
>  .. envvar:: DTS_TEST_SUITES
> @@ -95,13 +94,21 @@
>  import argparse
>  import os
>  import sys
> -import tarfile
>  from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
>  from dataclasses import dataclass, field
>  from pathlib import Path
>  from typing import Callable
>
> -from .config import DPDKLocation, TestSuiteConfig
> +from pydantic import ValidationError
> +
> +from .config import (
> +    DPDKLocation,
> +    LocalDPDKTarballLocation,
> +    LocalDPDKTreeLocation,
> +    RemoteDPDKTarballLocation,
> +    RemoteDPDKTreeLocation,
> +    TestSuiteConfig,
> +)
>
>
>  @dataclass(slots=True)
> @@ -122,6 +129,8 @@ class Settings:
>      #:
>      dpdk_location: DPDKLocation | None = None
>      #:
> +    precompiled_build_dir: str | None = None
> +    #:
>      compile_timeout: float = 1200
>      #:
>      test_suites: list[TestSuiteConfig] = field(default_factory=list)
> @@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
>
>      action = dpdk_build.add_argument(
>          "--precompiled-build-dir",
> -        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
> -        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
> -        "Can only be used with --dpdk-tree or --tarball.",
> +        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
> +        "pre-compiled binaries are located.",
>          metavar="DIR_NAME",
>      )
>      _add_env_var_to_action(action)
> -    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
>
>      action = parser.add_argument(
>          "--compile-timeout",
> @@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
>
>
>  def _process_dpdk_location(
> +    parser: _DTSArgumentParser,
>      dpdk_tree: str | None,
>      tarball: str | None,
>      remote: bool,
> -    build_dir: str | None,
> -):
> +) -> DPDKLocation | None:
>      """Process and validate DPDK build arguments.
>
>      Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
>      `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
> -    the :class:`DPDKLocation` with the provided parameters if validation is successful.
> +    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
>
>      Args:
> -        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
> -            must be provided.
> -        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
> -            provided.
> +        dpdk_tree: The path to the DPDK source tree directory.
> +        tarball: The path to the DPDK tarball.
>          remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
>              execution host.
> -        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
> -            subdirectory of `dpdk_tree` or `tarball` root directory.
>
>      Returns:
>          A DPDK location if construction is successful, otherwise None.
> -
> -    Raises:
> -        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
> -            they aren't in the right format.
>      """
> -    if not (dpdk_tree or tarball):
> -        return None
> -
> -    if not remote:
> -        if dpdk_tree:
> -            if not Path(dpdk_tree).exists():
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
> -                )
> -
> -            if not Path(dpdk_tree).is_dir():
> -                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
> -
> -            dpdk_tree = os.path.realpath(dpdk_tree)
> -
> -        if tarball:
> -            if not Path(tarball).exists():
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tarball '{tarball}' not found in local filesystem."
> -                )
> -
> -            if not tarfile.is_tarfile(tarball):
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tarball '{tarball}' must be a valid tar archive."
> -                )
> -
> -    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
> +    if dpdk_tree:
> +        action = parser.find_action("dpdk_tree", _is_from_env)
> +
> +        try:
> +            if remote:
> +                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
> +            else:
> +                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
> +        except ValidationError as e:
> +            print(
> +                "An error has occurred while validating the DPDK tree supplied in the "
> +                f"{'environment variable' if action else 'arguments'}:",
> +                file=sys.stderr,
> +            )
> +            print(e, file=sys.stderr)
> +            sys.exit(1)
> +
> +    if tarball:
> +        action = parser.find_action("tarball", _is_from_env)
> +
> +        try:
> +            if remote:
> +                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
> +            else:
> +                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
> +        except ValidationError as e:
> +            print(
> +                "An error has occurred while validating the DPDK tarball supplied in the "
> +                f"{'environment variable' if action else 'arguments'}:",
> +                file=sys.stderr,
> +            )
> +            print(e, file=sys.stderr)
> +            sys.exit(1)
> +
> +    return None
>
>
>  def _process_test_suites(
> @@ -512,11 +519,24 @@ def _process_test_suites(
>      Returns:
>          A list of test suite configurations to execute.
>      """
> -    if parser.find_action("test_suites", _is_from_env):
> +    action = parser.find_action("test_suites", _is_from_env)
> +    if action:
>          # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
>          args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
>
> -    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
> +    try:
> +        return [
> +            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
> +            for [test_suite, *test_cases] in args
> +        ]
> +    except ValidationError as e:
> +        print(
> +            "An error has occurred while validating the test suites supplied in the "
> +            f"{'environment variable' if action else 'arguments'}:",
> +            file=sys.stderr,
> +        )
> +        print(e, file=sys.stderr)
> +        sys.exit(1)
>
>
>  def get_settings() -> Settings:
> @@ -536,7 +556,7 @@ def get_settings() -> Settings:
>      args = parser.parse_args()
>
>      args.dpdk_location = _process_dpdk_location(
> -        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
> +        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
>      )
>      args.test_suites = _process_test_suites(parser, args.test_suites)
>
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index 62867fd80c..6031eaf937 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -17,7 +17,12 @@
>  from ipaddress import IPv4Interface, IPv6Interface
>  from typing import Union
>
> -from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
> +from framework.config import (
> +    OS,
> +    DPDKBuildConfiguration,
> +    NodeConfiguration,
> +    TestRunConfiguration,
> +)
>  from framework.exception import ConfigurationError
>  from framework.logger import DTSLogger, get_dts_logger
>
> @@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
>          self._init_ports()
>
>      def _init_ports(self) -> None:
> -        self.ports = [Port(port_config) for port_config in self.config.ports]
> +        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
>          self.main_session.update_ports(self.ports)
>          for port in self.ports:
>              self.configure_port_state(port)
>
>      def set_up_test_run(
> -        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
> +        self,
> +        test_run_config: TestRunConfiguration,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Test run setup steps.
>
> @@ -105,7 +112,7 @@ def set_up_test_run(
>          Args:
>              test_run_config: A test run configuration according to which
>                  the setup steps will be taken.
> -            dpdk_location: The target source of the DPDK tree.
> +            dpdk_build_config: The build configuration of DPDK.
>          """
>          self._setup_hugepages()
>
> diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
> index 5f087f40d6..42ab4bb8fd 100644
> --- a/dts/framework/testbed_model/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -364,7 +364,7 @@ def extract_remote_tarball(
>          """
>
>      @abstractmethod
> -    def is_remote_dir(self, remote_path: str) -> bool:
> +    def is_remote_dir(self, remote_path: PurePath) -> bool:
>          """Check if the `remote_path` is a directory.
>
>          Args:
> @@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
>          """
>
>      @abstractmethod
> -    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
> +    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
>          """Check if the `remote_tarball_path` is a tar archive.
>
>          Args:
> diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
> index 82c84cf4f8..817405bea4 100644
> --- a/dts/framework/testbed_model/port.py
> +++ b/dts/framework/testbed_model/port.py
> @@ -54,7 +54,7 @@ class Port:
>      mac_address: str = ""
>      logical_name: str = ""
>
> -    def __init__(self, config: PortConfig):
> +    def __init__(self, node_name: str, config: PortConfig):
>          """Initialize the port from `node_name` and `config`.
>
>          Args:
> @@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
>              config: The test run configuration of the port.
>          """
>          self.identifier = PortIdentifier(
> -            node=config.node,
> +            node=node_name,
>              pci=config.pci,
>          )
>          self.os_driver = config.os_driver
> diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
> index 0d3abbc519..6b66f33e22 100644
> --- a/dts/framework/testbed_model/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -201,12 +201,12 @@ def extract_remote_tarball(
>          if expected_dir:
>              self.send_command(f"ls {expected_dir}", verify=True)
>
> -    def is_remote_dir(self, remote_path: str) -> bool:
> +    def is_remote_dir(self, remote_path: PurePath) -> bool:
>          """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
>          result = self.send_command(f"test -d {remote_path}")
>          return not result.return_code
>
> -    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
> +    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
>          """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
>          result = self.send_command(f"tar -tvf {remote_tarball_path}")
>          return not result.return_code
> @@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo:
>              SETTINGS.timeout,
>          ).stdout.split("\n")
>          kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
> -        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
> +        return NodeInfo(
> +            os_name=os_release_info[0].strip(),
> +            os_version=os_release_info[1].strip(),
> +            kernel_version=kernel_version,
> +        )
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index a6c42b548c..57337c8e7d 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,11 +15,17 @@
>  import os
>  import time
>  from dataclasses import dataclass
> -from pathlib import PurePath
> +from pathlib import Path, PurePath
>
>  from framework.config import (
>      DPDKBuildConfiguration,
> -    DPDKLocation,
> +    DPDKBuildOptionsConfiguration,
> +    DPDKPrecompiledBuildConfiguration,
> +    DPDKUncompiledBuildConfiguration,
> +    LocalDPDKTarballLocation,
> +    LocalDPDKTreeLocation,
> +    RemoteDPDKTarballLocation,
> +    RemoteDPDKTreeLocation,
>      SutNodeConfiguration,
>      TestRunConfiguration,
>  )
> @@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
>          return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
>
>      def set_up_test_run(
> -        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
> +        self,
> +        test_run_config: TestRunConfiguration,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Extend the test run setup with vdev config and DPDK build set up.
>
> @@ -188,12 +196,12 @@ def set_up_test_run(
>          Args:
>              test_run_config: A test run configuration according to which
>                  the setup steps will be taken.
> -            dpdk_location: The target source of the DPDK tree.
> +            dpdk_build_config: The build configuration of DPDK.
>          """
> -        super().set_up_test_run(test_run_config, dpdk_location)
> -        for vdev in test_run_config.vdevs:
> +        super().set_up_test_run(test_run_config, dpdk_build_config)
> +        for vdev in test_run_config.system_under_test_node.vdevs:
>              self.virtual_devices.append(VirtualDevice(vdev))
> -        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
> +        self._set_up_dpdk(dpdk_build_config)
>
>      def tear_down_test_run(self) -> None:
>          """Extend the test run teardown with virtual device teardown and DPDK teardown."""
> @@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None:
>          self._tear_down_dpdk()
>
>      def _set_up_dpdk(
> -        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
> +        self,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Set up DPDK the SUT node and bind ports.
>
> @@ -211,21 +220,26 @@ def _set_up_dpdk(
>          are bound to those that DPDK needs.
>
>          Args:
> -            dpdk_location: The location of the DPDK tree.
> -            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
> -                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
> +            dpdk_build_config: A DPDK build configuration to test.
>          """
> -        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
> -        if not self._remote_dpdk_tree_path:
> -            if dpdk_location.dpdk_tree:
> -                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
> -            elif dpdk_location.tarball:
> -                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
> -
> -        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
> -        if not self.remote_dpdk_build_dir and dpdk_build_config:
> -            self._configure_dpdk_build(dpdk_build_config)
> -            self._build_dpdk()
> +        match dpdk_build_config.dpdk_location:
> +            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
> +                self._set_remote_dpdk_tree_path(dpdk_tree)
> +            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
> +                self._copy_dpdk_tree(dpdk_tree)
> +            case RemoteDPDKTarballLocation(tarball=tarball):
> +                self._validate_remote_dpdk_tarball(tarball)
> +                self._prepare_and_extract_dpdk_tarball(tarball)
> +            case LocalDPDKTarballLocation(tarball=tarball):
> +                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
> +                self._prepare_and_extract_dpdk_tarball(remote_tarball)
> +
> +        match dpdk_build_config:
> +            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
> +                self._set_remote_dpdk_build_dir(build_dir)
> +            case DPDKUncompiledBuildConfiguration(build_options=build_options):
> +                self._configure_dpdk_build(build_options)
> +                self._build_dpdk()
>
>          self.bind_ports_to_driver()
>
> @@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None:
>          self.compiler_version = None
>          self.bind_ports_to_driver(for_dpdk=False)
>
> -    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
> +    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
>          """Set the path to the remote DPDK source tree based on the provided DPDK location.
>
> -        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
> -        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
> -
>          Verify DPDK source tree existence on the SUT node, if exists sets the
>          `_remote_dpdk_tree_path` property, otherwise sets nothing.
>
>          Args:
>              dpdk_tree: The path to the DPDK source tree directory.
> -            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
> -                execution host.
>
>          Raises:
>              RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
>                  is not found.
>          """
> -        if remote and dpdk_tree:
> -            if not self.main_session.remote_path_exists(dpdk_tree):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
> -                )
> -            if not self.main_session.is_remote_dir(dpdk_tree):
> -                raise ConfigurationError(
> -                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
> -                )
> -
> -            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
> -
> -    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
> +        if not self.main_session.remote_path_exists(dpdk_tree):
> +            raise RemoteFileNotFoundError(
> +                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
> +            )
> +        if not self.main_session.is_remote_dir(dpdk_tree):
> +            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
> +
> +        self.__remote_dpdk_tree_path = dpdk_tree
> +
> +    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
>          """Copy the DPDK source tree to the SUT.
>
>          Args:
> @@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
>              self._remote_tmp_dir, PurePath(dpdk_tree_path).name
>          )
>
> -    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
> -        """Ensure the DPDK tarball is available on the SUT node and extract it.
> +    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
> +        """Validate the DPDK tarball on the SUT node.
>
> -        This method ensures that the DPDK source tree tarball is available on the
> -        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
> -        `dpdk_tarball` is already on the SUT node, it verifies its existence.
> -        The `dpdk_tarball` is then extracted on the SUT node.
> +        Args:
> +            dpdk_tarball: The path to the DPDK tarball on the SUT node.
>
> -        This method sets the `_remote_dpdk_tree_path` property to the path of the
> -        extracted DPDK tree on the SUT node.
> +        Raises:
> +            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
> +                not found.
> +            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
> +        """
> +        if not self.main_session.remote_path_exists(dpdk_tarball):
> +            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
> +        if not self.main_session.is_remote_tarfile(dpdk_tarball):
> +            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
> +
> +    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
> +        """Copy the local DPDK tarball to the SUT node.
>
>          Args:
> -            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
> -            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
> -                execution host.
> +            dpdk_tarball: The local path to the DPDK tarball.
>
> -        Raises:
> -            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
> -                is not found.
> +        Returns:
> +            The path of the copied tarball on the SUT node.
> +        """
> +        self._logger.info(
> +            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
> +        )
> +        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
> +        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
> +
> +    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
> +        """Prepare the remote DPDK tree path and extract the tarball.
> +
> +        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
> +        the path of the extracted DPDK tree on the SUT node.
> +
> +        Args:
> +            remote_tarball_path: The path to the DPDK tarball on the SUT node.
>          """
>
>          def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
> @@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
>                      return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
>              return remote_tarball_path.with_suffix("")
>
> -        if remote:
> -            if not self.main_session.remote_path_exists(dpdk_tarball):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
> -                )
> -            if not self.main_session.is_remote_tarfile(dpdk_tarball):
> -                raise ConfigurationError(
> -                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
> -                )
> -
> -            remote_tarball_path = PurePath(dpdk_tarball)
> -        else:
> -            self._logger.info(
> -                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
> -            )
> -            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
> -
> -            remote_tarball_path = self.main_session.join_remote_path(
> -                self._remote_tmp_dir, PurePath(dpdk_tarball).name
> -            )
> -
>          tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
>          self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
> -            PurePath(remote_tarball_path).parent,
> +            remote_tarball_path.parent,
>              tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
>          )
>
> @@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
>              self._remote_dpdk_tree_path,
>          )
>
> -    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
> +    def _set_remote_dpdk_build_dir(self, build_dir: str):
>          """Set the `remote_dpdk_build_dir` on the SUT.
>
> -        If :data:`build_dir` is defined, check existence on the SUT node and sets the
> +        Check existence on the SUT node and sets the
>          `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
>          Otherwise, sets nothing.
>
>          Args:
> -            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
> +            build_dir: DPDK has been pre-built and the build directory is located
>                  in a subdirectory of `dpdk_tree` or `tarball` root directory.
>
>          Raises:
>              RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
>                  node.
>          """
> -        if build_dir:
> -            remote_dpdk_build_dir = self.main_session.join_remote_path(
> -                self._remote_dpdk_tree_path, build_dir
> +        remote_dpdk_build_dir = self.main_session.join_remote_path(
> +            self._remote_dpdk_tree_path, build_dir
> +        )
> +        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
> +            raise RemoteFileNotFoundError(
> +                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
>              )
> -            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
> -                )
>
> -            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
> +        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
>
> -    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
> +    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
>          """Populate common environment variables and set the DPDK build related properties.
>
>          This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
> diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
> index d38ae36c2a..17b333e76a 100644
> --- a/dts/framework/testbed_model/topology.py
> +++ b/dts/framework/testbed_model/topology.py
> @@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
>                      port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
>
>          self.type = TopologyType.get_from_value(len(port_links))
> -        dummy_port = Port(PortConfig("", "", "", "", "", ""))
> +        dummy_port = Port(
> +            "",
> +            PortConfig(
> +                pci="0000:00:00.0",
> +                os_driver_for_dpdk="",
> +                os_driver="",
> +                peer_node="",
> +                peer_pci="0000:00:00.0",
> +            ),
> +        )
>          self.tg_port_egress = dummy_port
>          self.sut_port_ingress = dummy_port
>          self.sut_port_egress = dummy_port
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> index a319fa5320..945f6bbbbb 100644
> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -38,6 +38,4 @@ def create_traffic_generator(
>          case ScapyTrafficGeneratorConfig():
>              return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
>          case _:
> -            raise ConfigurationError(
> -                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
> -            )
> +            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 469a12a780..5ac61cd4e1 100644
> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
>          """
>          self._config = config
>          self._tg_node = tg_node
> -        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
> +        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
>          super().__init__(tg_node, **kwargs)
>
>      def send_packet(self, packet: Packet, port: Port) -> None:
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index 78a39e32c7..e862e3ac66 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -28,7 +28,7 @@
>
>  from .exception import InternalError
>
> -REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> +REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
>  _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
>  _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
>  REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> index d7870bd40f..bc3a2a6bf9 100644
> --- a/dts/tests/TestSuite_smoke_tests.py
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
>          path_to_devbind = self.sut_node.path_to_devbind_script
>
>          all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
> -            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
> +            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
>              SETTINGS.timeout,
>          ).stdout
>
> --
> 2.43.0
>
  

Patch

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@ 
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@  config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@ 
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@ 
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 7403ccbf14..c86bfaaabf 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@ 
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,28 @@ 
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
 
 
 @unique
@@ -118,15 +108,14 @@  class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +127,10 @@  class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(BaseModel, frozen=True, extra="forbid"):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +139,57 @@  class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,285 +208,317 @@  class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
-class SutNodeConfiguration(NodeConfiguration):
+class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
     Attributes:
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
-class TGNodeConfiguration(NodeConfiguration):
+class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
     Attributes:
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
+
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: Path) -> Path:
+    """Resolve a path into a real path."""
+    return path.resolve()
 
-    The configuration used for building DPDK.
+
+class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tarball location.
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
 
+    tarball: Path
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
+
+
+class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
+
+
+class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
+
+
+class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tarball location.
+
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK precompiled build configuration.
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
+
+    precompiled_build_dir: str = Field(min_length=1)
+
+
+class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
 
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK uncompiled build configuration.
+
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
+
+
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. This can also be represented as a string
+    instead of a mapping, example:
+
+    .. code:: yaml
+
+        test_runs:
+        - test_suites:
+            # As string representation:
+            - hello_world # test all of `hello_world`, or
+            - hello_world hello_world_single_core # test only `hello_world_single_core`
+            # or as model fields:
+            - test_suite: hello_world
+              test_cases: [hello_world_single_core] # without this field all test cases are run
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation of the model into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
 
+class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The SUT node configuration of a test run.
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
+
+
+class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -524,144 +530,130 @@  class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
-
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
-
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(BaseModel, extra="forbid"):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
+
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
 
-        return cls(test_runs=test_runs)
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -671,14 +663,14 @@  def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index cc3e78cef5..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,459 +0,0 @@ 
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter",
-        "vlan"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@ 
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@ 
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@  def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@  def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@  def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@  def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@  def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@  def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@  def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a452319b90..1253ed86ac 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@ 
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@ 
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@  class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@  def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@  def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@  def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -536,7 +556,7 @@  def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@ 
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@  def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@  def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 5f087f40d6..42ab4bb8fd 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -364,7 +364,7 @@  def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -375,7 +375,7 @@  def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@  class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@  def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 0d3abbc519..6b66f33e22 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@  def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
@@ -393,4 +393,8 @@  def get_node_info(self) -> NodeInfo:
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return NodeInfo(
+            os_name=os_release_info[0].strip(),
+            os_version=os_release_info[1].strip(),
+            kernel_version=kernel_version,
+        )
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index a6c42b548c..57337c8e7d 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,11 +15,17 @@ 
 import os
 import time
 from dataclasses import dataclass
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -178,7 +184,9 @@  def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -188,12 +196,12 @@  def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -202,7 +210,8 @@  def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -211,21 +220,26 @@  def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -238,37 +252,29 @@  def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -288,25 +294,45 @@  def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -324,30 +350,9 @@  def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -360,33 +365,32 @@  def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@  def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@  def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@  def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..e862e3ac66 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@ 
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@  def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout