Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new fixture cache_result param (default True) to ignore fixture cache and re-execute on each usage of the fixture #12814

Open
niroshaimos opened this issue Sep 14, 2024 · 5 comments

Comments

@niroshaimos
Copy link

What's the problem this feature will solve?

Reduce code duplication. Allow leveraging the setup,teardown features of pytest fixtures while allowing fixtures to be a bit more re-usable, like functions

Describe the solution you'd like

An ability to declare fixtures to not use their stored cache, and instead run and re-calculate on every reference

i.e.
Say I were testing my backend server which managed a database with 2 objects definitions: Projects and ProjectConfigurations.
When testing basic get functionalities, I would declare a fixture which creates a project, and another which creates a project-configuration

@pytest.fixture
async def project(
    client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
    project = await tools.create(
        client=client,
        project_create=models_factory.ProjectCreateFactory.build(),
    )

    yield project

    await delete_project(project_id=project.id)

@pytest.fixture
async def project_config(
    project: Project,
    client: AsyncClient,
    engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
    project_config = await tools.create(client=client, project_id=project.id)

    yield project_config

    async with engine.connect() as connection:
        res = await connection.execute(
            delete(ProjectConfig).where(
                project_config.id == ProjectConfig.id
            )
        )
        assert res.rowcount == 1

        await connection.commit()

This is already, a-lot of code, but fair. These are functionalities we must declare.

Now, when moving on to testing basic list functionalities, I would want more than 1 object in the db for each type during the test. I have to duplicate said fixtures above, or create a brand new fixture that creates X objects in my db as setup.
i.e.

@pytest.fixture
async def project(
    client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
    project = await tools.create(
        client=client,
        project_create=models_factory.ProjectCreateFactory.build(),
    )

    yield project

    await delete_project(project_id=project.id)

@pytest.fixture
async def project_config(
    project: Project,
    client: AsyncClient,
    engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
    project_config = await tools.create(client=client, project_id=project.id)

    yield project_config

    async with engine.connect() as connection:
        res = await connection.execute(
            delete(ProjectConfig).where(
                project_config.id == ProjectConfig.id
            )
        )
        assert res.rowcount == 1

        await connection.commit()

async def project2(
    client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
    project = await tools.create(
        client=client,
        project_create=models_factory.ProjectCreateFactory.build(),
    )

    yield project

    await delete_project(project_id=project.id)

@pytest.fixture
async def project_config2(
    project: Project,
    client: AsyncClient,
    engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
    project_config = await tools.create(client=client, project_id=project.id)

    yield project_config

    async with engine.connect() as connection:
        res = await connection.execute(
            delete(ProjectConfig).where(
                project_config.id == ProjectConfig.id
            )
        )
        assert res.rowcount == 1

        await connection.commit()

Now my list test will use project_config2 and project2
Here the code starts getting clustered and confusing. And as tests pile on and my fixtures grow, I have to be careful that two fixtures that I am using are not coupled to each other when writing tests, as this could lead to unexpected behaviors. This makes a very large test code base that can be hard to manage.
In my opinion, there could be a better way!

At the end of the day, I want to create objects in the DB. I care for them being there, and I care for them being removed when my test finishes so that other tests won't be affected by this. In my case, I do not care that the references to it will access the same object, as I usually have a linear usage for each fixture.

If I could declare a fixture as to not use its cache, my issue would be solved, I could reference a fixture as many times as I want, without duplicating code and still getting new objects, while still enjoying the benefits of pytest's reliable setup, teardown flows. I would not need the change my exiting fixture infrastructure when testing new components of the same objects in my BE.

It would look something like this:

@pytest.fixture(cache_result=False)
async def project(
    client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
    project = await tools.create(
        client=client,
        project_create=models_factory.ProjectCreateFactory.build(),
    )

    yield project

    await delete_project(project_id=project.id)

@pytest.fixture(cache_result=False)
async def project_config(
    project: Project,
    client: AsyncClient,
    engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
    project_config = await tools.create(client=client, project_id=project.id)

    yield project_config

    async engine.connection() as connection:
        res = await connection.execute(
            delete(ProjectConfig).where(
                project_config.id == ProjectConfig.id
            )
        )
        assert res.rowcount == 1

        await connection.commit()


@pytest.fixture
def project_config2(project_config) -> ProjectConfig:
    return project_config


def test1(project_config, project):
    ...


def test_list(project_config, project_config2):
    assert project_config != project_config2
    assert len(list_configs()) == 2

There are of course many more capabilities which come with this feature, this is just one of the useful usages.

@RonnyPfannschmidt
Copy link
Member

What you are asking for was coined as invocation scope back in 2016 when we tried to work on enabling that at the sprint in Freiburg

Back then we failed due to tech debt

While some of it is resolved, it's not clear if invocation scope can be done easily

@niroshaimos
Copy link
Author

niroshaimos commented Sep 14, 2024

While some of it is resolved, it's not clear if invocation scope can be done easily

I have implemented the feature under the fork below, at the very least as a POC with tests.
https://github.com/niroshaimos/pytest-no-cache-fixture
I'd be happy if you could take a look and if it's not good enough, maybe give me some pointers on what criteria it would have to meet so I could implement it as desired.

I am currently in a state where I have a very large infrastructure of many fixtures which are coupled and hard to maintain and my team is really feeling the pain on this one, so I am determined to do what I can to advance this :)

@RonnyPfannschmidt
Copy link
Member

Can you open a draft pull request for this, seeing a diff and potentially some tests will make this much easier to discuss

And those discussions will be the starting point for more

@niroshaimos
Copy link
Author

Sure! opened: #12816

@niroshaimos
Copy link
Author

Hey, any news on this?🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants