Skip to content

Conversation

kdexd
Copy link

@kdexd kdexd commented Dec 21, 2016

Third Phase (Extended) PR on #411 ( Replacement of #460 , Succeeding #465 )

This PR deals with refactor of tests in test_memory.py to adopt the pytest flavor.

  • Create two new fixtures memory and no_memoize_memory as they are used by almost all of the tests.
  • Remove memory clearing calls at certain places, because earlier temporary directory was created / destroyed manually. Now pytest's tmpdir already ensures isolation with other tests.
  • Parametrize tests wherever it seems straightforward.

@kdexd
Copy link
Author

kdexd commented Dec 21, 2016

Strange, undefined name 'func_with_kwonly_args - this PEP8 was never raised earlier, I doubt why now ? This has happened in some other module as well.

Copy link
Author

@kdexd kdexd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lesteve: I introduced a new design in 3e2878d . Last time both of us agreed that indirect parametrization looks a little weird so I didn't do it here.

Pytest deals with @parametrize decorator during test collection, while the fixtures are executed and provided when first asked for. Hence using a fixture (custom / provided by pytest ) is not permitted inside @parametrize decorator.

So for such requirements, or any other case where even setting up the test cases need precalculation, all of it can be done in a fixture. This way of parametrization is known as parametrizing through a fixture, where I set params=range(4) showing that any test using this fixture will be invoked 4 times with different setups provided by this fixture.

Each each, this fixture returns one tuple from the list of testcases, all this is handled by pytest's own request fixture. So in four invocations, request.param takes values from params argument (i.e. 0, 1, 2, 3).

I don't know whether such feature existed in nose, so this type of design might look new to people who have been using nose so far. Although pytest users would say this is standard.

Reference: http://doc.pytest.org/en/latest/fixture.html#parametrizing-fixtures

What are your thoughts about introducing such design to different tests ? I am in favour of keeping things uniform everywhere - either this new design or the older, so there are a lot of places where this design can be implemented, and if not then they should be done nowhere.

# Module-level functions and fixture, for tests
@fixture(scope='function')
def memory(tmpdir):
return Memory(cachedir=tmpdir.strpath, mmap_mode=None, verbose=0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not convinced by the memory and no_memoize_memory fixture, they only replace a one-line statement and add a layer of indirection. I feel it is better in this case to be explicit and go for simplicity.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lesteve Reverting back.

g = memory.cache(f)
gp = pickle.loads(pickle.dumps(g))
gp(1)


def test_call_and_shelve(tmpdir):
@fixture(scope='function', params=range(4))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it a bit disappointing that there is no better way than this. The thing is that you could actually mess up very easily by adding elements in func and Result and thinking that your test pass although it is just that you forgot to update params.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True that, there's one method through indirect parametrization, but I guess we'll have to wait for a while. Pytest recently introduced a feature related to supporting fixtures that "yield" - but they're yielding only once right now. Multiple yields from a yield, and direct parametrization through multiple yields should arrive in near future.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll be opening an issue in pytest later today about this.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lesteve: Well another obvious solution, till we dont get a better feature in pytest is to manually make a tempdir and remove it at such places. I am quite fond of tmpdir fixture because we don't need to care about isolation, creation and deletion and leakages at any point. Although if it is suitable, we can directly use the @parametrize decorator by not using tmpdir _for this test only. (I can add a TODO comment there at the best. Or is the current design fine ?)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in this case, we should give up on parametrize and keep the for loop. I am afraid that the advantages of parametrize (one test for each input value) does not compensate the disadvantages (strong obfuscation of what exactly gets tested + caveat when we want to add one input value to the test).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I said parametrize I meant rather a fixture with a params parameter, but you get the point ...

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure 👍

verbose=0)

@mem.cache()
def _setup_temporary_cache_folder(memory, num_inputs=10):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename this to _setup_toy_cache or something better

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure.

@kdexd
Copy link
Author

kdexd commented Dec 21, 2016

I had one more question - I saw a few smoke tests in test_memorized_repr, earlier they used to print directly to stdout and nosetests had a higher verbosity flag (in setup.cfg), so they were simply visible in CI logs. Can we introduce some string comparisons through capsys there ?

result.clear() # Do nothing if there is no cache.


@fixture(scope='function', params=range(2))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment here about using a for loop.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay I'll keep for loops until there's something interesting other than indirect parametrization from pytest's side.

@lesteve
Copy link
Member

lesteve commented Jan 5, 2017

The coverage drop is a bit weird here too ...

@lesteve
Copy link
Member

lesteve commented Jan 5, 2017

Why is it WIP by the way?

@kdexd
Copy link
Author

kdexd commented Jan 5, 2017

@lesteve I was experimenting a little bit of monkeypatching at places. I am usually not in favour of tampaering with sys variables when pytest's monkeypatch fixture is at disposal.

@lesteve lesteve force-pushed the pytestify-test-memory branch from 2652056 to 3717577 Compare January 9, 2017 09:20
@lesteve
Copy link
Member

lesteve commented Jan 9, 2017

Oops sorry I didn't realize I broke the tests with my last commit. This should be fixed now.

@kdexd kdexd force-pushed the pytestify-test-memory branch from 3717577 to 1740e3a Compare January 10, 2017 05:01
@lesteve
Copy link
Member

lesteve commented Jan 10, 2017

@karandesai-96 I think we should try not to be too perfectionist about remaining pytest-related PRs and merge them reasonably soon. Yes there may be pytest functionality that we could use to write the tests in a marginally nicer way, but that is not crucial enough to spend too much time on it.

In your last commit, you need to keep the sys.modules.pop('tmp_joblib_') in order to reload the module. If you want to be perfectionist you can actually use import imp and tmp = imp.reload(tmp) or something like this.

@kdexd kdexd changed the title [WIP] Refactor test_memory.py as per pytest design. [MRG] Refactor test_memory.py as per pytest design. Jan 10, 2017
@kdexd
Copy link
Author

kdexd commented Jan 10, 2017

I think we should try not to be too perfectionist about remaining pytest-related PRs and merge them reasonably soon.

@lesteve This perfectly makes sense.

In your last commit, you need to keep the sys.modules.pop('tmp_joblib_') in order to reload the module.

I'll probably restore this. It looks more straightforward to me.

memory.clear()


def func_with_kwonly_args(a, b, kw1='kw1', kw2='kw2'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am guessing you added these functions only to get rid of some flake8 warnings. This is a bit misleading so I removed them. Basically the test is for python 3 only and defining a function whose name does not match its behaviour can just add confusion.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I added them because that flake8 job repeatedly gave a red at some point of time. I am puzzled why it doesn't, now. Anyways this is good 👍

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It gave a red on flake8 job again.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For completeness, the flake8 is done on the diff that the PR introduces. The main reason is historical: there are some flake violations in some files and we don't want to fix them all because it will introduce conflicts in ongoing PRs. On the other hand we don't want that PRs introduce additional flake8 violations.

Because the flake8 is done on the diff, if you don't change a line that already has a flake8 violation, Travis will be green. In this case we were changing a line that had a flake8 violation so it is in the diff and Travis is red.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I understand now. Thanks for the clear explanation 👍

@lesteve
Copy link
Member

lesteve commented Jan 10, 2017

OK the Travis failure is the expected flake8 failure. I am going to merge this one, thanks a lot @karandesai-96.

@lesteve lesteve merged commit bab7c05 into joblib:master Jan 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants