-
Notifications
You must be signed in to change notification settings - Fork 434
[MRG] Refactor test_memory.py as per pytest design. #466
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Strange, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lesteve: I introduced a new design in 3e2878d . Last time both of us agreed that indirect parametrization looks a little weird so I didn't do it here.
Pytest deals with @parametrize
decorator during test collection, while the fixtures are executed and provided when first asked for. Hence using a fixture (custom / provided by pytest ) is not permitted inside @parametrize
decorator.
So for such requirements, or any other case where even setting up the test cases need precalculation, all of it can be done in a fixture. This way of parametrization is known as parametrizing through a fixture, where I set params=range(4)
showing that any test using this fixture will be invoked 4 times with different setups provided by this fixture.
Each each, this fixture returns one tuple from the list of testcases, all this is handled by pytest's own request
fixture. So in four invocations, request.param
takes values from params
argument (i.e. 0, 1, 2, 3).
I don't know whether such feature existed in nose, so this type of design might look new to people who have been using nose so far. Although pytest users would say this is standard.
Reference: http://doc.pytest.org/en/latest/fixture.html#parametrizing-fixtures
What are your thoughts about introducing such design to different tests ? I am in favour of keeping things uniform everywhere - either this new design or the older, so there are a lot of places where this design can be implemented, and if not then they should be done nowhere.
# Module-level functions and fixture, for tests | ||
@fixture(scope='function') | ||
def memory(tmpdir): | ||
return Memory(cachedir=tmpdir.strpath, mmap_mode=None, verbose=0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not convinced by the memory
and no_memoize_memory
fixture, they only replace a one-line statement and add a layer of indirection. I feel it is better in this case to be explicit and go for simplicity.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lesteve Reverting back.
g = memory.cache(f) | ||
gp = pickle.loads(pickle.dumps(g)) | ||
gp(1) | ||
|
||
|
||
def test_call_and_shelve(tmpdir): | ||
@fixture(scope='function', params=range(4)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I find it a bit disappointing that there is no better way than this. The thing is that you could actually mess up very easily by adding elements in func
and Result
and thinking that your test pass although it is just that you forgot to update params
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True that, there's one method through indirect parametrization, but I guess we'll have to wait for a while. Pytest recently introduced a feature related to supporting fixtures that "yield" - but they're yielding only once right now. Multiple yields from a yield, and direct parametrization through multiple yields should arrive in near future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll be opening an issue in pytest later today about this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lesteve: Well another obvious solution, till we dont get a better feature in pytest is to manually make a tempdir and remove it at such places. I am quite fond of tmpdir
fixture because we don't need to care about isolation, creation and deletion and leakages at any point. Although if it is suitable, we can directly use the @parametrize
decorator by not using tmpdir
_for this test only. (I can add a TODO comment there at the best. Or is the current design fine ?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think in this case, we should give up on parametrize and keep the for loop. I am afraid that the advantages of parametrize
(one test for each input value) does not compensate the disadvantages (strong obfuscation of what exactly gets tested + caveat when we want to add one input value to the test).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When I said parametrize
I meant rather a fixture with a params
parameter, but you get the point ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure 👍
verbose=0) | ||
|
||
@mem.cache() | ||
def _setup_temporary_cache_folder(memory, num_inputs=10): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rename this to _setup_toy_cache
or something better
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure.
I had one more question - I saw a few smoke tests in |
18a9baf
to
a31b952
Compare
result.clear() # Do nothing if there is no cache. | ||
|
||
|
||
@fixture(scope='function', params=range(2)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment here about using a for loop.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay I'll keep for loops until there's something interesting other than indirect parametrization from pytest's side.
The coverage drop is a bit weird here too ... |
Why is it WIP by the way? |
@lesteve I was experimenting a little bit of monkeypatching at places. I am usually not in favour of tampaering with |
2652056
to
3717577
Compare
Oops sorry I didn't realize I broke the tests with my last commit. This should be fixed now. |
3717577
to
1740e3a
Compare
@karandesai-96 I think we should try not to be too perfectionist about remaining pytest-related PRs and merge them reasonably soon. Yes there may be pytest functionality that we could use to write the tests in a marginally nicer way, but that is not crucial enough to spend too much time on it. In your last commit, you need to keep the |
@lesteve This perfectly makes sense.
I'll probably restore this. It looks more straightforward to me. |
memory.clear() | ||
|
||
|
||
def func_with_kwonly_args(a, b, kw1='kw1', kw2='kw2'): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am guessing you added these functions only to get rid of some flake8 warnings. This is a bit misleading so I removed them. Basically the test is for python 3 only and defining a function whose name does not match its behaviour can just add confusion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I added them because that flake8 job repeatedly gave a red at some point of time. I am puzzled why it doesn't, now. Anyways this is good 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It gave a red on flake8 job again.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For completeness, the flake8 is done on the diff that the PR introduces. The main reason is historical: there are some flake violations in some files and we don't want to fix them all because it will introduce conflicts in ongoing PRs. On the other hand we don't want that PRs introduce additional flake8 violations.
Because the flake8 is done on the diff, if you don't change a line that already has a flake8 violation, Travis will be green. In this case we were changing a line that had a flake8 violation so it is in the diff and Travis is red.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, I understand now. Thanks for the clear explanation 👍
OK the Travis failure is the expected flake8 failure. I am going to merge this one, thanks a lot @karandesai-96. |
Third Phase (Extended) PR on #411 ( Replacement of #460 , Succeeding #465 )
This PR deals with refactor of tests in test_memory.py to adopt the pytest flavor.
memory
andno_memoize_memory
as they are used by almost all of the tests.