Skip to content

Conversation

@sdb9696
Copy link
Collaborator

@sdb9696 sdb9696 commented Nov 8, 2024

No description provided.

@codecov
Copy link

codecov bot commented Nov 8, 2024

Codecov Report

Attention: Patch coverage is 76.92308% with 3 lines in your changes missing coverage. Please review.

Project coverage is 92.57%. Comparing base (32671da) to head (420f428).
Report is 193 commits behind head on master.

Files with missing lines Patch % Lines
kasa/smart/smartdevice.py 66.66% 1 Missing and 1 partial ⚠️
kasa/iot/iotdevice.py 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1235      +/-   ##
==========================================
- Coverage   92.60%   92.57%   -0.04%     
==========================================
  Files         101      101              
  Lines        6644     6657      +13     
  Branches      706      707       +1     
==========================================
+ Hits         6153     6163      +10     
- Misses        370      372       +2     
- Partials      121      122       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@rytilahti rytilahti left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested on L530 and it did seemingly work (i.e., I saw the state of the bulb changing), but got some errors from a completely unrelated model:

FAILED kasa/tests/test_bulb.py::test_bulb_sysinfo[L510B(EU)_3.0_1.0.5.json-SMART] - voluptuous.error.MultipleInvalid: expected float for dictionary value @ data['longitude']
FAILED kasa/tests/test_bulb.py::test_hsv_on_non_color[L510B(EU)_3.0_1.0.5.json-SMART] - assert not True
FAILED kasa/tests/test_bulb.py::test_non_variable_temp[L510B(EU)_3.0_1.0.5.json-SMART] - Failed: DID NOT RAISE <class 'kasa.exceptions.KasaException'>

@ryenitcher
Copy link
Contributor

ryenitcher commented Nov 9, 2024

I can confirm that the tests now no longer crash for the EP40M, however the following errors remain:

======================================================================================== short test summary info ========================================================================================= 
FAILED tests\test_childdevice.py::test_child_time[P300(EU)_1.0_1.0.13.json-SMART] - AssertionError: assert FakeDatetime(2024, 11, 8, 17, 56, 32, tzinfo=zoneinfo.ZoneInfo(key='America/Denver')) != FakeDatetime(2024, 11, 8, 17, 56, 32, tzinfo=tzlocal())
FAILED tests\test_readme_examples.py::test_tutorial_examples - ValueError: Cannot find module=docs/tutorial.py

Results (103.99s (0:01:43)):
     458 passed
       2 failed
         - kasa\tests/test_childdevice.py:127 test_child_time[P300(EU)_1.0_1.0.13.json-SMART]
         - kasa\tests/test_readme_examples.py:140 test_tutorial_examples
    13290 skipped

I believe that the docs failure is due to the test being run from the within ./kasa directory, not the top level directory where the docs folder is. The python path would need updated to include the top level directory in order for this to run properly from the ./kasa or ./kasa/tests directory. (I forgot to activate the venv before running pytest, but the test fails even with the venv activated)

Meanwhile it appears that the time test may be failing because the fallback time is the same as the current time, which may be the same as the actual device time. I suggest using the unix epoch instead as the fallback time. Running the tests multiple times in a row shows that it is an intermittent failure, meaning it depends on if you get lucky and how bad your network latency is.

======================================================================================== short test summary info ========================================================================================= 
FAILED tests\test_readme_examples.py::test_tutorial_examples - ValueError: Cannot find module=docs/tutorial.py

Results (109.10s (0:01:49)):
     459 passed
       1 failed
         - kasa\tests/test_readme_examples.py:140 test_tutorial_examples
    13290 skipped

@sdb9696
Copy link
Collaborator Author

sdb9696 commented Nov 11, 2024

Tested on L530 and it did seemingly work (i.e., I saw the state of the bulb changing), but got some errors from a completely unrelated model:

This should now be fixed

I believe that the docs failure is due to the test being run from the within ./kasa directory, not the top level directory where the docs folder is.

That's correct. You should now be able to run this with uv run pytest --ip=<ip address> --username=<username> --password=<password> from the root project directory. Remember to include the = between option and value. Works ok for me on windows.

Meanwhile it appears that the time test may be failing because the fallback time is the same as the current time, which may be the same as the actual device time.

I've now excluded this from physical device testing so it should be skipped.

@sdb9696 sdb9696 requested a review from rytilahti November 11, 2024 15:28
@rytilahti
Copy link
Member

A couple of errors, but the model is now correct. Copying a couple of lines from the source for context:

FAILED tests/smart/modules/test_lighttransition.py::test_module_v1[L530E(EU)_3.0_1.0.6.json-SMART] - AssertionError: assert 'smooth_transitions' in {'smooth_transition_off': Smooth transition off (smooth_transition_off): 0 (range: 0-60), 'smooth_transition_on': Smooth transition on (smooth_transition_on): 0 (range...
>       assert "smooth_transitions" in light_transition._module_features
E       AssertionError: assert 'smooth_transitions' in {'smooth_transition_off': Smooth transition off (smooth_transition_off): 0 (range: 0-60), 'smooth_transition_on': Smooth transition on (smooth_transition_on): 0 (range: 0-60)}
E        +  where {'smooth_transition_off': Smooth transition off (smooth_transition_off): 0 (range: 0-60), 'smooth_transition_on': Smooth transition on (smooth_transition_on): 0 (range: 0-60)} = <Module LightTransition (on_off_gradually) for 

FAILED tests/test_common_modules.py::test_light_effect_brightness[L530E(EU)_3.0_1.0.6.json-SMART] - assert 1 == 50
        await light_module.set_brightness(50)
        await dev.update()
        assert light_effect.effect == light_effect.LIGHT_EFFECTS_OFF
>       assert light_module.brightness == 50
E       assert 1 == 50
E        +  where 1 = <Module Light (light) for 192.168.xx.xx>.brightness

@sdb9696 sdb9696 requested a review from rytilahti November 11, 2024 17:22
@sdb9696 sdb9696 added the maintenance Project improvements and maintenance label Nov 11, 2024
@sdb9696 sdb9696 added this to the 0.8.0 milestone Nov 11, 2024
@sdb9696 sdb9696 merged commit 71ae06f into master Nov 11, 2024
26 of 28 checks passed
@sdb9696 sdb9696 deleted the janitor/fix_test_ip branch November 11, 2024 17:41
@ryenitcher
Copy link
Contributor

I'm a bit late on the follow up, but for the sake of the records, I can confirm that all device tests now pass when run against the EP40M for the final merged commit 71ae06f (including when run the top level directory).

Results (78.55s (0:01:18)):
     459 passed
    14331 skipped

@sdb9696
Copy link
Collaborator Author

sdb9696 commented Nov 14, 2024

Many thanks @ryenitcher!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

maintenance Project improvements and maintenance

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants