Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bugfix] Add commands to turn HRC off to SCS 107 #344

Merged
merged 4 commits into from
Feb 14, 2025
Merged

Conversation

jzuhone
Copy link
Collaborator

@jzuhone jzuhone commented Jan 2, 2025

Description

The DEC2324 load was interrupted by a radiation shutdown, in the middle of an HRC-S observation. SCS-107 should power the HRC off.

The cea_check run for the JAN0325 return-to-science loads incorrectly indicated that the HRC was on at the beginning of the load, which resulted in a thermal violation. This PR ensures that the HRC is turned off from the perspective of kadi states.

This includes an unrelated fix to one regression test test_get_starcats_each_year. This was failing in 2025 because the first 4 days include an SCS-107 run. I think that testing a fixed set of years 2003 to 2024 inclusive is fine.

Deployment

This should NOT be installed to flight prior to Feb 15. Otherwise this will result in duplication of the same commands which are in the Command Events sheet. Kadi command processing has a 30-day lookback, so wait until Feb 15 to be sure.

Interface impacts

New HRC commands result from any SCS-107 run (including NSM, Safe Mode). The command timing is different due to the addition of new commands with 1.025 sec waits.

Testing

Unit tests

Jean

  • Linux
jeanconn-fido> pytest
============================= test session starts ==============================
platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0
rootdir: /proj/sot/ska/jeanproj/git
configfile: pytest.ini
plugins: anyio-4.7.0, timeout-2.3.1
collected 183 items                                                            

kadi/commands/tests/test_commands.py ................................... [ 19%]
...............................................                                                [ 44%]
kadi/commands/tests/test_states.py .......................x............. [ 65%]
.............                                                                                  [ 72%]
kadi/commands/tests/test_validate.py ...................                                       [ 82%]
kadi/tests/test_events.py ..........                                                           [ 87%]
kadi/tests/test_occweb.py ......................                                               [100%]

========================================== warnings summary ==========================================
kadi/kadi/commands/tests/test_commands.py: 88 warnings
kadi/kadi/commands/tests/test_validate.py: 9 warnings
kadi/kadi/tests/test_occweb.py: 19 warnings
  /proj/sot/ska3/test/lib/python3.12/site-packages/bs4/builder/_lxml.py:124: DeprecationWarning: The 'strip_cdata' option of HTMLParser() has never done anything and will eventually be removed.
    parser = parser(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
====================== 182 passed, 1 xfailed, 116 warnings in 225.76s (0:03:45) ======================
jeanconn-fido> git rev-parse HEAD
54e6f15c1e9e4e17e48106fe107b63b272b0dfe5

Independent check of unit tests by JZ:

  • HEAD Linux

Functional tests

Without this change, the JAN0325 thermal model run indicated that the HRC was still on at the beginning of the load when run through cea_check, despite the fact that it had been turned off by the SCS-107 run. With the change included, the HRC is correctly marked as off at the beginning of the thermal model run.

@jzuhone
Copy link
Collaborator Author

jzuhone commented Jan 2, 2025

@taldcroft @jeanconn @javierggt I suspect that I do not have the correct settings to carry out the kadi unit tests, as I get a lot of failures, even when I run them on the master branch and not this one.

@jeanconn
Copy link
Contributor

jeanconn commented Jan 2, 2025

We should probably figure out what is going on with the tests then. @jzuhone are you running the tests on HEAD or local? And what is failing?

@jzuhone
Copy link
Collaborator Author

jzuhone commented Jan 2, 2025

@jeanconn I'm running the tests on HEAD. I am getting quite a number of failures--I can send a summary later.

@jeanconn
Copy link
Contributor

jeanconn commented Jan 2, 2025

Thanks! For me, against flight ska3 I'm seeing one test failure on master just now that looks to be a hiccup related to the SCS107 (test_get_starcats_each_year is failing in year 2025 because the fids aren't identified because they are (properly) off).

dict(type="COMMAND_HW", tlmsid="224PCAOF"),
dict(type="COMMAND_HW", tlmsid="2IMHVOF"),
dict(type="COMMAND_HW", tlmsid="2SPHVOF"),
dict(type="COMMAND_HW", tlmsid="2S2HVOF"),
Copy link
Contributor

@jeanconn jeanconn Jan 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the hiccup today, HRC and FOT MP and Tom settled on using these 8

COMMAND_HW | TLMSID=224PCAOF
COMMAND_HW | TLMSID=2IMHVOF
COMMAND_HW | TLMSID=2SPHVOF
COMMAND_HW | TLMSID=215PCAOF
COMMAND_HW | TLMSID=2S2STHV
COMMAND_HW | TLMSID=2S1STHV
COMMAND_HW | TLMSID=2S2HVOF
COMMAND_HW | TLMSID=2S1HVOF

so it sounds like there's a little remaining work figuring out which set makes the most sense.

@taldcroft
Copy link
Member

I'm seeing one failure in master on Mac. This was passing when we merged #342. The error means that a star went missing since then. Could it be related to the new bad stars? Doesn't entirely make sense but that is the only thing I can think of that changed.

(ska3) ➜  kadi git:(master) git rev-parse --short HEAD
7a350cd

(ska3) ➜  kadi git:(master) pytest      
============================================= test session starts ==============================================
platform darwin -- Python 3.11.8, pytest-7.4.4, pluggy-1.4.0
rootdir: /Users/aldcroft/git
configfile: pytest.ini
plugins: cov-5.0.0, timeout-2.2.0, anyio-4.3.0
collected 183 items                                                                                            

kadi/commands/tests/test_commands.py ..............................................F.................... [ 36%]
................                                                                                         [ 45%]
kadi/commands/tests/test_states.py .......................x.........................                     [ 72%]
kadi/commands/tests/test_validate.py ...................                                                 [ 82%]
kadi/tests/test_events.py ..........                                                                     [ 87%]
kadi/tests/test_occweb.py ......................                                                         [100%]

=================================================== FAILURES ===================================================
_____________________________________ test_get_starcats_each_year[year22] ______________________________________

year = 2025

    @pytest.mark.parametrize("year", years)
    def test_get_starcats_each_year(year):
        starcats = get_starcats(start=f"{year}:001", stop=f"{year}:004", scenario="flight")
        assert len(starcats) > 2
        for starcat in starcats:
            # Make sure fids and stars are all ID'd
            ok = starcat["type"] != "MON"
>           assert np.all(starcat["id"][ok] != -999)
E           AssertionError: assert False
E            +  where False = <function all at 0x1054b1330>(<Column name='id' dtype='int64' length=11>\n      -999\n      -999\n      -999\n1006782456\n1006774552\n1006778248\n1006781496\n1006779528\n1006771416\n1006637024\n1006784024 != -999)
E            +    where <function all at 0x1054b1330> = np.all

kadi/commands/tests/test_commands.py:903: AssertionError
=========================================== short test summary info ============================================
FAILED kadi/commands/tests/test_commands.py::test_get_starcats_each_year[year22] - AssertionError: assert False
============================= 1 failed, 181 passed, 1 xfailed in 66.30s (0:01:06) ==============================

@jeanconn
Copy link
Contributor

jeanconn commented Jan 2, 2025

I took it to mean that the fid lights aren't identified for our recent scs107 interval. I thought that test looks to run over a chunk of time in each year and the new thing is we're in a new year.

@taldcroft
Copy link
Member

This change ended up being a bit involved in the end because adding new commands to SCS-107 impacts a bunch of regression testing.

I needed to add a configuration hook to disable adding the new commands. This is required for comparing to existing commands in the archive. This is convenient for some other tests to just make the pain go away since at least a few tests were updated to reflect the new commanding.

@taldcroft taldcroft requested a review from jeanconn January 3, 2025 11:33
@taldcroft
Copy link
Member

@jzuhone - can you re-run your functional testing and unit tests on HEAD with the new commit?

@taldcroft
Copy link
Member

I added a note about deployment, which should be no earlier than Feb 15.

@jeanconn
Copy link
Contributor

jeanconn commented Jan 3, 2025

I like the deployment plan idea - though do we want to defer in fact until ska3-matlab is ready for this? Also, does deferred deployment suggest we should push that unrelated test fix in another PR or just put up with it?

@jzuhone
Copy link
Collaborator Author

jzuhone commented Jan 3, 2025

@taldcroft I ran the tests on HEAD and everything passes except for the one xfail you also got.

@johnny1up
Copy link

I said during TWG that I'd take care of the remaining coding necessary for this, but it seems that work has happened between then and now. Has it been completed, or do you still need me to finalize the ordering and timing?

@jeanconn
Copy link
Contributor

Thanks @johnny1up ! I think the approach here to just separate the commands by 1.205s is sufficient for the use cases in kadi, but if you have suggestions they are obviously welcome.

@jeanconn jeanconn requested a review from johnny1up February 11, 2025 15:50
@jeanconn
Copy link
Contributor

The ruff formatting issues appear unrelated to this PR.

Copy link

@johnny1up johnny1up left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes look good! (Aside from the ruff failure, which you already noted)

@jzuhone
Copy link
Collaborator Author

jzuhone commented Feb 12, 2025

I'll fix the ruff failure tonight

@jeanconn
Copy link
Contributor

Looks to me like the ruff formatting fixes are already in master so we could rebase or ignore here.

@jzuhone
Copy link
Collaborator Author

jzuhone commented Feb 12, 2025

@jeanconn ok, ruff errors fixed by a rebase.

Copy link
Contributor

@jeanconn jeanconn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I redid the unit test in the description. Looks good to me.

@jeanconn jeanconn requested a review from taldcroft February 14, 2025 16:31
Copy link
Member

@taldcroft taldcroft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tomorrow will be Feb 15, which was the NLT time for deployment. So we've waited long enough since this definitely won't get released before then. Even in masters this will be fine since there was margin in the Feb 15 date.

@taldcroft taldcroft merged commit 5aefbdd into master Feb 14, 2025
4 checks passed
@taldcroft taldcroft deleted the scs107_hrc_off branch February 14, 2025 18:04
@jeanconn
Copy link
Contributor

As a tiny go-back I think Feb 15 was the not-sooner-than time (not not-later-than) so we're still OK.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants