Skip to content

Conversation

@d-kamath
Copy link
Collaborator

@d-kamath d-kamath commented Sep 16, 2024

Closes #703
Closes #598

Description

This PR introduces three test files to validate the correctness of the serialization process:

  1. SerializeFullFileTest.cpp: Runs a full simulation (e.g., 10 epochs) and saves the serialized output.
  2. SerializationFirstHalfTest.cpp: Runs the first half of the simulation (e.g., 5 epochs) and saves the serialized output.
  3. SerializationSecondHalfTest.cpp: Runs the second half of the simulation (e.g., 5 epochs), using the serialized output from the first half, and compares it with the full simulation's serialized output.
  4. Implemented a bash script (run_serial_test.sh) to automate the testing process.

Checklist (Mandatory for new features)

  • Added Documentation
  • Added Unit Tests

Testing (Mandatory for all changes)

  • GPU Test: test-medium-connected.xml Passed
  • GPU Test: test-large-long.xml Passed

@d-kamath d-kamath self-assigned this Sep 16, 2024
@d-kamath d-kamath requested a review from stiber September 16, 2024 19:50
Copy link
Contributor

stiber commented Sep 21, 2024

Can’t the full length simulation results just be cached? Or is that too much like a recession test?

@d-kamath
Copy link
Collaborator Author

Can’t the full length simulation results just be cached? Or is that too much like a recession test?

It is possible, the cached result will need to be updated if any member variable is changed. Shall I cache the result?

@stiber
Copy link
Contributor

stiber commented Sep 26, 2024

Can’t the full length simulation results just be cached? Or is that too much like a recession test?

It is possible, the cached result will need to be updated if any member variable is changed. Shall I cache the result?

This is actually an interesting issue. In principle, we should check when changes to things like member variables represent a breaking change to serialization/deserialization, because simulator users may have saved serialization files they want to re-use and they need to know that won't be possible across certain updates.

Are we using Cereal's versioning capability? Maybe we need a separate test that compares a cached and a new serialization file — a test that fails if they have the same version but don't match (but passes if they have different versions). I don't think the versioning capability is anything more than embedded documentation, so it seems like it would need to be something that gets caught by a test like that.

Copy link
Contributor

@stiber stiber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, just make a new issue on additional ideas.

@d-kamath d-kamath merged commit d29e471 into development Dec 17, 2024
@d-kamath d-kamath deleted the issue-703-serialization-test branch December 17, 2024 21:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants