I've REVISED the MAIN P/L "Time stamp corrections vs Unique Frame Count" (This should be used to replace the file I sent out yesterday).
I've attached a time stamp correction file for each payload
The MS-Word file contains a description of the steps taken, definitions, etc.
There are 5 Excel files. This time they are for the COUGAR GPS receiver 1pps only. (They have all been extracted using the correct frame interval during the "Extract Process").
There is an apparent 1.003 milli-second delay (offset) in the COUGAR GPS 1pps (One of the two GPS receivers on the MAIN P/L has a 1pps offset).
However, we have to use the COUGAR GPS 1pps since it is common to all 5 payloads.
I've also attached a MS-Word file which has 3 tables to summarize the time stamps for all 5 payloads. (one at T-0, one at Apogee, & one near the end of flight).
This table should demonstrate why the MAIN & PFF time stamps have to be handled in a different way.
The MAIN & PFF's do not have the exact nominal bit rate, so there is a frequency difference from nominal. Therefore, one can not just take the time stamp for the "Unique Frame Counter" @ T-0 and project the time for the rest of the flight. We would end up with a 5.375 mS error by the end of the flight, if that was done on the Main Payload.
MAIN P/L Unique Frame Count = 12716193 @ T -0.000122 (beginning of minor frame)
MAIN P/L Unique Frame Count = 18796150 @ T +759.999900 (beginning of minor frame)
This results in a difference of 6,079,957 minor frames. So, 6,079,957 x 125 micro-seconds = 759.994625 seconds. (An error of 5.375 mS by the end of flight).
One needs to know that the "Actual" bit rate is closer to 9,999,932 bps. Therefore the "actual" word rate is 999,993.2 words/second. This results in a minor frame rate of 7999.9456 (frame period = 125.00085 uS)
Then: 6,079,957 (frames) x 125.00085 uS = 759.999793 seconds. (vs the true corrected time stamp difference of 750.000022) Now the error is only about 229 micro-seconds. Still, this is a non-trivial error.
The PCM crystal oscillator will still have some actual drift vs time, so the exact "frame period" is not constant.
For example, observe the MAIN Payload bit rate measurements that were performed during TM testing (Each test was only a few minutes long):
+24V +28V +32V
Pre-Vib 9,999,934 9,999,936 9,999,935
Post-Vib 9,999,936 9,999,938 9,999,937
Field 9,999,930 9,999,932 9,999,932
I suggest that data analysis (for MAIN, & PFF's) be performed by using each of the 1pps corrected time stamps (rather than just assuming a constant bit rate, constant frame rate).
One last item:
PFF #2 experienced a large number of telemetry dropouts and lost navigation from T+23 to T+167.
The telemetry dropouts make it very messy to determine real data from false data.
The loss of GPS navigation makes the 1pps data invalid. PFF #2 did not regain navigation until about T+209 seconds.
Therefore we should avoid any PFF #2 1pps measurements during the T+23 to T+209 second period.
Let me know if there are questions. I'm hoping that someone will come up with the same answers.
Building F10 Room N207
NSROC Electrical Engineering
Northrop Grumman Technical Services, Inc.
NASA Sounding Rocket Operations Contract (NSROC)
P.O. Box 99
NASA/Wallops Flight Facility
Wallops Island, VA 23337
Telephone (757) 824-1413
FAX (757) 824-2423