Optical Array Probe (OAP) Software
Inter-comparison Project
Synthetic Data
Aaron Bansemer has generated synthetic data to simulate the response
of the 2D-S, CIP and PIP and these data are now
available. The simulations all use the same assumed
exponential size distribution, and there are simulations for both
drops and columns. An extra set of synthetic data for the HVPS will
be available in the near future (they are taking longer to run
because of the large sample volume), and Aaron will update them to
the folder when they are complete. Another set of simulations
assuming irregular ice particles will also be generated so that
there are a range of particles that fit each instrument. These will
be added once the HVPS simulations are complete.
In order to make the comparisons most meaningful, it is important
that we all follow the same instructions for processing the data. As
some of us did a similar exercise 6 years ago for the 2018
University of British Columbia workshop on evaluation of cloud probe
processing software, I recommend that we follow the same
instructions for this inter-comparison. We will compare both
particle by particle (PbP) and bulk parameter data. Note that
I am attaching a poster presenting at the 2018 AMS Cloud Physics
Conference that describes the results of our previous
inter-comparison.
PROCESSING INSTRUCTIONS
First, please process the data and provide the PbP data for each
particle including the following: time, particle count in buffer,
width (W), length (L), projected area (A), particle maximum
dimension (D), perimeter (P), and a flag for whether the end diodes
are shadowed (i.e., 0 1 or 2 depending on whether no, one, or both
photodiodes are shadowed). For the first step we will only use the
length of the particle along the photodiode array in the comparison
(i.e., the width W). However, to make intercomparison easier
at later steps, please include all the aforementioned parameters in
the file. Note that there are different definitions of particle
maximum dimension (i.e., time direction, diode direction, maximum of
two, minimum circle completing enclosing particle, area-equivalent
diameter, etc.: See Wu and McFarquhar 2016 for a list of possible
metrics that can be used to characterize the dimension of the
particle): feel free to use whatever dimension you typically use.
Thus, in summary the instructions for the files to provide for the
intercomparison are as follows (repeat for each file):
1. Extract information from the header. Generate a
PbP file that contains the image time from header for each
particle, count in buffer, W, L, A, D, P, end diode shadow flag.
2. Determine counts recorded by probe as function
of time before any corrections for particle size are made or before
the sample volume is computed.
a. Complete
images only
b. Partial
images only
c. Complete
and partial images
3. Determine counts recorded by probe as a
function of time after corrections for particle size based on out of
focus particles are made
a. Complete
images only
b. Partial
images only
c. Complete
and partial images
4. Number distribution functions before
corrections for particle size (we will use counts per bin per second
so that assumptions about the sample volume dependence on maximum
dimension don't affect the results).
a. Use
1-second resolution for generating counts;
b. Bins
should have widths corresponding to 1 pixel
c. Entries
should correspond to number of counts in that bin for given second
(should be integer)
d. Files
should cover entire length of time in synthetic data files
e. Should
include complete and partial images
5. Number distribution function after corrections
for particle sizes (we will specify a priori the sample volume)
a. Use same
instructions as in step 4 and include complete and partial images
6. Number distribution function after removal of
artifacts (based on particle morphology)
7. Number distribution function after removal of
artifacts (based on particle morphology) and shattered particles
(based on interarrival time)
8. Number distribution function after removal of
artifacts (based on particle morphology) and shattered particles
(based on interarrival time) and reacceptance of spuriously removed
particles.
It is possible we might not have time to complete all steps before
the workshop. Please send your files after we complete each step so
that we can intercompare the distributions on a step-by-step basis.
We anticipate that these comparisons will be taking place
continuously before we meet in Jeju to discuss the final
comparisons. It will be very helpful to start receiving files soon.
It would be best if you could provide netcdf files with the output
from your code. As there are two sets of files for each probe (2DS,
HVPS-3, CIP, PIP), one with simulated round drops and another for
simulated columns. It would be best to supply separate files for
each probe. To allow us to more easily interpret who supplied
what files, we recommend the following naming convention for your
supplied files:
XXX_PPP_Z_S#_YY.cdf
where XXX is a unique 3 digit code (e.g., UIO for University of
Illinois/Oklahoma software), PPP represents the probe (i.e., 2DS,
HVP, CIP or PIP), Z represents shape (i.e., either C for columns or
R for round particles), S# represents the step number (i.e., use S1
for step 1 giving counts along photodiode array) and YY represents
type of particles included (i.e., CO for complete, PA for partial
and CP for complete + partial).
Ideally we would like the PbP files by the end of March, and the
sized distribution files by the end of April so that there is lots
of time to perform the comparison before the Jeju workshop (and get
supplementary information if needed). TOnce we start receiving
files, we will start generating the intercomparison plots and let
you know any pertinent information as we move to subsequent steps,
and there is always a possibility additional information might be
needed.