MEETING MINUTES (10/8/98 draft)

GLI WORKSHOP

Tokyo Fashion Town

September 9-11, 1998

TOKYO, JAPAN

 

September 9, 1998; morning session

 

Dr. Vu Saito introduced Dr. Iigarashi, who introduced Dr. Ogawa, director of EORC

I. Opening Address from EORC Toshihiro OgawaEORC/NASDA

  1. ADEOS-II Launch date in November, 2000.

 

II. GLI Project: Current Status Vu Saito GLI Manager

  1. report on upper management and activities from June, 1997 to the current time.
  2. Project schedule presented.
  3. Standard algorithm development ends December 31, 1998 for overseas PIs; March, 1999 for Japanese PIs.
  4. Launch date: November, 2000
  5. 1999-Launch: develop/integrate S/W onto EORC system
  6. 1999.1 - 2001.1 at EOC: install S/W developed by EORC
  7. Next RA concept: continue for year before launch and 2 years after launch.
  8. Second RA (research announcement) objectives: development of research algorithms, cal./val., scientific application to ADEOS-II mission that includes:
  1. Seawind by NASA
  2. POLDER-II by CNES
  3. ILAS -II by ESA
  1. GLI meetings every 2 months. Attendees from this workshop can pick up copies of the principal handouts delivered at the last GLI seminar held in December, 1997 at the reception desk.
  2. Proto-flight GLI Model construction to be completed Dec., 1998
  3. Ocean and Land scientists will propose operation patterns, including tilt operations that will satisfy both groups' objectives.
  4. GLI 250 meter data acquisition difficulties
  5. Capacity of optical disk recorder: 7 minutes (4 min for each scene).
  6. Scientific validation plan drafted by PIs. This plan will be merged with the NASDA Cal/Val. plan.
  7. Standard High level products and Research High level products.
  8. GLI Simulator provides synthetic data for all 36 channels (see CD-ROM handout).
  9. At end of this meeting, expects final list of research products

Questions or comments/Answers (Q&A):

Q: Ian Barton, 250 m bands are important for cloud clearing. Providing it to ground stations? Direct broadcast of channel data?

Q: Nakajima: PI should push for direct broadcast.

 

IIIa. Current Status of GLI Development Kazuhiro Tanaka EOS/NASDA

  1. Block diagram and schedule presented.
  2. Currently manufacturing the PFM.
  3. Final integration of GLI in October, 1998.

 

IIIb. GLI-PFM Expected Performance Hirokazu Ohmae Fujitsu

  1. Adjusting gain, dynamic range, linearity, polarization, spectral characteristics
  2. Polarization: channels 1, 10, 18 don't meet specs.

Questions or comments/Answers (Q&A):

Q: Terry Nakajima. Non-linearity of the GLI sensor response is a problem. Can we compensate for the non-linearity?

A.: No compensation. However, we measure it and can provide a basis for correcting the non-linearity. The non-linearity is stable.

Q: Terry Nakajima: Can we trace the source of non-linearity? What if optics degrade?

A.: This is a problem to discuss later.

 

 

IV. GLI Data Processing System Satoko Horiyama EOSD/NASDA

  1. Processing time depends on whether or not we are using DRTS.

Horiyama to Terry Nakajima: DRTS will be operational 70-80% of time

 

 

V. Enhancement for GLI Data Distribution System Shigemitsu Fukui EOIS/NASDA

  1. HDF format for GLI
  2. Toolkit functions are under investigation
  3. JPEG for browse at EOIS
  4. EOIS catalog interoperabilty with NASA

Questions or comments/Answers (Q&A):

Q: Rosenfeld: Looks like the GLI data system is designed after TRMM. The TRMM Toolkit is important to users for accessing TRMM files.

A.: Saito: We will investigate function of Toolkit. We need your input.

A.: Rosenfeld: I will provide input for TRMM toolkit.

Q: Kishimo: Where is decoding performed?

A.: Horiyama: at level 0

Q: Kishimo: What is format of empty packet (bit stream structure)? Empty packets are not defined at CCSDS.

A.: level 1A level 0 packet are filled with dummy SWIR bit string (13 bits) of level 0 transformed into 16 bit (byte unit)

Q: Shimoda: What is reason of not considering DVD?

A.: Fukui: We have no plan to use DVD in the near future. If need arises after launch for DVD, we will develop interface to DVD.

 

 

VI. High Level Product Evaluation System and Support for PIs Kazuo Yoshida EORC/NASDA

  1. showed slide delineating role of EOC and EORC.
  2. EOC to furnish machine to EORC.

Questions or comments/Answers (Q&A):

Q: Abbott: What is the limiting factor of three months for 250 m data?

Q: Nakajima: What is the bottleneck for the 250 m band downlink? Is it hardwired at design? Can scientists propose changes to the design?

A.: Horiyama: We will use ODR. We have a hardware limitation for 250 m data.

Q: Abbott: What volume of level 3 product will come out from GLI?

A.: Yoshida: Not much space and budget, we hope to product level 3 standard products for test. Not sure what amount of data can be accommodated.

A.: Nakajima: Only 25% of total data volume will be produced in the 1st year.

Q: Abbott: How is the first 25% to be processed selected?

A.: every 4 pixels; we can discuss later.

 

 

Morning session issues:

1) 250 m direct reception

2) sensor performance: non-linearity

3) data volume of data products

4) use of DVD

 

 

GLI WORKSHOP

Report Session

Afternoon of September 9, 1998

 

I. Science Team Activities 1997/98 Teruyuki Nakajima (Leader of GLI PI Team)

Discussion:

GAIT: Twelve (12) of twenty-three (23) standard algorithms have been submitted.

Information is available on-line at http://www/eorc.nasda.go.jp/ADEOS-II/GLI/gli.html

 

II. Report from the Algorithm Scientist Hajime Fukushima Tokai U.

Discussion:

Q: Isaka: Are there plans to include 3-D radiative transfer in the GLI Simulator? We need to coordinate activities between the PIs and the GAIT. Where do we discuss PI requirements?

A.: Terry Nakajima: We'll arrange a meeting on the third day.

Q: Cota: How do you evaluate algorithms?

A.: It depends on the algorithm. When two algorithms have the same objective, we should evaluate both. We'll see after launch. No evaluation generally, due to time criticality.

 

III. Report from GAIT Activities Takashi Nakajima EORC/NASDA, Fujitsu, SED

Discussion:

Q: Isaka: The cloud mask, for example, is based on some given criteria; someone else may want different criteria. We need some cooperation and coordination to know the criteria. How do you proceed to evaluate the performance of the algorithm, especially when you combine algorithms?

Q: Pinker: Can we get input errors that enter the next algorithm? That error will propagate into other parameters.

A.: We need some kind of discussion.

A.: Terry Nakajima: Give us what kind of problems you can think of to serve as a basis of discussion.

Q: Barton: What is the mechanism and the timing for feedback of algorithm testing to the PIs?

A.: Tagashi Nakajima: It depends on the algorithm. If you want to modify the algorithm, tell us as soon as possible. When finished, EORC will send feedback as soon as possible. Implementation takes 4 months.

Q: Barton: Testing our algorithms with synthetic GLI data is good, but in 4 months, my contract may be finished.

A.: Check the home page of EORC for your algorithm status.

Q: Pinker: Does the GSD (GLI Simulator Data) use the GLI response function? Is the GLI response function from the Engineering Model?

A.: Yes.

Q: Kishino: The current GSD is for clouds. Will the next generation GSD include snow and clouds?

A.: Yes.

Q: What are the future plans for the GSD?

A.: The data will be generated based on cloud microphysics. It will include snow grain size and impurity.

Q: When will the EOC toolkit be ready?

A.: This October.

Q: Terry Nakajima: Their plan should be published describing the simulation data.

A.: We need to generate this product. It will take half a year hopefully.

 

IV. Report from Calibration Scientist Kohei Arai Saga U.

  1. Round-robin test flow
  2. Setting and measuring radiance from the SIS (small integrating sphere)
  3. The difference in SIS radiance as measured by Fujitsu (F) and Saga U. (S) at 1400 nm is almost 40%.
  4. (F-S)/F is between 0 and -6% for SWIR bands.
  5. (F-S)/F is between 0 and -1% for VNIR bands.
  6. Saga U. uses the average of 128 measurements.

 

V. Report from Cal./Val. Working Group Masanobu Shimada EORC/NASDA

Discussion:

Q: Abbott: What are the plans for making calibration or match-up data available to the community for analysis, for example, buoy data?

A.: We would like to open these data sets to researchers, but some require owner's approval.

Q: Terry Nakajima: Where does the change in signal as a function of tilt angle originate? Radiance should not depend on tilt angle.

A.: It's raw data from the Blackbody.

Q: Terry Nakajima: The NASDA budget requires that we maximize validation efficiency. We ask you to consider what instruments to deploy at the validation sites. Please discuss on the third day of this meeting.

A.: Abbott: I think to get maximum efficiency for the validation activities, we have tom address this issue of data ownership.

 

VI. Report from Validation Scientist Satoru Taguchi Soka U.

  1. Data products listed.
  2. Fourteen (14) ocean products and priorities established.
  3. Priorities set for atmosphere, land, and cryosphere.
  4. Scientists need to set priorities.
  5. Quantities for validation listed.
  6. Validation sites listed.

 

VII. Report from Data Management Working Group Yoshiaki Honda Chiba U.

Discussion:

Q: Nakajima: Do you have tilt plan? Please give us a plan. We need to how many times a tilt can occur per orbit.

A: Tilt operation will be decided by the land and the ocean working groups after mutual agreement.

 

 

GLI WORKSHOP

Common Session

Afternoon of September 9, 1998

 

IX. AROP Algorithm Aerosol Optical Propertiesc Over Land Ken Knapp (on behalf of Vonder Haar) Colorado State U./CIRA

  1. Overview of algorithm, sensitivity, current status, future work, suggestion for further improvement.
  2. Channels have multiple solution.
  3. Channel retrieval: solutions are non-orthogonal, hence sensitive to small error.
  4. Kaufman technique 1997: PI wants to improve over MODIS and use multiple channels.
  5. Retrieval requires 3 channels to avoid multiple solutions. We'll need to include the correction for ozone or water absorption. Too computationally expensive to build a database that stores surface reflectances.
  6. Early version of algorithm submitted last December.

Discussion:

Q: Verstraete: Can you give us some more information about how you select the calculation and dense, dark measurement of surface reflectance?

A.: Knapp: From Dr. Kaufman's paper, surface reflectances are measured at 2.2 and 3.7 microns and, in turn, are related to the reflectances at visible wavelengths.

Q: How do you characterize the surface? Do you assume a black surface?

For non-black surface, do you have directional reflectance? The reflectance of the surface can change by a factor of two depending on the angle.

A.: We assume a green vegetation surface. We use a linear relation between bands. The directional reflectances are both from the same angle for the same time period.

Q: Nakajima: Do you have an algorithm for getting radiance or critical reflectance from a larger area that would demonstrate a change in the radiative budget?

A.: Not now.- that would require using some sort of multi-temporal comparison of reflectance. We're not interested in doing multi-time. The use of a "mosaic" of previous satellite passes would be computationally expensive.

Q: Have you looked at TOMS products (experimental tropospheric aerosol product using the UV channel)?

A.: TOMS estimates aerosol absorption. These are initial estimates, recently quantified. These would be useful as an a priori model for our retrieval.

Q: Have you considered using SeaWiFS/POLDER data?

A.: No, but other data may be helpful.

Q: Pinker: I want to get some information about aerosol climatology of biomass burning.

A.: We use Kaufman's published climatology.

 

X. GLI Algorithm for Retrieving Precipitation from Clouds with Tops Warmer than 245 K Daniel Rosenfeld Hebrew U.

  1. Rainfall retrieval for GLI, use of GLI for cloud with top > 245k, use of optical depth to complement AMSR, low level and warm clouds are least detected by AMSR, GLI detection, assign rain rate based on the geometrical depth of the cloud, then average to 100x100km
  2. Day algorithm: threshold for visible reflectance+ split window threshold between 11 and 12 microns. For these optically thick clouds, we determine which ones are precipitating by applying an optical depth threshold of 14 micron for precipitating clouds.
  3. Recommend combining this algorithm and the AMSR rain algorithm.

Discussion:

Q: Nakajima: Why does the passive radiometer always over-estimate PR?

A.: This is an inherent problem with TRMM. TRMM data will be reprocessed. I expect to provide correction coefficient for TMI rain.

Q: Nakajima: In your data, the PR appears to be over-estimated.

A.: The data is trained with the PR, so there should not be an overestimate. I don't see any significant deviation.

Q: Isaka: No relation between PR and Rosenfeld technique for T<245K cold clouds.

A.: This is exactly what I want to point out. I am not suggesting using these cold clouds.

Q: Isaka: AMSR will have problem with cold clouds.

A.: The passive microwave rainfall algorithm relies on the backscatter of the ice. The radiometer works better with ice crystals for scattering.

Q: Barton: How to determine algorithm outside TRMM coverage?

A.: At those high latitudes, use radar networks outside tropics that provide precipitation information; use radar in US.

 

XI. Activities of GLI PI Science Team- 1997/98 Teruyuki Nakajima U. Tokyo

  1. Redefinition of products.
  2. Algorithm development status.
  3. Radiation budget at the surface requires ground albedo from the Land group. Land people need the atmospheric correction to get accurate ground albedo and BRDF. Big interaction!
  4. List of validation sites.
  5. Need quantitative method to validate cloud fractions. Cloud top and bottom heights will require a huge effort.

 

XII. High Latitude Bio-Optical Algorithms Glenn Cota Old Dominion U.

  1. High latitude are interesting- reducing sea ice.
  2. Made 10 cruises

 

XIII. Inversion Schemes to Retrieve Atmospheric and Oceanic Parameters from GLI Data Robert Frouin Scripps Institution of Oceanography/UCSD

  1. Whitecap correction algorithm described.
  2. Normalized Difference Phytoplankton Index (NDPI) code, relating NDPI to phytoplankton concentration, has been delivered to NASDA.
  3. Multi-layer perceptrons model the transfer function between spectral marine reflectance and phytoplankton pigment concentration.

Discussion:

Q: Takashi Nakajima: There is no real-time wind data for whitecap correction. We need to use Seawind. ECMWF is not real-time.

A.: I recommend using ECMWF winds. It's a drawback because it's not real wind, but it is better than ignoring whitecap effects.

Q: Cota: How far down did you go for your Chlorophyll concentration?

A.: It goes down to 0.02.

 

 

GLI WORKSHOP

Common Session

September 10, 1998

 

I. Monitoring Wetlands Yoshifumi Yasuoka U. Tokyo

  1. Developed new vegetation index (VSW = Vegetation Soil Water) because the usual NDVI doesn't work well on wetland areas. The VSW index is calculated based on the ratios among HV, HS, HW, where these quantities are the perpendicular lengths from the measurement to the lines formed by the Vegetation, Soil, Water triangle
  2. Mixed decomposition with hyperspectral GLI image to assess vegetation mixture conditions.
  3. Scaling between GLI and high spatial resolution sensors.

Conclusions:

  1. VSW can be used as a new VI for GLI.
  2. Unmixing with subspace method is useful for hyperspectral GLI.
  3. Scaling is necessary between GLI and high spatial resolution sensors.

Questions/Comments and Answers (Q&A):

Q. Terry Nakajima: What is the effect of tilt on VSW calculation?

A.: Tilt has big effect on algorithm. We may modify algorithm to take into account of the sun angle. We are interested in wetlands and specular reflection is important.

Q.: Are you using only 2 channels?

A.: Yes.

Q.: What are the channels?

A.: That is not decided yet. But for unmixing, we will use all channels.

 

II. Mosaicking, Atmospheric Correction and VI Algorithms Alfredo Huete U. Arizona

  1. Generate coarse resolution (1 km) and moderate resolution (250 m) VI maps.
  2. Composite data before performing atmospheric correction.
  3. Composite twice/month.
  4. Goal of LTSK10 is creating a cloud-free reflectance map.

Questions or comments/Answers (Q&A):

Q: Kishino: Have you evaluate the effect of band-to-band mis-registration?

A.: We found that the band to band mis-registration is 10 times worse than pixel to pixel mis-registration. A 10% error in band to band can lead to almost 50% classification error.

Q: Verstraete: The fact that atmospheric correction is performed after mosaicking is dangerous.

A.: Regardless of how it's corrected, if we can get access to the input (e.g., TOMS), then after we composite, part of the compositing output contains the normalized radiances. Also, we can preserve the pixel information such as sun angle, day of year, etc., so we can do atmospheric correction for any time.

Q: Verstraete: I want to advocate that there is no such a thing as universal NDVI. The bands are different for AVHRR, MODIS and GLI, and other parameters that have to do with the biosphere, such as LAI, FPAR, or biomass are what we really seek. I want to point out that the NDVI is not an optimal index for monitoring changes.

A.: Nakajima: Michel, you have proposed something that is provocative; you need to bring up this discussion in land group discussion.

 

 

III. Research Status of Cryosphere Yomohiko Oishi Tokai Univ.

  1. Use Aoki models and neural network (NN) for the inversion from radiance data to impurity concentration and grain size.
  2. Made field campaign snow reflectance measurements.
  3. Reflectance of larger grains is smaller longward of approximately 1 micron.
  4. CTSK2b3 snow cover area algorithm is ready.

Questions or comments/Answers (Q&A):

Q: Kishimo: What is the reason for the number of neurons and layer in your NN?

A: The most important is the number of layers; 2 is most commonly used.

Q: Zege: Have you measured angular dependence of the reflection coefficient?

A: (show graph). This is BRDF at snow surface. I have made no correction. But we can use this for the angular BRDF to address your question.
 
 
 

Parallel Sessions: September 10, 1998

Atmosphere and Ocean Session

 

Atmosphere 1. Retrieval of Cloud Parameters from GLI and POLDER Data Harumi Isaka U. Blaise Pascal

  1. The retrieval algorithm is defined and tested for cloud parameters in inhomogeneous clouds. We retrieve average optical depth, effective radius, and relative cloud inhomogeneity. We can probably extend the retrieval algorithm to fractional clouds.

Questions or comments/Answers (Q&A):

Q: Nakajima: You assumed 1-D and 2-D homogeneity. In the case of vertical inhomogeneity, can you use your technique?

A.: Average effective radius depends on cloud height. An English group use along track radiometer. Their retrieval showed dependence on the viewing angle in the range 60 degrees to the nadir could result in about 3% error.

Q: Nakajima: You have one more unknown.

A.: We have three parameters. You can define the size at the top of the cloud.

Q: Nakajima: Rather than use texture analysis, is there another way to go?

A.: Maybe. You define one more parameter. Vertical height is not an important problem. If necessary, we could make improvements in that area. The problem is fractional clouds. Monte Carlo computation takes 3 months.

Q: Have you consider polarization?

A.: No. You are asking about orientation effects. We cannot retrieve orientation information from GLI. Cloud inhomogeneity comes from POLDER. POLDER provides polarization information and, probably, cloud type. But cloud retrieval comes from GLI.

Q: Melnikova: Do you take into account cloud absorption of radiation? This factor may affect the retrieval of cloud optical depth. You may underestimate optical thickness.

A.: We do not include cloud absorption of radiation, but isn't this effect less than the effect of cloud inhomogeneity?

 

 

A2. Evaluation of Absorption Properties of Atmospheric Aerosols Based on Chemical Characterization Sachio Ohta Hokkaido U.

  1. Atmospheric correction based on chemical characterization of atmospheric aerosols: phase function and single scattering albedo.
  2. In situ collections of aerosols.
  3. Plans for surface-based aerosol measurement system.
  4. Plans for daily in-situ collection of aerosols.

Questions or comments/Answers (Q&A):

Q: Rosenfeld: Such data are useful for getting cloud condensation nuclei. Can we connect these observations to this problem?

A.: Yes in principle, but no time or support in practice.

Q: Isaka: Chemical composition determines size distribution. No separation in particles size?

A.: I used size distribution according to our measurements and the cascading model. I used Cascade impactor to check size distribution. I assume 3 layer aerosol model.

Q: Melnikova: Why is the external mixture less absorbed than the internal mixture?

A. This is because of the model. In the internal mixture half the aerosols have the absorptivity.

Q: Terry Nakajima: Atmosphere and ocean groups are using AFGL model. Can you modify their model?

A.: Yes.

 

 

A3. Fundamental Analysis for Algorithm Construction on Cloud-Aerosol Parameters Derivation Tamio Takamura Chiba U.

  1. Ground-based observation program
  2. Lidar provides vertical distribution of aerosols.
  3. Other instruments are Sun-photometer, Aureolemeter, microwave radiometer, pyranometer, pyrheliometer.
  4. Aerosols typically have a tri-modal distribution in volume spectrum.
  5. Future- identify complex index of refraction.

Questions or comments/Answers (Q&A):

Q: Knapp: Do you account for the effect of relative humidity?

A.: We do not include a relative humidity dependence. You can see the shift of the maximum as a function of height. May be we can check the dependence on height.

 

 

A4. Ground-Based Measurements of Aerosol Parameters Guang-Yu Shi Chinese Academy of Sciences

Observation of spectral direct solar radiation (spectral Sun- photometer), spectral diffuse solar radiation (Aureolemeter), global direct solar radiation (pyranometer),

global diffuse solar radiation (pyrheliometer), global reflected radiation (pyrgeometer), global terrestrial radiation, and the concentration and size distribution of aerosols (optical particle counter).

 

 

A5. Retrieval of Cloud Geometrical Thickness Makoto Kuji Nara Women's U.

  1. Can retrieve cloud top height and cloud geometrical thickness simultaneously.

Questions or comments/Answers (Q&A):

Q: Terry Nakajima: What kind of response function did you use? The response function can really affect the results. What if you use 2 or 3 layer clouds? What geometric thickness will you get?

A.: We work with MCR only.

Q: You have four variables, it is difficult to get stable answer.

A.: Why do you think so?

Q: for example, the stability depends on the Look up table. You need a lot of memory for LUT. I worry about memory usage.

 

 

A6. Improvements to the GEWEX and GCIP SRB Algorithms Based on Observations from the GLI on ADEOS-II Rachel Pinker U. Maryland

  1. The shortwave radiation budget satellite algorithm is ready.
  2. At longwave, HARTCODE was upgraded, allowing new water continuum parameterization.
  3. Retrieving surface temperature from GLI. Delivered code to NASDA.
  4. Ground-truth data set will be used for validation.

Questions or comments/Answers (Q&A):

Q: Terry Nakajima: You are getting the effective surface temperature?

A.: Yes.

Q: You have 2 options in the shortwave, but are not pursuing option 2?

A.: Option 2 is worth pursuing. The motivation is related to the need in surface albedo. Version one, already delivered, assumes that surface albedo will come from independent sources. Version two, in addition to downward shortwave fluxes, will also allow the derivation of the surface albedo, namely, downward and upward fluxes.

 

A7. Development of Remote Algorithm for Atmospheric and Terrestrial Surface Parameters Using Multi-Channel Data Takahisa Kobayashi for Akihiro Uchiyama Meteorological Research Institute

  1. Cloud type classification algorithm
  2. Surface absorbed flux algorithm
  3. Volcanic aerosol retrieval method developed.

Questions or comments/Answers (Q&A):

Q: Rosenfeld: Did you examine the shady side of the cloud? It should be symmetrical to to prove your point.

A.: We did not do that. The shadowing effect is significant, so I do not think it will be symmetrical.

Q: Nakajima: What constitutes a clear sky (443 nm)? Aerosol or molecule?

A.: Aerosol is more important.

 

 

Opening Remarks for the Ocean Group Hiroshi Murakami and Yongje Park GAIT

 

Ocean 1. Development of Underwater Algorithms for ADEOS-II/GLI Motoaki Kishino Institute of Physical and Chemical Research

  1. Neural network approach to obtain concentrations of water constituents.

Questions or comments/Answers (Q&A):

Q: Cota: How do you handle chlorophyll packeting effect? Do you assume that they are all constant?

A.: Yes. This is important for modeling.

Q: Can you give computational time for pixel-wise retrieval?

A.: About 5 minutes for the image I showed and using SUN workstation.

 

 

O2. MODIS Ocean Algorithms and GLI Mark Abbott Oregon State U.

  1. MODIS issue: Electronics crosstalk between channels
  2. Software delivered to NASDA
  3. MODIS launch mid-1999
  4. MODIS:GLI sensor differences- number of bands, wavelength and bandpasses, calibration, GLI scan angle dependence, atmospheric correction
  5. Fluorescence daily time-dependence observed.

Questions or comments/Answers (Q&A):

Q: Did you use continuous culture?

A.: Yes.

Q: How do you get the nutrient?

A.: These were all chemostat, not batch culture. But we change the flow rate.

 

 

O3. Algorithm Development for Ocean Color: Asian Dust Aerosol Correction and Cloud-affected Pixel Screening Hajime Fukushima Tokai U.

  1. Atmospheric correction code submitted.
  2. Evaluation of “new aerosol model set” to be performed shortly
  3. Cloud screening and near-cloud pixel screening algorithms submitted.

 

 

O4. GLI Satellite Algorithm Development and Application for the California Coastal Zone Robert Frouin for B. Greg Mitchell Scripps Institution of Oceanography, UCSD

  1. Using the CalCOFI bio-optical database for algorithm development.
  2. Cubic spline interpolation to GLI wavelengths.
  3. Algorithm for chlorophyll and K490 developed.
  4. Developed supporting data.
  5. Algorithm to distinguish red tide (more absorption in UV).

Questions or comments/Answers (Q&A):

Q: What is the real difference when you have red tide? Species?

A.: Abbott There is a protein (MAA?) released that is absorbed in the UV.

Q: Are you sure you could detect a bloom?

A.: It is a possibility.

 

 

O5. Variability in the Spatial and Temporal Distribution of Spring Bloom of Phytoplankton in the Western Subarctic Pacific Satori Taguchi Soka U.

  1. Trying to relate fluorescence values in a water column to GLI determinations.
  2. Developing algorithms to relate natural fluorescence and chlorophyll a concentration, to relate natural fluorescence and primary production, and to relate instantaneous primary production and daily primary production.
  3. Optical observations of PAR and natural fluorescence made.
  4. Biological observations made.

Questions or comments/Answers (Q&A):

Q: Cota: Absorption characteristics with depth are regular. Surface biomass is related to integral biomass. Your OCTS data works well at high latitude because of the pigment-rich conditions there. Any comment?

A.: I have no measurements at high latitude. Phytoplankton absorption is small in ocean, then the packet effect (?) is small. .

Q: Cota: Does the data show large packet effect?

A.: In warm water the chlorophyll concentration is very low and the particles are small we think. Size of phytoplankton changes with season in Japanese water. In the spring, there are large chlorophyll concentrations. They are large packets.

 

 

O6. Pre-Launch Algorithm Development for GLI Data- SST, Atmospheric Water Vapor, and Cloud Detection Techniques Ian Barton CSIRO

  1. Radiometric field measurements and bulk temperature devices for validation.
  2. Four SST products at high spatial resolution (1 km data)
  3. CD-ROM containing SST programs and data files have been delivered to NASDA.

Questions or comments/Answers (Q&A):

Q: Cota: Some improvement in rms is artificial as you add channels. Can you discern real improvement?

A.: Yes, by looking at the values of the coefficients themselves.

 

 

O7. Development of Algorithm and Schemes to Retrieve SST from GLI Data

Masao Moriyama for Hiroshi Kawamura Tohoku U.

  1. Did similar work for OCTS.
  2. For GLI, working on SST, cloud detection, skin temperature algorithms, match-up data set, and sea surface shape effects.

Questions or comments/Answers (Q&A):

Q: Ian Barton: Channel 30 has better transmission than channels 35,36 and could improve your rms for SST.

A.: The 3.7 um channel is more transparant, but is very noisy. We need more study on aerosol effects to do the correction.

Q: Barton: For the AVHRR algorithm, they use the 3.7 um channel and get better results than the split channel technique.

A.: We will study more

Q: Nakajima: do you have conversion from skin temperature to sea surface temperature

A.: SST depends on surface emissivity. We need parameters such as surface wind.

 

Parallel Sessions: September 10, 1998

Cryosphere and Land Session

 

 

Opening remarks: Research Status of Cryosphere Masahiro Hori GAIT

  1. List of various products dealing with the cryosphere.
  2. Many research and standard products.
  3. Since NASDA will chose modules, please emphasize.

 

Cryosphere 1. The Retrieval of the Effective Radius of Snow Grains and Control of Snow Pollution With GLI Data Eleonora Zege Academy of Sciences of Belarus

  1. Snow optical model.
  2. Most snow models use independent spherical scatterers. We will use a non-spherical close packed media.
  3. System of asymptotic formulae.
  4. Geometrical optics to describe single scattering.
  5. Results reflect the appropriateness of new techniques.
  6. Simulator designed to evaluate the snow size algorithm.
  7. Atmospheric effects on the retrieved snow grain size and pollution amount is estimated.
  8. The developed algorithm mostly provides reliable estimations without atmospheric correction.

Questions or comments/Answers (Q&A):

Q: Oishi: What is the physical meaning of similarity parameter?

A.: It gives the effect of absorption in the diffusion process.

 

 

C2. Research Report for the Third GLI Workshop: Remote Sensing of Cloud and Surface Properties in Polar Regions from GLI Measurements On Board ADEOS-II Knut Stamnes U. of Alaska

  1. Cloudy/clear discriminator and snow/sea ice discriminator have been delivered to NASDA.
  2. Radiative transfer model for coupled surface-atmosphere. Spherical properties for surface particles of snow. Surface is treated as semi-infinite medium. A better theory could replace the assumptions of surface properties.
  3. Cloud detection algorithm is designed to work independently of the surface.
  4. Snow/sea-ice discriminator works only during daytime. This is not an issue as we have enough daylight during half of the year when this (sea-ice) is an issue.
  5. I strongly recommend changing the 1.64 micrometer channel at 1 km spatial resolution rather than 250m.

Questions or comments/Answers (Q&A):

Q:Verstraete: Can you not just average 250m to 1km?

A.: Maybe, but it is an inconsistency compared to the spatial resolution of the other channels we use.

Q: The coefficients to discriminate surface types should be optimized for the application. May or may not need more channels. What is the impact of mis-classification on the other algorithms? If not much impact, then OK. Otherwise, we need to test for this effect.

A.: Yes, I agree. I am open to suggestions.

 

 

C3. Development of Retrieval Algorithm of Snow Grain Size With ADEOS-II/GLI Teruo Aoki Meteorological Research Institute

  1. Algorithm to calculate NDSI, NDII, NDVI for land surface with and without forest and sea surface.
  2. Algorithm for snow covered area.
  3. Cloud detection needs algorithm, e.g., Ackerman or Stamnes.
  4. Calculated look-up tables for surface albedo depending on snow grain size and impurities and radiance TOA, retrieval of snow properties and BRDF look-up tables.
  5. Comparison between observed and modeled BRDF show good agreement.
  6. Experiment in low temperature facility to compare spherical and dendrite shape of snow crystals.

Questions or comments/Answers (Q&A):

Q: Stamnes: What is the message from the measurements in the cold facility

A.: We expected a difference in the spectral albedo due to differences in the shape of snow particles. However, we observed quite similar spectral albedo. The crystal sizes are the same, except for the difference in shape.

 

 

Opening remarks of the Land portion Muhtar Qiong GAIT

Current topics in the Land Group include:

  1. In vicinity of ground receiving stations we can get 250m data. In other regions, 250m data may not be possible.
  2. Capacity of ODR = 7 minutes. One scene size is 4 minutes and 10 seconds.
  3. Differences in observation patters. Ocean science needs tilt while the land science does not desire a tilt.

 

Land 1: Global Map of Vegetation Using GLI Data Yoshiaki Honda Chiba U.

  1. Global vegetation map using GLI data.
  2. Stable classification.
  3. Shortest distance method from target pixel to supervised vegetation pattern.
  4. Tilt data will be used along with BRDF correction; pattern recognition between observations and supervised data.
  5. Cloud free data from non-tilt is not enough, hence we have to use the tilt data with BRDF correction.
  6. Stable point: every year is almost the same line for a meteorological station.
  7. Peak adjustment.
  8. Vegetation class check; 3 years data is examined to see the consistency of the interpreted vegetation class in order to remove any obvious mistakes using our vegetation class hierarchy.
  9. The method is simple.

Questions or comments/Answers (Q&A):

Q: What kind of vegetation index will you use? NDVI? Or all channel index?

A.: In my method I can apply any vegetation index. I want to compute the standard product.

Q: Verstraete: The last slide: Vegetation class check. You keep tropical forest as it does not become evergreen. What if you had evergreen, tropical, evergreen?

A.: This will result in tropical forest.

Q: If vegetation class is based on a-priori information, why do you need GLI data?

A.: I need it as it forms the first basis for computation of land cover.

Q: Methodology is sensitive to the order; you have full confidence in the way your algorithm is working.

A.: We can use such class check for 2 classes, e.g., tropical forest, evergreen. If we have more than 2, there will be problems.

Q: Chin: In tropical forests, how do you quantify effects of deforestation?

A.: I do not know how the succession will be constructed when this situation occurs.

Q: How do you classify data of each pixel?

A.: There could be complications from regrowth in deforested areas, e.g., secondary forests.

 

 

L2. Shaobo Huang (for Dr. Tateishi) Chiba U.

  1. Each pixel has the land cover class and the information source.
  2. Land cover classification was developed.
  3. Ground truth data is added to the classification; the NDVI and surface temperature was used.

Questions or comments/Answers (Q&A):

Q: Verstraete: What are the measurements used to classify vegetation on the ground?

A.: Arbitrary

Q: Is there any relation between the criterion to classify vegetation on the ground and the satellite measurements to do the same?

 

 

L3. Vegetation and Land-Cover Classification Based On Simulated GLI Data Set Nguyen Dinh Duong Institiute for Geography

  1. GASC algorithm has been tested on Landsat TM (Thematic Mapper) data.
  2. Each land surface category has a unique modulation of the spectral reflectance curve.
  3. Simulated GLI data for Vietnam.
  4. Total Reflected Radiance (TRR) will be used for land cover classification.
  5. System implemented for various platforms using the simulated GLI data over Vietnam. The classification agreed to 90%.
  6. Topography and cloud shadow cause errors in classification.

Questions or comments/Answers (Q&A):

Q: Gobron: How were the TRR values for classification decided?

A.: Based on field trips and TM data and statistical methods.

 

 

L4. Satoshi Tsuyuki (for Dr. Awaya) Tokyo U.

  1. Biomass estimation.
  2. Test sites: (I) Hokkaido: evergreen conifers, (II) foot of Mt Fuji: deciduous conifers.
  3. Shadow disturbs the estimation of biomass using vegetation indices.
  4. Quick seasonal changes in leaf spectra will lead to not so accurate estimation of vegetation biomass.

Questions or comments/Answers (Q&A):

Q: Verstraete: As the tree grows, the red reflectance decreases. As the canopy biomass increases, the NIR reflectance should increase. Why do you need to examine the ratio? Why not just consider the NIR reflectance?

A.: The ratio reduces the geographical effect because we have steep terrain.

 

 

L5. Atmospheric Correction for Land and Estimation of Biomass and Carbon Amount in the Humid Tropics for ADEOS-II/GLI Liew Soo Chin (for Dr. Hock) National U. of Singapore

  1. Atmospheric effects on satellite observed radiances.
  2. Full RT maybe computationally expensive, approximate using single scattering.
  3. Simulation over Indonesia and atmospheric correction.
  4. Biomass algorithm; Jambi and Singapore test sites; regression relationships.

 

 

L6. Craig Trotter Topographic Corrections for GLI Landcare Research

  1. Effect of topography on reflectance of forests.
  2. Initial analysis of nadir viewed radiance data for conifer and broadleaf canopies at different sun elevations have been completed
  3. Corrections for topography for nadir views have been obtained.
  4. Off-nadir results are not good; reflectance approximations must be corrected for off-nadir view angles.

Questions or comments/Answers (Q&A):

Q: Verstraete: Reflectance range?

A.: Yes, we have a range for reflectance. The equation is insensitive to the diffuse radiation.

Q: Why does it not work for off-nadir?

A.: Reflectance ratio has to be changed.

 

 

L7. Development of New Vegetation Indices and Algorithms for Detecting Vegetation Changes Motomasa Daigo Nara Women's U.

  1. Vegetation maps, net primary production and vegetation change maps using GLI.
  2. Pattern decomposition method.
  3. Composite color image.
  4. Vegetation Index based on Pattern Decomposition (VIPD) is more sensitive to vegetation types than NDVI.

Questions or comments/Answers (Q&A):

Q: Huete: Consider the comparison graph between VIPD and NDVI. Concrete, asphalt and yellow leaf also have a range of VIPD.

Q: Trotter: Define and how was quantum efficiency measured?

A.: Arbitrary.

Q: Was reflectance measured at the same time as quantum efficiency?

 

 

L8. Surface Radiation Budget Products from the ADEOS-II GLI-Proposal G16 Fred Prata CSIRO Division of Atmospheric Research

  1. Global land surface temperature from GLI and clear surface albedo.
  2. Use radiative transfer physics and linearize to regression with coefficients determined off-line so that computation is fast.
  3. Zenith angle effects are weakly non-linear.
  4. Vegetation and land cover maps from GLI, but not using channels 35 and 36 (as this model for surface temperature uses it).
  5. Retrieved surface temperature is within +/- 0.5 to +/- 2.0K.
  6. Water vapor in the atmosphere is least important over ice, therefore surface temperature retrievals over ice are very accurate.
  7. In the absence of BRDF and NDVI, TOA radiances can be used to construct surface albedo.
  8. Validation sites in Australia in tropical monsoon climate.

Questions or comments/Answers (Q&A):

Q: Stamnes: To compare albedo with that of AVHRR, we need a footprint of 1km. Is the area of measurement spatially homogeneous?

A.: Yes, it is. Also, albedo is measured at various places. The measurements are consistent.

 

 

L9. Development of a Spectral Index Optimized for the GLI Instrument Michel Verstraete Space Application Institute

  1. New perspectives on vegetation indices and spectral indices to estimate FAPAR.
  2. Index optimized for the application.
  3. No justification for using GLI to estimate NDVI.
  4. NDVI is dependent on bands, gain, spectral response etc. Therefore, NDVI from different instruments cannot be compared.
  5. Optimization of spectral index for GLI.
  6. GLI VI is a better estimator of FAPAR than NDVI.
  7. NDVI predicted and GLI VI predicted FAPAR has different values for some regions.

Questions or comments/Answers (Q&A):

Q: Prata: Why use the word vegetation index? It can be used in the future to estimate vegetation fraction by others.

A.: Good questions; the word index is used to signify parametric approach.

Q: What is NDVI good for?

A.: If you are interested in a variable, then design an index for estimation of the variable.

Q: Stamnes: How do you validate FAPAR in the field?

A.: We do need FAPAR. However, it cannot be measured in the field directly. By detailed measurements of the physical characteristics of the plant and building a model would be one approach.

Q: Could PAR measurement suffice?

A.: No, as PAR varies with depth in the canopy and multiple reflection in the canopy.

Q: How about aerosol effects on atmospheric correction?

A.: They can be used in the forward models and a look-up table is generated.

Q: Huete: Iso-lines cannot be simulated. How sensitive is your method to a different RT model, BRDF model?

A.: If we used a ray-tracing models we would get a better answer. The understanding of the environment is all contained in the data.

Q: There is no reason NDVI FAPAR is correct and GLI VI FAPAR is incorrect? A measurement would have settled this matter.

A.: I have not measured the FAPAR. I have shown that there is a rational way to test these differences.

Q: Pinker: How can you get more than one piece of information in the case of FAPAR (amount absorbed and amount incoming) from one index?

A.: discussion off-line.

 

 

L10. Geometric Correction of GLI Images Toshiaki Hashimoto Chiba U.

  1. Geometrical correction of GLI images due to miscalculation of satellite position, altitude and misalignment.
  2. Accuracy of GLI is as good or better than OCTS.
  3. Final product contains channel data, solar zenith and azimuth angles.

Questions or comments/Answers (Q&A):

Q: Prata: A request: Please use nearest neighbor so as to preserve radiometric accurcay.

A.: Shimoda: The nearest neighbor may not be the best. Band to band registration is important. Effect increases with tilt operation. We may need more simulations to evaluate the sampling method.

 

GLI WORKSHOP

September 11, 1998

 

  1. Atmosphere Science Group

Session chaired by Terry Nakajima

  1. Reviewed algorithm flowchart.
  2. L1B (radiometrically-corrected radiance) produces L1B match-up, L2-pp, and L2A products.

    A pixel mapping matrix will be prepared to provide addresses of pixels in the L2A_AO data set. We don't need to process a whole orbit in order to get longitude and latitude.

    Q: Isaka: Is that the separated data or nearest neighbor?

    A.: It's nearest neighbor data.

    L2-pp is special, user-order pixel by pixel scene. It is limited to <10% of the total processing.

    NASDA cannot process all data; process 25% in the first year. L2A is a reduced-size AO and LC data set, 380 MB/orbit, 4x4 km resolution.

    L2A produces L2-op, the global operational, pixel by pixel product that is based on physical parameters.

    Q: Rosenfeld: Process 25% of L2A?

    A.: Yes.

    EOC performs L1, L2A processing.

    EORC does L2, L3 processing.

  3. Schedule
  4. Year 1: Process 25% of L2A data.

    Version 1: Launch + 6 months (debugged)

    Version 1.11 Launch + 12 months (calibrated)

    Discussion:

    Q: Isaka: 25% of 4 km data?

    A.: Yes, 25% of L2A data.

    Q: Isaka: L2_pp is 10% of total data or 10% of processed data?

    A.: Maybe 10% of the total computing time.

    Rosenfeld: We can learn from TRMM. They concentrate full data in special areas of interest, e.g., where ground validation data exist (<1% of all data).

    Nakajima: Good idea. Match-up data is processed automatically. There is ground validation in L1B match-up, point-wise (approximately 50 x 50 km). ILAC orders (L2-pp, 1000 km or 500 km x 500 km) will be standard for coastal studies.

    Rosenfeld: Special areas should be flexible. All groups may want ILAC.

    Nakajima: Good idea. The Ocean group wants ILAC for coastal study.

    Pinker: It would be interesting to have a site in common with EOS for validation.

    Nakajima: We need to have input for EOS sites.

    Isaka: We need to have a clear idea of what we can do with that area.

    Nakajima: Yes, we need to discuss that.

    Isaka: For example, aerosol:cloud interaction. We have to carefully select our specific area.

  5. ILAC areas:
  6. Aerosol: cloud interactions on a 500 x 500 km scale are of interest for radiative and precipitation applications.

    Thailand/Indonesia, China/Japan, Amazon, Europe/Atlantic, open ocean (southern and northern hemispheres).

    Discussion:

    Pinker: Choose an area where we can perform ground validation

    Rosenfeld: ADEOS-II is best equipped to study the relation between aerosols and heavy rain. What is the impact of biomass burning by mankind on precipitation and on pollution? For precipitation, I recommend areas that are strongly polluted.

    Nakajima: Open ocean is one issue, but aerosol will not be detected.

    Isaka: You cannot detect the type of aerosol.

    Pinker: Use EOS sites on ADEOS-II data, so we can validate our retrieval algorithms.

    Rosenfeld: ADEOS-II is best equipped to deal with cloud: aerosol interaction and the impact on precipitation. We can study how mankind is affecting precipitation globally.

    Nakajima: Heavy rain or drizzle?

    Rosenfeld: Heavy rain.

    Nakajima: Why don't we use TRMM, SSMI, or AMSR instead?

    Rosenfeld: TRMM's weakness is that it can't get the aerosols.

    Nakajima: But they can get that data from GLI.

    Rosenfeld: But it would have to be coincident. The most important contribution to society and to welfare is to determine the impact of mankind's pollution on precipitation.

    Nakajima: The data management working group asked us what tool requirements (EOC toolkit) will be released this year.

    Rosenfeld: TRMM produces Toolkits. We need data interface toolkits for L1B and L2A.

    Nakajima: The data handling toolkit will be provided by EOC.

    The data management working group requests toolkit requirements and standing order requirements (e.g., entire U.S. L1B or L2A)

  7. (Makoto?)KujiL or Kaji Kajiwara?) Data Management: Fujitsu

Handed out a list of Confirmation Items for PI Science Groups (dated 9/9/98).

Discussion:

Terry Nakajima: We may have objections to NCSA-HDF file format.

Isaka: Please clarify L2A_OA file configuration.

Nakajima: Ocean people want to do analysis along the orbit. L2 is along the orbit.

L2-binned corresponds to the atmosphere group's binning. The atmosphere people will go directly to the latitude:longitude grid to do analysis. Hence the L2A_OA segment grid definition.

Q: Knapp: How is orbit overlap at high latitudes handled?

A.: They will know the orbit at that time. There will not be any data loss.

Nakajima: Temporal binning (L3) is for the benefit of the ocean people, not the atmosphere group.

 

General Discussion:

Isaka: The inhomogeneous cloud parameter algorithm works off of the Water Cloud Parameters algorithm.

Terry Nakajima: Cloud screening is every pixel (1 km). If Isaka needs texture analysis, information must go in CFLG.

Q: Rosenfeld: Can we apply to ILAC for a case study?

A.: Yes.

Q: On the topic of clear/cloudy water vapor (1.1 um)- is anyone doing this?

A.: No. If unavailable, we should ask for it in the second R.A.

Q: Nakajima: How can we provide the interface for Dr. Pinker's radiation budget parameter program? A simple one-layer cloud input is currently used. Should we go to a multi-layer input?

Pinker: If you go to a multi-layer input, you have to know where the clouds are.

Nakajima: You produce skin temperature. That is useful for the surface radiation budget. We will propose skin temperature as a new standard product.

Q: Nakajima: Do you need albedo?

A.: Pinker: Version 1 needs it

Isaka: We need that information too.

Q: Nakajima to Knapp: Can you provide albedo? Can you get an albedo product?

A.: I think land people are already doing it.

Unknown: Land people have it as a research product.

Nakajima: If we get BRDF from the land people, we can tune it.

Isaka: We should do that.

Rosenfeld: For PRCP (1 km), we need the brightness temperature for every pixel. We need to eliminate pixels next to other pixels having certain properties.

Q: Nakajima: Do land people have a database for wavelength-dependent albedo and BRDF?

Nakajima: Our biggest problem is that for bright targets, we have no onboard calibration. The lamp is out of range. The blackbody can only be examined in the nadir-looking geometry. For forty (40) minutes we do not have Blackbody calibration. We

Need to strengthen the cal/val effort.

Also of some concern is the small non-linearity due to saturation.

Polarization in channel 1 is < 3%; channel 10 is < 5%, channel 18 is < 3%, other channels are < 2%.

Rosenfeld and Nakajima: Use ILAC to look at the desert.

Q: Nakajima (speaking about the global data acquisition pattern): Do we need to generate a 250 m database (currently a research product)? Should we increase the frequency of the 250 m channel data to one month?

Isaka: The atmosphere group does not need the tilt operation.

Nakajima: For aerosol remote sensing, we need the tilt.

Rosenfeld (speaking about validation): Lidar is useful for validation of cloud top and bottom heights.

Nakajima: Plus we can use cloud radar and sonde. We need to gather such data for validating the cloud model.

Isaka: I can get 8 mm radar data.

Nakajima: Lidar and Cloud Precipitation Radar data will produce particle effective radius. This would be good to mention for the next R.A.

Nakajima: Data flow goes from the EOC to the EORC. L2A plus 10% of L1A data will be available on-line.

Q: Rosenfeld: Final versions of algorithms need to be compatible with system resources. We need to know system resources. What kind of resolution can we use?

A.: Nakajima: Assume every fourth pixel, unless told otherwise. We can discuss this later.

 

  1. Ocean Science Group

Session chaired by M. Kishino

 

  1. Kishino presented algorithm status and pointed out that all algorithms have been submitted and accepted, except K490 algorithm (PI Kishino).
  2. He next presented data flow. He pointed out that Level 1A data has resolution of 1 km, and L2A is 1 km pixel subsampled at 4 km. EOC will do processing and 10% of data will be transfer to EORC.
  3. Program update:

For Level 1:

  1. V0 should be ready by launch. This version shall be submitted to EOC for routine processing.
  2. V1: ready at L+6months; program goes to EOC.
  3. V1.2 will be processed at EORC.
  4. Version 2 ready by L+18m; program goes to EOC.
  5. Version 3 ready by L+30m; program goes to EOC.

For Level 2 and 3 generation:

  1. 25% of research and cal.val. products will be processed at EORC at L + 18m
  2. 50% at L + 24m
  3. 100% at L + 30m

Discussion:

Q: Barton: How is the 25% data selected, geographically or temporally?

A.: Every other 4 pixels.

Q: Abbott: How about the other sensors, will they follow the pattern for processing?

A.: AMSR data is much smaller. All data are processed.

Q: Abbott: The experience with SeaWiFS is that they are doing a lot of reprocessing with coefficient changes. What is the definition of the versions?

A.: Details are not decided yet. Sometimes they can be just change in calibration coefficients.

Q: Abbott: We need to size for a lot of reprocessing, especially when the match-up data will be available after launch. If we stay with the original SeaWiFS processing, the data is useless now, due to sensor degradation.

A.: OCTS is doing the same thing. EORC is supporting OCTS research. EOC is doing the standard products, and EORC is performing the third reprocessing.

Q: Abbott: We need to make sure data that we released is well documented about data quality.

 

  1. Moriyama presented Data format and QC flag. Currently, the QC flag is 2 byte. He also presented Barton's proposed QC flag. Barton admitted his proposed flag is only initial guess, and follow on discussion focused on that proposed by Moriyama.
  2. Discussion:

    Q: When do we need to fix the format?

    A.: Not sure. Critical point is data size.

    Fukashima: Fujitsu have to fix data format and binning algorithm by Mar 1999. I proposed that we fix level 2 format by December 1998.

    Q: Abbott: The flags are 16 bits. Each PI needs to have input to the QC flags. To do the final QC flag specification by December is ambitious.

    Kishino: For atmospheric correction, 16 bits are needed.

    Abbott: There are other QC flags which depends on the specific algorithm, such as K490, Chlorophyll may need 2-3 flags each. For MODIS, there is step one flag, which carry QC quantities during run time. The step 2 QC flags are post processing. We should adopt the MODIS philosophy to get the data out quickly, but with QC flags, so that the data will not be misused.

    Frouin: I am not sure people will look at the QC flags.

    Abbott: That is not my problem. I have a student who uses SeaWIFS data who looks at the flags.

     

    Tilt Operation

  3. Fukashima presented a sheet summarizing the questions on tilt strategy. He pointed out that EOC understands that tilt will operate as a standard pattern, which will be decided by the Land and Ocean group. EOC needs to know how often will "exceptional" tilt will be conducted. It was agreed that GLI scientist T. Nakajima should be in charge on change/adjustment.
  4. Kishino showed four scenarios for tilt
  1. land mode (100% no tilt)
  2. ocean mode (100% tilt)
  3. mixed mode (25% non-tilt, 75% tilt)
  4. Campaign mode (to support field experiments)

GLI is designed for a maximum of 4 tilts per orbit, and it takes 23 seconds for tilt operations. The data during tilt operation cannot be used, as the sensor is looking at inside of instrument. This means that about 600 km of satellite path will be missed during tilt operations.

Discussion:

Abbott: Heute indicated he has no problem doing VI with tilt. We should ask the land group to reconsider their needs for no tilt.

Fukashima: The Japanese land group gave us 10 reasons for no tilt.

Shimoda: For the 250 m pixels, registration is a big problem for the land group.

 

  1. Moriyama reported on some problem on non-black body observation under tilt operation. Since there is no internal calibration during tilt, it is expected that there will be bias between tilt and non-tilt operation for Level 1B and 2A products. The error will propagate from L1B/2A. It is therefore important to establish ground validation for bias correction. He requested that the scope of the sensor and calibration team be clearly defined.
  2. Discussion:

    Barton: I requested that NASDA engineers perform full thermo-analysis on tilt/non-tilt mode; and during tilt which occurs near the equator, that black body calibrations be made for 4-5 seconds during equatorial crossing.

    Fujitsu engineer: For these type of operation, there may be a problem for ground reception sites. The download data may be difficult to process.

    Barton: You still need to do calibration.

    Q: Park: How many ground stations do we have?

    A.: about 3-4.

    Barton went up to board and draw up calibration scheme during tilt.

    Fukashima: We will recommend to Fujitsu to if they can accommodate this kind of processing.

    Tanaka: I want to make a plea for the IOCCG (International ocean committee coordinating group). We have heard comments that there are too many ocean color instruments flying by the year 2000. There is at least MODIS, SeaWiFS, GLI, and MODIS-PM afterwards. For three satellites, they can cover 60% of globe in 3 days. There is therefore no problem with tilt, you can use different satellite data. Please consider using data from different satellites.

     

  3. Shimoda reported on Cal/Val activities. He distributed a sheet on OCTS processing and specifics on version changes and showed some evaluation results.

Discussion:

Q: Tanaka: Are the validation points on your graph taken at the same location?

A.: No. They are distributed geographically.

Abbott: Your results are good. There is a SIMBIOS meeting in September that will report some comparison results. There will be more reprocessing. The results show about 20% in pigment concentration, and 10% in radiance. MOBY (Moored optical buoys) will still be collecting data that can be used for validation.

Fukashima: I have received a new version copy of the SEADAS. My student tested it and found the results are totally changed.

Abbott: The new version accounts for sea foam, and Howard Gordon (U. Miami) thinks they can extend the retrieval to higher wind speed regions.

Frouin: We will take a look and have more information later.

Abbott: The new version includes foam, sensor degradation, and breaking waves corrections.

 

  1. The cal/val. Plan was presented. It includes case 1 (open ocean) water in global ocean and coastal sites, mostly near Japan.
  2. Discussion:

    Abbott: There are additional validation sites such as MOBY, and North Africa to study dust which should be included. I want to talk about the data policy (ownership issue).

    Frouin: There is an effort to put instrumentation on merchant ships. This is a nice addition to research vessels and fixed buoys for cal/val measurements.

    Taguchi: Should include time series data.

    Burton: People have put instrument for MODIS validation. We had radiometer round robin in March in Miami. The data are in the web site

    In Australia, we have on-going validation campaign that we plan to continue. We get daily data from three sites. Bill Emery of Colorado has a contract from the Navy to develop instrument for ships of opportunity.

    Fukashima: I want to stress the importance of buoy data. I rely only on buoy data and need MOBY for GLI post launch.

    Abbott: MOBY is funded to Dennis Clark through MODIS effort through year 2001. JGOFS has additional mooring site. NASA is interested in developing cheaper drifters.

    Taguchi: Can I summarize as follows. We need shipping plan, coordination with ships of opportunity, cheap instrumentation, and cooperation with MODIS.

    Frouin: POLDER is also an ocean color instrument on ADEOS-II. There is not enough coordination between GLI and POLDER cal/val activities.

    Fukashima: We should recommend more interactions.

    Abbott: On data policy, it is important the ground truth people who put in GT data get the satellite data back quickly.

     

  3. Kishino presented problem with sensor characterization. GLI has non-linearity and polarization. For channel 1 (389 nm), the error is <3%, <5% for channel 2 (625 nm), < 3% for channel 18 (865 nm), and <2% for other channels

Discussion:

Q: Frouin: These values are scary. Is there plan in atmospheric group to correct for polarization?

A.: Fukashima: Right now there is no plan. I asked Fujitsu for sensor filter functions and got them. I will talk to Howard Gordon. I have passed Howard's paper for MODIS calibration to T. Nakajima and ask the Fujitsu to learn about the correction.

Abbott: These numbers are troublesome. We need to watch carefully how they are corrected. We have cross talk problem in MODIS. We should have Fujitsu engineers talk to MODIS engineers.

Kishino: We would recommend study on black body target and cross talk problem. We will discuss request for 250m channel data, request for receiving station and global data acquisition pattern in the general meeting.

 

  1. Land Science Group

 

Honda

 

  1. Nakajima's discussion points:
  1. Flow chart of land surface products
  2. Sensor characterization
  1. small non-linearity due to saturation
  2. polarization
  3. calibration
  1. Requests for 250m data
  1. requests for receiving stations: 40 scenes/day; 4 global images/year
  2. global data acquisition pattern: one month? 250m products?
  1. Tilt operation pattern: combination of tilt and non-tilt
  1. when mirror is moving, no data acquisition
  2. coastal zone, not resolved
  1. Validation plan
  2. Algorithm WG discussion
  3. Data management WG
  4. Program updates

Discussion:

Verstraete: In 2000, there will be a lot of sensors in space. If GLI does not produce 250m data or if there are technical arguments against using 250m data, no one will use the data.

Chin: 250m data is useful for coastal zone studies

Huete: 250m data is essential for advanced studies.

Verstraete: Availability? Broadcasting antenna so investigators can acquire data locally. Like AVHRR. If data not acquired, it is lost forever.

Chin: Receiving station in Singapore can acquire all SE Asia

Saito: Power restrictions on board the satellite dictate some of these concerns.

Verstraete: Some problems from European satellites have been overcome using on-board memory.

Honda: Request for receiving stations.

Highest priority target areas should go to calibration and validation sites. Second priority will go to field experiment sites beyond 2000. Third priority: tropical forests, coastal zones. Fourth priority: Other missions. Need to make contacts with IGBP.

 

Direct comparison between vegetation indices derived from GLI and other sensors with sensitivity to vegetation: MERIS, ASTER, VEGETATION (1km), LANDSAT-7 etc.

 

Team should prepare a journal article on the use of 250m data.

Land cover change community has determined that 250m is a threshold above which the land cover land change data will not be as useful for such studies.

Channels have been designed corresponding to LANDSAT TM.

 

Products for 250m: All land surface products. Critical: Land-cover change.

Continuity from historical TM results.

 

NASDA can still change channels. It is difficult to keep all 36 channels.

35GB/10 channel 1 global composite data/1km spatial resolution.

 

Discussion on how the data can be saved so that after compositing we can do the atmospheric corrections. Need to preserve the atmospheric data so that after compositing, corrections can be carried out.

Two modes of operation: (1) global (2) on-demand processing.

Standard products are in the global operation.

Research products will follow on-demand flow.

Atmospheric correction is still not decided. Many products need atmospheric corrections. Either carry out the atmospheric corrections completely or not at all. Partial corrections may be dangerous.

Apply atmospheric corrections and mosaicking after the research and standard products.

The user would desire a good product irrespective of how the land surface team member designed her/his algorithm with/without atmospheric corrections.

Validation plan is very important.

Dr. Taguchi will send workshop attendees a copy of the validation plan.

Please send the same with your modifications to Drs. Taguchi, Tateishi and Saito.

NASDA will distribute the RA for validation in November. At that time, we will incorporate the modified validation plan into the RA.

 

  1. Cryosphere

See Reports of the Science Groups in Joint Session (below)

 

GAIT Session

 

Takashi Nakajima:

  1. General flow of data and algorithm modules: handout provided.
  2. A draft plan exists for implementation of algorithms.
  3. It will take 3 or 4 months to implement algorithms.
  4. Optimization, local testing, installation into the EORC analysis system will occur July-October, 1999.

Inoue (Fujitsu): The program will proceed by:

  1. Analysis of module structure, module interface, and variables.
  2. Output data and performance will be checked.
  3. Code will be returned to PI.

Discussion:

Fukushima: If there is no need for the PI to change code, then there will be no further interaction.

Q: Prata: Will the GAIT rewrite any PI code?

A.: We won't change the computer language. C or Fortran are acceptable.

Q: Fukushima: Do you change the name of variables?

A.: No.

Q: Long Chiu: Is the input/output fixed for the algorithm?

A.: Fukushima: The core part is mostly the same. I/O is initially the same. Later, the toolkit may change the I/O.

Q: Fukushima: Are there any other interactions between the PI and the implementation team?

A.: Fukushima and Inoue: No. The PI does not have to worry about the AVS.

Q: Barton: I would like the guidance from NASDA. Spatially averaged products are useful. What averaging? For example, if I make a water vapor product, I want to know the best way to interface with AMSR.

A.: The question is outside the implementation team venue.

Q: Can we get product information from all ADEOS-II instruments, including spatial and temporal resolution?

A.: Takashi Nakajima: Spatial resolution should be defined by the PI, not by the GAIT.

Barton: We Pis would like more information in the future.

A.: Fukushima: I acknowledge your suggestion.

Q: Verstraete: We depend on other algorithms and want to know their make-up. Can we assess other ATBDs?

Takashi Nakajima, Fukushima, and Saito: Can we post your algorithm on the home page?

A.: PIs: Sure.

Saito: We will post it on our home page.

Fukushima: The home page has a PI's door (password protection). We will open documents in a few months. If algorithm descriptions are compiled by Fujitsu, we will discuss making them available.

Q: Prata: Can the GAIT provide software modules common to many PIs?

Verstraete: For example, we may have different applications, but the input data is processed the same way. This is an integration issue.

Fukushima: For what phase do you need this?

Verstraete: We need to view inputs.

Takashi Nakajima: To view inputs, see Fujitsu documentation. Tell us your toolkit requirements.

Verstraete: Data formats and product specification descriptions.

Fukushima: The L1B product description is available.

Honda: The Japanese language version is ready. We will translate it.

PIs: Put it on the web.

Honda: Yes.

Q: Prata: What is in L1B?

A.: Takashi Nakajima: It includes tilt, radiance and calibration information.

Q: Zege: Noise information?

A.: Fukushima: Specifications for noise are available on the web. Real noise is not available.

Pinker: GAIT testing- PIs should provide the GAIT with input requirements for our modules. Then the GAIT should prepare one interface for all modules to run everything. You may be interested in running modules at different spatial scales.

Fukushima: You suggest each PI submit I/O parameters. Fujitsu did a questionnaire for ancillary data, but that doesn't cover everything. I foresee the GAIT will do such work.

Verstraete: Land and atmosphere groups need more interaction.

Fukushima: We encourage interaction. The GAIT is not involved in implementation issues, so it is hard to include that interaction now. It should be a future activity.

Isaka: For example, cloud parameter retrieval may need albedo or BRDF, so that interaction is important.

Fukushima: GAIT just received your documents and is beginning to study them.

 

 

Reports of the Science Groups in Joint Session:

 

  1. Atmosphere Summary
  1. Algorithm development chart presented.
  2. Data Management Working Group requests tool requirements for L1B, L2A
  3. ILAC areas needed for cloud-aerosol interaction studies.
  4. Can we reduce sensor non-linearity?
  5. Lack of calibration for bright target; need large cal-val effort, e.g., desert and sky measurements
  6. Tilting geometry and scenario must be planned. Atmosphere group prefers nadir view, but has no significant problem with tilt mode.
  7. Data flow
  1. PP (pixel-to-pixel) cloud screening Earth texture analysis
  2. PRCP needs fast segment analysis
  3. ERB produces skin Temperature
  4. ERB and aerosol need BRDF from the land, ground albedo, and wavelength dependent albedo database.
  5. Cloud requires cloudtop height and cloud bottom height.
  1. Want version 2 in launch + 12 months.
  2. Validation site handout was provided.
  3. Implementation of validation results was described in the handout.

Discussion:

Q: Frouin: All of these sites are land sites or coastal sites. What is the plan over oceans?

A.: You are right. Murai research vessels and commercial vessels have sky radiometer, mostly for aerosol studies.

 

  1. Ocean Summary

Murakami:

  1. Presented algorithm status.
  2. Sensor polarization for channels 1, 10 and 18 are large. We will study whether there is an impact for the ocean color atmospheric correction.
  3. Bright target recovery is a concern due to the calibration limitation.
  4. Crosstalk between sensors is an issue.
  5. The blackbody cannot be viewed during tilt operation. We suggest using the blackbody just before and just after tilt operation. We suggest stopping at nadir as the tilt changes between -0 and +20.
  6. Simada presented OCTS cal/val status.
  7. Abbott presented SeaWiFS status.
  8. Validation sites for open and coastal waters have been defined and coordinated with the IOCCG and with the Cal/val teams for POLDER, MODIS.

 

  1. Land Summary

Alfredo Huete

  1. Reviewed flowchart; added bands 19, 28, 29.
  2. Four standard products. Some research products were moved to the standard product track.

Q: Terry Nakajima: No impact?

A.: No, we reviewed it with the GAIT.

  1. Problem: Non-linearity must be minimized and /or characterized.
  2. Problem: Lamp calibration limitation causes a bright area problem.
  3. Want 250 m channel data as much as resources allow.
  4. Target area and priorities: want to collaborate with atmosphere group to determine sites.
  5. Want to increase frequency of the 250 m channel data to one month. Land cover change is our most important product.
  6. Certain coastal zone areas need nadir (non-tilt) observations.
  7. Each PI will make a validation plan by the end of October.
  8. Standing orders will be submitted in October, 1998.
  9. Validation collaboration with the MODIS land group has been arranged.
  10. The land group will provide the cloud-free ground albedo.

Q: Terry Nakajima: No algorithm problems?

A.: No, we reviewed the flow chart.

 

 

  1. Cryosphere Summary
  1. The large radiances from ice and snow and the calibration limitations imposed by the standard lamp require that we make ground truth measurements periodically to trace changing sensor characteristics.
  2. Tilt operation is not important. We will follow the land group.
  3. Polarization characteristics are not a significant concern.
  4. We want 250 m data from channels 23, 28, 29 for cloud discrimination. After receiving data, these data are averaged for 1 km. Then we apply the algorithm for snow grain size.
  5. Receiving stations in Alaska, Kiruna, and Syowa.
  6. We want a global data set once a month. Once every tree months is too long.
  7. Pre-launch, we will validate the algorithm using other sensors and our own numerical simulator. We also request performing validation with a different simulator for inter-comparison.
  8. Starting in spring, 2001, we will validate the sensor calibration in Barrow, Alaska at least once a year for three years.
  9. We cannot specify the 1 km channel for L2 now. We will have further discussions with Fujitsu and NASDA.

 
 

  1. Algorithm Development
  2. Discussion:

    Fukushima: Nakajima explained the draft plan of implementation schedule control.

    Inoue: Algorithm file format documents should be available. Interaction in the integration of algorithms is encouraged. Input/output parameter descriptions of each PIs modules should be made available by the GAIT.

    Verstraete: Product descriptions would be more useful.

    Inoue: OK.

    Yoshio Awaya: L1B NCSD/HDF data format; 1236 pixels x 1656 lines, 26 scenes per path. Details will be available in November, 1998.

    Q: Barton: What happens if there is a tilt operation in the scene?

    A.: The scene will be divided into two segments.

    Rosenfeld: We'll need some interface to read this scene.

    Q: Prata: It takes 23 seconds to tilt. What happens to this data?

    A.: Image data will not be available.

    Q: What is PCD?

    A.: Payload Correction Data

    Inoue: Presented definition of L2A_OA (path coverage) and L2A_OC (global coverage).

    Q: Verstraete: No L2A over land?

    A.: Terry Nakajima: L2A covers the land. You cannot get 1 km resolution for all channels. There is linear interpolation.

    Q: Barton: One pixel every 4 km?

    A.: Terry Nakajima: L2A_OA has 4 km resolution.

    Q: Prata: You mean 1 km pixels are sampled every 4 km?

    A.: Yes

    Q: Prata: Why do people talk about 250 m resolution or 2 km?

    A.: Nakajima: Because of on-board memory constraints, 250 m channel data is stored every 2 km.

    Q: Verstraete: Can we have a standing order for L1B data?

    A.: I think so if we request it.

    Q: Verstraete: Averaging and sampling seem complicated. Is there a document we can look at?

    A.: In Japanese. Maybe we should make a document and describe it in the GLI Handbook that will be released soon.

    Q: Frouin: We need GLI simulator data in the same form as the satellite data to test algorithms. Is there such a provision?

    A.: Use the CD-ROM for GLI simulator data (GSD) distributed at this meeting.

    Q: Prata: It's L1B HDF?

    A.: Not exactly.

    A.: It would be a good idea to generate GSD to test the toolkit and everything.

    Q: Barton: Confirm the number of lines in the data format. Confirm that the data module refers to one complete orbit.

    A.: L1B unit is 1600 x 1600 sections. For L2A, this becomes one orbit.

    Q: Rosenfeld: Contracts terminate at the end of 1998. R.A.s come late in 1999. What is the mechanism for keeping foreign PIs involved? We can't just freeze things until the next R.A.

    A.: Saito: Consult with SAIC.

    Q: Nakajima: How about the budget?

    A.: It hasn't been determined yet. The second R.A. will cover the period from one year prior to launch to two years after launch.

    Q: Terry Nakajima: Can the GLI team keep running?

    A.: Saito: We hope we will have an answer soon.

    Q: Terry Nakajima: There may be a one year gap. That's no good. Do we have to judge ourselves whether we pass the algorithm cut?

    A.: I understand your concern.

    Q: Terry Nakajima: When do you give the evaluation results and the list of people who can continue?

    A.: Saito: At the end of November. I will tell you informally as soon as possible.

     

  3. Closing Remarks
  1. Professor Sumi: ADEOS-II Scientist

Some hardware and software concerns exist. The project is challenging. GLI is an extension of our experience with these types of sensors. We need to think about collaboration with international programs.

  1. Okuda: Director of EORC

EORC will distribute the scientific data to the scientists.