libRadtran discussion page

On this page you may complain about bugs, about mysterious or needless error messages or warnings etc. There is a good chance that a libRadtran developer will fix the problem ;-)

Discussion

, %2008/%04/%17 %12:%Apr:

ok, somebody has to be the first one:

Here is one example for a number of warnings which might have a good intention but maybe cause wrong feelings in the user: I use a thermal_bands_file and output sum. uvspec complains:

   Warning, 'output sum' might not be appropriate for spectral calculations
   the sum of the output over wavelength is only the integrated flux 
   if there are no gaps between the (wavelength or wavenumber) bands.
 
   Please make sure that this is what you really want!

Yes, this is what I really want. But as a standard user I would be unnerved a bit. I probably would try to use e.g. “output integrate” and the warning would vanish. My conclusion would be that “output integrate” is more correct which is certainly the wrong conclusion. So, here I originally selected the only really correct option but I am disencouraged to use it.

Similar warning are e.g. “integration in unknown space” or so which the user gets when he tries to integrate spectral irradiance. As in the above example I am not sure if I can do the calculation correct without getting a warning. I also thing that the average user does not really know what he is supposed to do when the model tells him he is in “unknown space”. 1D space instead of 3D? Parallel worlds?

, %2008/%04/%20 %18:%Apr:

I think the first one is a useful warning, as the user might make serious errors here. There was no checking at all in first place, e.g. the user was allowed to use output integrate in the thermal range, where the output is in W/(m2 cm-1) and more combination, which produced wrong results without any warning or error message. And if the user read the warning, it is also quite clear, what the warning is about. The warning can be switched off by saying quiet. Or the test can be written more precisely, as the wavelenght bands are known.

The second message was only there by accident. libRadtran was updated in order to be able to integrate in wavenumber space. Before submitting, the following message
“debugging: integration in unknown space”
was not deleted by mistake. It is now changed to a verbose message with following text:
*** wavelength or wavenumber integration of the simulation result

, %2008/%04/%17 %12:%Apr:
 Warning: nphase not found in netcdf file ../data/mie/ch016.001.mie.cdf. Assume old 
          dataformat, where only phase function moments are included. 
Hmm, am I doing something wrong here? I am using only stuff which comes with the model but uvspec suggests that I need to upgrade something. Is it worth spending a lot of efforts on upgrading?

Again, this is a bit disencouraging warning. Shouldn't it be only issued if a polarization dependent solver has been chosen?

, %2008/%05/%19 %13:%May:

Fixed. Now the warning is only displayed if polarization is calculated.

, %2008/%04/%17 %12:%Apr:
  WARNING: the z-grid of ../data/atmmod/tigr/tigr0420.dat
           shows larger deviation concerning the hydrostatic equation!
Hmm, what am I supposed to do? I mean, yes, there are some deviations from the hydrostatic equation but uvspec currently doesn't even assume hydrostatic equilibrium within layers (rather it assumes exponential pressure and density and linear temperature). As this has no consequence for the radiative transfer, the warning should probably only occur when verbose is switched on.

Following this line: uvspec currently checks if the ideal gas law is fulfilled (which is necessary because uvspec uses both pressure and density and therefore both need to be consistent); the hydrostatic equilibrium is also checked, but currently it has no effect on the result if it is not. If we go on, we could also check if the atmosphere is in radiative equilibrium (heating rates == 0) or in radiative-convective equilibrium or …

, %2008/%04/%20 %18:%Apr:

> Hmm, what am I supposed to do?
Just use the radiosonde option, and let libRadtran calculate the z-grid for you.

> As this has no consequence for the radiative transfer
Not for the radiative transfer, but for one result: heating rates. As the layer height is used to calculate the extinction coefficients, it is critical to have the right layer height. This is a useful check.

, %2008/%04/%19 %15:%Apr:

Und schon wieder einer zum Thema überkritische Warnings:

Hallo Uli,

mein MYSTIC-Nutzer in Utah beschwert sich, dass das Modell ab und zu aussteigt, weil die 1D und 3D Levels nicht gleich seien. Sind sie aber. Die Ursache fuer den Ausstieg ist wohl folgende:


revision 1.32 date: 2008-03-05 12:17:51 +0100; author: ulrich; state: Exp; lines: +10 -3

Additional check for Monte Carlo clouds: Here it was only check, weather the number of levels of the atmosphere and 3dclouds were identical. Now also the levels itself are checked, if they are the same.


Das ist eigentlich nicht notwendig, da das in MYSTIC sowieso ueberprueft wird. Jetzt wird's zweimal ueberprueft, und zwar unterschiedlich. In MYSTIC teste ich, ob der relative Unterschied zwischen 1D und 3D kleiner als 1e-6 ist, und das funktioniert. In cloud3d.c testest Du, ob der absolute Unterschied kleiner als 1mm ist, und das geht nicht gut. Bei z=120km bedeutet 1mm etwa einen relativen Unterschied von 1e-8 und das liegt leider unterhalb der Genauigkeit von float's - daher muss der Test eigentlich recht oft aussteigen.

Kann man den Test wieder loeschen, oder gibt's einen bestimmten Grund warum der da ist?

, %2008/%04/%20 %18:%Apr:

Test gelöscht. Es gibt einen Grund, warum der Test an dieser Stelle war, denn Tests sollten so früh wie möglich in Code ausgeführt werden, um nicht unnötig viel zu rechnen bevor abgebrochen wird. Die Durchführung des Tests war allerdings wie du oben ausgeführt hast wenig gelungen.

, %2008/%05/%20 %10:%May:

Dem “so früh wie möglich” Argument kann ich nicht ganz zustimmen:

1. Bei einem typischen MYSTIC-Lauf rechnet man üblicherweise mehrere Sekunden bis Stunden - da spielt es eigentlich keine grosse Rolle, ob der Lauf nach 1/100 oder nach 1/10 s abgebrochen wird

2. In diesem Fall ist es sogar essentiell, erst direkt in MYSTIC zu testen, da nach dem Test in cloud3d.c durch irgendein regridding noch Levels dazukommen könnten, und das wäre in MYSTIC fatal.

, %2008/%06/%05 %12:%Jun:

OK

, %2008/%04/%21 %08:%Apr:

The twostream upward flux (eup) deviates by 17% or (solar integrated 190W/m2) from the same calculation with disort, if the albedo is 1.0 (and does only slightly deviate when albedo is zero). 190W/m2 is pretty much, maybe here is a hidden bug.

, %2008/%06/%13 %09:%Jun:

Arve checked the twostr code. The deviations are due to the simplification which is done by twostr-approximation and prop cant be avoided. Be aware that this might be up to the order of 10 to 15% of the fluxes.

, %2008/%04/%21 %09:%Apr:

Usually disort with 2 streams is not allowed, as it is not optimised for calculation speed. But if this check (that disort with 2 streams is not allowed) is removed, than this combination produces only NaN as output. (As this is not usual combination, this is not so serious, but this possibility would be nice in order to check the twostream issue, one entry above.)

, %2008/%04/%21 %09:%Apr:

The independent pixel mode for 1D solver with the option wc_ipa does not work. And the equivalent for ice clouds ic_ipa does not exist.

, %2008/%04/%22 %10:%Apr:

Some WindowsXP user reported, that added empty lines in the input file (independent on the editor: Editor, WordPad oder Editpad Lite) produce emtpy lines in the output file. This is probably due a conversion problem for the meta-letter line-break. Looking at the outputfile with emacs, the empty lines are displayed as ^M, which is read by the tool integrate as one additional column in the first line, and causes an error message.

, %2008/%04/%23 %15:%Apr:

Yes, I know. Claudia had the same report from a user and we were able to trace it down, but not to fix it: The uvspec input file is processed through uvspec_lex.l which basically takes away everything “it understands” and passes the remaining stuff to stdout. Unix systems represent line breaks with the ASCII linefeed character. DOS/Windows based systems combine both the ASCII carriage-return and linefeed characters. Obviously, under those Windows systems, uvspec_lex.l digests the linefeed but leaves the carriage-return untouched which thus goes to stdout.

Probably somewhere in uvspec_lex.l or uvspec_lex.c is a list of ASCII characters which uvspec “understands” which probably includes all letters, numbers, blanks, tabs, linefeed, etc. and everything which is not included is not interpreted but copied to stdout (remember that the input file is passed | uvspec which takes everything it knows and leaves the remainder untouched). That way, one carriage return is left over from every line in the input file.

I guess if everything is generated under Windows, including flex .l → .c, the carriage return / linefeed issue should be properly addressed. Is it possible that the problem exists only when flex is done under Linux while the following compilation is done under Windows?

, %2008/%04/%25 %10:%Apr:

Perhaps we should put this issue on the libRadtran homepage under 'known bugs'. Maybe some user like to fix it?

, %2008/%05/%09 %15:%May:

Every time I run make check I get about 917 uvspec tests that fails on mac (about 189 belong to sos). This is rather annoying. Now, by setting $limit=0.01 (default 0.00001) and $maxdiff=0.02 (default 0.001) all errors messages are gone. But these values are a bit large. It is rather difficult to write proper tests that ignore all not-important results (who cares about upwelling radiation for albedo=0.0?, but numerics may give numbers that differs in the tests (well, probably not for this case, but for similar cases). It would be nice to have a better and less worrying display of test results. Any ideas?

, %2008/%05/%20 %10:%May:

This is a good and important point because users are often afraid that libRadtran does not run correctly on their machines while it actually does. What about changing the test to the values Arve suggested and leaving the as they are for the “verbosecheck”? For developers it is important to see even small changes and I use the “verbosecheck” anyway by default.

For the latter, however, we could also think of a different way for testing. The point is that one gets different results on every machine and what I really would like to have a test which produces 0's unless something has changed. Now, I get 0's only on the one specific machine where I created the output files. Couldn't one have an additional test “make localtest” or so which creates local output files and in subsequent calls compares the model output files with the local ones? That way each developer could easily monitor changes in the output. Specifically: 'make localcheck' creates examples/local/UVSPEC_SIMPLE.OUT etc if they are not yet existing and otherwise compares the model output with the local files. Arve, what about a small extension to your test/test.pl?

, %2008/%05/%20 %19:%May:

A 'make localcheck' would sort of solve the problem for the developers. However, as a developer, I probably have less problems with the failed tests than the typical user. Hence, we should first get the user problem out of the way. As such, changing values for $limits and $maxdiff would sort of solve the problem for the users. I think, however, we should have individual values for each test since some tests are more robust than others. That is a bit of work, but probably the safest way to do it.

Otherwise I have some libRadtran “administrative” issues I think we should address after the ESASLight meeting in June:

* mv executables to bin directory? COMMENT: This should be done to make libRadtran more standard in folder naming. Who does what?

* mv everything in tools to src_c? and rm tools? COMMENT: Once again to make things more standard and familiar. Who does what?

* get a 'make install' and 'make uninstall' working.

* doxygen. COMMENT: This we should do ASAP in order to get into the doxygen way of working. Who and when?

* GUI: python? with Tk? COMMENT: python seems like a nice way to go, and we could build something around uvspec. We need to discuss this and agree on what to do. Well, if a GUI is not needed we have a lot of time…

* cvs→ go to svn? COMMENT: Nothing urgent here as cvs is working well. In the long run svn make take over.

Og så, for å gjøre dette forumet til et flerspråklig eventyr, en liten setning på norsk: Været er fint og solen skinner.

, %2008/%06/%05 %12:%Jun:

When using correlated_k fu, source thermal, twostr, and wavelength_index 4 (I know, there is hardly any thermal radiation), uvspec produces a segmentation fault in hopsol, twostr, solve_rte.c(3207). Can we avoid this segmentation fault?

, %2008/%06/%12 %10:%Jun:

I have been playing around a bit in order to create a user-friendly uvspec_error() function, but gave up. My feeling is that it is close to impossible to create something which looks good in the code and creates meaningful output. The main difficulty is that we want to retain the full functionality of printf in the arguments to uvspec_error(). That is, in the code something like the following should be possible:

 uvspec_error (
  "Error %d - altitude %f smaller than 0\n", status, z,
  "Please provide a positive number instead!\n");

which would then be passed to several printf's in the uvspec_error() function. With some acrobatics this should be possible. stdarg.h allows functions with variable number of arguments (printf is actually an example for such a function) - if you are curious check “man va_arg”. And somehow one might even be able to guess which argument to uvspec_error() is a format string and which is an argument to printf (this is not trivial because there is no way in the stdarg functions to determine the type of an argument). Even if one would succeed doing so, one would loose one very valuable thing: The compiler gives a warning if the number of arguments of printf is not correct, but I have no idea how one could keep this functionality for uvspec_function - the consequence would be segmentation faults and weird output instead of compiler warnings.

Interestingly, stuff like

   printf("This is a very very very %s "
   "long string going over %d lines\n", "very", 2);

works, even if compiled -pedantic -ansi -Wall ! That way, passing everything to uvspec_error() would be more simple. In uvspec_error one could do then something like (this is only pseudo-code):

 ... count the number of arguments, n
 switch (n) {
 case 1:
   printf (arg[0]);
   break;
 case 2:
   printf (arg[0], arg[1]);
   break;
 case 3:
   printf (arg[0], arg[1]);
   break;
 ...
 case infty:
   printf (arg[0], arg[1], arg[2], ...);
   break;
 }

Not very nice but possible (and it might generate big source code which is good once we calculate our efficiency in terms of number of new source code lines / ESA money spent :-). But again, it would not check the number of arguments during compile time. Is there a way?

An alternative is of course to

 sprintf (errortext, "...");
 uvspec_error (errortext);

This is not much better improvement to saying

 fprintf ("...");
 assert();
 return status;

We probably should stay with the last version unless one of you comes up with a really good idea.

, %2008/%06/%18 %14:%Jun:

Funny idea to reply to one's own discussion point :-) Anyway:

I tried the first implementation of assert(). It works, but it requires careful programmers. The point is that assert(0) not only writes an error message to stderr, but it also stops the program by calling abort(). If one does something like

 if (status!=0) {
   printf (stderr, "Error %d\n", status);
   assert(0);
 }

the program will stop if an error is encountered. If, however, one does #define NDEBUG to produce the distribution, the program will only produce the error message but won't stop anymore. Therefore one needs

 if (status!=0) {
   printf (stderr, "Error %d\n", status);
   assert(0);
   return status;
 }

The “return status” is of course easy to forget. In some cases where the programmer forgot the “return status” uvspec might go on and produce wrong numbers.

I also don't really like the idea of aborting the program immediately whenever an error is encountered (they don't like it either: https://wci.llnl.gov/codes/smartlibs/necdc_2004_paper_30Nov04_htmlmaps_2.17.html ). It also takes away the possibility to trace which function called the function where the error was produced. Assume e.g. we had the assert() in arbwvn() - then there would be no chance to find out which function called arbwvn - if it was the interpolation of the spectral albedo or … whatever. Currently this is better to trace.

So, what I would really like to have is an assert() which does not abort() but only reports where the error occurred. Seems that this is possible with some preprocessor macros which are available in all modern compilers (C99 standard):

__LINE__, __FILE__, __func__

An example can be seen in libsrc_c/miecalc.c. The macros might not be available in some compilers, therefore configure checks if they are available.

, %2008/%06/%23 %16:%Jun:

Yeah, that's nice. If you use netCDF functions and get error numbers, there is also a very nice function nc_strerror(status), that convert the number to a human readable text message, which is also quite nice.

, %2008/%06/%18 %14:%Jun:

Some thoughts on input/output structures:

In uvspec_lex.l are quite a lot of default settings, too.
They might also be input dependent, e.g. if time and lat/lon is given,
the atmosphere file is chosen. The atmosphere file name is written into
the input structure, which is only possible in uvspec_lex.l.
This can't be moved to uvspec_check.

Yes, and I guess that was the reason for having some of the stuff in uvspec_lex.l because that's the last place where it is allowed to modify the input structure. In principle one should never modify the input structure, and we had only few exceptions. E.g. nstr was set to 16 if radiances were to be calculated. I didn't check for a while, but as Uli indicates above that a lot of stuff is modified in the input structure now, and I personally think this might actually not belong there. Things like

 if ( fabs (Input.latitude) < 23.0 )
   strcpy (Input.filename[FN_ATMOSPHERE], "tropics");

is no longer basic changes but involves already some educated decisions. And the result might easily depend on other parameters. E.g. one could have a better resolved set of atmospheres (say, one for each degree latitude) which might be selected by the user. For such things we had the output structure. The original philosophy was to have the input structure which contained everything defined in the input file, and the output structure which contained all data which already required some interpretation or interaction between parameters. nstr was one of the first exceptions, and I should probably have never done it that way.

We might think about strictly forbidding any change to the input-structure. E.g. if nstr was to be modified, one had to introduce an output→nstr. We have done it that way for all other parameters. I think nstr was one of the few parameters where we did it differently, probably because I considered it basically a real “default” parameter in the sense that nstr defaults to 16 for radiances and the user is only allowed

(a) to increase it
(b) to reduce it in case that only fluxes are to be calculated. 

(or maybe it was simply laziness because I didn't want to introduce

output->nstr ;-)

When we think that far, one should go one step further: Now we pass a mixture of “input” and “output” to the solver, and for the programmer it is not necessarily obvious which parameters to use. E.g. we pass to disort

 ...
 &output->atm.nlyr,
 &output->atm.nzout,
 &input.rte.maxumu,
 &input.rte.nstr,
 &input.rte.maxphi,
 ...

atm.nlyr can only be in output because the number of layers is only known after reading all input files (atmosphere_file, wc_file, …). But what about nzout? nzout is already known in the input file and there is the respective parameter nzout in the input structure. How am I supposed to know that I have to use output→atm.nzout and not input.nzout? Why should the model decide that I actually need the output a 4 levels if I asked for 3? This is only to illustrate that it became rather complex to know which parameter actually describes what. A bit more safer way would be to have all solver input in the output structure after all setup_???() calls and use only “output” as model input. The advantage would be that at least at the time when the solver is called, there would be much less ambiguity for the developer. In principle, one should even have three structures:

 user_input (this is the current input structure which contains
             everything defined in the input file without
             interpretation)
 input      (this is the current output structure which contains the
             complete description of the atmosphere)
 output     (this is the part of the current output structure which
             contains the model ouput and derived results).

That is,

 uvspec_lex -> user_input
 user_input -> setup_???() -> input
 input -> solve_rte, multiply_extraterrestrial() -> output

Wouldn't that be more logical and better to interpret? And I think that's actually not too much work, if carefully done!

There is at least one problem with having all error checking on one
place and that is what to do about files needed by the model, for
example atmospheric files. To fully check that they are ok requires to
read, check and close these file by uvspec_check before they are used.
Today this is done in individual routines. Personally I think we should
keep that part the way it is today, stuff in uvspec_lex.l should be
mv'ed to uvspec_check that then check limits and if files exists.

Yes, please leave it as it is!!! You certainly wouldn't like to read e.g. Mie files or 3D cloud files twice. I don't see a problem with having some checks occur later in the code. Some of them even have to, because for example you first have to read the wc_file before you can check if the wc_properties file contains all the radii which you need. And there is also no point in checking everything right at the beginning, except that in some cases in the beginning we still know which command caused the problem and the error message can be more user-friendly. We should also avoid having avoidable error messages AFTER the solver (e.g. after a 48 hours MYSTIC calculation an error like “Error, cannot multiply with extraterrestrial irradiance at 900nm because spectrum stops at 800nm” :-) But the main thing is to do the check somewhere before the solver call (e.g. in setup_extraterrestrial) but not necessarily at the very beginning.

, %2008/%06/%19 %07:%Jun:

At the moment the input and output is a bit of a mess. Sometimes input is input, sometimes output is input, output is also output which it should be and I would not be surprised to find input as output. The three structures suggested by Bernhard should resolve all this and make life much easier for developers. Such a structure would also make it a lot easier to build a GUI. And for uvspec_check input is user_input and output is checked user_input that is good old input (got that one??). After uvspec_check we thus only see input as input. I am quite positive to this suggestion. But it requires some dedicated work from someone:-) Any volunteers?

Stuff like:

 if ( fabs (Input.latitude) < 23.0 )
   strcpy (Input.filename[FN_ATMOSPHERE], "tropics");

should never make it into uvspec. This takes control away from the user and makes non-transparent assumptions about what the user actually wants. These constructs belongs to user_codes that calls uvspec. So we should get rid of these things. That also includes the nstr=16 or larger for radiances. Yes it is stupid to calculate radiances for nstr<16 and use arctic atmospheres for latitudes < 23. But if the user wants to be stupid that should be allowed after a warning has been issued. For all we know, the user might have other information and ideas that he/she wants to investigate. In such cases uvspec should allow the user to abuse (in our opinion) the code.

, %2008/%06/%23 %16:%Jun:
In principle, one should even have three structures:
/* user_input, input, output */

This sounds like a good structure.

, %2008/%06/%19 %13:%Jun:

I agree to nearly everything, except:

Even though uvspec should leave freedom to the user to do whatever he wants to do, it should try to prevent the user from doing complete nonsense, especially if not doing nonsense would involve quite a lot of knowledge about how the individual solvers work. To know that you need to set a variable nstr to 16 if you want radiances implies that you understand quite a bit about the disort solver. It is probably a bit too strict to not allow nstr<16 for radiances, but it is saver that way.

Concerning latitude < 23 …: There is more than one place where the model decides what's good for the user without telling the user what this may be. E.g. if one switches on “aerosol_default” then uvspec decides that Shettle spring-summer … might be a good choice without telling the user. There are many more examples like that and I think it is similar that the model decides to use a tropical profile by default if the user defined a latitude between 0 and 23 degree. If the user is clever enough to provide a profile, then he is of course free to choose arctic or even Martian profiles for the equator. But if he doesn't, the model might as well do an educated guess, which however must be clearly documented! Some disadvantage is of course that there will be discontinuities if the user e.g. calculates irradiance as a function of latitude.

I have a general problem with this latitude/logitude/time options which can only be solved by careful documentation. In principle the user might assume that the complete atmospheric setup is done by latitude, longitude and time because when these three are known the model could in principle retrieve all required parameters from a huge database. In fact, however, these options have only few consequences: They only affect the solar position, some default profiles (like 'tropics' above) and they are used to select values from some explicitely defined data bases. They do not retrieve the total ozone for this location from the TOMS server, nor do they obtain cloud or aerosol parameters from MODIS, MSG. or whatever. Nevertheless, I am quite sure that we have users who think that

latitude 48
longitude 11
time 2008 06 19 12 00
wavelength 300 400

will provide the correct ultraviolet spectrum for today at noon. The documentation clearly states what these parameters actually do, but who reads documentation anyway?

, %2008/%06/%20 %06:%Jun:

Maybe the problem by assigning default values and making assumptions about the users ideas/plans already starts with UVSPEC_SIMPLE.INP. Yes, it is all more or less in the documentation, but we can not force anyone to read it. However, we may of course hide behind it and blame any faulty results on the fact that it was all explained in the docs….

To me the above discussion is a rather strong point in favor of a GUI. A well-designed GUI will give the user a better overview, hopefully, without having to read all the nitty-gritty in the documentation.

And I still believe we should go for the user_input, input, output structure suggested by Bernhard.

, %2008/%06/%23 %15:%Jun:
but we can not force anyone to read it.

We can't ?!?!? Quite a lot in libRadtran is very intuitive, and we improve that more and more. And I think it is intuitive, that a tropical profile is chosen, when I specify lat<23 degrees. Well if there are new options like lat/lon/time, one should at least read the manual for one time, before using it, right? ;-) It is clearly stated in the manual, what happens, when using this options. (And this is more than for some other default options.) It is also written in the verbose output. So there should be no problem with it.

If the user specifies an atmosphere_file, than the model does not have to make a guess and the specified atmosphere file is of cause considered. This means, there is still all the flexibility, you are used to.

, %2008/%07/%14 %16:%Jul:

The default wavelength grid in uvspec is not really optimized, in particular when used together with the “correlated_k lowtran” option. 1nm wavelength step is certainly not appropriate, in particular in the thermal IR. The lowtran resolution is 5cm-1, see ~pa2h/develop/new/libRadtran/lowtran.resolution/test.pdf (test0.gle - test9.gle) which shows that the lowtran absorption coefficient is constant over 5cm-1 intervals in the UV/VIS/NIR while it seems to be somehow Planck-weighted in the thermal IR. In the thermal IR the appropriate solution would be to use the center of each 5cm-1 interval,

lambda_i = 1E7/[5(i+0.5)] nm 

where i is an integer. However, the spacing should not be too large because the Planck function also varies within the interval.

In the UV/VIS/NIR this stepping is not practical because e.g. at 300nm the step would be about 0.005nm which would give too many data points and the actual resolution of the absorption cross sections is not that high anyway. Also, the resolution certainly depends on the application. While for 99% of the applications a step of 10nm between 350 and 450nm is perfectly ok, the resolution is far too bad to estimate e.g. the effect of NO2 or for the development of NO2 retrievals. The same is true for all other absorption bands as well.

How could one determine a reasonable grid where “reasonable” means as few grid points as possible but as many as necessary?

, %2008/%08/%20 %13:%Aug:

This is a rather difficult one I feel. In principle one needs to know the application of the user to decide the wavelength grid. To some extent that may be guessed from the input file, but not always. We probably could do something more intelligent, but at some stage the user has to start thinking about what he/she is doing and act correspondingly.

, %2008/%07/%23 %09:%Jul:

An idea to speedup phase function interpolation and to save a lot of memory which is a big issue for ic_properties baum_detailed etc. where the computational time and memory is dominated by phase function interpolation: uvspec reads a set of optical property files. Only the files covering the user-defined wavelength range are read and the data is interpolated to the internal wavelength grid. But at the moment data for all effective radii are read and interpolated, irrespective of the fact if they are needed or not. MYSTIC already calculates phasetables only for the required effective radii in libsrc_c/phasetable.c. The same should be implemented into the respective read_optprop functions in uvspec which should also read and interpolate only those effective radii which are actually needed. To determine the range of effective radii, one needs to check 1D cloud files, 3D cloud files, and IPA files. Sounds complicated but actually isn't because all these data are read in setup_wcloud / setup_icloud and the information is already nearly available. MYSTIC does (in cloud3d.c)

    /* combine 1D and 3D rmin/rmax */
    rmax = (wcld3d->reffmax > output->wc.microphys.effrmax ? wcld3d->reffmax : output->wc.microphys.effrmax);

The thing should go along with some cleaning of the source code. Currently we have

if (1D cloud) 
  read_1D_file (in cloud.c)
  read_optprop

if (3D file)
  read_3D_file (in cloud3d.c)
  read_optprop_if_not_already_having_read_them_above

if (IPA files)
  read_IPA_files (in ipa.c)
  read_optprop

This should be restructured into

read_1D_file
read_3D_file
read_IPA_files

determine rmin, rmax (is already available in read_1D and read_3D)

read_optprop

(not many changes, but needs careful testing). To be implemented by Bernhard or Claudia.

, %2016/%06/%17 %13:%Jun:

You must do bańki bezogniowe a fantastic job in relation to article writing or you will are unsuccessful. It's you who need to be prepared to have things go smoothly and fix them anytime anything is certainly going wrong. Your chances of achievement will increase when you use the featured tips.

, %2016/%10/%12 %03:%Oct:

prokru.pl

, %2016/%11/%18 %06:%Nov:

It is a Neujahrswünsche easy suggestion for anybody planning to learn to play the acoustic guitar: don't speed! You can actually overcome-thrilled, and assume too much too quickly. Even so, while you might learn quickly at first, trying to learn too quickly often leads to beginners to quit when they cannot enjoy such as a professional from the very beginning. Allow yourself time, and understand slowly, exercising each and every strategy until you obtain it proper!

, %2016/%11/%18 %07:%Nov:

As peculiar Neujahrsgrüße as it might seem to instrument amateurs, you should start off the procedure of building callouses in your disposal right after commencing to learn to play the instrument. Possessing a little bit of thick pores and skin at the conclusion of your hands and fingers is likely to make actively playing a lot less painful, and you will be well worth the work while you spend a lot more time producing music.

, %2016/%11/%18 %12:%Nov:

Create a neujahr sprüche training regimen for your self. Learning how to play the instrument will take even longer in case your training sessions aren't focused. Plan out your process periods in advance. Modify them to your needs. Ensure they're intriguing but focus on your problem areas. Locate a wonderful stability in between rep and studying something totally new.

, %2016/%11/%30 %07:%Nov:

Don't push frasi di auguri di natale particular supplies with a young child when conducting a craft task. Not every kid loves employing components that you simply or another youngsters like employing. Some children could like simply using sparkle. Some like utilizing sparkles. Your young child may not like sparkle or sparkles. Once they will not, you should not press them into preference them or using them.

, %2016/%11/%30 %22:%Nov:

Should you be pranie dywanów kielce a fan of artwork and create projects, you should use the Saturday circulars in your favor. There are many merchants offering great deals on artwork items, and you also won't understand about them except if you check out the sales papers, so don't have those circulars to the side.

, %2016/%12/%04 %15:%Dec:

Ornamentea gives liebe weihnachtsgrüße the right spot to get a bunch of their jewellery generating items. Ornamentea delivers an array of hues within their leather-based cording at a reasonable cost. You will additionally look for a wide range of courses to assist you to consider your expensive jewelry making to another level.

, %2017/%02/%04 %04:%Feb:

lusthöjande för kvinnor

, %2017/%05/%16 %11:%May:

problemas potenciales

, %2017/%08/%05 %17:%Aug:

Steroide zum Muskelaufbau

, %2017/%08/%15 %08:%Aug:

les meilleurs stéroïdes

, %2017/%09/%15 %03:%Sep:

http://zarabianie.wzsp.pl

 
discussion.txt · Last modified: 2008/04/10 12:02 by esaslight     Back to top