The Denver X-ray conference was quite excellent, mostly becauase of the people to talk to during the coffee breaks and the dinners (as always, there should be more of these and longer!). Here’s my quick review of the conference: Read More

# The value of years: good stories and advice from M. H. J. Koch

One paper I managed to miss for my review paper on data corrections is a paper by M. H. J. Koch from Hamburg. The paper is written in a nice informal way, replete with good quotes, where he talks about his experiences in instrument development for SAXS and WAXS beamlines. In particular the paper details the development of delay line (wire) detectors, and may form a good introduction into this topic. It seems that wire detectors may still have their uses: their rapid response to incoming photons still puts them among the fastest 2D detectors out there. Read More

# A review of data collection and correction procedures.

Following my previous work in progress detailing the data correction steps to obtain good data, I finally had the chance to write this down in a review article. This review article (open access) has been submitted on Monday. After it has been reviewed and (hopefully) published in the journal, I will ensure that that latest version is available as an open access paper (thanks to funds from NIMS/ICYS). Until then, please enjoy the pre-submission version and as always feel free to comment!

* [Sep. 22 edit: the ArXiv link has been replaced with a link to the journal, where the paper is available under an open-access license]*

# Comments on Deschamps, 2011 and A clarification on Guinier for Polydisperse systems

After browsing through a recent Journal of Applied Crystallography, I came across a paper by Deschamps. It indicates to me that there is a slight lack of information communication in some aspects of SAXS.

Firstly, it mentions in the introduction that the advanced (Fourier-transform-based) SAXS analysis methods cannot “extract simultaneously the precipitate form factor and the precipitate size distribution”. Indeed they cannot, but neither can the classical methods (see for example page 147 of [1]). It is only when we assume a shape of the scatterer in the classical methods, that the number of possible size distributions reduces to a single solution. Inversely, if we assume a size distribution, there is only one general form factor which will match. This leads to the erroneous conclusion that the classical methods can result in a simultaneous determination of size *and* polydispersity.

My main point, however, is the research on the behaviour of the Guinier method in polydisperse systems. I, too, have been looking at this and found out that others have tread this path before me. My results here are in agreement with the results of others that the radius of gyration of the Guinier method in polydisperse systems follows the volume-squared weighted radius of gyration [2]. The limits of applicability shift accordingly.

The work by Deschamps on the Porod behaviour in polydisperse systems is to my knowledge unique, but I have not looked in detail into this.

[1] O. Glatter and O. Kratky, *Small angle X-Ray Scattering*, (Academic Press, 1982), available online

[2] G. Beaucage, H. Kammler, and S. Pratsinis, Journal of Applied Crystallography 37, 523 (Jan 2004).

# Notes on Guinier

…well, his famous SAXS analysis method.

This documentGuinier_short, copyright Brian Pauw

gives a short description and review of the applicability of the Guinier method to polydisperse systems.

It also shows, through analysis of simulated data, what q-range should be measured for the Guinier method to be valid. In short, the rule of qmax=1.3/Rg still holds, but Rg in polydisperse systems is the volume-squared weighted Rg of the distribution.

This then implies that the Guinier method for polydisperse systems quickly becomes unusable as the required qmax cannot be reached with anything but USAXS systems for polydisperse samples.

This text (the linked PDF) is released under copyright (copyright by Brian R. Pauw, 2011) as I may want to include some of this in a later publication. I hope you understand…

# Read Ruland and more free reading (textbooks!)

Catching up to current affairs, I stumbled across this beauty. Now, I find this paper starts a little bit chaotic, but very quickly we come across some very useful equations indeed, and a link between the used equation for their analysis of phase transitions in fluids, and various other equations such as the Ornstein-Zernike structure factor and the Debye-Bueche equation. The equations published in this paper appear ready to be applied to a wide variety of amorphous scattering patterns, capable of extracting quite a few physical parameters! There will likely be much more on this topic as I get to apply these. To top it all off, the data used in the paper has been “extracted” from published graphics by Ms. A. Höhle. I can see her sitting there now with a ruler and a paper, meticulously noting down her estimates for the q and S values for each datapoint…. Perhaps it would be a good starting point for publishing some of our best data online so others can have a go at analysing it?

On another note, I want to point you towards the horror of textbooks. How fitting then, that the next special issue of the J. of Appl. Cryst. is about teaching! As you may have read on Slashdot a few weeks ago, textbooks are still very expensive, even for topics as stagnant as elementary mathematics. prof. Feynman also had a few words to say on the topic of textbooks. And there are now some alternatives popping up allowing your students an alternative to ridiculous and expensive textbooks: Free textbooks.

http://www.curriki.org/xwiki/bin/view/Main/WebHome

http://about.ck12.org/

http://www.lightandmatter.com/

http://www.wikibooks.org/

If you know more, let me know and I will add them to the list!

# Holiday reading, watching and writing.

Hi all,

I found some interesting papers for you, and a talk. Let me start with the talk. It is a TED talk (naturally) concerning TED talks. This nice introspective talk is actually of interest for all of us as it gives a few pointers to the set-up of excellent (and terrible) talks, with a fascinating slide on the colours used to evoke certain responses from the audience. Funny and applicable to us to make our talks better (and we know we need it, right?). The talk is here.

Then there are some papers, two of which I found to be closely related to what I did. One paper discusses the stretching of voids in tensile experiments, simulating the 2D patterns with cylinders (but unfortunately not using a 2D fit, but 1D slices to arrive at a solution). That paper is here (yes, you have probably already read it since it is in j.appl.cryst., but just in case you have been too busy like me to read the table of contents…). Another one is similar, but I must admit I have not managed to completely read it yet. Looks interesting, though.

Also, it is not everyday you see a new geometry diffractometer being suggested. I wish these guys good luck in the further development of their diffractometer and I hope they publish some fantastic results when they get to it.

At the moment, I am trying to do a literature research (something I should have done much, much earlier) on in-situ particle growth studies using SAXS. I have come up with quite some references by now, but if you know an excellent study, do drop me a line at brian at stack dot nl, and I will be eternally grateful. If you are, on the other hand, interested in co-authoring a small review paper on the topic, I am always open to collaborate!

# Bayesian Inverse Fourier Transforms

Although I cannot say I completely grasp the underlying theory, the Bayesian approach to the Inverse Fourier Transformation of (isotropic) small-angle scattering patterns certainly appeals to me. The idea is that the small-angle scattering pattern can be transformed (back) into real-space, resulting in either a distance distribution function *p(r)*, a correlation function *gamma(r)* (=r^2 p(r) ), or through double derivation of the result, into a chord length distribution (CLD). The Bayesian approach removes the user-defined input requirements of the standard IFT.

So what can you do with all this real-space information? Well, first of all, being in real space means that one’s intuition can once more be applied (because intuition does not work in reciprocal space). For example: a maximum probability at a certain radius really may indicate that this is a characteristic length scale in the system. Secondly, the real-space p(r) may be a lot easier to fit than the scattering pattern itself, especially for odd shapes for which no analytical scattering function exists. Lastly, it shows exactly the amount of information inherent in the SA(X)S pattern, making it easier for those new in the field to understand the limits of the number of extractable parameters.

More about this in future posts. If you’re interested, I can point towards the following references:

Hansen. Bayesian estimation of hyperparameters for indirect Fourier transformation in small-angle scattering. Journal of Applied Crystallography (2000) vol. 33 pp. 1415-1421

Hansen. Estimation of chord length distributions from small-angle scattering using indirect Fourier transformation. Journal of Applied Crystallography (2003) vol. 36 pp. 1190-1196

Hansen. Simultaneous estimation of the form factor and structure factor for globular particles in small-angle scattering. J Appl Crystallogr (2008) vol. 41 pp. 436-445

Pons et al. Modeling of chord length distributions. Chem Eng Sci (2006) vol. 61 (12) pp. 3962-3973

# Particle size distribution: review of “Small-Angle X-ray and Neutron Scattering of Polydisperse Systems: Determination of the Scattering-Particle-Size Distribution”

The determination of the particle size distribution from small-angle scattering curves is usually achieved by assuming a certain statistical size distribution model (f.ex. a Schultz distribution, a Gaussian distribution or a log-normal distribution), and fitting this to the data using a non-linear least-squares optimisation method.

Fitting multimodal distributions then implies the addition of multiple contributions, each with their own set of parameters. This increase in the number of parameters may make the fitting function unstable and the results unreliable.

Retrieval of distribution model-independent size information therefore would be of great benefit to the experimentalist. One problem with this is that the scattering intensity of particles scales with the volume of the particle squared (i.e. for spherical particles with the radius to the sixth power). This then causes information on the small particle sizes to be drowned out by the signal of the larger particles.

A method to retrieve this information is presented in the 1996 paper entitled “Small-Angle X-ray and Neutron Scattering of Polydisperse Systems: Determination of the Scattering-Particle-Size Distribution” (M. Mulato and I. Chambouleyron, J. Appl. Cryst. 1996, 29, 29-36). This paper presents an iterative method for retrieval of this information, and compares it to existing methods such as implemented in the GNOM package. A particularly challenging bimodal size distribution with one mode at 0.5 nm and another at 5 nm reveals that the newly presented model is capable of retrieving this distribution to good agreement.

This then is a very interesting approach to the problem of the determination of polydispersity information from systems of hard spheres. Personally, I will certainly implement this approach. In addition, the paper provides good insight in the challenges associated with scattering problems of a polydisperse nature. Lastly, its clear writing makes it recommended reading material.

All in all, an interesting paper worth reading. I will let you know how it works for me if I can get it implemented.

# SASFIT software

The famous question of the uninitiated in small-angle scattering is: “Do you have a bit of software which will give me an answer from my data?”. After a lengthy explanation (coloured with some anecdotes) about why small-angle scattering is not a uniquely defined problem with an often unique answer such as wide-angle diffraction might be, the new user is then left with a copy of Matlab or Fit2D and asked not to return until he has a more “sensible” question. I guess this is because there are not many alternative treatments for these users. These days, I may also give them a copy of the most recent SAXSGUI, but since this lacks quite some fitting functions, it is only really useful for users who already know what they are doing and can program their own 1D or 2D fitting functions. An all-in-one package that is not only good for beginning users but can also remain a useful tool for advanced users is not something which I’ve seen so far. Until now, that is.

I’d like to draw your attention to a very useful software package for anyone working with isotropic small-angle x-ray/neutron scattering data called SASFIT and can be found here. I noticed this package only last month during the SAS09 conference in Oxford, but I have been told that it has been under development for quite some years now by Joachim Kohlbrecher and Ingo Bressler. It comes precompiled for OS X (Macintosh) as well as for some other unmentionable operating systems, so colour me happy.

The one thing that struck me as particularly good about this package is its completeness. I think you will be hard pressed to find any other package containing as many form factors and structure factors to fit your 1D data as this package. On top of that, all the functions are described in the documentation, so if nothing else, do take a look at that (and ste– recode the functions in your favourite language of choice if you must).

The code is open source (as opposed to some other SAS fitting software which is more like a black box) and it has been written in C. While it is not the language I’m familiar with, it is a very good choice. Much more readable than FORTRAN (in which much of the other software is written), can be compiled on a plethora of platforms, and is a language which is the backbone of much good software (i.e. it doesn’t change much over time, it has been around for a while, and it will be around for a while).

To get going, you must have your data in a format readable by SASFIT, and the easiest data format is to have your data in an ASCII-readable file with three columns, q, I, and the error in I (although the last is not necessary). That means you must preprocess your data using your own tools to obtain it in this form, for example using the binning method described two weeks ago, which will supply you with data-points with equal error. Once that is done, you can use your binned data as input into SASFIT.

When using the software, I must say it was not the most intuitive software I have ever used. The TCL/TK interface is partly to blame for this, as it’s not very well integrated with OS X, and therefore appears awkward and occasionally “hangs”. Reading the manual is therefore recommended for this software, to prevent premature baldness. Once you get going, though, it’s very good. Initially, applying some random fitting functions to my data resulted in many (not very straightforward) errors, which could be resolved when setting the parameter boundaries for the fitting parameters correctly. If you do not do that the fitting may optimise parameters to “impossible” values, which subsequently results in impossible intensities, impossible residuals, and impossible optimisation criteria. Once you set the limits correctly, it works much better, and good results can be obtained.

During my initial test, I found the optimisation algorithm to be not so robust. This is likely due to my choice of fitting function, but from my experience with the Matlab “fminsearch” and “fminsearchbnd” functions, I had expected these to work a little better. The fitting is also rather slow, which may be due to the continuous updating of the fitting curve in the graphical display.

These experiences of an inexperienced user aside, I would very much recommend this package to anyone working with 1D SAS data. It holds much promise, contains a hell of a lot of fitting functions, is well documented and open source, and in my eyes is essential software for all levels of users. If it becomes slightly easier to use off-the-bat (i.e. with preset sensible parameter limits for each fitting function, some pre-loaded example data and a good welcome screen pointing you at the place to start) our lives would become a lot easier: “Take this package, little man, and don’t come back until you have a sensible answer!”.