12 corrections in a row

2014/04/07 // 0 Comments

Just a couple of housekeeping notes: I’m giving talks in Europe in a month at the following locations: Unité Matériaux et Transformations (UMET), Lille on May 16, hosted by Grégory Stoclet, Birmingham University, Birmingham on the 20th of May, hosted by Zoe Schnepp, Nottingham University, Nottingham, on the 23rd of May, hosted by Philip Moriarty, Between these dates, I’ll also be joining Zoe and Martin for beamtime at the Diamond synchrotron (beamline I11) between May 21 and May 23. Please feel free to stop me at these locations and say hi! For today, I’ve got another bit of data correction to show. I thought it might be interesting to put them all together and show you what difference it makes to an integrated scattering pattern. Many of the data corrections implemented are quite straightforward shifts and scalings, but some are more involved and have a greater effect on the scattering pattern.


Once more: writing data correction software

2013/12/02 // 1 Comment

With the Bonse Hart instrument out of commission still (yet another failure in the X-ray generator target assembly), I decided it may be time to have another look at the data correction procedures. Some of you may remember that I wrote a comprehensive review of all conceivable corrections in a recent open-access review paper, and while I implemented some of the corrections mentioned therein, it might be time to implement most of them.


Monte Carlo; now getting a GUI!

2013/11/05 // 0 Comments

There have been many developments on the Monte Carlo program that I have been rather silent about. For those that forgot: this is the method that allows the determination of a particle size distribution (or rather a scatterer size distribution) from a small-angle scattering pattern if you select a shape. It is described in detail here, but the supplementary information shows the real strength of the method: it can retrieve a wide variety of realistic size distributions.


Programming your own data reduction? Check out fabIO and pyFAI

2013/08/26 // 0 Comments

Reading in the detector data, and programming methods for that, is one of the more tedious tasks of any data reduction program. Those of you who write their own data correction programs know this all too well. Many detector systems store their data a little bit differently than the others, despite the availability of decent standards for storing images and their metadata (fortunately, some manufacturers are now using standard data storage formats). For example, the NIKA manual shows a rather lengthy list of formats for which support had to be written. 


Free Code! McSAS: A Monte-Carlo way for retrieving particle size distributions.

2013/01/21 // 1 Comment

Good news for those of you on the hunt for a way to get polydispersity (size distribution) information from your scattering patterns. Two pieces of good news, to be precise! Firstly, the paper that describes my implementation of the method that does exactly this has just been accepted earlier this month for publication in J. Appl. Cryst, though it will probably not make it into the February issue. With a bit of luck, I will be able to make it open access, though! I have talked about the method before (e.g. here) so I will not spend more words on it. The second news is that the Python code with the fitting procedure is now available in an online repository here, thanks to Pawel Kwasniew at ESRF for his efforts in setting up the repository. The code comes complete with a quickstart guide with several pictures and some test data. If you are reasonably familiar with Python, why not grab a copy and try the method on your data? Reports from early testers have been positive, and everyone is encouraged to comment or send me an e-mail so it can be improved. License-wise, the code is released under a creative-commons-attribution-sharealike license. Lastly, if you want to contribute to the code you are more than welcome to. Currently, the code is being recoded in object-oriented form to improve flexibility, with the first release of the OO version expected later this month. Afterwards, a smearing function will be implemented for directly fitting slit-smeared data, and more shape functions should be included. As it is intended to be integrated in existing SAS analysis GUI’s (of which there are quite a few), there is no graphical user interface, and as such the focus is on getting the base functionality implemented right. As usual, drop me a line or leave a comment!


Monte-Carlo fitting of isotropic scattering and observability on arXiv

2012/11/05 // 3 Comments

We have now entered the final two weeks before the Scattering-Bonanza that is SAS2012… Are you feeling the pressure yet? With many good talks lined up for that conference, and many good people attending, I am honestly quite excited! I too will be giving my small contribution in the form of two talks, one at the Bonse-Hart satellite meeting, and another at the SAS2012 conference itself (Tuesday, 16:00 at the D4 session). While there are many good things being prepared (publications, presentation preparations and the likes), also for this website, none of them are quite ready yet. To give you a taste of the upcoming publications, there are two pre-publications of mine available on arXiv. The first considers the 1D Monte-Carlo model I have been talking about every now and then. Development on the analysis method has been ongoing for quite some time, and it is now in a very useable form. While not all of the latest modifications are in this pre-submission version of the manuscript, some details can be gleaned. It is available here: arXiv:1210.5304 The second paper applies this Monte-Carlo model to study the ageing-induced growth of rod-like precipitates in MgZn alloys, comparing the resulting radii distributions to the distributions found using TEM. This collaborative work between dr. Julian Rosalie (TEM expert) and me shows the symbiotic relation between the two technique very well. In other words, TEM provides morphological details so that the Monte-Carlo method can be applied to extract a size distribution. A very young version of that manuscript can be found here: arXiv:1210.5366 I hope you enjoy some of the results in there, and please talk to me at the SAS2012 conference!


Data processing flowchart and news on an old publication

2012/02/29 // 1 Comment

Short news first; by going through the motions and waiting for Elsevier to get back to me, I have gotten permission (for the royal sum of 0.00 eurodollars) to repost one more paper from Polymer on my site. So that has now gone in the 2010 publications page here. Then it is time to give you something. For those who have to do their own data processing and would like to get my way of doing it, I have attached my data processing flowchart to this post. It is not a perfect method, but as far as I can tell it works quite well. If you are interested in getting the actual Python code that does all this work, drop me a line. Since the code is quite new, it does not support many strange detectors, so if support needs to be built for a particular detector, I’ll be happy to spend some of my time looking at whether it can be done. So there:imp_imagecorrect_and_imgint. Let me know if there are improvements, obscurities or if you have any other comments on this.


Detail-preserving 2D binning, part 1: the appetizer

2011/12/10 // 0 Comments

(Sorry about the hiatus, there’s been a period filled with that noblest of Japanese traditions: paperwork!) If you want to do fitting of a 2D image, you want to preserve the information in the entire image. 2D fitting is quite computationally intensive, so you still want to reduce the number of pixels in your images. Methods I have seen published, are occasionally quite poor at preserving detail, but I’ve played with a type of binning (quite similar to the mathematical concept of k-d trees) that does preserve this: Behold the coolness in the following plots, which are different zoom levels of the same scattering pattern. On the vertical axis is the azimuthal angle, on the horizontal axis q (in reciprocal meters). An explanation will follow in the near future, but you’re welcome to write to me for the scoop. Pretty cool, huh! And since the errors of the intensity of each bin are known, fitting is not affected by the differences in area!    


Live Fourier Transform for Windows

2011/08/16 // 6 Comments

[Ed: a new, completely rewritten version of this code can be found here, with a precompiled Windows version available at the bitbucket site courtesy of Joachim Kohlbrecher] Since entering paternity leave, I have had little time to come up with something new to post here. However, one colleague was so kind as to send me his Windows version of the live fourier transform program discussed before. His runs on his Lenovo laptop (but may be more widely applicable) and uses the Windows built-in Matlab webcam code. Framerates are markedly better than my OS X code, but memory requirements are significant and reinitialisation of the camera every few seconds generates a flash. The code is available here, with many thanks to Jakob R. Eltzholtz for making his code available. If you have suggested improvements or changes, please do not hesitate to contact Jakob and/or me.


Live FT video

2011/02/04 // 1 Comment

A demonstration of the live Fourier Transform showing scattering patterns can be seen here:

1 2 3 4