# Cross-correlations in the high-redshift sky

A few colleagues and I were organising a workshop in UCL last week, entitled ‘Cross-correlations in the high-z sky’. The website is xcorr2014.co.uk, in case you want to see the programme and the talks that were given. We were extremely lucky to have so many good participants and presentations, and we are very grateful to the invited speakers for their excellent lectures.  I am including a few photos below (botice the nice weather and venues!). The conference dinner was taking place in the Design Museum, with a gorgeous view on London Bridge.

On behalf of the organising committee: thank you so much for making this workshop so interesting and lively!

# Statistical Challenges in 21st Century Cosmology

I was lucky enough to be part of this very interesting conference in Lisbon, bringing together researchers from cosmology, astrophysics, statistics and signal processing. The discussions were very lively, and the talks of high quality! I also gave a talk myself — very stressful because it was the first time I was presenting the new work on primordial non-Gaussianity.

I took a few days off after the conference to visit Lisbon and the surroundings. A few pics below!

# PONT d’Avignon 2014

Last week I was in the PONT (Progress on Old and New Themes in Cosmology) conference in Avignon. It was a great conference, with many interesting talks, in a wonderful venue and location! Although the conference was mostly theoretical, there were a few observational presentations (including mine, on quasars), and I am really glad I attended.

# Python and Healpy with Mavericks/Xcode 5.1

I am quite satisfied with my Python installation. I simply followed one of the numerous tutorials on the web and installed a virtual environment with Brew, easy_install and pip. However, I migrated to Mavericks recently, and started experiencing serious problems, especially with Healpy. Strangely, I migrated three machines, which have the exact same Python installation, and only one had these problems. Anyways, the reason is that Apple Xcode 5.1 now throws errors when used with unknown compiler flags. The easy fix that worked for me is to defined the following symbols before running “pip install healpy” :

export CFLAGS=-Qunused-arguments
export CPPFLAGS=-Qunused-arguments
export ARCHFLAGS=-Wno-error=unused-command-line-argument-hard-error-in-future

Sources: [1], [2], [3].

# Primordial gravitational waves: a backstage story

This is a copy of an article I wrote for the newsletter of the T.I.M.E. Alumni Association. For the record, T.I.M.E is a network of universities organising international joint degrees in engineering in the best universities in Europe. I was lucky enough to be part of this program, and concurrently study in University of Mons (Mons, Belgium), Supélec (Paris, France), and Univ. Orsay Paris 11 (France).

A few weeks ago, as I was having a meeting with my supervisor, we received an intriguing call from a BBC journalist. He asked us if we had heard about a rumour that started to spread in the scientific community just a few hours earlier. A surprise press conference was organised at Harvard University, to announce a “major discovery that would change our understanding of the universe”. We were very sceptical at first, but when the journalist mentioned the words “Primordial B-modes”, I saw a spark in my supervisor’s eyes. It turns out that if such B-modes were detected, they would be the first observational evidence of a process called inflation, a crucial part of the Big Bang Theory. Cosmologists think that this inflation generated microscopic fluctuations (so-called “quantum” because ruled by quantum mechanics) and almost instantaneously stretched them to macroscopic scales, creating the seeds for all the stuff in the universe.

After answering the journalist’s questions and hanging up, my supervisor stared at me, and I immediately knew that our meeting was rescheduled. She worked most of her career on inflation and the Big Bang Theory, and these rumours were too important to be ignored. The rest of the day (and most of the week-end in fact) involved calculations on the whiteboard, brainstorming sessions, and video calls with other cosmologists, in an attempt to make sense of the rumours. We wanted to prepare for one of the most important cosmological discoveries of the decade.

The press conference took place three days later, on March 19, and was broadcast online. The BICEP collaboration, a team of scientists operating a telescope based in the South pole, publicly announced their findings. After extensive analysis of their three years of data, they believed they finally detected a signal that cosmologists had craved for decades. They found that the polarisation of the microwave background light had a specific “curly” pattern, which was not known to be produced by any astrophysical object. However, it was a good match for the so-called primordial B-modes, produced in the very beginning of the Big Bang during inflation. BICEP kept the secret for several months, but was now confident enough to announce this detection and to release their data so that all of us could play with them. A new Facebook group quickly became a vibrant arena where hundreds of experts and young researchers like me started to comment, analyse and share the excitement of this amazing discovery (not your average Facebook group, given the technicity of the comments!).

Figure: BICEP2 studied for over three years the cosmic microwave background (CMB), a faint radiation afterglow left over from when the universe was just 400,000 years old. The polarization of the light is characteristic of gravitational waves produced even early, in the very beginning of the Big Bang. Credit: BICEP-KECK.

The BICEP detection has stunning implications for our understanding of the universe and how it started. Primordial B-modes are thought to be the imprints of gravitational waves, ripples in the fabric of space-time, produced in the very beginning of the Big Bang. In fact, this is due to the violence of inflation, which perturbed space-time itself, according to modern theories of gravity such as Einstein’s General Relativity. Dozens of versions of this process of inflation have now been tested against the BICEP measurements. Unfortunately for theoretical physicists, the simplest versions of string theory are struggling to predict what BICEP observed. This is a bad surprise because string theory is our best attempt to unify quantum physics and general relativity so far, and to explain the origin of the Big Bang. Therefore our avenues of research will have to change. But history tells us that scientific breakthroughs often arise from conflicts between theory and observation. In any case, this discovery has stunning implications: we now have evidence that all the stuff in the universe, galaxies, stars, planets, originated from microscopic fluctuations which were stretched during inflation. But how did inflation start in the first place? Well, that this we don’t know yet, but we’re working on it.

Figure: Inflation creates and stretches quantum fluctuations to large scales and creates gravitational waves. These waves create primordial B-modes, a specific pattern in the polarisation of the background light emitted when the universe is 380,000 years old. This light travels 13.7 billion years old to reach our microwave telescopes, and create extreme excitement among cosmologists. Credit: WMAP.

# Rencontres de Moriond 2014

This year, I was lucky enough to attend Rencontres de Moriond, which is a prominent cosmology conference in Europe (organised every two years), and often visited by some of the most influent names of the field. Also, since it is organised by the French groups, it was my first contact with the French cosmology community!

Moriond is very intense because the lectures and talks take place from 8.30 until 12.30, and 16.30 to 20.30, which is pretty intense. Obviously, the 12.30-16.30 break is a ski break, since the hotel is located in La Thuile, which is a wonderful ski station (see photo below!). I was skying with my colleagues from UCL and also with new friends made at the conference. I think the take-away point here is that we all survived and didn’t break a bone my boss was quite happy about that.

Also, it was a pretty important conference for me because I was presenting the work I have been doing the past year for the first time. It was a challenge and a test, since I am supposed to present this work again at several conferences this year. Thanks to all the feedback from my colleagues back in UCL, I had really polished the speech and the slides, and this paid off since I received excellent echoes from the talk. The introduction and conclusion slides are below!

Frankly, Moriond has been an amazing conference. So many interesting talks, great time skying, new friends and contacts. And the cherry on the cake: Clem Pryke, from the BICEP2 collaboration, was there and gave a lecture about their findings. It was incredibly exciting to be there in person, talk to him, and also witness the huge excitement of the audience before and after the talk, and all the interesting questions. This charged my batteries and reminded me why research in physics is so amazing, both scientifically and humanly speaking.

# Seminar in Oxford

This week I was giving a seminar at the Astrophysics Department in Oxford. It was quite exciting (and stressful…) because it was my first 1h seminar to such a large group of cosmologists. Such a long talk was a challenge and I tried to keep the level very progressive and mixed between theory, methods and data. I think it went well; there was a lot of very interesting questions afterwards, especially on the gory details of the analysis of photometric quasars. It was very stmulating to hear people asking very detailed and relevant questions. I was also showing some new results involving new data and techniques, so it was even more stressful. Thanks to Min-Su Shin and Johannes Noller for inviting me!

# Visit in University of Barcelona

After the DES collaboration meeting I stopped by Barcelona to work with collaborators at the Institute of Physics (on a project started recently). I also gave a seminar on the work I’ve been doing on photometric quasars with Hiranya, Daniel, Andrew and Aurélien at UCL. Thank you Licia Verde and Raul Jimenez for receiving me there, I had a wonderful time and I think we got an awful lot of work done!

# DES Collaboration meeting

Last week I attended my second Dark Energy Survey (aka DES) Collaboration meeting. This time was taking place in St Feliu de Guixols, not too far from Barcelona. It has been a wonderful weeks I must say. I gave a couple of (formal and informal) presentations to explain the work we have been doing at UCL and how it can benefit the whole collaboration. I’m afraid I cannot explain more or show any slides since this work is confidential (until the first round of papers in summer 2014), but I can say that the venue was absolutely stunning! Ideal to meet with collaborators and get a lot of work done (for real!) near the swimming pool and in one of the numerous relaxing areas. I am looking forward from the next meeting, which will happen in Illinois, USA in May 2014. Hopefully we will have the first science results by then.

I really enjoy giving oral presentations. Not because I like making them (which usually takes an awful lot of time, basically wasted and not spent on research), but really because the outcome of a well-prepared, polished presentation can be amazingly positive for one’s moral and career. In addition, making a good presentation is always a challenge, and I love challenges*. So it is always good to keep the basics in mind, unless naturally gifted for making a perfect keynote on the first attempt (which I cannot do; it always takes me hours). Below are a few tips which summarise the zero-order tips to avoid the most common mistakes — those mistakes that turn your presentation into a torture for the audience.

* In fact, I take it so seriously because my first presentations at Uni were truly horrible.

And frankly, if I had to summarise all my experience (from scientific, technical and outreach presentations) into one advice, I would simply say: preparation is the key. I always rehearse several times, carefully think about what I want to say, use a chronometer, and get feedback from as many people as possible. Counter-tip: do not rehearse like a machine, leave space for creativity and spontaneity.

Good luck! And remember: the impact of a good, well-prepared presentation in front of an interested audience can be massive.

# Conference: Wavelet & Sparsity XV in San Diego (USA)

The Wavelet & Sparsity XV meeting in San Diego is now coming to an end. It was an extremely interesting meeting, which made me discover the kind of large conferences that take place in the engineering and signal processing communities (Wavelet & Sparsity occurs within the SPIE conference, bringing together thousands of researchers, engineers, companies, around the fields of Optics and Photonics).

I was in fact invited to talk during this meeting about the work I’ve been doing with colleagues at UCL on 3D wavelets and their applications in cosmology. The talk went well, the audience seemed interested and I had really interesting questions/discussions afterwards. I am looking forward to go to more conferences like this. It was good to meet the wavelet community again.

Little anecdote: during a (friendly) fight between two speakers, I heard this funny catch phrase: “What you’re saying is wrong! Try to square a Dirac and a mathematician will punch you. It’s heavy Russian mathematics here, Sir.”

San Diego Convention Center

Waterfront

Dallas suburbs from the plane (London->Dallas->San Diego)

Logo of the workshop, also on the T-shirts we received!

First slide of my talk

# San Diego Zoo

The San Diego Zoo has an excellent reputation. I must admit I was dubious, but now I have changed my mind. I definitely recommend this zoo; it was not only huge and very well organised, but the animals there were quite active and seemed happy. I had never been that close of a tiger and a polar bear!

# Conference: LSS13 in Ascona (Switzerland)

I have just returned from an amazing conference in Ascona, Switzerland, on the “Challenges for studying the large-scale structure of the universe with next generation surveys” (abbreviated as LSS13). I have met many amazing people, attended several interesting talks and had a really good time! I really which every conference was like this.

I have also presented my work on photometric quasars, which was published recently. The talk went well, I received really good feedback and questions, and the resulting discussions were extremely useful too. In fact I was very surprised to win the award for the best contribution! This wouldn’t have been possible without the support and feedback from my collaborations in UCL, which I thank very much.

Ascona lake

Amazing location for the conference center

Best work conditions!

First slide of my talk

Award picture with the conference organisers, Professors Uros Seljak (left) and Vincent Desjaques (right), in front of the conference center in Ascona.

# Paper submitted!

We finally submitted our paper on angular power spectra and photometric quasars, entitled “Estimating the large-scale angular power spectrum in the presence of systematics: a case study of Sloan Digital Sky Survey quasars“. If you wish to cite it, the bibtex reference is below!

http://arxiv.org/abs/1306.0005

@article{leistedt:powspec_sdssqsos,
author = {Boris Leistedt
and H. V. Peiris
and D. J. Mortlock
and A. Benoit-Lévy
and A. Pontzen},
title = {Estimating the large-scale angular power spectrum
in the presence of systematics:
a case study of Sloan Digital Sky Survey quasars},
eprint = {arXiv:1306.0005}
}

# Outreach: talk at the Open University Astronomy Club

This week I was giving an outreach talk about modern cosmology at the Open University Astronomy Club. It all went well (I think) and the questions/discussions were extremely interesting. I am looking forward to meet the members of this club again! Many thanks to Jakub for inviting me.

First slide of my talk

# Brief introduction to the dark energy puzzle

Cosmology is a young and rapidly changing discipline. It is evolving at such a pace that the last decade has witnessed deep and crucial advances in our understanding of physics on large scales. In particular, the arrival of the first high-precision data of the Cosmic Microwave Background and the large-scale distribution of matter has turned Cosmology into a high-precision science.

In the era of precision Cosmology, our model of the Universe at large can finally be confronted to the real complexity of Nature. Despite the astonishing match between this model and the data, one must not forget that it is a simplified, mathematical representation that still leaves numerous questions and observations unresolved. In particular, our understanding of gravity and how it connects to quantum physics is very limited. This issue is now arising as one of the biggest Physics challenges of our time.

Without doubt, General Relativity, which has now been tested on a wide range of scales and energies, remains incredibly accurate for explaining gravity on large scales. This makes it one of the best models in Physics, and leads to a beautiful, consistent model of the Universe viewed as an evolving structure whose dynamic is determined by its content. However, although observational data has confirmed the GR-driven picture of an expanding flat Universe, this model is only compatible with data when Einstein’s equation of GR include a constant term. This term, known as Dark Energy, is extremely mysterious for two reasons.
First, it must have been extremely small in all cosmic history to avoid a ‘cosmic crunch’ and allow classical structure formation as we observe it. It is particularly unsettling as we know its value is stricly non-zero and causing today’s Universe to fall into a new accelerated phase.
Second, Dark Energy cannot simply be waved away by extending GR or fine tuning some aspects of the current model. More precisely, it is extremely hard (i.e. theoretically irrelevant or already ruled out by current data) to design a model of the Universe which is consistent with observational data and that does not include a cosmological constant.

Hence Dark Energy is likely to be a much deeper issue than just a constant term in Einstein’s equation, and it jeopardizes our understanding of Nature at large. Even worse, it seems to be a part of a greater puzzle that relates to our inability to unify General Relativity with Quantum Field Theory (QFT), our best model for small scale physics. Intriguingly, the two theories differ in one crucial aspect: QFT is only sensitive to differences in energies, whereas in GR vacuum energy gravitates. If Dark Energy is indeed a vacuum energy, we do not yet understand how quantum fluctuations can source it, and for this reason we do not have a good understanding of the physics of the early univers where Quantum Physics and General Relativity are strongly coupled.

Another riddle for the GR-based standard model of the Universe was revealed by the homogeneity of the CMB (homogeneous at $10^{-5}$), that indicated that all points must have been in causal contact at some point before the last scattering surface was emitted. One way to incorporate this observational fact in the previous picture of the Universe was to introduce a phase of extremely fast expansion called cosmological inflation. Naturally, since it takes place in the very early universe, this phase of inflation cannot be fully understood without a viable candidate for a quantum theory of gravity, and Dark Energy is again a part of the picture.

String Theory is currently the best candidate for such a theory, and could explain the inflationary phase and the origin of dark energy. Unluckily, both inflation and string theory involve an incredible (almost untractable) theoretical complexity and exist in a wide variety of versions, which complexifies the task of constraining the models from observation. Fortunately, our best current data favor initial conditions which are extremely close to Gaussian. This enables us to focus on inflationary models that predict –or are at least compatible– with Gaussian initial conditions for the CMB and the large scale structure. Even better, it appears that inflation must be highly fine-tuned in order to predict such conditions, and various models for inflation predict different amounts of non-gaussianity. Since non-gaussanity can be measured in a variety of observational data, it is currently one of the most powerful probe of the early universe, potentially permitting us to rule out and favor models of inflation and quantum gravity.

# Conference: BASP in Villars-sur-Ollon (Switzerland)

I’ve just come back from the BASP (international biomedical and astronomical signal processing) Frontiers workshop in Villars-sur-Ollon, Switzerland. This meeting was truly amazing: amazing location, amazing people, fascinating science. The standards were really high, and the talks and posters of excellent quality. I am glad I could participate to this meeting (I was in fact helping for the organisation too). I was also presenting a poster on our work on 2D and 3D wavelets and their applications in cosmology, which resulting in excellent feedback and discussions.

Needless to say, the meeting was scientifically useful, but also a good fun with the ski expeditions during the day. It’s really nice to ski/surf with close collaborators; it felt like an excellent team building activity.

My poster (during the poster session) with a wonderful landscape on the back

Amazing Swiss landscape

Andy Parker telling us about the Higgs boson and the latest news from the LHC

# Installing S2LET, FLAG & FLAGLET on Ubuntu/MacOSX

This tutorial is complementary to the generic one presented on this page and will guide you through the installation of FFTW, CFITSIO, HEALPIX, SSHT, S2LET, FLAG and FLAGLET. The procedure was initially tested on a blank Ubuntu 12 virtual machine but will also work on Mac OSX systems (Lion minimum recommended). MATLAB >2011 is also recommended because the Mex compiler is painful to configure with old versions of gcc (see below).

# Compiling and testing the C libraries

## FFTW

Download FFTW 3.x.x at the official website. Unzip it and configure/install it in your favorite software directory (/home/bl/software/fftw-3.3.3 for me) with the following commands. The flag –with-pic is required on Linux only.

./configure --prefix=/home/bl/software/fftw-3.3.3 --enable-shared --with-pic
make
make install

The should create lib and include folders in /home/bl/software/fftw-3.3.3 containing the FFTW libraries and headers.

## SSHT

SSHT can be obtained here. After unzipping the package, you should modify the makefile to use the right path to FFTW (and MATLAB, see below). In particular, you should pay attention to the following line and maybe add the explicit FFTW library:

LDFLAGS = -L$(SSHTLIB) -l$(SSHTLIBNM) $(FFTWLIB)/lib$(FFTWLIBNM).a -lm

Then you should be able to run ‘make’ and test that SSHT works properly by running the test command:

make
./bin/c/ssht_test 128 0

I am attaching the makefile that worked for me, just in case.

## CFITSIO and Healpix

To use the IO and Healpix functionalties of S2LET, CFITSIO and HEALPIX_C are required. You can skip this paragraph if you don’t need them. Download and unpack CFITSIO from the website and run

./configure
make
make install

If you don’t have it already (note that v2.x work well), download HEALPIX_3.00 from this website and run

./configure

Configure the C code only (i.e. main option 2) with the correct location for CFITSIO. In my case I had to specify /home/bl/software/cfitsio/lib and /home/bl/software/cfitsio/include during the configuration script. I didn’t apply any other change, option, or compiler flag since S2LET only need a few basic functions from the Healpix C library.

## S2LET / FLAG / FLAGLET

Note that theses three packages have identical setups (apart from their dependencies). S2LET/FLAG/FLAGLET can be obtained here. After unzipping the package(s), modify the makefile(s) with the right path pointing to FFTW and the other packages (CFITSIO, SSHT, etc). I had to change the order in the linking command and add fPIC to make it work on Linux. All the tests must run and achieve floating point accuracy.

make
./bin/c/s2let_test

I am also attaching the makefile that worked for me.

# Setting up the MATLAB interfaces

## GCC and the Mex compiler

A common problem is that the MEX compiler is only compatible with a few versions of gcc. For example I personally had gcc-4.7.2 installed and MEX was complaining a lot about it. So I had to
download an older version of GCC, compatible with MEX, to compile the MATLAB interfaces for SSHT/S2LET/FLAG/FLAGLET. For me, with MATLAB2012b the newest gcc compatible with mex was gcc-4.4. Also, MEX is annoying and not only needs this version installed, but also the right gcc and g++ symbolic links. The simplest solution was to rename /usr/bin/gcc into gcc-4.7.2 (my version) and make gcc-4.4 the default compiler by renaming it to /usr/bin/gcc. There are other ways to proceed and make Mex work with specific versions of GCC but it is far beyond the scope of this tutorial. Contact for further information. On Ubuntu I had to run something like:

sudo apt-get install gcc-4.4
sudo apt-get install g++-4.4
sudo mv /usr/bin/gcc /usr/bin/gcc-4.7.2
sudo cp /usr/bin/gcc-4.4 /usr/bin/gcc
sudo cp /usr/bin/g++-4.4 /usr/bin/g++

Then MEX should find the right compilers and not complain anymore.

## SSHT/S2LET/FLAG/FLAGLET

If MATLAB is correctly pointed in the respective makefiles and the previous issue with the Mex compiler is resolved, then running the following command should be sufficient:

make matlab

Most codes have demos (e.g. s2let_demo1, flag_demo1, etc) that you can run in MATLAB once you have added the correct directories to the PATH (e.g. $S2LET/src/main/matlab) # HEALPix, Matlab and S2LET The S2LET code was initially intended to use exact harmonic and wavelet transforms based on the MW sampling on the sphere. However all features were extended to support the HEALPix format since it is very widespread in astrophysical sciences. In Matlab, S2LET provides routines for • Reading/writing Healpix maps • Plotting the mollweide projection • Performing the spherical harmonic transform (interface to the Healpix Fortran code) • Performing the scale-discretised wavelet transform (interface to the S2LET C code) The third Matlab demo (s2let_demo3) in S2LET gives a good overview of the HEALPix features. The first step is to read a HEALPix map stored in a FITS file % Read data from FITS file s2let_path = '.' inputfile = strcat(s2let_path,'/data/somecmbsimu_hpx_128.fits') [f_ini, nside] = s2let_hpx_read_real_map(inputfile); The parameters for the spherical harmonic and wavelet transforms are defined by % Parameters L = 192; % Band-limit B = 3; % Wavelet parameter J_min = 2; % First wavelet scale J = s2let_jmax(L, B); % Last wavelet scale The data must be band-limited using the spherical harmonic transform, and the wavelet transform can be executed for the input parameters: % Band limit the data flm = s2let_hpx_map2alm(f_ini, 'L', L); nside_recon = 128; % In case the resolution must be changed f = s2let_hpx_alm2map(flm, nside_recon, 'L', L); % Perform wavelet decomposition [f_wav, f_scal] = s2let_axisym_hpx_analysis(f,’B’,B,’L’,L,’J_min’,J_min); Finally, the wavelet coefficients can be plotted using Mollweide projection: % Plot zoomfactor = 1.2; ns = ceil(sqrt(2+J-J_min+1)) ; ny = ns - 1 ; nx = ns ; figure('Position',[100 100 1300 1000]) subplot(nx, ny, 1); s2let_hpx_plot_mollweide(f); campos([0 0 -1]); camup([0 1 0]); zoom(zoomfactor) title('Initial band-limited data') subplot(nx, ny, 2); s2let_hpx_plot_mollweide(f_scal); campos([0 0 -1]); camup([0 1 0]); zoom(zoomfactor) title('Scaling fct') for j = J_min:J subplot(nx, ny, j-J_min+3); s2let_hpx_plot_mollweide(f_wav{j-J_min+1}); campos([0 0 -1]); camup([0 1 0]); zoom(zoomfactor) title(['Wavelet scale : ',int2str(j)-J_min+1]) end # Installing SSHT, S2LET, FLAG and FLAGLET This is a post to guide you through the installation of the various packages required to perform exact wavelets on the sphere and on the ball. The first step is of course to visit this page to receive the previous packages by email. Once they are unzipped, you may follow the simple steps below. ## Step 1 : Installing FFTW and optionally CFITSIO and HEALPIX SSHT, thus all the other packages, require FFTW. The later can be downloaded here and the installation should be straightforward using the usual ./configure and sudo make all install commands. If you intend to use the FITS and HEALPix features of S2LET, you should also download and install CFITSIO and HEALPIX. The Fortran HEALPix library must be compiled with consistent Fortran flags; you will have to use the same flags to compile the Healpix Fortran interface in S2LET. ## Step 2 : Creating symbols in Bash or Shell One only need to check a few lines in each Makefile to be able to build all packages. In brief, the things that potentially need to be adapted are the locations of the dependencies and the compilers with their options. But the very first step before looking at the makefiles is to define symbols pointing to the libraries. For example in Bash you can modify the following pattern and copy it to your ~/.profile or ~/.bashrc : export FFTW='~/sofware/fftw' export SSHT='~/sofware/ssht' export S2LET='~/sofware/s2let' export FLAG='~/sofware/flag' export FLAGLET='~/sofware/flaglet' If you intend to use the FITS and HEALPix features of S2LET, you should also define export CFITSIO='~/sofware/cfitsio' export HEALPIX='~/sofware/healpix' and make sure the libraries are located in /lib subdirectories. ## Step 3 : Setting up the makefiles In the makefile for SSHT you must specify the location of FFTW, the C compilers+options and optionally the Matlab Mex compiler if you intend to use the Matlab interfaces. More details can be found below. On the contrary, in the makefiles for S2LET/FLAG/FLAGLET you should only check the compilers+options, since the makefiles will find the dependencies based on symbols defined in Bash/Shell. All packages have Matlab interfaces that use Mex/C functions and must be built using the Matlab Mex compiler. Hopefully only the location of Matlab should be specified. For example on my machine it is # Directory for MATLAB MLAB = /Applications/MATLAB_R2011b.app If you work on a recent Mac you should be able to build all libraries by running make all. On a Linux you may have to change a few options. In case you need to make these changes, below are the standard settings that work on Mac OS. The standard settings for compiling the C libraries are # Compilers and options for C CC = gcc OPT = -Wall -O3 -g If you want to use HEALPix in S2LET, the following flags must agree with those used to build HEALPix: # Compilers and options for Fortran FCC = gfortran OPTF90 = -O3 -ffree-form # To be defined if LGFORTRAN cannot be found in the path # GFORTRANLIB = /sw/lib/gcc4.6/lib # Flags for Healpix HPXOPT = -lgfortran -DGFORTRAN -fno-second-underscore -fopenmp The following options will be used to compile the dynamic library required for the IDL interface in S2LET: # Config for dynamic library ifeq ($(UNAME), Linux)
DYLIBEXT = so
DYLIBCMD = cc -flat_namespace -undefined suppress
endif
ifeq ($(UNAME), Darwin) DYLIBEXT = dylib DYLIBCMD = g++ -flat_namespace -dynamiclib -undefined suppress endif ## Step 4 : Installing the libraries and interfaces The packages must be built in this order: SSHT, S2LET, FLAG, FLAGLET. With valid dependencies and options, the command make lib should build the C library in every package. If the compilation was successful, you should check the exactness of each transform, for example by running$S2LET/bin/s2let_test
$FLAG/bin/flag_test$FLAGLET/bin/flaglet_test

The command

make all

will build the C library, the Matlab interfaces and the various high-level programs provided in every package.

# [1211.1680] S2LET: A code to perform fast wavelet analysis on the sphere

http://arxiv.org/abs/arXiv:1211.1680

@article{leistedt:s2let_axisym,
author = {B.~Leistedt and J.~D.~McEwen and P.~Vandergheynst and Y.~Wiaux},
title = {S2LET: A code to perform fast wavelet analysis on the sphere},
journal = {Astron. \& Astrophys.},
year = {submitted, 2012},
pubtype = {submitted},
eprint = {arXiv:1211.1680},
pdf = {papers/s2let_axisym.pdf}
}


# [1205.0792] Exact Wavelets on the Ball

http://arxiv.org/abs/1205.0792

@ARTICLE{leistedt:flaglets,
author = {B. Leistedt and J. D. McEwen},
title = "{Exact Wavelets on the Ball}",
primaryClass = "cs.IT",
journal = {IEEE Trans. Sig. Proc.},
volume = {60},
number = {12},
year = {2012},
doi = {10.1109/TSP.2012.2215030},
pubtype = {refereed},
archivePrefix = "arXiv",
eprint = {1205.0792}
}

# The group website is out!

A few months ago my boss and I started to think that our group deserved a nice webpage. After all, visibility is an important factor for jobs, grants and outreach! We met on a regular basis to explore the immensity of the internet looking for nice blogs and websites, and we finally came up with something.

I am pleased to announce that the webpage of our group of early universe researchers at University College London is finally out!

www.earlyuniverse.org

On this platform we will post news, tutorial, codes and various resources to make our work visible and accessible at various levels of complexity.

# Memo: improved ‘top’ command

This command is actually very useful:

top -F -R -o cpu

I find Intel’s Math Kernel Library (MKL) extremely powerful but I very often struggle to find the right compilation flags.

MKL provides a wide variety of C/C++/Fortran interfaces to LAPACK, LBLAS and FFTW. Multiple platforms and systems are supported (which is great!) but consequently one must carefully chose which library to use (otherwise everything crashes, of course).

If you’re like me and you struggle every time you have to compile your program, then go on this page :

Hopefully this should help!

# Wavelets on the sphere through harmonic tiling

The wavelet kernels, defined on the sphere through harmonic tiling, are to be convolved with the signal (using a rotation operator).

They have compact support in both real and harmonic space, so that the extracted wavelet coefficient retain the localised information (unlike the Fourier transform). Here the wavelet parameter is 2, and the resolution is 128, so that the transform is orthogonal and complete if one uses the kernels on the left. The signal is projected on these kernels and no information is loss.

# Wavelet decomposition of 3D seismological Earth model

This is an illustration of the exact 3D Flaglets described in Leistedt & McEwen (2012).

Wavelets are designed to extract scale-dependent features in harmonic space and pixel space simultaneously. In this example I consider Ritsema’s model of shear wavespeed perturbations in the Earth mantle (see e.g. Ritsema’s website and Frederick Simon’s website and publications).

The model supplies spherical harmonic coefﬁcients in the angular dimension and radial spline coefﬁcients in the depth dimension to deﬁne a signal on the ball, which I band-limit. Ritsema’s model does not contain a lot of structure at the smallest scales but essentially contains largescale features, as shown on Fig. 1 (namely the initial data and its wavelet decomposition).

Fig. 1 - Wavelet decomposition of 3D seismological Earth model

Because the model mostly contains large-scale features in wavelet spaces, it is a good candidate for a wavelet denoising. We use the same method described in Leistedt & McEwen (2012), i.e. hard thresholding using a simple noise model.

As shown on Fig. 2, realistic noise usually have a small scale structure and will be consequently captured by specific wavelet coefficients that do not overlap with the informative signal very much. The noise model provides the thresholding rule, which in this case is relatively efficient for the reason cited previously.

Fig. 2 - Wavelet denoising

# Earth tomography and wavelets

This is an illustration of the S2LET package. We analyse and decompose some Earth tomography map into scale-dependent maps using axisymmetric wavelets. The two sets of maps below are completely equivalent since the multiresolution algorithm is exact (see explanation below).

Earth tomography: multiresolution wavelet analysis with s2let

Earth tomography: full resolution wavelet analysis with s2let

## Exact harmonic transform

We use the exact sampling theorem and the related spherical harmonic transform described in McEwen & Wiaux 2011.

Roughly speaking, one can only perform an exact transform with a sampling scheme on the sphere, which guarantees that all information (at floating point precision) is captured in a finite set of samples on the sphere. In practice, if the decomposition is L-band-limited, i.e. $f_{\ell m} = 0 \ \forall l \geq L$, one only needs $2L(L-1)+1$ samples on the sphere to perform the following integral exactly (using Gaussian quadrature, describe in the above-cited paper) $f_{\ell m} = \int_{S^2} d\omega f(\omega) Y^*_{\ell m}(\omega)$ where the $Y^*_{\ell m}(\omega)$‘s are the spherical harmonics. a consequence, one can restrict the reconstruction to these points.

Since this usually doesn’t look good on a plot (these samples are the minimal nodes you need to store the information contained in your map), it is advised to oversample the field on a much higher number of samples, so that it looks continuous (but remains band-limited). This is equivalent to do as if the band-limit was greater. The field is reconstructed using $f_(\omega) = \sum_{\ell=0}^{L-1} \sum_{m=-\ell}^{\ell} f_{\ell m} Y_{\ell m}(\omega)$.

## The transform is exact

The initial map is exactly decomposed into four wavelets plus a scaling part and can be reconstructed at floating point precision. Wiaux, McEwen et al describes the procedure to extract wavelet scales from the spherical harmonics $f_{\ell m}$. Large scale wavelets have lower band-limits, which allows to use the previous multiresolution algorithm. Only the finer scales have band-limit  $L$.

## Multiresolution analysis

As an example of the oversampling remark, notice that the maps on the two plots above are completely equivalent. Because each wavelet scale has different band-limits than the main map (lower than L), the first set of maps uses the minimal number of samples for each wavelet scale, whereas the second oversamples all maps on the great number of samples. However each pair of map (the low-resolution and the oversampled) contains the same information.

# Extract data from ArXiV

Here is a piece of code I wrote in Scala to extract data from ArXiV. The inputs are author and category names. Plugging the output into any graph library can give pretty amazing collaboration graphs.

import scala.xml._
import java.net.{URLConnection, URL}
import java.io._

object arxivio {

def loadURL(s: String): Elem = {
val url = new URL(s)
val conn = url.openConnection
}

def loadMonthCatalogue(cat: String, yrmonth: String): NodeSeq

def loadMonthCatalogue(category: String, yrmonth: String, save: Boolean): NodeSeq = {
val startdate = yrmonth + "01"
val enddate = yrmonth + "04"
var url =
"http://export.arxiv.org/api/query?search_query=submittedDate:["
+ startdate + "0000+TO+"+enddate+"2359]+AND+cat:" + category +
"&start=0&max_results=10000&sortBy=submittedDate&sortOrder=descending"
val page = loadURL(url) \ "entry"
if(save){
val outputfile = "data/" + category + "-" + yrmonth + ".xml"
try{
val fstream: FileWriter = new FileWriter(outputfile)
val out:BufferedWriter = new BufferedWriter(fstream);
out.write(page.toString());
out.close();
}
}
page
}