What is paleoclimate research?
It is a well-established theory that the earth was created 4.6 billion years ago. From the birth of the earth to the present, environmental changes such as the mass extinction of the dinosaurs during the Cretaceous period, for example, have had a great impact on the creatures and animals living on the earth during that time. Even in recent years, there has been much talk about global warming being caused by human influence. Of course, the CO2 emitted into the atmosphere by humans has increased more rapidly after the Industrial Revolution than before the Industrial Revolution, and the theory that humans are accelerating global warming is a strong one. However, if we unravel the 4.6 billion years of the Earth's history, we will find that there are cycles of periods when the Earth's temperature decreased (glacial periods) and periods when the temperature increased relative to the glacial periods (interglacial periods). This cycle is called the Milankovitch cycle, and it is based on the fact that the Earth's environment is changing due to changes in the Earth's orbit around the sun. The Milankovitch cycle is based on the fact that the Earth's environment is changing due to the changes in the Earth's orbit around the Sun. Thus, it is important to understand how the Earth's environment has changed in the past (e.g., sea temperature has risen or fallen) and for what reason (or reasons) in order to consider the future of the Earth.
Paleoclimate models can represent past climate changes by numerical simulation on a computer.
Paleoclimate model (LOVECLIM)
LOVECLIM is an EMIC (Earth System Model of Intermediate-Complexity) developed in Belgium, which is often used for paleoclimate research and can perform integrations over 10,000 years on a PC for free.
In Japan, EMIC is developed and operated by the Meteorological
Research Institute of the Japan Meteorological Agency (JMA), but it is not available to individuals because it requires a license.
Therefore, I would like to explain the process of installing LOVECLIM, which is provided free of charge, in a Linux environment and visualizing it using GrADS, which is mainly used in the
field of oceanography.
Installation & Experiments
· Preparing the files to use
Download LOVECLIM1.3.tar.gz, NetCDF-4.1.3.tar, udunits-1.12.11.tar from the official LOVECLIM website, and hdf5-1.10.4.tar.gz from the Source-Gzip link on the official HDF Group website. Download the following files
(The README in the Loveclim_V1.3 directory, which is created when you unzip Loveclim, lists other necessary libraries, but we will focus on these three libraries as examples.)
NetCDF, udunits, and hdf5 are called libraries, and they are files
that accumulate functions and other files that are needed
to run LOVECLIM.
Next, create a working directory for Loveclim in your home directory, and move Loveclim_V1.3, which has been unzipped from Loveclim1.3.tar.gz, into it using mv command.
· Library maintenance
Unzip the downloaded NetCDF-4.1.3.tar, udunits-1.12.11.tar, and hdf5-1.10.4.tar.gz, create a lib directory in your home directory, and move netcdf-4.1.3, udunits-1.12.11, and hdf5-1.10 .4 to the new directory. (This time I did the installation in a situation where I could not be a super user, so if you have package management software such as macports or homebrew installed, you can install it there as well.)
Now, let's install the library. We will use udunits-1.12.11 as an example.
We will move udunits-1.12.11 into the lib directory and work on it in the following src directory.
The first step is to edit the configuration. Open configure with any installed editor software (vi, emacs, gedit, etc.) and search for the string prefix.
ac_default_prefix=/usr/local
The above will be found. Edit /usr/local under = to /home/lib/udunits-1.12.11 that you just created. By editing here, you can specify the directory where the installation will take place.
Next, save and close configure, and run it. The execution commands are described in detail in the README in Loveclim_V1.3, but in the case of udunits-1.12.11, the commands are as follows:
CC=icc CFLAGS='-Df2cFortran -fPIC' ./configure --prefix=/opt/udunits
Change the prefix here from =/opt/local to =/home/lib/udunits-1.12.11 as before and run the command.
If the following directory is created in the same hierarchy as the src directory, the installation is complete.
udunits-1.12.11]$ ls
bin etc include lib man src
You can also install netcdf-4.1.3 and hdf5-1.10.4 in the same way.
· LOVECLIM maintenance & execution
After all the libraries are installed, the next step is to maintain Loveclim.
First, navigate from your home directory to the ref directory in Loveclim_V1.3.
home/Loveclim_V1.3/RUN/V1.3/expdir/ref
Now, open make.macros in your editor software and edit it. To edit make.macros, open make.macros in your editor software.
The two places to edit are shown below at the bottom of make.macros.
# NETCDFINCLUDE = -I/opt/netcdf/include
# NETCDFLIB = -L/opt/netcdf/lib -lnetcdf -lnetcdff
#UDUNITSINCLUDE = -I/opt/udunits/include
#UDUNITSLIB = -L/opt/udunits/lib -ludunits
In the upper part of these two sections, specify the location of the include directory and lib directory created by the installation of netcdf-4.1.3 as home/lib/netcdf-4.1.3/include, home/lib/netcdf-4.1.3/lib, etc. . In the same way for the lower part, specify the location of the include and lib directories of the udunits you installed.
Next, go to the expdir directory.
home/Loveclim_V1.3/RUN/V1.3/expdir
Now open exp.peram with your editor software and edit it. This file allows you to adjust the parameters for running Loveclim. Specifically, you need to edit the following three parts.
Experiment Name is the name of the experiment. Specify the directory where you want to output the results of this experiment. It can be anything, but I chose the directory nyc.
############### Experiment Name ################
texp=nyc
# texp=control
Next is the Scratch directory shown below, which specifies the output destination of the executable file that will be needed later.
Scratchdir=/home/Loveclim_V1.3/RUN/V1.3/scratch
Next is Location for saving of outputs files, parameters and initial conditions, which specifies the output directory for the results, so we specify it as result for clarity.
SavingPath=/home/Loveclim_V1.3/result
Location where eco binary can be found and Location where cdo binary can be found specify the location of the installed hdf5-1.10.4 and udunits-1.12.11 libraries.
ncodir="/home/lib/hdf-5.10.4/lib"
cdodir="/home/lib/udunits-1.12.11/lib"
################# DIRECTORIES ##################
#Scratch directory
Scratchdir=/scratch/${USER}
#Location for saving of outputs files, parameters and initial conditions"
SavingPath=/storage/${USER}/STORAGE
#Location where nco binary can be found
ncodir="/opt/nco/bin"
#Location where cdo binary can be found
cdodir="/opt/cdo/bin"
Finally, let's set each parameter. In this experiment, we changed the Number of years and days.
I set lengthInDays=36000. There are many other parameters that can be adjusted. For example, by changing the Starting year and day, you can set the year to start the experiment.
################ RUN PARAMETER #################
#Starting year and day
start_y=1950; start_d=1
initdate=0
#Number of years and days (1yr=360days 1500yr=540000days)
lengthInDays=36000
#Number of run
NbRun=AUTO
#Multi simulations (Number of members)
iens=0
## debug mode = 1 -> scratch stay in place after run
debug_mode=1
When you have finished editing so far, save and close exp.peram and run it with the command . /newexp exp.peram command.
When the job finishes successfully, the nyc directory you specified earlier will be created in the same directory as exp.peram. Next, go to the nyc directory and run launch_r1 in the directory . /launch_r1.
When the job finishes successfully, it will move to the scratch directory that you specified earlier.
home/Loveclim_V1.3/RUN/V1.3/scratch/LOVECLIM1.3/nyc/repo
If an executable file named emic.x is created in this directory, Loveclim is ready to use.
If you get an error after this step, it is probably because the path is not set correctly and the program cannot find the library. Please check your editing again. It is also possible that the LD_LIBRARY_PATH is not set properly. Type the command "echo $LD_LIBRARY_PATH" to check it.
If there is a mistake in specifying the library in $LD_LIBRARY_PATH, open the .bashrc file from your home directory with an editor and edit it as follows
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/local/bin/lib(例):$HOME/lib/netcdf-4.1.3/lib (こちらが重要)
· Setup to read experimental results in GrADS
If the emic.x file has been created by the maintenance up to this point, running . /emic.x will create the result in the result directory, but GrADS cannot open the result file because of the difference in the number of days in a year. Therefore, we need to edit one setting of Loveclim. Go to the sources directory in Loveclim_V1.3 below.
c time
status=nf_def_var(ncid,'time',NF_DOUBLE,1,start,varid)
status=nf_put_att_text(ncid,varid,'calendar',7,'360_day')
Edit 360_days to 365_days in the post status=nf_put_att_text(acid,varid,'calendar',7,'360_day').
status=nf_put_att_text(acid,varid,'calendar',7,'365_day')
After editing, run emic.x in the scratch directory you just created under . /emic.x.
If there are no problems, the experimental results should be output to the specified result directory. The two .nc files below are the output results, where CLI03a~ stands for annual, and CLI03m~ stands for monthly.
/Loveclim_V1.3/result/LOVECLIM1.3/nyc/repo/output/ocean
$ ls
CLIO3a.001950_001.nc CLIO3m.001950_001.nc
A method to describe the results in GrADS
GrADS is widely used mainly in the field of physical oceanography, and the main reason for its widespread use is that GrADS is simple and easy to use for drawing diagrams with relatively high accuracy. Please download the .gz file from the official site and install it.
As for the .nc file created in the experiment, it cannot be read by GrADS as it is. Normally, to read a netCDF file in GrADS, I use the sdfopen command, but I get the following error.
ga-> sdfopen CLIO3a.001950_001.nc
Scanning self-describing file: CLIO3a.001950_001.nc
SDF Error: 365 day calendars are no longer supported by sdfopen.
To open this file with GrADS, use a descriptor file with
a complete TDEF entry and OPTIONS 365_day_calendar.
Documentation is at http://iges.org/grads/gadoc/SDFdescriptorfile.html
The error message says that the file format is not supported anyway. So let's use the ncdump command to open the contents of the CLI03a.001950_001.nc file.
$ ncdump CLIO3a.001950_001.nc |less
dimensions:
ptlon = 120 ;
pulon = 120 ;
ptlat = 65 ;
pulat = 65 ;
tdepth = 20 ;
wdepth = 21 ;
wedges = 22 ;
corners = 4 ;
sflat = 57 ;
sfdepth = 22 ;
sfedges = 23 ;
basidx = 4 ;
ptlonp = 121 ;
pulonp = 121 ;
ptlatp = 66 ;
pulatp = 66 ;
time = UNLIMITED ; // (29 currently)
variables:
double ptlon(ptlon) ;
ptlon:units = "degrees_east" ;
ptlon:long_name = "pseudo tracer grid longitude" ;
ptlon:modulo = "m" ;
ptlon:topology = "circular" ;
double pulon(pulon) ;
pulon:units = "degrees_east" ;
pulon:long_name = "pseudo momentum grid longitude" ;
pulon:modulo = "m" ;
pulon:topology = "circular" ;
We were able to see the contents of the .nc file and know the dimension of the data.
Now that we know the file structure, we will create the GrADS control file.
Create an .xctl (control file) file
To read a netCDF file in GrADS, use the sdfopen command as described above. sdfopen opens the file, and if you look at the contents of the file in your editor software, you will see that the header section (the top part of the file when you open it) contains details about the data it contains (e.g., x-direction values, depth values, time values, etc.). GrADS uses this data as the basis for drawing the figure. The header part containing this information can also exist as a standalone file. This file is called a .ctl (control file). In this case, use the open command in GrADS to open the .ctl file. At this time, you read a netCDF file such as .nc file specified in the .ctl file as data. Since .xctl files work with only minimal information when GrADS reads control files, we will create a .xctl file.
Now we will create a .xctl file to depict the figure.
The items to be filled in are DSET (data to be read), TITLE (title), DTYPE (data format), UNDEF (points where data does not exist), XDEF (x-axis variable), YDEF (y-axis variable), ZDEF (z-axis variable), and TDEF (time scale). If you specify these information, GrADS will be able to read the data correctly. You can pick up this information from the contents of the data opened by the ncdump command. I have uploaded the .xctl file I used here. The file name is 'dclio-taxpypzp.xctl'.
dclio-taxpypzp.xctl
File Edit Options Buffers Tools Help
DSET ^CLIO3a.%ch_001.nc
TITLE tracer-related quantities in the original coordinates of CLIO
* Type 'xdfopen dclio-taxpypzp.xctl' from Grads prompt
* For details, try the following commands:
* q file
* q dims
* q ctlinfo
*
DTYPE netcdf
*
OPTIONS 365_day_calendar TEMPLATE
CHSUB 1 100 001950
CHSUB 101 200 002050
CHSUB 201 300 002150
CHSUB 301 400 002250
CHSUB 401 500 002350
CHSUB 501 600 002450
CHSUB 601 700 002550
CHSUB 701 800 002650
CHSUB 801 900 002750
CHSUB 901 1000 002850
*
UNDEF -1.0e+32 missing_value _FillValue
XDEF ptlon
YDEF ptlat
ZDEF tdepth
TDEF time 99999 LINEAR 01JAN1950 1yr
· Drawing diagrams with .xctl files
Let's open the created .xctl file with GrADS.
After launching GrADS, use the xdfopen command to open the .xctl file. Type the following commands: set gxout shaded, d sst to create a diagram. SST stands for Sea Surface Temperature, which is a basic physical quantity in oceanography.
ga-> xdfopen ファイル名.xctl
Scanning Descriptor File: ファイル名.xctl
SDF file CLIO3a.001950_001.nc is open as file 1
LON set to 0 360
LAT set to -79.5 112.5
LEV set to -5126.18 -5126.18
Time values set: 1950:1:1:0 1950:1:1:0
E set to 1 1
ga-> set gxout shaded
ga-> d sst
Contouring: -0 to 30 interval 3
ga->
The resulting diagram is shown in the next figure.
I was able to create the figure successfully, but I feel
uncomfortable around 60°N.
So we will try to overlay the latitude and longitude built into the data. The commands are d tlon and d tlat.
From this figure, we can see that there is a discrepancy between the results from the model and the grid we are depicting.
In the next section, we will try to correct this discrepancy.
Fix drawing grid
As for the modification of the depiction grid, the SST values and latitude/longitude data are included in the model result data as shown in the figure above, so we can imagine that the latitude/longitude data included in the data should be placed in the correct place on the global map.
Now we will do that program in fortran. I will upload the script I created, and I will explain the key points.
The program name is mkpdef. First, read tlon, tlat, and angle from the .nc file.
program mkpdef
implicit none
include 'netcdf.inc'
! original model grid in the netcdf output of CLIO
real(8),dimension(1:120,1:65) :: ntlon, ntlat, nurot
! same as above except for attachment of cyclic buffers
real(4),dimension(0:121,1:65) :: mtlon, mtlat, murot
! mapping information from original- and target-grid sets
real(4),dimension(1:120,1:57) :: gi,gj,gd,gr
real(4) :: dist, mindist, lon, lat, a, b, clat, clon, crot
integer :: i,j, ii, jj, ci,cj
integer :: err, id_file, id_tlon, id_tlat, id_urot
err = nf_open('CLIO3a.001950_001.nc', NF_NOWRITE, id_file)
write(*,*) 'step 1', err, id_file
err = nf_inq_varid(id_file, 'tlon', id_tlon)
write(*,*) 'step 2', err, id_tlon
err = nf_inq_varid(id_file, 'tlat', id_tlat)
write(*,*) 'step 3', err, id_tlat
err = nf_inq_varid(id_file, 'angle', id_urot)
write(*,*) 'step 4', err, id_urot
err = nf_get_var_double(id_file, id_tlon, ntlon)
write(*,*) 'step 5', err, ntlon(1,1), ntlon(120,1), ntlon(1,65), ntlon(120,65)
err = nf_get_var_double(id_file, id_tlat, ntlat)
write(*,*) 'step 6', err, ntlat(1,1), ntlat(120,1), ntlat(1,65), ntlat(120,65)
err = nf_get_var_double(id_file, id_urot, nurot)
write(*,*) 'step 7', err, nurot(1,1), nurot(120,1), nurot(1,65), nurot(120,65)
The next step is to convert to the correct position by calculating the distance (dist) between the latitude and longitude of the model result (search points) and the latitude and longitude of the projection destination grid (target points). The key point here is that we specify a grid that is 10 times finer than the grid of the model result to find the search point. This is simply a process to improve accuracy.
do jj = 1, 57 射影先の球グリッド(緯度)
do ii = 1, 120 射影先の球グリッド(経度)
lon = (ii-1)*3.0e0+1.5 射影先の球座標の細かさ (目的点)
lat = (jj-1)*3.0e0-79.5e0 射影先の球座標の細かさ (目的点)
mindist = 400.0d0
do j = 1, 641 モデル結果のグリッド(緯度、探索点)
do i = 1, 1200 モデル結果のグリッド (経度、探索点)
ci = int(1 + (i-1)*0.1) モデル結果のグリッドの整数化 (四方囲い原点)
cj = int(1 + (j-1)*0.1) モデル結果のグリッドの整数化 (四方囲い原点)
a = (i-1)*0.1-int((i-1)*0.1)
b = (j-1)*0.1-int((j-1)*0.1)
clon = mtlon(ci,cj)*(1.0-a)*(1.0-b) + mtlon(ci+1,cj)*a*(1.0-b) + & 球座標での探索点
mtlon(ci,cj+1)*(1.0-a)*b + mtlon(ci+1,cj+1)*a*b
clat = mtlat(ci,cj)*(1.0-a)*(1.0-b) + mtlat(ci+1,cj)*a*(1.0-b) + & 球座標での探索点
mtlat(ci,cj+1)*(1.0-a)*b + mtlat(ci+1,cj+1)*a*b
crot = murot(ci,cj)*(1.0-a)*(1.0-b) + murot(ci+1,cj)*a*(1.0-b) + &
murot(ci,cj+1)*(1.0-a)*b + murot(ci+1,cj+1)*a*b
dist = min((clon-lon)**2+(clat-lat)**2, &
(clon-lon-360.0d0)**2+(clat-lat)**2) 目的点と探索点の距離 (近さ)
if (dist < mindist) then
mindist = dist
gi(ii,jj) = 1 + (i-1)*0.1e0
gj(ii,jj) = 1 + (j-1)*0.1e0
gr(ii,jj) = crot
gd(ii,jj) = mindist
endif
if(gi(ii,jj) == 120.0e0) then
gi(ii,jj) = 119.9999e0
endif
enddo
enddo
write(*,*) ii,jj, gi(ii,jj), gj(ii,jj), gd(ii,jj), gr(ii,jj)
enddo
enddo
Now, we will leave the overview of the fortran program at this point, and actually run and modify it.
The fortran program we used is here. The execution command is described in the Makefile, so after moving it to the same directory as the .nc file of the model result, please execute it with make execp.
One thing to keep in mind is that fortran uses ifort (which can be installed for free if you are a student), and edit the netcdf location.
Open the Makefile with your editor software. Edit MYINC and MYLIB in the following items to the corresponding places respectively.
MYINC = /home/lib/netcdf-4.1.3/include
MYLIB = /home/lib/netcdf-4.1.3/lib
# change MYINC and MYLIB depending on your unix environment
MYINC = /opt/local/include
MYLIB = /opt/local/lib
Run "make execp", and after the executable named execp is created, run . /execp. Then, a new file dclio-xpyp.pdef will be created. This file will be the data file that replaces the latitude and longitude from the model results with the correct placement.
Depiction of modified data
The control file we will use is not the .xctl file we created earlier, but the one that replaces the z-direction variable with pressure (p) and corresponds to the .pdef file. Here is the .ctl file we used.
dclio-taxpypzp.ctl
DSET ^CLIO3a.%ch_001.nc
*
TITLE tracer-related quantities mapped onto standard lat-lon coordinates
* Type 'open dclio-taxpypzp.ctl' from Grads prompt
* For details, try the following commands:
* q file
* q dims
* q ctlinfo
*
DTYPE netcdf
PDEF 120 65 bilin stream binary-big ^dclio-xpyp.pdef
*
OPTIONS TEMPLATE
CHSUB 1 100 001950
CHSUB 101 200 002050
CHSUB 201 300 002150
CHSUB 301 400 002250
CHSUB 401 500 002350
CHSUB 501 600 002450
CHSUB 601 700 002550
CHSUB 701 800 002650
CHSUB 801 900 002750
CHSUB 901 1000 002850
*
UNDEF -1.0e+32 missing_value _FillValue
XDEF 120 LINEAR 1.5000 3
YDEF 57 LINEAR -79.5000 3
ZDEF 20 LEVELS -5126.18 -4385.22 -3661.11 -2963.25 -2307.36 -1717.9 -1225.11 -850.19 -588.88 -415.07 -299.29 -219.86 -163.28 -121.52 -89.75 -64.96 -45.2 -29.17 -15.98 -5
TDEF 99999 LINEAR 01JAN1950 1yr
vars 26
tlon=>tlon 0 y,x longitude
tlat=>tlat 0 y,x latitude
dxs1=>dxs1 0 y,x zonal length of grid box sides
dxs2=>dxs2 0 y,x meridional length of grid box sides
dxc1=>dxc1 0 y,x zonal width at grid box centres
dxc2=>dxc2 0 y,x meridional width at grid box centres
area=>area 0 y,x surface area of tracer grid box
tmask=>tmask 0 y,x horizontal tracer grid mask
h=>h 0 y,x bathymetry on tracer gridpoints
temp=>temp 20 t,z,y,x averaged potential temperature
salt=>salt 20 t,z,y,x averaged salinity
ssh=>ssh 0 t,y,x averaged sea surface height
sst=>sst 0 t,y,x averaged SST
sss=>sss 0 t,y,x averaged sea surface salinity
shflx=>shflx 0 t,y,x averaged surface heat flux
sfflx=>sfflx 0 t,y,x averaged surface freshwater flux
zmix=>zmix 0 t,y,x averaged depth of ocean surface mixed layer
zcnv=>zcnv 0 t,y,x averaged depth of convection
msl=>msl 0 t,y,x averaged G-M slope
hice=>hice 0 t,y,x averaged ice thickness
hicp=>hicp 0 t,y,x averaged ice production
albq=>albq 0 t,y,x averaged lead fraction
hsn=>hsn 0 t,y,x averaged snow thickness
snow=>snow 0 t,y,x averaged snow precipitation
tice=>tice 0 t,y,x averaged ice temperature
fb=>fb 0 t,y,x averaged heat flux at ice base
endvars
After preparing the .ctl file, open it with the open command on GrADS. Then run the command set gxout shaded, followed by d sst to create the following figure.
We are now ready to create the figure shown at the beginning
of the explanation.
As a confirmation, let's represent the latitude and longitude by overlapping them with the command d tlon d tlat.
North of 60°N is also correctly placed.
This concludes my introduction to the representation of SST in GrADS using LOVECLIM.
I will continue to introduce matters of interest related to oceanography in this way in the future.