File:World map 200ma krita 2.png

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

Original file(1,600 × 800 pixels, file size: 814 KB, MIME type: image/png)

Captions

Captions

World map 200 Ma

Summary

[edit]
Description
English: World map 200 Ma
Date
Source Own work
Author Merikanto

This image is based on Paleodem and exoplasim. Exoplasim output is downscaled with R and Python scripts to produce channels of rgb image.

CO2 1200, current orbital parameters.

Exoplasim very basic params.

a_eccentricity1=0.0167022 a_obliquity1=23.441 a_lonvernaleq1=102.7 a_pCO21=1200.0e-6


This image is created w/ annual precipitation & annual mean tamperature Python GAM downscaling For downscaling temperature is to kelvins, and over 28 C was set to 28c, for avoid overflow in downscaling (paleotempareture was greater than nowadays, it was over 28 C)

from this image

https://earthobservatory.nasa.gov/blogs/elegantfigures/wp-content/uploads/sites/4/2011/10/land_shallow_topo_2011_8192.jpg

https://earthobservatory.nasa.gov/blogs/elegantfigures/2011/10/06/crafting-the-blue-marble/


Basic raw data for this image is simulated with Exoplasim


Data for simu:

PaleoDEM Resource – Scotese and Wright (2018) 11 August, 2018 by Sabin Zahirovic

https://www.earthbyte.org/webdav/ftp/Data_Collections/Scotese_Wright_2018_PaleoDEM/Scotese_Wright_2018_Maps_1-88_1degX1deg_PaleoDEMS_nc.zip

Data for mask, dem and hillshade is 6 minutes paleodem

6 minutes dataset

@dataset{scotese_christopher_r_2018_5460860,

author       = {Scotese, Christopher R and
                Wright, Nicky M},
title        = {{PALEOMAP Paleodigital Elevation Models (PaleoDEMS) 
                 for the Phanerozoic}},
month        = aug,
year         = 2018,
publisher    = {Zenodo},
doi          = {10.5281/zenodo.5460860},
url          = {https://doi.org/10.5281/zenodo.5460860}

}

https://zenodo.org/record/5460860/files/Scotese_Wright_2018_Maps_1-88_6minX6min_PaleoDEMS_nc.zip?download=1}}

You need scripts under this

https://commons.wikimedia.org/wiki/File:World_200ma_6.webp

And these addiotional scripts

Main script "rune.sh", sometimes slow

                                                                                  1. 3
    1. preprocess earth rgb, temparature, precipitation data
    2. for downscaling
    1. 27.6.2022 v 0000.0001
    1. input rasters


  1. inputearthmap1="./indata/earthmap1k.jpg"

inputearthmap1="./indata/erdas3.jpg"

  1. inputearthmap1="./indata/earthmap.jpg"

inputearthdem1="./indata/CHELSA_TraCE21k_dem_1_V1.0.tif"

  1. inputearthdem1="./indata/earthdemx.tif"

inputearthmeantemp1="./indata/CHELSA_bio1_1981-2010_V.2.1.tif" inputearthannualprecip1="./indata/CHELSA_bio12_1981-2010_V.2.1.tif"

dimx=1600 dimy=800

echo "Inputs"

echo $inputearthmap1 echo $inputearthdem1 echo $inputearthmeantemp1 echo $inputearthannualprecip1


  1. exit(-1)

rmdir -r origo2 rmdir -r process rmdir -r output

mkdir origo2 mkdir process mkdir output



    1. georef koppenpasta file

gdal_translate -a_srs EPSG:4326 -a_ullr -180 90 180 -90 $inputearthmap1 ./origo2/earth0.tif


  1. exit -1
    1. translate to netcdf

gdalwarp -of GTiff -overwrite -ts $dimx $dimy ./origo2/earth0.tif ./origo2/earth1600.tif

gdal_translate -of NetCDF -ot Float32 ./origo2/earth1600.tif ./origo2/earth1600.nc

  1. gdal_calc.py -A ./origo2/earth160.tif --outfile=./origo2/earth1600.tif --calc="nan_to_num(A)"


rm ../origo2/e10.nc rm ../origo2/e1.nc rm ../origo2/e20.nc rm ../origo2/e2.nc rm ../origo2/e30.nc rm ../origo2/e3.nc

ncks -v Band1,lon,lat ./origo2/earth1600.nc ./origo2/e10.nc -O ncks -v Band2,lon,lat ./origo2/earth1600.nc ./origo2/e20.nc -O ncks -v Band3,lon,lat ./origo2/earth1600.nc ./origo2/e30.nc -O


ncrename -vBand1,e1 ./origo2/e10.nc ./origo2/e1.nc -O ncrename -vBand2,e2 ./origo2/e20.nc ./origo2/e2.nc -O ncrename -vBand3,e3 ./origo2/e30.nc ./origo2/e3.nc -O


gdalwarp -of GTiff -overwrite -ts $dimx $dimy $inputearthdem1 ./origo2/earthdem160.tif

gdal_translate -of GTiff -a_nodata 0 ./origo2/earthdem160.tif ./origo2/earthdem1600.tif gdal_translate -of NetCDF ./origo2/earthdem1600.tif ./origo2/earthdem1600.nc


gdalwarp -of GTiff -overwrite -ts $dimx $dimy $inputearthmeantemp1 ./origo2/earthavgtemp160.tif gdalwarp -of GTiff -overwrite -ts $dimx $dimy $inputearthannualprecip1 ./origo2/earthavgpr160.tif

    1. process CHELSA datas

gdal_calc.py -A ./origo2/earthavgtemp160.tif --outfile=./origo2/earthavgtemp161.tif --calc="(A-2731.5)/10" gdal_calc.py -A ./origo2/earthavgpr160.tif --outfile=./origo2/earthavgpr161.tif --calc="A/10"

gdal_translate -of GTiff -a_nodata 0 ./origo2/earthtemp161.tif ./origo2/earthtemp1600.tif gdal_translate -of GTiff -a_nodata 0 ./origo2/earthpr161.tif ./origo2/earthpr1600.tif

gdal_translate -of NetCDF ./origo2/earthdem1600.tif ./origo2/earthdem1600.nc gdal_translate -of NetCDF ./origo2/earthavgtemp1600.tif ./origo2/earthavgtemp1600.nc gdal_translate -of NetCDF ./origo2/earthavgpr1600.tif ./origo2/earthavgpr1600.nc

bash earthdem.sh

python nix1.py

bash kart.sh


earthdem.sh

inputdem1="./origo2/earthdem1600.tif"

gdal_translate -of GTiff $inputdem1 ./origo2/dem0.tif gdal_calc.py -A ./origo2/dem0.tif --outfile=./origo2/dem6.tif --calc="A*(A>0)" gdal_calc.py -A ./origo2/dem0.tif --outfile=./origo2/sea6.tif --calc="A*(A<0)" gdal_translate -of NetCDF ./origo2/dem6.tif ./origo2/dem6.nc gdal_translate -of NetCDF ./origo2/sea6.tif ./origo2/sea6.nc gdaldem aspect ./origo2/dem6.tif ./origo2/aspect6.tif gdaldem slope ./origo2/dem6.tif ./origo2/slope6.tif gdaldem TPI ./origo2/dem6.tif ./origo2/tpi60.tif gdaldem TRI ./origo2/dem6.tif ./origo2/tri60.tif

gdal_calc.py -A ./origo2/tpi60.tif --outfile=./origo2/tpi6.tif --calc="nan_to_num(A)" gdal_calc.py -A ./origo2/tri60.tif --outfile=./origo2/tri6.tif --calc="nan_to_num(A)"

gdal_translate -of NetCDF ./origo2/aspect6.tif ./origo2/aspect6.nc gdal_translate -of NetCDF ./origo2/slope6.tif ./origo2/slope6.nc gdal_translate -of NetCDF ./origo2/tpi6.tif ./origo2/tpi6.nc gdal_translate -of NetCDF ./origo2/tri6.tif ./origo2/tri6.nc

gdal_proximity.py ./origo2/sea6.tif ./origo2/distance6.tif gdal_translate -of NetCDF ./origo2/distance6.tif ./origo2/distance6.nc gdal_calc.py -A ./origo2/dem6.tif --outfile=./origo2/mask6.tif --calc="256*(A>0)" gdal_translate -of NetCDF ./origo2/mask6.tif ./origo2/mask6.nc


gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/mask6.tif ./origo2/mask1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/dem6.tif ./origo2/dem1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/sea6.tif ./origo2/sea1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/distance6.tif ./origo2/distance1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/aspect6.tif ./origo2/aspect1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/slope6.tif ./origo2/slope1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/lons6.tif ./origo2/lons1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/lats6.tif ./origo2/lats1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/tri6.tif ./origo2/tri1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/tpi6.tif ./origo2/tpi1600.tif

  1. gdal_translate -of png -ot Uint16 ./origo2/mask1600.tif ./origo2/mask1600.png

gdal_translate -of png -ot Uint16 ./origo2/dem1600.tif ./origo2/dem1600.png gdal_translate -ot Int16 -of GTiff ./origo2/dem1600.tif ./origo2/dem16.tif gdal_translate -of png -ot Uint16 ./origo2/mask1600.tif ./origo2/mask1600.png gdal_translate -ot Int16 -of GTiff ./origo2/mask1600.tif ./origo2/mask16.tif

downscaler "nix1.py"


    1. big, small raster stack downscaler
    2. 28.6.2021, v 0012.0001

import matplotlib.pyplot as plt import numpy as np

import os

import netCDF4 as nc

from osgeo import gdal from osgeo import osr from osgeo import gdalconst

from mpl_toolkits.basemap import shiftgrid

from sklearn.linear_model import LinearRegression from sklearn.preprocessing import PolynomialFeatures from sklearn.pipeline import make_pipeline from sklearn.ensemble import RandomForestRegressor from sklearn.preprocessing import StandardScaler from sklearn.model_selection import train_test_split from sklearn import svm, metrics from pygam import LogisticGAM from pygam import LinearGAM from scipy import exp, mgrid, signal, row_stack, column_stack, tile, misc


def process_raster(iname1,oname1,lon1, lat1, lon2, lat2, width1, height1):

bigrname1="./work/big_"+oname1+"_r.tif" bigname1="./work/big_"+oname1+".tif" bigname2="./work/big_"+oname1+".nc" midiname1="./work/midi_"+oname1+".tif" midiname2="./work/midi_"+oname1+".nc" ds = gdal.Translate(bigrname1, iname1, projWin=[lon1, lat2, lon2, lat1]) ds = 0 ds = gdal.Warp(bigname1, bigrname1, width=width1, height=height1, resampleAlg='bilinear') ds = 0 raster_reso_to_reference_raster(bigrname1,"./work/smallfile.tif" , midiname1) ds = 0 ds = gdal.Translate(midiname2, midiname1, format='NetCDF' ) ds=0 ds = gdal.Translate(bigname2, bigname1, format='NetCDF' ) ds=0

return()


def sea_proximity(srcname, dstname): src_ds = gdal.Open(srcname) srcband=src_ds.GetRasterBand(1) drv = gdal.GetDriverByName('GTiff') dst_ds = drv.Create(dstname,src_ds.RasterXSize, src_ds.RasterYSize, 1,gdal.GetDataTypeByName('Float32')) dst_ds.SetGeoTransform( src_ds.GetGeoTransform() ) dst_ds.SetProjection( src_ds.GetProjectionRef() ) dstband = dst_ds.GetRasterBand(1) gdal.ComputeProximity(srcband,dstband,["VALUES='0,-1'","DISTUNITS=GEO"]) srcband = None dstband = None src_ds = None dst_ds = None return(0)


def create_landsea_x(neofile1, neofile2):

print(neofile1) array1, lats, lons=load_raster_float32(neofile1)

lon1=np.min(lons) lat1=np.min(lats) lon2=np.max(lons) lat2=np.max(lats)

nx=len(lons) ny=len(lats)

array1[array1 < 0.0] = 0.0 array1[array1 > 0.0] = 1.0 array2=array1.astype(np.float32) array3=np.flipud(array2)

drivername1="GTiff" x_save_raster(neofile2, drivername1, array2, lons, lats) drivername2="NetCDF" neofile3=neofile2.replace(".tif", ".nc") x_save_raster(neofile3, drivername2, array2, lons, lats)

return(0)


def x_save_raster(outname1, rastype1, rasdata1, lonz1, latz1): ## with netcdf ok xmin=np.min(lonz1) ymin=np.min(latz1) xmax=np.max(lonz1) ymax=np.max(latz1) nx=len(lonz1) ny=len(latz1)

xres = (xmax - xmin) / float(nx) yres = (ymax - ymin) / float(ny) geotransform = (xmin, xres, 0, ymax, 0, -yres)

#dst_ds = gdal.GetDriverByName(rastype1).Create(outname1, nx, ny, 1, gdal.GDT_Float32, options=['COMPRESS=DEFLATE']) dst_ds = gdal.GetDriverByName(rastype1).Create(outname1, nx, ny, 1, gdal.GDT_Float32) dst_ds.SetGeoTransform(geotransform) # specify coords srs = osr.SpatialReference() # establish encoding srs.ImportFromEPSG(4326) # lat/long dst_ds.SetProjection(srs.ExportToWkt()) # export coords to file dst_ds.GetRasterBand(1).WriteArray(rasdata1) # write r-band to the raster dst_ds.FlushCache() # write to disk dst_ds = None return(0)


def sea_proximity(srcname, dstname): src_ds = gdal.Open(srcname) srcband=src_ds.GetRasterBand(1) drv = gdal.GetDriverByName('GTiff') dst_ds = drv.Create(dstname,src_ds.RasterXSize, src_ds.RasterYSize, 1,gdal.GetDataTypeByName('Float32')) dst_ds.SetGeoTransform( src_ds.GetGeoTransform() ) dst_ds.SetProjection( src_ds.GetProjectionRef() ) dstband = dst_ds.GetRasterBand(1) gdal.ComputeProximity(srcband,dstband,["VALUES='0,-1'","DISTUNITS=GEO"]) srcband = None dstband = None src_ds = None dst_ds = None return(0)

def rasterize_shapefile(shpath1, outfile1, lon1, lat1, lon2, lat2, cols, rows): outfile2=outfile1.replace("tif","nc") tempofile1="temp1.tif" kom1="gdal_rasterize -burn 255 -ot Float32" kom2=" -te "+str(lon1)+" "+str(lat1)+" "+str(lon2)+" "+str(lat2) kom3=" -ts "+str(cols)+" "+str(rows) kom4=" "+shpath1+" "+tempofile1 kommandoo1=kom1+kom2+kom3+kom4 koma2="gdal_calc.py -A "+tempofile1+ " --calc=\"(A>200)*99\" --outfile=temp2.tif" koma3="gdal_calc.py -A temp2.tif --calc=\"(A<1)*1\" --outfile="+outfile1

#koma4="gdal_calc.py -A temp3.tif --calc=\"(A>97)*0\" --outfile=temp4.tif"

kommandoo2="gdal_translate -of netcdf -ot Float32 "+outfile1+" "+tempofile1

print (kommandoo1) #print (kommandoo2) os.system(kommandoo1) os.system(koma2) os.system(koma3) #os.system(koma4) #river_proximity("temp3.tif", "trivo.tif")

os.system(kommandoo2) return(0)


def river_proximity(srcname, dstname): src_ds = gdal.Open(srcname) srcband=src_ds.GetRasterBand(1) ## jn waruning test code #srcarray=src_ds.ReadAsArray() #srcarray=srcarray*0 #srcband.WriteArray(srcarray)

drv = gdal.GetDriverByName('GTiff') dst_ds = drv.Create(dstname,src_ds.RasterXSize, src_ds.RasterYSize, 1,gdal.GetDataTypeByName('Float32')) dst_ds.SetGeoTransform( src_ds.GetGeoTransform() ) dst_ds.SetProjection( src_ds.GetProjectionRef() ) dstband = dst_ds.GetRasterBand(1)


gdal.ComputeProximity(srcband,dstband,["VALUES='0,-1'","DISTUNITS=GEO"]) srcband = None dstband = None src_ds = None dst_ds = None return(0)




def gaussian_blur_sub(in_array, size): ny, nx = in_array.shape ary = row_stack((tile(in_array[0,:], size).reshape((size, nx)),in_array,tile(in_array[-1,:], size).reshape((size, nx)))) arx = column_stack((tile(ary[:,0], size).reshape((size, ny + size*2)).T,ary,tile(ary[:,-1], size).reshape((size, ny + size*2)).T)) x, y = mgrid[-size:size+1, -size:size+1] g = exp(-(x**2/float(size) + y**2/float(size))) g = (g / g.sum()) #.astype(in_array.dtype) return signal.convolve(arx, g, mode='valid')

def gaussian_blur(src1, dst1, size1): source = gdal.Open(src1) gt = source.GetGeoTransform() proj = source.GetProjection() band_array = source.GetRasterBand(1).ReadAsArray() blurredArray = gaussian_blur_sub(band_array, size1) #driver = gdal.GetDriverByName('GTIFF') driver = gdal.GetDriverByName('NetCDF') driver.Register() cols = source.RasterXSize rows = source.RasterYSize bands = source.RasterCount band = source.GetRasterBand(1) datatype = band.DataType output = driver.Create(dst1, cols, rows, bands, datatype) output.SetGeoTransform(gt) output.SetProjection(proj) outBand = output.GetRasterBand(1) outBand.WriteArray(blurredArray, 0, 0) source = None # close raster output = None band = None outBand = None return(0)


def filter_nas_out_from_small_and_midi(smal, midis):

smali=smal

meduza=[]

num1=len(midis) print("") print("staksize1 : ", num1) print("")

#smala=np.ravel(smal)

#s1=len(smala)

#print(s1)

#plt.imshow(smal) #plt.imshow(midis[0])

smak=np.where(smal==9.96920997e+36, smal, smal*0+1) #smac=np.where(smak < 10000.0, smak, 1)

smali=smak*smal

for n in range(0,num1): print(n) m1=smak*midis[n] meduza.append(m1)


#plt.imshow(smali)

print(smali)

#plt.show()


return(smali, meduza)





def rm_create_dir(path1): try: os.rmdir(path1) except OSError: print(" ")

try: os.mkdir(path1) except OSError: print ("Create dir %s failed" % path1) else: print ("Created dir %s " % path1)

return(0)

def create_dirs():

rm_create_dir("work") #rm_create_dir("work") return(0)




def random_forest_multiple_vars( x1, y1, x2, feat1): print ("RF /2") xa1=np.array(x1).astype(float) ya1=np.array(y1).astype(float) xa2=np.array(x2).astype(float)

#print (xa1)

  1. quit(-1)

sha1=np.shape(x1)

dim2=sha1[1]

x=xa1.reshape((-1, dim2)) #y=ya1.reshape((-1, 1)) y=ya1.ravel() xb=xa2.reshape((-1, dim2))


#model = LinearRegression().fit(x, y) #degree=3 #polyreg=make_pipeline(PolynomialFeatures(degree),LinearRegression()) #model=polyreg.fit(x,y)

sc = StandardScaler() xa = sc.fit_transform(x) xba = sc.transform(xb)


model = RandomForestRegressor(n_estimators=100, max_features=feat1).fit(xa,y)


y2= model.predict(xba) return(y2)


def rotate_to_180(iname1, oname1): r1,la1,lo1=load_raster_float32(iname1)

## hadcm3 grid! #print(lo1) r2, lo2 = shiftgrid(179.75, r1, lo1, start=True) rastype1="GTiff" lo2=lo2-360.0

cols=len(lo2) rows=len(la1) lon1=min(lo2) lon2=max(lo2) lat1=min(la1) lat2=max(la1) #print(lon1,lon2)

save_raster_compress(oname1, "GTiff", r2,lon1,lat1,lon2,lat2,cols,rows) save_raster_compress("output.nc", "NetCDF", r2,lon1,lat1,lon2,lat2,cols,rows) return(0)

def rotate_to_360(iname1, oname1): r1,la1,lo1=load_raster_float32(iname1)

## hadcm3 grid! #print(lo1) r2, lo2 = shiftgrid(0.0, r1, lo1, start=True) rastype1="GTiff" #lo2=lo2-360.0

cols=len(lo2) rows=len(la1) lon1=min(lo2) lon2=max(lo2) lat1=min(la1) lat2=max(la1) #print(lon1,lon2)

save_raster_compress(oname1, "GTiff", r2,lon1,lat1,lon2,lat2,cols,rows) save_raster_compress("r360.nc", "NetCDF", r2,lon1,lat1,lon2,lat2,cols,rows) return(0)


def linear_regression_multiple_vars( x1, y1, x2): print ("MLR") xa1=np.array(x1) ya1=np.array(y1) xa2=np.array(x2)

sha1=np.shape(x1)

dim2=sha1[1]

x=xa1.reshape((-1, dim2)) y=ya1.reshape((-1, 1)) xb=xa2.reshape((-1, dim2))


#model = LinearRegression().fit(x, y) degree=3 polyreg=make_pipeline(PolynomialFeatures(degree),LinearRegression()) model=polyreg.fit(x,y)


y2= model.predict(xb) return(y2)

def gam_multiple_vars( x1, y1, x2): print ("GAM") xa1=np.array(x1) ya1=np.array(y1) xa2=np.array(x2)

#print (xa1)

  1. quit(-1)

sha1=np.shape(x1)

dim2=sha1[1]

x=xa1.reshape((-1, dim2)) y=ya1.reshape((-1, 1)) xb=xa2.reshape((-1, dim2))

#sc = StandardScaler() #xa = sc.fit_transform(x) #xba = sc.transform(xb)

#model = LogisticGAM().fit(x, y) model = LinearGAM().fit(x, y)

y2= model.predict(xb)

return(y2)


def linear_regression_singe_var( x1, y1, x2): #print (x1) #print(y1) #return(0)

#xa1=np.array(x1) #ya1=np.array(y1) #xa2=np.array(x2) xa1=np.array(x1) ya1=np.array(y1) xa2=np.array(x2)

sha1=np.shape(x1)

#dim2=sha1[1]

#x=xa1.reshape((-1, dim2)) #y=ya1.reshape((-1, 1)) #xb=xa2.reshape((-1, dim2)) x=xa1.reshape((-1, 1)) y=ya1.reshape((-1, 1)) xb=xa2.reshape((-1, 1))

#print (x) #print (y)

#model = LinearRegression().fit(x, y) #model = LogisticGAM().fit(x, y) model = LinearGAM().fit(x, y) #model = RandomForestRegressor(n_estimators=10, max_features=1).fit(x,y) degree=2 polyreg=make_pipeline(PolynomialFeatures(degree),LinearRegression()) ##polyreg=make_pipeline(PolynomialFeatures(degree), ) #model=polyreg.fit(x,y)

## warning not xb y2= model.predict(xb) #print(y2) #print("LR") return(y2)

def load_raster_float32(filename): print(filename) raster = gdal.Open(filename) print(raster) geotransform = raster.GetGeoTransform() originX = geotransform[0] originY = geotransform[3] pixelWidth = geotransform[1] pixelHeight = geotransform[5] d1 = geotransform[2] d2 = geotransform[4] #array = np.array(raster.ReadAsArray()).astype(float) array0 = raster.ReadAsArray() #array=array0 # jn test array=array0.astype(float) cols = array.shape[1] rows = array.shape[0] limitX=cols*pixelWidth+originX limitY=rows*pixelHeight+originY lons=np.array(np.arange(originX, limitX,pixelWidth)) lats=np.array(np.arange(originY, limitY,pixelHeight))


return(array, lats, lons)

def raster_reso_to_reference_raster(inputfile, referencefile, outputfile): print(inputfile, referencefile, outputfile) input = gdal.Open(inputfile, gdalconst.GA_ReadOnly) inputProj = input.GetProjection() inputTrans = input.GetGeoTransform() reference = gdal.Open(referencefile) referenceProj = reference.GetProjection() referenceTrans = reference.GetGeoTransform() bandreference = reference.GetRasterBand(1) x = reference.RasterXSize y = reference.RasterYSize driver= gdal.GetDriverByName('GTiff') output = driver.Create(outputfile,x,y,1,bandreference.DataType) output.SetGeoTransform(referenceTrans) output.SetProjection(referenceProj) gdal.ReprojectImage(input,output,inputProj,referenceProj,gdalconst.GRA_Bilinear) del output return(0)

def save_raster_compress(outname1, rastype1, rasdata1,lon1,lat1,lon2,lat2,cols, rows): ## testef w/gtiff ok xmin=lon1 ymin=lat1 xmax=lon2 ymax=lat2 nx=cols ny=rows xres = (xmax - xmin) / float(nx) yres = (ymax - ymin) / float(ny) geotransform = (xmin, xres, 0, ymax, 0, -yres)

#dst_ds = gdal.GetDriverByName(rastype1).Create(outname1, nx, ny, 1, gdal.GDT_Float32, options=['COMPRESS=DEFLATE']) dst_ds = gdal.GetDriverByName(rastype1).Create(outname1, nx, ny, 1, gdal.GDT_Float32) dst_ds.SetGeoTransform(geotransform) # specify coords srs = osr.SpatialReference() # establish encoding srs.ImportFromEPSG(4326) # lat/long dst_ds.SetProjection(srs.ExportToWkt()) # export coords to file dst_ds.GetRasterBand(1).WriteArray(rasdata1) # write r-band to the raster dst_ds.FlushCache() # write to disk dst_ds = None return(0)


def load_hadcm3_slice(hadpath1,inf1,vari1, taimi1): ind1 = nc.Dataset(inf1) #print(ind1) hlons1=ind1['lon'][:] hlats1=ind1['lat'][:] hslice1=ind1[vari1][taimi1] return(hlons1, hlats1,hslice1)


def load_hadcm3_avg(hadpath1,inf1,vari1, taimi1):

slicesize=32 ind1 = nc.Dataset(inf2) #print(ind1) hlons1=ind1['lon'][:] hlats1=ind1['lat'][:] hslices=[] for n in range(0,slicesize): hslices.append(ind1[vari1][taimi1+12*n])

hslice=hslices[0]*0.0

for n in range(0,slicesize): #print(n) hslice=hslice+hslices[n]

hslice=hslice/float(slicesize)

print(hslice) return(hlons1, hlats1,hslice)


def process_hadcm3_slice(year1, month1, lon1, lat1, lon2, lat2):

baseyear1=22500 ## CURRENT baseyear1=2500 vari1='lwe_thickness_of_surface_snow_amount' inf1 = 'regrid_lwe_thickness_of_surface_snow_amount_20_22.5kyr.nc' ## CURRENT inf1='regrid_lwe_thickness_of_surface_snow_amount_0_2.5kyr.nc'


hadpath1='./haddaway/'

taimi1=(baseyear1-year1)*12+month1

inf2=hadpath1+inf1 hlons1, hlats1, hslice1=load_hadcm3_slice(hadpath1,inf2,vari1, taimi1) zhape1=np.shape(hslice1) cols1=zhape1[1] rows1=zhape1[0] hslice2=np.flipud(hslice1) save_raster_compress("./work/s360.nc","NetCDF", hslice2, 0,0, 360,89.5,cols1, rows1) save_raster_compress("./work/s360.tif","GTiff", hslice2, 0,0, 360,89.5,cols1, rows1) rotate_to_180("./work/s360.tif","./work/s180.tif") ds=gdal.Translate("./work/smallfile.tif", "./work/s180.tif", projWin=[lon1, lat2, lon2, lat1]) ds = gdal.Translate("./work/smallfile.nc", "./work/smallfile.tif", format='NetCDF' ) ds=0 return (hslice2,hlons1,hlats1, cols1, rows1)


def process_big_rasters(inames1,lon1, lat1, lon2, lat2, width1, height1):

num1=len(inames1)

for n in range(0,num1): inam1=inames1[n] bigrname1="./work/bigfile_r_"+str(n)+".tif" bigname1="./work/bigfile_"+str(n)+".tif" bigname2="./work/bigfile_"+str(n)+".nc" midiname1="./work/midifile_"+str(n)+".tif" midiname2="./work/midifile_"+str(n)+".nc" ds = gdal.Translate(bigrname1, inam1, projWin=[lon1, lat2, lon2, lat1]) ds = 0 ds = gdal.Warp(bigname1, bigrname1, width=width1, height=height1, resampleAlg='bilinear') ds = 0 raster_reso_to_reference_raster(bigrname1,"./work/smallfile.tif" , midiname1) ds = 0 ds = gdal.Translate(midiname2, midiname1, format='NetCDF' ) ds=0 ds = gdal.Translate(bigname2, bigname1, format='NetCDF' ) ds=0

bigstak=[] midistak=[]

for n in range(0,num1): bigrname1="./work/bigfile_r_"+str(n)+".tif" bigname1="./work/bigfile_"+str(n)+".tif" midiname1="./work/midifile_"+str(n)+".tif" m1,midila,midilo=load_raster_float32(midiname1) b1,bigla, biglo=load_raster_float32(bigname1) bigstak.append(b1) midistak.append(m1)

return(bigstak, midistak,bigla, biglo, midila, midilo)


def downscale_bigs_and_smalls_1(metoden1, inna1, outta1): #lon, lat rekt lon1=-180 lat1=-90 lon2=190 lat2=90 width1=1600 height1=800 ## Random forest max features feat1=7


create_dirs() sourcename= inna1

sorcer0,bex,bey=load_raster_float32(sourcename) sorcer1=np.ravel(sorcer0).astype(float) #sorver1, bey, bex=

smallstak=[] bigstak=[]


## temp, pr, dem, lons, lats, MLR #az1,ay1,ax1=load_raster_float32("./origo2/earthavgtemp161.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo2/earthavgpr161.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo2/dem1600.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo/lons1600.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo/lats1600.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo2/distance1600.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo2/mask1600.tif") #az2=np.ravel(az1) #smallstak.append(az2)

bz1,by1,bx1=load_raster_float32("./origo/temp_1600.tif") bz2=np.ravel(bz1) bigstak.append(bz2) bz1,by1,bx1=load_raster_float32("./origo/pr_1600.tif") bz2=np.ravel(bz1) bigstak.append(bz2) #bz1,by1,bx1=load_raster_float32("./origo/dem1600.tif") #bz2=np.ravel(bz1) #bigstak.append(bz2) #bz1,by1,bx1=load_raster_float32("./origo/lons1600.tif") #bz2=np.ravel(bz1) #bigstak.append(bz2) #bz1,by1,bx1=load_raster_float32("./origo/lats1600.tif") #bz2=np.ravel(bz1) #bigstak.append(bz2) #bz1,by1,bx1=load_raster_float32("./origo/distance1600.tif") #bz2=np.ravel(bz1) #bigstak.append(bz2) #bz1,by1,bx1=load_raster_float32("./origo/mask1600.tif") #bz2=np.ravel(bz1) #bigstak.append(bz2)


small_model=smallstak[0] big_model=bigstak[0] sheipi1=np.shape(bz1)

print(sheipi1) cols3=sheipi1[1] rows3=sheipi1[0] print(cols3, rows3)

num1=len(smallstak) bigs=[] smalls=[]

## WARNING data filtering! ## clamp NaN out! print(num1) replaceval1=0.0 for n in range(0,num1): print("QR no:",n) m1=smallstak[n] b1=bigstak[n] m1=m1.flatten() b1=b1.flatten() b1[np.isinf(b1)]=replaceval1 m1[np.isinf(m1)]=replaceval1 b1[np.isnan(b1)]=replaceval1 m1[np.isnan(m1)]=replaceval1 m1[m1==9.96920997e+36]=replaceval1 b1[b1==9.96920997e+36]=replaceval1 m1[m1==-32768]=replaceval1 b1[b1==-32768]=replaceval1 smalls.append(m1) bigs.append(b1)

sorcer3=sorcer1.flatten() sorcer3[np.isinf(sorcer3)]=replaceval1 sorcer3[np.isnan(sorcer3)]=replaceval1 sorcer3[sorcer3==9.96920997e+36]=replaceval1 sorcer3[sorcer3==-32768]=replaceval1

spek1=bigs[0] spek2=np.reshape(spek1,sheipi1)

apoints1=list(zip(*smalls)) bpoints1=sorcer3 cpoints1=list(zip(*bigs))

if(metoden1==0): y2=random_forest_multiple_vars(apoints1, bpoints1, cpoints1, feat1)

if(metoden1==1): y2=gam_multiple_vars(apoints1, bpoints1, cpoints1)

if(metoden1==2): y2=linear_regression_multiple_vars(apoints1, bpoints1, cpoints1) ## nok

ro1=np.reshape(y2,sheipi1)

#plt.imshow(ro1) #plt.show()

save_raster_compress(outta1,"NetCDF", ro1, lon1,lat1,lon2,lat2,cols3, rows3) return(0)


def create_mask_from_files():

infilenames1=["./origo2/mask1600.tif","./origo2/earthavgtemp161.tif","./origo2/earthavgpr161.tif"]

maskii=0

num1=len(infilenames1) for n in range(0,num1): inf1=infilenames1[n] bz0,by0,bx0=load_raster_float32(inf1) plt.imshow(bz0) plt.show() quit(-1)

return(maskii)


def load_raster_filter_with_mask(iname1, maskname1, ofset1, koef1): mz1, my1, mx1=load_raster_float32(maskname1) iz1, iy1, ix1=load_raster_float32(iname1) #plt.imshow(mz1, vmin=-255, vmax=255)

#plt.imshow(iz1) #plt.show() mz2=np.copy(mz1) iz2=np.copy(iz1) iz2=iz2*koef1+ofset1

##mz2=np.where(mz2==300.0, 1, mz2) mz2[mz2 >1] = 1 mz2[mz2 == 0] = 0 iz2=mz2*iz2

iz2[iz2 < 0] = 0

#plt.imshow(mz2, vmin=-255, vmax=255)

#plt.imshow(iz2) #plt.show()

return(iz2, iy1, ix1)


def load_raster_filter_with_mask_and_ravel(iname1, maskname1, ofset1, koef1): iz1, iy1, ix1=load_raster_filter_with_mask(iname1, maskname1, ofset1, koef1) riz1=np.ravel(iz1) return(riz1,iy1,ix1)


def masked_downscale_bigs_and_smalls_1(metoden1, inna1, outta1): #lon, lat rekt lon1=-180 lat1=-90 lon2=190 lat2=90 width1=1600 height1=800 ## Random forest max features feat1=7

smallstak=[]

bigstak=[]

create_dirs() sourcename= inna1

#sorcer0,bex,bey=load_raster_float32(sourcename) #sorcer1=np.ravel(sorcer0).astype(float)

sorcer1,bey, bex=load_raster_filter_with_mask_and_ravel(inna1, "./origo2/mask1600.tif",1,1)

#out1=np.reshape(sorcer1,(len(bey),len(bex))) sorcer2=np.copy(sorcer1)


#sorcer2=sorcer2[sorcer2!=-255] sorcer2=sorcer2[sorcer2!=0]

#plt.imshow(out1, vmin=0, vmax=255)

#plt.show()


az1,aey, aex=load_raster_filter_with_mask_and_ravel("./origo2/earthavgpr161.tif", "./origo2/mask1600.tif",0,1) az2=np.copy(az1) az2=az2[az2!=0] smallstak.append(az2)

az1,aey, aex=load_raster_filter_with_mask_and_ravel("./origo2/earthavgtemp161.tif", "./origo2/mask1600.tif",273.15,1) az2=np.copy(az1) az2=az2[az2!=0] smallstak.append(az2)


#az1,aey, aex=load_raster_filter_with_mask_and_ravel("./origo/lons1600.tif", "./origo2/mask1600.tif",360,1) #az2=np.copy(az1) #az2=az2[az2!=0] #smallstak.append(az2)

#az1,aey, aex=load_raster_filter_with_mask_and_ravel("./origo/lats1600.tif", "./origo2/mask1600.tif",360,1) #az2=np.copy(az1) #az2=az2[az2!=0] #smallstak.append(az2)

print(len(sorcer1)) print(len(az1)) print(len(sorcer2)) print(len(az2))

#quit(-1)


#plt.imshow(out1, vmin=0, vmax=255)

#plt.show() #print(sorcer1)

#quit(-1) ##def load_raster_filter_with_mask_and_ravel(iname1, maskname1)

## temp, pr, dem, lons, lats, MLR #az1,ay1,ax1=load_raster_float32("./origo2/earthavgtemp161.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo2/earthavgpr161.tif") #az2=np.ravel(az1) #smallstak.append(az2)


#bz2=np.ravel(bz1c) #bigstak.append(bz2) bz1,by1,bx1=load_raster_float32("./origo/pr_1600.tif") bz2=np.ravel(bz1) bigstak.append(bz2)

bz1,by1,bx1=load_raster_float32("./origo/temp_1600.tif") bz1b=np.copy(bz1) bz1b[bz1b>28.0]=28.0 bz1c=bz1b+273.15 bz2=np.ravel(bz1c) bigstak.append(bz2)


#az1,ay1,ax1=load_raster_float32("./origo/dem1600.tif") #az2=np.ravel(az1) #smallstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo/lons1600.tif") #az1b=az1+360.0 #az2=np.ravel(az1b) #bigstak.append(az2) #az1,ay1,ax1=load_raster_float32("./origo/lats1600.tif") #az1b=az1+360.0 #az2=np.ravel(az1b) #bigstak.append(az2)


small_model=smallstak[0] big_model=bigstak[0] sheipi1=np.shape(bz1)

print(sheipi1) cols3=sheipi1[1] rows3=sheipi1[0] print(cols3, rows3)

num1=len(smallstak) bigs=[] smalls=[]

## WARNING data filtering! ## clamp NaN out! print(num1) replaceval1=0.0 for n in range(0,num1): print("QR no:",n) m1=smallstak[n] b1=bigstak[n] # m1=m1.flatten() # b1=b1.flatten() # b1[np.isinf(b1)]=replaceval1 # m1[np.isinf(m1)]=replaceval1 # b1[np.isnan(b1)]=replaceval1 # m1[np.isnan(m1)]=replaceval1 # m1[m1==9.96920997e+36]=replaceval1 # b1[b1==9.96920997e+36]=replaceval1 # m1[m1==-32768]=replaceval1 # b1[b1==-32768]=replaceval1 smalls.append(m1) bigs.append(b1)

#sorcer3=sorcer1.flatten() #sorcer3[np.isinf(sorcer3)]=replaceval1 #sorcer3[np.isnan(sorcer3)]=replaceval1 #sorcer3[sorcer3==9.96920997e+36]=replaceval1 #sorcer3[sorcer3==-32768]=replaceval1


spek1=bigs[0] spek2=np.reshape(spek1,sheipi1)

apoints1=list(zip(*smalls)) bpoints1=sorcer2 cpoints1=list(zip(*bigs))

if(metoden1==0): print("RF features ", feat1) y2=random_forest_multiple_vars(apoints1, bpoints1, cpoints1, feat1)

if(metoden1==1): y2=gam_multiple_vars(apoints1, bpoints1, cpoints1)

if(metoden1==2): y2=linear_regression_multiple_vars(apoints1, bpoints1, cpoints1) ## nok

ro1=np.reshape(y2,sheipi1)

#plt.imshow(ro1) #plt.show()

save_raster_compress(outta1,"NetCDF", ro1, lon1,lat1,lon2,lat2,cols3, rows3) return(0)


  1. create_mask_from_files()

metod1=2


masked_downscale_bigs_and_smalls_1(metod1, "./origo2/e1.nc", "./output/echannel1.nc") masked_downscale_bigs_and_smalls_1(metod1, "./origo2/e2.nc", "./output/echannel2.nc") masked_downscale_bigs_and_smalls_1(metod1, "./origo2/e3.nc", "./output/echannel3.nc")


  1. load_raster_filter_with_mask(metoden1, "./origo2/e1.nc", "./output/echannel1.nc")
  1. metoden1=2
  1. retu1=downscale_bigs_and_smalls_1(metoden1, "./origo2/e1.nc", "./output/echannel1.nc")
  2. retu2=downscale_bigs_and_smalls_1(metoden1, "./origo2/e2.nc", "./output/echannel2.nc")
  3. retu3=downscale_bigs_and_smalls_1(metoden1, "./origo2/e3.nc", "./output/echannel3.nc")



print(".")



Downscale and draw map, after main script was exec almost once "opera.sh"

    1. downscale and draw map

gdal_translate -of GTiff ./origo/lons6.nc ./origo/lons6.tif gdal_translate -of GTiff ./origo/lats6.nc ./origo/lats6.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo/lons6.tif ./origo/lons1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo/lats6.tif ./origo/lats1600.tif gdal_translate -of GTiff ./origo2/lons6.nc ./origo2/lons6.tif gdal_translate -of GTiff ./origo2/lats6.nc ./origo2/lats6.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/lons6.tif ./origo2/lons1600.tif gdalwarp -of GTiff -overwrite -ts 1600 800 ./origo2/lats6.tif ./origo2/lats1600.tif


python nix1.py

bash kart.sh

Make map, "kart.sh"

gdalbuildvrt -separate ./output/out.vrt ./output/echannel1.nc ./output/echannel2.nc ./output/echannel3.nc gdal_translate -of Gtiff ./output/out.vrt ./output/esand.tif gdal_translate -of png ./output/out.vrt ./output/eland.png


Licensing

[edit]
I, the copyright holder of this work, hereby publish it under the following license:
w:en:Creative Commons
attribution share alike
This file is licensed under the Creative Commons Attribution-Share Alike 4.0 International license.
You are free:
  • to share – to copy, distribute and transmit the work
  • to remix – to adapt the work
Under the following conditions:
  • attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
  • share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current07:47, 20 July 2022Thumbnail for version as of 07:47, 20 July 20221,600 × 800 (814 KB)Merikanto (talk | contribs)kuop
14:54, 19 July 2022Thumbnail for version as of 14:54, 19 July 20221,600 × 800 (838 KB)Merikanto (talk | contribs)new code
08:25, 28 June 2022Thumbnail for version as of 08:25, 28 June 20221,600 × 800 (1,010 KB)Merikanto (talk | contribs)Update
12:59, 27 June 2022Thumbnail for version as of 12:59, 27 June 20221,600 × 800 (839 KB)Merikanto (talk | contribs)Uploaded own work with UploadWizard

There are no pages that use this file.

Metadata