Water depth - echo sounder (Fahrentholz)

From Experimental Hydrology Wiki
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Parameter to be measured:

Water depth (subsurface measurement)

Method:

echo sounder (sonar)

Equipment:

Fahrentholz BBES 30/ 100 /200 /700

Advantages:

  • no-frills sturdy device
  • simple recording with any COM-port monitor (e.g. Telnet, Hyperterminal)
  • up to 40 measurements/sec from 0.1 to 100 m
  • custom modifications on request
  • can be used in conjunction with (d)GPS, synchronize both records via timestamp (see scripts below)
  • rustic bulky Landrover-style

Disadvantages:

  • computer connection needed, no internal logging
  • COM-port needed
  • external 12/24 V battery needed
  • no logging software provided (see scripts below)
  • no real-time display of values

What to watch out for:

  • may sometimes loose calibration and require resetting (press button and the device is back in seconds)
  • when attached to the stern, bubbles in the wake may produce noisy signal
  • when used with (d)GPS, synchronize system time of laptop with GPS time. BEWARE: true GPS-time is 15 sec ahead of UTC (Nov 2010), which increases every 8 month due to leap seconds. Check which one the GPS displays and which one it records (this may differ!).

Problems/Questions:

Price:

~ 2500 € (2003, Germany)

Links

Projects that used the above equipment:


Other related web sites: [[ http://www.fahrentholz.de/bbesneu.htm ]]

Python script (logging with timestamp from serial port, aggregation of values) File:Echolot2.zip

References

Scripts

readme.txt:

echolot
-------
reads data from echo-sounder (Fahrentholz Echolot-Sensor) via Serial port, aggregates to specified timestep and writes data and timestamp to file
Till Francke; 10.3.2010
Please don't bother me with bugs, improve the script if you can.

Possibly better alternative (not tested): http://www.geo.uib.no/polarhovercraft/index.php?n=Main.GPSEchoSounderLogging

requirements:
- Python (>=2.6.4) with packages numpy (>=1.3.0) and pyserial (>=2.5)
- laptop with serial port (or a reliable USB-COM-converter)

initial test:
- connect to the echo-sounder using a terminal programm (e.g. Hyperterminal, Putty) choosing the right COM-port and 4800 baud

settings:
- open "echolot.py" with a text editor and customize the following lines. Don't add or remove Tabs.
	time_interval=3     #averaging interval used for file output [seconds]
    log_file="e:/temp/comlog.txt"   #output file
    file_mode="w"               #file mode: "w": overwrite existing files; "a": append to existing files
	[...]
    ser.port = 1            #COM-port 0:com1; 1:com2, etc.
- if you want to synchronize the logfile with a GPS-track, verify that the system time is set correctly

operation:
- run "echolot.py". A console window will open and display current measurements, min and max values
- the specified log file should grow

postprocessing:
- you can use the R-script (needs R installed) "link2GPX.r" as a start for syncing a echo-sounder log with a GPS-track:
	- convert GPS-log to GPX
	- adjust filenames near "gps_log=..." and "echo_log=..."
	- adjust time offset near "time_offsets=c(15,0)". Beware: some GPS-receivers may log a different time than they display. The offset is than the real-time minus GPS-time, which is 15 sec in 2010.
	

Copy the contents of this box and save as echosound.py. echosound.py:

#!/usr/bin/python
#reads data from echo-sounder (Fahrentholz Echolot-Sensor) via Serial port (Com-Port), aggregates and writes to file
#Till Francke; 10.3.2010
#needs PySerial, NumPy
import serial


if 1:
    time_interval=3     #averaging interval used for file output [seconds]
    log_file="./comlog.txt"   #output file
    file_mode="a"               #file mode: "w": overwrite; "a": append

    ser = serial.Serial()
    #ser.baudrate = 57600    #MRR
    ser.baudrate = 4800    #Echolot
    ser.port = 0            #COM-port 0:com1; 1:com2
    ser.timeout = 0.5
#	ser


    def convertStr(s):
        try:
        	ret = float(s)
        except ValueError:
        	ret = NaN
        return ret

    import random
    from datetime import datetime
    from numpy import *

    import csv

    with open( log_file, file_mode) as f:
        spamWriter = csv.writer(f, delimiter='\t',quotechar='|', lineterminator="\n",quoting=csv.QUOTE_MINIMAL)
        if file_mode=='w':
            spamWriter.writerow(["Time","meandepth[cm]","min","max","std"])  #write header
       	ser.open()

        while 1:
            now = datetime.today()
            elapsed = datetime.today()-now
            errorline=""
            a=[]    #b = array([], dtype=float)

            while elapsed.seconds < time_interval:
                line = ser.readline()   # read a '\n' terminated line
                line = line.rstrip()
                #print("read from port:",line)
                convertvalue=convertStr(line)
                if isnan(convertvalue):
                    errorline=line              #save uninterpretable line
                a=a + [convertvalue]
                #            a=a + [random.randint(1, 10)]
                elapsed = datetime.today()-now
            print(time_interval," seconds interval")

            if errorline=="":
                print 'mean:{0:2f} min:{1:3f} max:{2:4f} std:{3:4f}'.format(mean(a),min(a),max(a),std(a))
                spamWriter.writerow([datetime.today().strftime("%Y-%m-%d %H:%M:%S"),mean(a),min(a),max(a),std(a)])
                
                if isnan(mean(a)):
                   print("\a") # beep
		f.flush()
            else:
                print ("received:",errorline, "\a")
        ser.close()             # close port
    f.close()

Scripts for linking the recodred file to a GPX-Track: link2GPX.r:

##join logger files by timestamp

source("join_logfiles_functions.r")   #declare necessary functions 

#import GPS-log
  gps_log=read.table(file="DGPS_Data_Bathymetrie.csv", sep=";", header=TRUE, dec=",")
  gps_log$Control=NULL
  gps_log$Method=NULL
  gps_log$Orbit=NULL
  gps_log$Projection=NULL
  gps_log$time=as.POSIXct(strptime(as.character(gps_log$Start.Time),"%Y-%m-%d %H:%M:%S"))      #,tz="GMT"
  gps_log$Start.Time=NULL
  gps_log$id=1:nrow(gps_log)
  
#import echo-log
  echo_log=read.table(file="comlog.txt", sep="\t", header=TRUE)
  echo_log$time=as.POSIXct(strptime(as.character(echo_log$Time),"%Y-%m-%d %H:%M:%S"))       #,tz="GMT"
  echo_log$Time=NULL

##adjust shift
  time_diff= diff(as.numeric(gps_log$time)) #compute time difference between succesive log points
#  time_diffi=which(time_diff==0)
#  write.table(gps_log[sort(c(time_diffi,time_diffi+1,time_diffi+2,time_diffi-1)),],"duplicates.csv",row.names=F,quote=F,sep="\t")

  time_diff[time_diff > 5]=0     #discard time differences larger than 5 s

  gps_log$speed_ms=c(Inf, sqrt(diff(gps_log$Grid.Northing..m.)^2 + diff(gps_log$Grid.Easting..m.)^2) / time_diff) *3.6
  gps_log$speed_ms[is.na(gps_log$speed_ms)]=Inf
  gps_log$speed_ms2=c(gps_log$speed_ms[-1],Inf) #calculate speed from previous and to next GPS-point

##clean gps-log
  gps_log$Elevation..m.[gps_log$Vertical.Precision..m.>0.5]=Inf    #remove elevation records with poor reception
   


#  
#  graphics.off()
#  firstdaystart=ISOdatetime(2010,10,3,16,17,40)    #, tz="GMT"     #day1 zoom : 14 s
#  firstdayend=ISOdatetime(2010,10,3,16,18,30)    #, tz="GMT"
# 
#  firstdaystart=ISOdatetime(2010,10,7,12,32,15)    #, tz="GMT"         #day2 zoom : 15 s
#  firstdayend=ISOdatetime(2010,10,7,12,32,50)    #, tz="GMT"
# 
#  firstdaystart=ISOdatetime(2010,10,7,15,32,0)    #, tz="GMT"         #day2 langsam
#  firstdayend=  ISOdatetime(2010,10,7,15,39,0)    #, tz="GMT"
# 
#  firstdaystart=ISOdatetime(2010,10,7,17,33,40)    #, tz="GMT"         #day2 ende
#  firstdayend=  ISOdatetime(2010,10,7,17,34,40)    #, tz="GMT"
#
#  firstdaystart=ISOdatetime(2010,10,3,16,36,0)    #, tz="GMT"         #day1 langsam
#  firstdayend=  ISOdatetime(2010,10,3,16,46,0)    #, tz="GMT"
#
#  firstdaystart=ISOdatetime(2010,10,3,18,16,0)    #, tz="GMT"         # day1 ende
#  firstdayend=  ISOdatetime(2010,10,3,20,46,0)    #, tz="GMT"
#
# 
#  first_day_echo=echo_log$time < firstdayend & echo_log$time > firstdaystart
#  first_day_gps=gps_log$time < firstdayend & gps_log$time > firstdaystart
#  
#  plot(echo_log$time[first_day_echo],echo_log$meandepth.cm[first_day_echo]/100,type="l", pch=".", ylab="depth [m] / speed [km/h]", main="day1", xaxt="n",
#  ylim=c(0, max(echo_log$meandepth.cm[first_day_echo]/100,gps_log$speed_ms[first_day_gps],na.rm=TRUE))) #
#  axis.POSIXct(1, x=echo_log$time[first_day_echo], format="%H:%M:%S")
#  lines(gps_log$time[first_day_gps],gps_log$speed_ms[first_day_gps],type="l", pch=".", col="red")
#  legend("topright",c("depth","speed"), col=c("black","red"),lty=1)
# 
## echo_log$time <- echo_log$time + 15 
#  
#  #plot map
#  windows()
#  plot(gps_log$Grid.Easting..m.,gps_log$Grid.Northing..m.,type="p",pch=".")
#  points(gps_log$Grid.Easting..m.[first_day_gps],gps_log$Grid.Northing..m.[first_day_gps],type="p",pch="o",col="red")
#  legend("topright",c("all","selection"), col=c("black","red"),lty=1)


#join datasets
  dataset_list=list(echo=echo_log,gps=gps_log)      #specify, which dataframes will be joined (the first in list determines the temporal resolution)
  time_offsets=c(15,0)                               #this numbers will be added to the respective time_columns to adjust for offsets
  for (i in 1:length(dataset_list))
   dataset_list[[i]]$time <- dataset_list[[i]]$time + time_offsets[i] 
   
  
  join_field="time"                                           #field, by which the joining takes place (usually time)
  
  options(warn=1)
  joined_data_set=join_datasets(dataset_list=dataset_list,join_field=join_field)  #join the data
  joined_data_set$gps.speed_ms[joined_data_set$gps.speed_ms==Inf]=NA         #this prevents nodata values from being interpolated
  joined_data_set$gps.speed_ms2[joined_data_set$gps.speed_ms2==Inf]=NA
  joined_data_set$gps.Elevation..m.[joined_data_set$gps.Elevation..m.==Inf]=NA



# analyse dGPS height, correct for water level and speed effects
  joined_data_set$speed_kmh=apply(joined_data_set[,c("gps.speed_ms","gps.speed_ms2")],1,mean,na.rm=T)

  campaigns=list(
    c(start=ISOdatetime(2010,10,3,8,00,00, tz="GMT"), end=ISOdatetime(2010,10,3,19,00,00)),
    c(start=ISOdatetime(2010,10,7,8,00,00), end=ISOdatetime(2010,10,7,17,34,00))
    )
 
  dist_antenna_sensor= 1.635   #distance between echo sensor and GPS antenna in m
  
  joined_data_set$bathy_elevation=0 #prepare column for bathymetry information
  for (i in 1:length(campaigns))
  {
    index=(joined_data_set$time >= campaigns[[i]]["start"] & joined_data_set$time < campaigns[[i]]["end"])
    antenna_elev_time=lm(gps.Elevation..m.~time,data=joined_data_set[index,])

    antenna_elev_speed=lm(gps.Elevation..m. ~ I(speed_kmh*speed_kmh)+speed_kmh,data=joined_data_set[index,])

    antenna_elev_speedtime=lm(gps.Elevation..m. ~ (speed_kmh*time)+I(speed_kmh*speed_kmh),data=joined_data_set[index,],na.action=na.exclude)

   
    #time dependence
    plot(joined_data_set$time[index],joined_data_set$gps.Elevation..m.[index],type="l")
    timespan=seq(from=min(joined_data_set$time[index]),to=max(joined_data_set$time[index]),length.out=20)
    lines(timespan, predict(antenna_elev_time,newdata=data.frame(time=timespan)),col="red") 
    antenna_elev_predicted= predict(antenna_elev_speedtime,newdata=data.frame(speed_kmh=mean(joined_data_set$speed_kmh[index],na.rm=T),time=joined_data_set$time[index]))
    lines(joined_data_set$time[index], antenna_elev_predicted,col="blue")

    windows()
    #speed dependence
    plot(joined_data_set$speed_kmh[index],joined_data_set$gps.Elevation..m.[index],type="p")
    speedspan=seq(from=min(joined_data_set$speed_kmh[index],na.rm=TRUE),to=max(joined_data_set$speed_kmh[index],na.rm=TRUE),length.out=20)
    lines(speedspan, predict(antenna_elev_speed,newdata=data.frame(speed_kmh=speedspan)),col="red")
    antenna_elev_predicted= predict(antenna_elev_speedtime,newdata=data.frame(time=mean(joined_data_set$time[index],na.rm=T),speed_kmh=joined_data_set$speed_kmh[index]))
    points(joined_data_set$speed_kmh[index], antenna_elev_predicted,col="blue",pch=".")

    windows()
    #bivariate dependence
    antenna_elev_predicted=    predict(antenna_elev_speedtime)
    #plot residuals
    plot(joined_data_set$time[index],joined_data_set$gps.Elevation..m.[index]-antenna_elev_predicted,type="l")

    joined_data_set$bathy_elevation[index]=antenna_elev_predicted - dist_antenna_sensor - joined_data_set$meandepth.cm.[index]/100  

    #correct original gps_log, too
    index=(gps_log$time >= campaigns[[i]]["start"] & gps_log$time < campaigns[[i]]["end"])
    antenna_elev_predicted=    predict(antenna_elev_speedtime,newdata=data.frame(time=gps_log$time[index], speed_kmh=apply(gps_log[index,c("speed_ms","speed_ms2")],1,mean,na.rm=T) ,na.rm=T))
    gps_log$sensor_elev[index]=antenna_elev_predicted - dist_antenna_sensor 

   }
  


#explore data
  library(fields)
  n=round(max(joined_data_set$meandepth.cm., na.rm=TRUE)/100)

  pal=palette(tim.colors(n=n))
  
#  split.screen( rbind(c(0,.8,0,1), c(.8,1,0,1)))
# first image
   sset=1:nrow(joined_data_set)
  plot(joined_data_set$gps.Grid.Easting..m.[sset],joined_data_set$gps.Grid.Northing..m.[sset], col= pal[1+floor(joined_data_set$meandepth.cm.[sset]/100)],
   type="p", pch=".",cex=2)  
  if (exists('point_pairs'))
  for (i in (1:nrow(point_pairs)))
  {
    pair=which(joined_data_set$gps.id %in%  point_pairs[i,])
    if (length(pair)==2)
    {
        lines (joined_data_set$gps.Grid.Easting..m.[pair],joined_data_set$gps.Grid.Northing..m.[pair], type="l", col="black")  
        points(joined_data_set$gps.Grid.Easting..m.[pair],joined_data_set$gps.Grid.Northing..m.[pair], type="p", pch="o",cex=2,col="black")  
     }
   }
   

  
#  screen(2)
 #  plot(rep(1,n),1:n, type='p', pch=22,lwd=2, col=pal, bg=pal, xaxt="n",xlab="", ylab="depth (m)", oma=5+c(1,2,3,4))
 # image.plot(matrix(joined_data_set$meandepth.cm.,ncol=1),legend.only=TRUE, smallplot=c(0,.3, .3,.7), 
 #   col=pal,main='%')   #colorbar
  image.plot(matrix(joined_data_set$meandepth.cm.,ncol=1),legend.only=TRUE, smallplot=c(0.0,.1, .1,.7), 
    col=pal,main='%')   #colorbar
#  screen(1)
#  identify(joined_data_set$gps.Grid.Easting..m.[sset],joined_data_set$gps.Grid.Northing..m.[sset],joined_data_set$meandepth.cm.[sset])
  close.screen(all=TRUE)
   
   source("zoom_plot.r")
   zoom_plot(joined_data_set$gps.Grid.Easting..m.,joined_data_set$gps.Grid.Northing..m.,joined_data_set$meandepth.cm.) 
   identify(joined_data_set$gps.Grid.Easting..m.,joined_data_set$gps.Grid.Northing..m.,joined_data_set$meandepth.cm.)


#find adjacent points
  search_radius=sqrt(1)   #maximum spatial distance in m
  min_time = 2*60         # minimum temporal distance in s
  num_time= as.numeric(gps_log$time) #speed
  point_pairs=NULL
  for (i in 1:nrow(gps_log))
  {
    
    valid_points=(gps_log$id > i) & (abs(num_time[i]-num_time)>=min_time) #determine points not yet treated and with minimum temporal distance to current point
    dists2 = ( gps_log$Grid.Easting..m.[valid_points] - gps_log$Grid.Easting..m.[i] )^2 + ( gps_log$Grid.Northing..m.[valid_points] - gps_log$Grid.Northing..m.[i] )^2
    within_dist= dists2<=search_radius
    if (any(within_dist))
    {
      closest=which.min(dists2)
      point_pairs=rbind(point_pairs,c(i, which(valid_points)[closest]))   
      print(i); flush.console()
    }
  }
  save(file="pointpairs.Rdata", list="point_pairs")


 

#use replicate points only
  correlations=NULL
  gps=gps_log[unique(as.vector(point_pairs)),]

  for (offsettime in -5:20)
  {
    echo=echo_log
    echo$time=echo$time+offsettime

    echo= echo[echo$time %in% intersect(echo$time,gps$time),]

    dataset_list=list(echo=echo,gps=gps)      #specify, which dataframes will be joined (the first in list determines the temporal resolution)
    
   
    join_field="time"                                           #field, by which the joining takes place (usually time)
    options(warn=1)
    joined_data_set=join_datasets(dataset_list=dataset_list,join_field=join_field)  #join the data
  
    #plot comparison of depth values of intersection points
    depths=NULL
    sd_depth=NULL
    hor_prec_gps=NULL
    nomatch=(joined_data_set$gps.id != round(joined_data_set$gps.id) | (abs(joined_data_set$gps.bathy_elevation)==Inf))
    joined_data_set=joined_data_set[!nomatch,]
    for (i in 1:nrow(point_pairs))
    {
      pair_rows=joined_data_set$gps.id %in% point_pairs[i,]
 #     coords=t(joined_data_set[pair_rows,"meandepth.cm."])
      coords=t(joined_data_set[pair_rows,c("gps.sensor_elev","meandepth.cm.")])
      if (ncol(coords) != 2) next else
      depths=rbind(depths,coords[1,]-coords[2,]/100)
      sd_depth=c(sd_depth,sum(joined_data_set[pair_rows,"std"]))
      hor_prec_gps=c(hor_prec_gps,sum(joined_data_set[pair_rows,"gps.Horizontal.Precision..m."]))
    }

#    plot(depths,xlab="depth 1st pass [cm]",ylab="depth 2nd pass [cm]",main="offset 15 s", col= gray((hor_prec_gps-min(hor_prec_gps))/(max(hor_prec_gps)-min(hor_prec_gps))), type="p", pch=".",cex=10)
    plot(depths,xlab="depth 1st pass [cm]",ylab="depth 2nd pass [cm]",main="offset 15 s, speed corrected", col= "magenta", type="p", pch="o")
    abline(0,1)
   
    correlations=rbind(correlations,c(offsettime,  cor(depths[,1],depths[,2],use="pairwise.complete.obs")))
   }
   plot(correlations[,1],correlations[,2], ,xlim=c(-5,20),xlab="temporal offset of echo log [min]",ylab="R2 between depths at same location")
 
 




#export
  column_order=c(   #for ordering the columns to match for import as 3d-ASCII
#  which(names(joined_data_set) == "gps.id"),
  which(names(joined_data_set) == "gps.Grid.Northing..m."),
  which(names(joined_data_set) == "gps.Grid.Easting..m."),
  which(names(joined_data_set) == "bathy_elevation")
 # which(names(joined_data_set) == "time")
  )
 
  #for import into RealWorks
  write.table(file="joined_data.asc",na.omit(joined_data_set[,column_order]),quote=FALSE,row.names=FALSE,sep="\t",col.names=F)      #write joined data to file
  #all data
  write.table(file="joined_data.txt",joined_data_set,quote=FALSE,row.names=FALSE,sep="\t")      #write joined data to file

join_logfiles_functions.r:

##auxiliary functions for joining of logfiles (e.g. by timestamp)

parse_gpx= function (gpxfile)
#extract lat, lon and time from all waypoints and trackpoint in GPX-file
{
  #gpx=scan(file = gpxfile, what = character(), n = -1)
  #gpx=paste(gpx,collapse=" ",sep=" ")

  zz <- file(gpxfile, "rb")
  gpx=readChar(zz, 1e8, useBytes = FALSE)
  close(zz)

  extract_value=function(haystack, needle)     #extract the value from structure blabla needle=value
  {
    keyval=gregexpr(paste(needle,"[^\"]*",sep=""), chunk, ignore.case = TRUE,perl=T)[[1]]  #extract key and value
    val=sub(needle,"",substr(haystack,keyval[1],keyval[1]+attr(keyval,"match.length")-1))   #extract value
    return(val)
  }

  extract_value2=function(haystack, needle)     #extract the value from structure blabla <needle> value </needle>
  {
    keyval=gregexpr(paste("<",needle,">[^<]*",sep=""), chunk, ignore.case = TRUE,perl=T)[[1]]  #extract key and value
    val=sub(paste("<",needle,">",sep=""),"",substr(haystack,keyval[1],keyval[1]+attr(keyval,"match.length")-1))   #extract value
    return(val)
  }

  pointframe=data.frame(ptype=rep("p",0),lat=rep("p",0),lon=rep("p",0),time=rep("p",0),stringsAsFactors=F)
  pointpos=gregexpr("<wpt|<trkpt", gpx, ignore.case = TRUE)[[1]]
  pointposwithend= c(pointpos, nchar(gpx))
  for (i in 1:length(pointpos))
  {
    chunk=substr(gpx,pointpos[i],pointposwithend[i+1]-1) #treat substring only
    if (substr(chunk,1,4)=="<wpt") ptype="wpt" else ptype="tpt"    #determine point type
    pointframe=rbind(pointframe,data.frame(ptype=ptype,lat=extract_value(chunk, "lat[ ]*=[ ]*\""),lon=extract_value(chunk, "lon[ ]*=[ ]*\""),time=extract_value2(chunk, "time[ ]*"),stringsAsFactors=F))
  }
  return(pointframe)
}

join_datasets=function(dataset_list,join_field)
#join numeric columns of datasets contained in the named list "dataset_list" using the column "join_field", which must exist in all datasets
#the column "join_field" of the first set is used in the resulting output dataset, all other values are linearly interpolated
{
  for (i in 1:length(dataset_list))   #check availability of time information
    if (!(join_field %in% names(dataset_list[[i]])))
        stop(paste("Dataset",i,"doesn't have a column",join_field))

  joined_data_set=dataset_list[[1]]   #use first dataset as target resolution
  for (i in 2:length(dataset_list))   #successively join all datasets
  {
    for (j in 1:ncol(dataset_list[[i]]))
    {
      if (names(dataset_list[[i]])[j] == join_field) next  #skip join field

      if (is.numeric(dataset_list[[i]][,j])) method="linear" else next     #only numbers can be interpolated
      column = approx(x=dataset_list[[i]][,join_field], y=dataset_list[[i]][,j], joined_data_set[,join_field], rule=c(2,2), method=method)      #linear interpolation of new column
#      browser()
      joined_data_set=cbind(joined_data_set,column$y)                                                                         #join interpolated column to common dataset
      names(joined_data_set)[ncol(joined_data_set)] = paste(names(dataset_list)[i],".",names(dataset_list[[i]])[j],sep="")  #label column
    }
  }
  return(joined_data_set)
}

plot_functions.r:

collect_legend=function(namestr,pch,lty,col)
	{
		assign("namestr_all", cbind(namestr_all,namestr), env = .GlobalEnv)
		assign("pch_all", cbind(pch_all,pch), env = .GlobalEnv)
		assign("lty_all", cbind(lty_all,lty), env = .GlobalEnv)
		assign("col_all", cbind(col_all,col), env = .GlobalEnv)
	}
	
	reset_legend=function()
	{
		assign("namestr_all", NULL, env = .GlobalEnv)
		assign("pch_all", NULL, env = .GlobalEnv)
		assign("lty_all", NULL, env = .GlobalEnv)
		assign("col_all", NULL, env = .GlobalEnv)
	}
	
	draw_legend=function(xpos,ypos)
	{
		legend(xpos, ypos, namestr_all, col=col_all, pch=pch_all,lty=lty_all)
	}

	
#break string into substring of maxlenght length using newline chars
str_break = function (stri,minlength,maxlength)
{
  sparts=strsplit(stri, " ")
  sparts=sparts[[1]]
  #sapply(sparts,nchar)

  resultstr=""
  linestr=""
  while (!is.na(sparts[1]))
  {
      len=nchar(sparts[1])
      if (len+nchar(linestr)<=maxlength)     #substring conveniently fits into new line
      {
         linestr=paste(linestr,sparts[1]," ",sep="") #append substring to current line
         sparts=sparts[-1]                    #mark this substr as treated
      }  else if (len < maxlength)                  #substring fits completely into new line
      {
         resultstr=paste(resultstr,linestr,"\n",sep="") #append current line to resultstring
         linestr=sparts[1]                      #empty current line and append substring to current line
         sparts=sparts[-1]                       #mark this substr as treated
      } else {                                     #substring is too long anyway, use the knife
        limit=maxlength-nchar(linestr)
        sparts=c(substr(sparts[1],1,limit),substr(sparts[1],limit+1,1000000),sparts)
        sparts=sparts[-3]
      }
      if (nchar(linestr)>=maxlength-1)  #flush the rest
      {
        resultstr=paste(resultstr,linestr,"\n",sep="") #append current line to resultstring
        linestr=""
      }
  }
  resultstr=paste(resultstr,linestr) #append current line to resultstring
  
  return (resultstr)
}

#return date-vector for reasonable scaling and labelling of time-axis
#time_range: vector containing min and max of desired timespan
dateticks=function(time_range)
{
 mintime=as.numeric(min(time_range))+strptime("1970-01-01", "%Y-%m-%d", tz="GMT")
 maxtime=as.numeric(max(time_range))+strptime("1970-01-01", "%Y-%m-%d", tz="GMT")
 time_range=difftime(maxtime,mintime,units="hours")
        if (time_range<48)
        {
          by="1 hours"
          format="%d/%m %Hh"
        } else
        if (time_range<7*24)
        {
          by="1 days"
          format="%d/%m/%y"
        } else
        if (time_range<30*24)
        {
          by="2 days"
          format="%d/%m/%y"
        } else
        if (time_range<180*24)
        {
          by="1 weeks"
          format="%d/%m/%y"
        } else
        if (time_range<366*24)
        {
          by="2 weeks"
          format="%d/%m/%y"
        } else
        if (time_range<3*366*24)
        {
          by="1 months"
          format="%m/%y"
        } else
        if (time_range<10*366*24)
        {
          by="6 months"
          format="%m/%y"
        } else
        {
          by="1 years"
          format="%y"
        }

        at = seq(mintime, maxtime, by=by)     #define tick locations
        return(list(at,format))
}

zoom_plot.r:

#use zooming/panning functions 

#2009-04-06 
#Till: dynamic labelling of time axis

################################################################################
################################################################################
################################################################################

source("plot_functions.r")

zoom <- function(plotfunc, extent, xdata,ydata,zdata){
	# (C) 2007 GPL by Huidae Cho <http://geni.ath.cx>
	# Simple R function for interactive zooming/panning
	#
	# Example:
	# data <- runif(100)*10
	# extent <- list(x=c(1, 100), y=c(0, 10))
	# plotfunc <- function(lim){
	# 	plot(data, xlim=lim$x, ylim=lim$y)
	# 	abline(mean(data), 0, col="red")
	# }
	# zoom(plotfunc, extent)

	print("Zoom in:     Click two corners", quote=FALSE)
	print("Zoom out:    Click above plot", quote=FALSE)
	print("Prev extent: Click left of plot", quote=FALSE)
	print("Next extent: Click right of plot", quote=FALSE)
	print("Full extent: Click below plot", quote=FALSE)
	print("Pan:         Double click", quote=FALSE)
	print("Quit:        Right button", quote=FALSE)

	lim <- extent
	lim.stack <- c(lim$x, lim$y)
	lim.depth <- 1
	lim.cur <- 1

	repeat{
		print(lim)
	 flush.console()
	
    plotfunc(lim,xdata,ydata,zdata)

		options(locatorBell=FALSE)
    l <- locator(1)
		if(is.null(l))
			break
		ext <- par()$usr
		if(l$x < ext[1] || l$x > ext[2]){
			cur <- lim.cur
			lim.cur <- if(l$x < ext[1]) max(lim.cur-1, 1)
				else min(lim.cur+1, lim.depth)
			if(lim.cur != cur)
				lim <- list(x=lim.stack[lim.cur, 1:2],
					y=lim.stack[lim.cur, 3:4])
			next
		}
		if(l$y < ext[3])
			lim <- extent
		else
		if(l$y > ext[4]){
			cx <- (lim$x[1] + lim$x[2]) / 2
			cy <- (lim$y[1] + lim$y[2]) / 2
			w <- lim$x[2] - lim$x[1]
			h <- lim$y[2] - lim$y[1]
			lim <- list(x=c(cx-w, cx+w), y=c(cy-h, cy+h))
		}else{
			l2 <- locator(1)
			if(is.null(l2))
				break
			if(sum(l$x == l2$x) || sum(l$y == l2$y)){
				w <- lim$x[2] - lim$x[1]
				h <- lim$y[2] - lim$y[1]
				lim <- list(x=c(l2$x-w/2, l2$x+w/2),
					y=c(l2$y-h/2, l2$y+h/2))
			}else
				lim <- list(x=sort(c(l$x, l2$x)),
					y=sort(c(l$y, l2$y)))
		}
		if(lim.cur < lim.depth){
			lim.stack <- lim.stack[-((lim.cur+1):lim.depth),]
			lim.depth <- lim.cur
		}
		lim.stack <- rbind(lim.stack, c(lim$x, lim$y))
		lim.depth <- lim.depth + 1
		lim.cur <- lim.cur + 1
	}

	lim
}

################################################################################
################################################################################
################################################################################

# The function that plots the time series
plotfunc <- function(lim,xdata,ydata,zdata) {
 
  scales=max(diff(lim$x), diff(lim$y))
  # Plot the series
 
# first image
  #screen(1)
 # browser()

  plot(xdata,ydata, col= pal[round(zdata/100)], type="p", pch=".",cex=max(2,3000/scales), xlim=lim$x, ylim=lim$y)  
  if (exists('point_pairs'))
  for (i in (1:nrow(point_pairs)))
  {
    pair=which(joined_data_set$gps.id %in%  point_pairs[i,])
    if (length(pair)==2)
    {
        lines (joined_data_set$gps.Grid.Easting..m.[pair],joined_data_set$gps.Grid.Northing..m.[pair], type="l", col="black")  
        points(joined_data_set$gps.Grid.Easting..m.[pair],joined_data_set$gps.Grid.Northing..m.[pair], type="p", pch="o",cex=2,col="black")  
     }
   }

}


zoom_plot = function(xdata,ydata,zdata)
{
  n=round(max(zdata, na.rm=TRUE)/100)
  pal=palette(tim.colors(n=n))
 
  
  # Set full plot extent
  extent= list(x=range(xdata),y=range(ydata))
  # Plot with zoom option
#  windows(width=7,height=5)
  split.screen( rbind(c(0,.8,0,1), c(.8,1,0,1)))
  screen(2)
  image.plot(matrix(joined_data_set$meandepth.cm.,ncol=1),legend.only=TRUE, smallplot=c(0,.3, .3,.7), 
    col=pal,main='%')   #colorbar
  screen(1)   
    zoom(plotfunc, extent,xdata,ydata,zdata)
  close.screen(all=TRUE)
}