注册 登录  
 加关注
   显示下一条  |  关闭
温馨提示!由于新浪微博认证机制调整,您的新浪微博帐号绑定已过期,请重新绑定!立即重新绑定新浪微博》  |  关闭

阿弥陀佛

街树飘影未见尘 潭月潜水了无声 般若观照心空静...

 
 
 

日志

 
 
关于我

一直从事气象预报、服务建模实践应用。 注重气象物理场、实况场、地理信息、本体知识库、分布式气象内容管理系统建立。 对Barnes客观分析, 小波,计算神经网络、信任传播、贝叶斯推理、专家系统、网络本体语言有一定体会。 一直使用Java、Delphi、Prolog、SQL编程。

网易考拉推荐

RAMADDA point data framework  

2013-03-09 07:13:57|  分类: Ramadda |  标签: |举报 |字号 订阅

  下载LOFTER 我的照片书  |
 

This page describes how to develop a new point data file reader using the RAMADDA point data framework. This framework is based around a record file reading framework. To provide support for a new point data format all that is required is to define a new Java File class that can create the Record class that knows how to read a record from the file.

Getting Started

First, check out the core RAMADDA package
svn co https://ramadda.svn.sourceforge.net/svnroot/ramadda
And check out the NLAS package:
svn co https://nlas.svn.sourceforge.net/svnroot/nlas/trunk
We'll use readers developed for IceBridge data which is in the NLAS package. Change directory into:
cd trunk/src/org/unavco/data/lidar/icebridge/

Mcords IRMCR2 Text Format

One of the readers defined in this package supports the Mcords IRMCR2 text data. The Mcords data is available here: ftp://n4ftl01u.ecs.nasa.gov/SAN2/ICEBRIDGE_FTP/BRMCR2_MCORDSiceThickness_v01 and looks like:
LAT,LON,TIME,THICK,ELEVATION,FRAME,SURFACE,BOTTOM,QUALITY
76.807589,-48.918178,48974.2143,-9999.00,4158.4286,2007091001001, -5.87,-9999.00,0
76.807579,-48.917978,48974.2504,-9999.00,4158.5008,2007091001001, -4.63,-9999.00,0
76.807569,-48.917778,48974.2865,-9999.00,4158.5731,2007091001001, -3.40,-9999.00,0
To provide support for this data format we need to create 2 classes- McordsIrmcr2File and McordsIrmcr2Record. The basic structure is that the "File" classes are what get insantiated and can do some initialization (e.g., read the header) and create a Record class that is used to read and store the values for one line or record of data.
RAMADDA point da<wbr>ta framework - 险峰 - 阿弥陀佛
 One could hand write both the File and the Reader class but RAMADDA provides a data dictionary based code generation facility. In the Icebridge package there is a definerecords.tcl script that contains the data dictionary that generates Java code for the various readers. To run this do:
tclsh definerecords.tcl
This script generates a self-contained McordsIrmcr2File class. This class contains a generated McordsIrmcr2Record class that does the actual reading. This code is generated by the generateRecordClass procedure defined in ../..record/generate.tcl. The following arguments are used
org.unavco.data.lidar.icebridge.McordsIrmcr2Record Generate this Java class
-lineoriented 1 This is a text line oriented file, not a binary file
-delimiter {,} Comma delimited
-skiplines 1 skip the first line in the text file. It is a header
-makefile 1 Normally, generateRecordClass generates just a Record class. This says to actually make a McordIrmcr2File class that contains the Record class. This makes the reader self contained
-filesuper org.ramadda.data.point.text.TextFile This is the super class of the file class
-super org.unavco.data.lidar.LidarRecord This is the super class of the record
-fields
{latitude double -declare 0} Define a field called latitude of type double. The -declare says to not declare the latitude attribute in the Record class. This uses the latitude attribute of the base PointRecord class. Look at AtmIceSSNRecord in definerecords.tcl to see how to overwrite the getLatitude methods
{longitude double -declare 0}
{time double}
{thickness double -missing "-9999.0" -chartable true } Specify a missing value and set the chartable flag. The chartable is used by RAMADDA to determine what fields are chartable.
{altitude double -chartable true -declare 0} This uses the altitude attribute of the base class.
{frame int}
{surface double -chartable true -missing "-9999.0"}
{bottom double -chartable true -missing "-9999.0"}
{quality int -chartable true }
The generated McordIrmcr2File class has a main that can be used to test, e.g.:
java org.unavco.data.lidar.icebridge.McordIrmcr2File <data file>
To use the file reader within RAMADDA one has to add a new RAMADDA entry type in a plugin. The NLAS RAMADDA plugin is located here:
cd src/org/unavco/projects/nlas/ramadda
In resources/types.xml is the entry used for the Mcords file. This specifies a record.file.class property that is used to instantiate the file reader. Running ant in the plugin directory creates an nlasplugin.jar.
<type name="lidar_mccords_irmcr2"  
      description="McCords Irmcr2 Data" 
      handler="org.unavco.projects.nlas.ramadda.LidarTypeHandler" 
      super="lidar" category="LiDAR">
     <property name="icon" value="/nlas/icons/nasa.png"/>
     <property name="record.file.class" value="org.unavco.data.lidar.icebridge.McordsIrmcr2File"/>
</type>

ATM QFit Data

The ATM QFit data is a binary format. There are 3 different record structures - 10 word, 12 word and 14 word. We use the code generation facility to generate readers for each of these formats.
generateRecordClass org.unavco.data.lidar.icebridge.QFit10WordRecord  
    -super org.unavco.data.lidar.icebridge.QfitRecord  -fields  { 
    { relativeTime int -declare 0}
    { laserLatitude int -declare 0}
    { laserLongitude int -declare 0}
    { elevation int -declare 0  -unit mm}
    { startSignalStrength int }
    { reflectedSignalStrength int }
    { azimuth int -unit millidegree}
    { pitch int -unit millidegree}
    { roll int -unit millidegree}
    { gpsTime int }
}
The records all have some common fields - relativeTime, latitude, longitude and elevation. These fields have various scaling factors. We declare those fields in the base (hand written) QfitRecord class and that class in turn implements the getLatitude, getLongitude, etc., methods, scaling the integer values accordingly. The QfitFile is not generated. It handles the logic of determining what record format the file is in, its endianness and pulls out the base date from the file name.

AMRC Text Files

This example is in the core RAMADDA source tree. Check it out from Sourceforge:
svn co https://ramadda.svn.sourceforge.net/svnroot/ramadda
cd src/org/ramadda/data/point/amrc
The AmrcFinalQCPointFile class reads the final QC'ed text file format:
Year: 2001  Month: 09  ID: BPT  ARGOS:  8923  Name: Bonaparte Point     
Lat: 64.78S  Lon:  64.07W  Elev:    8m
2001 244  9  1 0000   -2.5  444.0    0.2  110.0  444.0  444.0
2001 244  9  1 0010   -2.5  444.0    0.2  114.0  444.0  444.0
2001 244  9  1 0020   -2.5  444.0    0.2  110.0  444.0  444.0
2001 244  9  1 0030   -2.5  444.0    0.0    0.0  444.0  444.0
2001 244  9  1 0040   -2.5  444.0    0.0    0.0  444.0  444.0
The AmrcFinalQCPointFile does not create a specific Record class (like the examples above). Rather, it uses the generic TextRecord that is instantiated by a fields property. In the AmrcFinalQCPointFile.prepareToVisit method the 2 line header is read and the fields are defined:
fields=
Site_Id\[ type="string"   value="BPT"  \],
Latitude\[ value="-64.78"  \],
Longitude\[ value="-64.07"  \],
Elevation\[ value="    8"  \],
Year\[ \],
Julian_Day\[ \],
Month\[ \],
Day\[ \],
Time\[ type="string"  \],
Temperature\[ unit="Celsius"   chartable="true"  \],
Pressure\[ unit="hPa"   chartable="true"  \],
Wind_Speed\[ unit="m/s"   chartable="true"  \],
Wind_Direction\[ unit="degrees"  \],
Relative_Humidity\[ unit="%"   chartable="true"  \],
The fields that have the value attribute set are not actually in the data. This allows us to take the metadata in the header (e.g., location) and have it applied to the data records. The AmrcFinalQCPointFile class also implements the processAfterReading method to parse the date/time from the column values and set the time on the Record:
    public boolean processAfterReading(VisitInfo visitInfo, Record record) throws Exception {
        if(!super.processAfterReading(visitInfo, record)) return false;
        TextRecord textRecord = (TextRecord) record;
        String dttm = ((int)textRecord.getValue(IDX_YEAR))+"-" + ((int)textRecord.getValue(IDX_MONTH)) +"-"+ 
           ((int)textRecord.getValue(IDX_DAY)) + " " + textRecord.getStringValue(IDX_TIME);
        Date date = sdf.parse(dttm);
        record.setRecordTime(date.getTime());
        return true;
    }
  评论这张
 
阅读(428)| 评论(0)
推荐 转载

历史上的今天

在LOFTER的更多文章

评论

<#--最新日志,群博日志--> <#--推荐日志--> <#--引用记录--> <#--博主推荐--> <#--随机阅读--> <#--首页推荐--> <#--历史上的今天--> <#--被推荐日志--> <#--上一篇,下一篇--> <#-- 热度 --> <#-- 网易新闻广告 --> <#--右边模块结构--> <#--评论模块结构--> <#--引用模块结构--> <#--博主发起的投票-->
 
 
 
 
 
 
 
 
 
 
 
 
 
 

页脚

网易公司版权所有 ©1997-2017