GeosoftDoD UXProcess
GeosoftDoD UXProcess
www.geosoft.com
The software described in this manual is furnished under license and may only
be used or copied in accordance with the terms of the license.
Written by, Nancy Whitehead with special thanks to Elizabeth Baranyi. Please
send comments or questions to tech@geosoft.com
Geosoft Incorporated
8th Floor
85 Richmond St. W.
Toronto, Ontario
M5H 2C9
Canada
Tel: (416) 369-0111
Fax: (416) 369-9599
E-mail: info@geosoft.com
Contents
UX-DataPreparation 5
UX-ParameterDetermination 7
UX-TargetManagement 8
Creating a Project 10
Setup Parameters 13
Survey Layout 16
Import 29
Import Data 29
Automatic Creation of X_UTM/Y_UTM Channels
from Lon/Lat Channels 29
Importing Magnetic Data 30
Importing EM Data 31
Importing EM61 MK2 data 33
Import EM63 data 35
V i e wi n g A r r a y D a t a 36
Array Channel Profile Viewer Tool 37
Displaying Individual Data Records for Array Data 38
Instrument Dump 38
Export 39
Static Test 46
Instrument Response 49
Dynamic Instrument Response 50
6 Line Test 56
Acceptance Criteria: Positional Accuracy +/- 35cm 56
Repeatability 60
Azimuth Test 62
Azimuth Test 63
Octant Test 66
Navigation Cross Test 67
Navigation Cross Test Summary Report: 70
Data corrections 71
Heading Correction 76
Data Display Tip 77
Path Corrections 84
Warp a Database 84
I n t e r a c t i v e l y wa r p i n g d a t a 84
M a n u a l l y wa r p i n g d a t a 87
Utilities 102
QC QA Tools 110
EM Data 135
Contour 155
DEFINITIONS:
In this Agreement:
"Licensed Program(s)" means the actual copy of all or any portion of Geosoft’s proprietary software technology, computer
software code, components, dynamic link libraries (DLLs) licensed through the Geosoft license server, including any
modifications, improvements or updates provided by GEOSOFT.
“Effective Date” is the date the Geosoft license is installed. This date is recorded by the Geosoft License server when the
Licensed Program(s) is installed.
"Services" means the Services described on Section 4.
"Termination" means the occurrences contemplated by Section 6 and 7.
LICENSE:
GEOSOFT grants to me a non-transferable and non-exclusive license to use the Licensed Program(s) for my own purposes
whereby the Licensed Program(s) are being used only by myself, on one computer, at any one time.
Title and all intellectual property rights in and to the License Program(s), including, without limitation, copyright, trade secrets
and trade marks, shall remain with GEOSOFT. I agree to refrain from raising any objection or challenge to such intellectual
property rights, or from assisting or causing or permitting other(s) to do so, during the term of the Agreement and thereafter
I may not assign this Agreement or any part thereof or sub-license the rights granted herein, or lend, rent, time-share, sell or
lease the software without the prior written consent of GEOSOFT.
I may not attempt to reverse engineer, de-compile or disassemble the software.
I may not make any attempt to circumvent the License Manager that controls the access to the software use.
TERM:
The Term of this Agreement shall commence on the Effective Date and shall continue until termination, as described in Section
6.
SERVICES:
(i) According to the terms of my initial purchase, GEOSOFT shall make available to me, without additional fees such
corrections and improvements to the Licensed Program(s) as may be generally incorporated into the Licensed Program(s) by
GEOSOFT. (Normally this will be for a period of twelve (12) months).
(ii) GEOSOFT has a strong commitment to customer service and product support. GEOSOFT offers me, subject to applicable
Service Charge(s), continuing support in the form of email or telephone advice and other assistance in problem diagnosis and
the correction of errors or faults in the Licensed Program(s) during the life of this License. When a problem occurs which
appears to be related to errors or faults in the Licensed Program(s), I may contact GEOSOFT and GEOSOFT will make an
honest effort to solve the problem. However, GEOSOFT cannot guarantee service results or represent or warrant that all errors
or program defects will be corrected. Also it is to be noted that each Licensed Program is designed to operate on a Windows
NT (sp 6 or later), Windows 2000 or Windows XP platform.
(iii) Further, if I request service relating to the modification of the Licensed Program(s) to meet a particular need or to conform
with a particular operating environment, GEOSOFT may, at its discretion, modify the Licensed Program(s) to meet these
particular needs, subject to applicable Services Charge(s). However, all intellectual property or other rights which may arise
from such modifications shall reside with GEOSOFT.
TERMINATION:
This agreement shall terminate upon the termination date, if any, specified in your purchase agreement with Geosoft.
This agreement may be terminated only upon thirty-days prior written notice to GEOSOFT.
GEOSOFT may terminate this Agreement upon prior written notice effective immediately if I fail to comply with any of the terms
and conditions of this Agreement.
This Agreement shall terminate automatically upon the institution, or consenting to the institution of proceedings in insolvency
or bankruptcy, or upon a trustee in bankruptcy or receiver being appointed for me/us for all or a substantial portion of my/our
assets.
WARRANTY:
GEOSOFT does not warrant that the functions contained in the Licensed Program will meet my requirements or will operate in
the combinations which may be selected for use by me, or that the operation of the Licensed Program will be uninterrupted or
error free or that all program defects will be corrected.
Each Licensed Program shall be furnished to me in accordance with the terms of this Agreement. No warranties, either
express or implied, are made to me regarding the Licensed Program.
THE FOREGOING WARRANTIES ARE IN LIEU OF ALL OTHER WARRANTIES, EXPRESSED OR IMPLIED, INCLUDING,
BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OR MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE.
LIMITATION OF REMEDIES
I agree to accept responsibility for the use of the programs to achieve my intended results, and for the results obtained from
use of said Program(s). I therefore accept complete responsibility for any decision made based on my use of the
aforementioned Licensed Program(s).
In no event shall GEOSOFT be liable for any damages arising from performance or non-performance of the Licensed
Program(s), or for any lost profits, lost savings or other consequential damages, even if GEOSOFT has been advised of the
possibility of such damages, or for any claim against me by any other party.
GENERAL:
I agree that this Agreement is a complete and exclusive statement of the agreement with GEOSOFT.
This Agreement supersedes all previous Agreements with respect to the Licensed Programs, with the exception of a current
signed Technical Service Agreements.
GEOSOFT is not responsible for failure to fulfill its obligations under the Agreement due to causes beyond its control.
Should any part of This Agreement for any reason be declared invalid, such declaration shall not affect the remaining portion
which shall remain in full force and effect as if this Agreement had been executed without the invalid portion thereof.
The relationship between the parties is that of independent contractors. Nothing contained in this Agreement shall be deemed
to constitute or create a partnership, association, joint venture or agency.
The provision of this Agreement shall be binding upon me and GEOSOFT and my respective successors and permitted
assigns.
This Agreement will be governed by the laws of the Province of Ontario and applicable laws of Canada.
YEAR 2000:
The Licensed Programs have been tested to conform to DISC PD2000 1:1998 Year 2000 Conformity Requirements
(www.bsi.org.uk/disc/year2000/2000.html), with the exception of clause 3.3.2, paragraph b. Section 3.3.2 paragraph b) requires
that inferences for two-digit year dates greater than or equal to 50 imply 19xx, and those with a value equal to or less than 50
imply 20xx. The Licensed Programs will recognize all two digit years as 19xx. This is to prevent errors importing historical data
that pre-dates 1950. All dates that follow 1999 must use four digit dates in the Licensed Programs.
3
The best way to find information in this system is to use the Search tab to perform a
full-text search of all help topics. If you still can’t find the information you’re looking
for, the Online Books help system contains complete Geosoft manuals and tutorials
in Adobe PDF format.
The UX-Process system enables you to set up a project, plan a survey, import and
correct data, analyze instrument tests, access quality control, quality assurance and
prove-out tools, manage progress status, determine target analysis calculations, and
display results using standard US Army Corps basemaps.
The UX-Process v6.4 system has been updated with a restructured menu system for
an improved workflow. The updated menus combine the UX-Process (previously
named USACE DoD QA/QC) and the montaj UX-Detect systems enabling them to
be fully integrated providing a logical and consistent workflow. The updated system
includes three main menus and is a run from within the Oasis montaj core software
platform.
UX-DataPreparation
1.) Setup Parameters: This option enables you to set up basic information regarding
the project.
2.) Survey Layout This menu opens a sub-menu containing the following survey
layout options:
• Plan a UXO Survey
• Display the Index Map
• Define Cultural Mask
• Move Control Points
• Move Survey Boundary Points
• Discard Control Points/Survey Boundary
Changes
3.) Import This menu opens a sub-menu containing the following import options:
• Import Data
• Dump Geonics Instrument to File
• Import Geonics Dump File to Database
4.) Export This menu opens a sub-menu containing the following export options:
• Export XYZ File
• Export To|Import From Dig Sheet
6 DoD UX-Process Overview and Capabilities
5.) USACE Instrument Tests This menu opens a sub-menu containing the following
path correction options:
• Static Test
• Instrument Response
• Optimum Sensor Height
• 6 Line Test
• Repeatability
• Azimuth Test
• Octant Test
• Navigation Cross Test
6.) Data Corrections This menu option opens a sub-menu containing the following
data correction options:
• Base Station Correction
• Heading Correction
• Instrument Drift Correction
7.) Path Corrections This menu opens a sub-menu containing the following path
correction options:
• Warp a Database
• Sensor Offset Corrections
• Instrument Latency Correction
• Non-Systematic Lag Correction
• Navigation Error Correction
• Split Master Survey GDB
8.) Utilities This menu displays a sub-menu containing the following utility options:
• Pipeline Detection and Removal
• Interactive Pipeline Selection
• Velocity Calculation
• 60 Hz Power Lines Filter
DoD UX-Process Overview and Capabilities 7
10.) Progress Reporting This menu displays a sub-menu containing the following
progress reporting options:
• Create Progress Status Table
• Load Progress Status
• Update Progress Status
• View Audit Log
• View QC Report
11.) Other DoD Tools This menu displays a sub-menu containing other DoD tools:
• Meandering Path Density Analysis
• Automated Quality Assessment Program
System
UX-ParameterDetermination
1.) EM Data This menu opens a sub-menu containing the following options for
working with EM data:
• Geophex Conductivity/Susceptibility
• Check Time Gate Data
• Time Constant Calculation
• Display Decay Curves
2.) Target Selection This menu displays a sub-menu containing the following target
selection options:
• Pick Peaks Along Profiles
• Find Peak Dipoles
8 DoD UX-Process Overview and Capabilities
3.) Target Analysis This menu displays a sub-menu containing the following target
analysis options:
• Calculate Signal Strength and SNR
• Display Target Windows
• Redefine a Target Window
• Batch Fit Targets
UX-TargetManagement
1.) Target Maps This menu opens a sub-menu containing the options for creating
target maps:
• Create US Army Corps Basemap
• Display Grid
• Contour
• Site Plan from DXF File
• 3D Colour Range Symbols
• Generate Comparative Map
• Redraw Comparative Map Colour Bars
• Proveout Map
2.) Target Utilities This menu opens a sub-menu that contains the following target
utility options:
• Generate Composite Target ID
• Display Target Windows
• Redefine a Target Window
• Statistics on Target Removed Data
• Target Density Calculation
• Calculate Distance to Corners
• Calculate Target Location
• Shortest Path
3.) Target Classification: This option enables you to classify the target data either in
increasing/decreasing order of up to 4 channels, or by providing up to 4 expressions
in order of importance.
4.) Image Manipulations This menu opens a sub-menu that contains the following
image manipulation options:
• Attach Images to Dig Sheet
• Dig/No Dig
• Display Target Attributes
• View Target Image
DoD UX-Process Overview and Capabilities 9
5.) Dig Sheet Analysis This menu opens a sub-menu that contains the following dig
sheet analysis options:
• Post Dig Verification of Unsubstantiated Pick
• Post Dig Verification of Target Signal Ranges
10 Before you begin
Creating a Project
In order to access the UX-Process menu in Oasis montaj, you must have an open
Project. An Oasis montaj "Project" encompasses every item in your working project;
from the data files in your project (databases, maps, and grids), to the tools used
(including auxiliary tools such as histograms, scatter plots etc.), to the project setup
including the menus you have displayed and whether you are working on a map or
profile and the state in which you left it the last time you used it.
The project also controls your working directory. Projects are saved as (*.gpf) files. If
you open an existing project from a directory, the system assumes that all your
project files are located in the same directory. To streamline your work, as well as
keep it organized, you may wish to make sure that your project file is in the same
directory as the other files you want to use. We recommend that each project you
work on have its own project (*.gpf) file. If you use a number of applications or add-
on tools in Oasis montaj that have different menus, you can use the project to display
only the menus you require.
The Project Explorer tool enables you to browse as well as open any project item.
The Project Explorer has two tab windows, the Data window that includes all data
files included in the project and the Tools window that organizes and maintains the
project tools. To access the Tools window click the Tools bar on the bottom of the
Project Explorer. To return to the Data window, click the Data bar on the top the
Project Explorer.
T O C REATE A P ROJECT :
Note: Oasis montaj assumes that your data is in the directory containing this project
(i.e. D:\DoD).
3. Specify a name and directory for the project. For example, name the project
(tutorial) and specify the working directory as D:\DoD.
4. Click the [Save] button. The system saves the project and indicates it is open
by adding menus to the menu bar, adding buttons to the standard short-cut bar
and by displaying the Project Explorer window. These are visual clues
indicating that you are ready to start working with the system.
Before you can start working with the UX-Process system, you must load the UX-
Process menu (UX-Process.omn) to your Oasis montaj menu bar. If you require
more detailed information on modifying menus, refer to the Oasis montaj Online
Help System (Help|Help Topics).
T O L OAD THE UX-P ROCESS MENU :
1. On the GX menu, select Load Menu or click the Load Menu icon ( ) on the
main toolbar. The Load Menu dialog is displayed.
2. Select (UX-Process.omn) from the list of files and click the [Open] button.
The system adds the UX-Process menus (UX-DataPreparation, UX-
ParameterDetermination and UX-TargetManagement) on your menu bar.
Note: After completing these steps, you are now ready to start using the UX-Process
system. If you require more details about Oasis montaj capabilities, please
refer to the Oasis montaj Quick Start Tutorials, which can be found on the
Help|Manuals and Tutorials menu.
12 Before you begin
This tutorial uses sample data provided with the installation of the UX-Process
system and can be found in the “C:\Program Files\Geosoft\Oasis montaj\data\usace”
directory. The tutorial and associated data files can also be downloaded from the
Geosoft web site www.geosoft.com/resources/tutorials/. Before you begin the tutorial
copy the data files to your working (project) directory, i.e. D:\DoD.
Tutorial 1: UX-Data Preparation 13
Setup Parameters
Whenever you begin a new UX-Process project, it is highly recommended that you
run the Setup parameters menu option. This tool stores your project information,
which can then be used elsewhere in the program. The Setup parameters dialog
enables you to specify the following information: project name, projection
coordinates, longitude and latitude (required for determining magnetic declination),
survey height, client name, contractor name, interpreter name, user comments, project
date, local coordinates, DoD and Company logos (for map display) and QC report.
The UX-Process system automatically projects your data from Latitude, Longitude to
UTM WGS 84. The specific zone your data is in will be calculated by your lat/long
coordinates. If you require your data in another projection you can click the
[Projection] button on the Setup Parameters dialog to launch the Projection Wizard.
Company logos, in bitmap, JPEG or TIFF format, can be added to all maps by
including them in the project setup. The DoD logo (USACELogo.jpg) is included
with the UX-Process system and can be found in the Oasis montaj|etc directory. To
include your Company logo, copy the file into your project directory (or any other
appropriate directory) and all processes that generate maps of the data will look for
this logo and if found it will be displayed on the map legend.
To access additional technical information on the project Setup parameters, click the
Help ( ) button. The data files used in this tutorial are: USACELogo.jpg and
CompanyLog.bmp.
The Setup Parameters dialog will set the basic information regarding the UX-Process
project. These parameters are common to numerous processes and are collected at
the onset of the project.
T O S ET UP THE S YSTEM :
2. Normally you would specify the parameters as they apply to your project and
click the [OK] button. However, for this tutorial complete the parameters as
shown in the dialog above. The following table outlines the Setup parameters,
as they are defined in this tutorial.
Project Enter the name of the Project as (Aberdeen Proving
Grounds).
Distance units Distance units are the measurement units for the
project and are based on the distance units of the
defined projection. If no projection is defined, the
default units are “Foot”.
Longitude (-180 to 180) Select a point within the survey area. This point will
be used to calculate the magnetic north. The
Latitude (-90 to 90) Longitude and Latitude values of this point should be
supplied in decimal degrees (-72.74, 66.72).
Survey height The pre-set height of the project. This is the default
height for all GXs that require this information.
User Comments Any comments the user would like to add to the
project (User Comments).
Local Coordinate system If there is a local coordinate system and you would
like to display it on the maps then turn this option on
(Yes).
DoD logo If you want to display the USACE logo on the maps
please specify the location here. The logo
(USACELogo.jpg) is supplied with the installation
and can be found in the “...//Oasis montaj/etc”
directory.
Corporate logo If you would like to display your own logo on the
maps please enter it, otherwise you can leave this
entry blank. A Geosoft logo in *.bmp format is
provided with the tutorial data for use with this
tutorial (CompanyLogo.bmp).
3. The default projection, UTM WGS 84, is acceptable so we can click the [OK]
button on the Setup parameters dialog to set the project parameters. For more
information on setting a projection see Tutorial 11: Projections in the Oasis
montaj Quick Start Tutorial.
16 Tutorial 1: UX-Data Preparation
2. This dialog displays your projected coordinates, which in this case are the
default UTM WGS 84 coordinates. The specific UTM zone 18N has been
calculated from the latitude, longitude coordinates.
Survey Layout
Most ordnance surveys are laid out in square or rectangular blocks, or grids. The
planning of the UXO survey includes setting the size and azimuth direction of the
blocks. When a bounding polygon exists, the survey layout dialog will automatically
generate sufficient grids to cover the area enclosed by the polygon. Alternatively the
survey layout may be imported from an external file. Interactive tools are provided
that enable you to shift the areas or alter the boundary, if you decide that the
automatic generation of the survey areas are not entirely to your satisfaction.
The coordinates of the individual grids are stored in the project and both a plan
database and map may be optionally generated for each survey grid. The summary of
the layout is saved in a text file.
To access additional technical information on the survey layout parameters, click the
Help ( ) button on the dialog of interest. The data files used in this tutorial are:
surveypolygon.dxf, moguls.ply and cistern.ply
This section describes how to layout a UX-Process survey. Topics discussed in this
section include:
Tutorial 1: UX-Data Preparation 17
A Geosoft polygon file, an ArcView shape file, a DXF polygon file, or a DGN file
can be used to specify survey boundaries. Grids can be directly specified using the
“Control Point file” option. This option accepts *.CSV, Excel, *.DXF, Shape, and
text files. Randomly distributed grids can be supplied by specifying their coordinates
in a Control Point CSV file, or alternatively a Meandering Path can be supplied as a
text file with X & Y coordinates. The survey area may consist of multiple polygons;
if so all polygons should be defined in the same polygon file.
You can also plan a survey when no boundary polygon or a pre-existing grid
specification file exists. With this option, you are prompted for the position of the
lower left corner of the first grid and the number of blocks in each direction.
2. The Distance units are taken from the Setup Parameters (projection
information) dialog.
3. Specify the Geographic location (1) and (2) as (Aberdeen Proving Grounds
and Maryland). Specify the Prefix, which will be added to the database and
map file names for each survey grid as (APG) and then specify the Sector
prefix (optional) that is added to the indexed grid names and enables the
inclusion of multiple sectors in the project, as (S1).
4. Using the dropdown lists, select the Instrument as (Magnetometers),
Calculate planned databases? as (Yes), Create planned maps as (No), and
Define survey boundary via (DXF file). Note that, the specifications of the
survey area may come from a number of different sources, in a number of
different formats. To facilitate your workflow we support the following
formats, Geosoft polygon file, ArcView shape file, DXF file, DGN file,
Meandering path file, or specified directly through control point files (such as,
CSV, ArcView Shape, DXF, and Excel). Alternatively, the user may select
"No boundary", and will be prompted for the lower left corner of the survey as
well as the number of grids in each direction.
5. If you already have a Cultural Mask File you can apply it here. If however,
you need to define a cultural mask you can do so in a following step. For now
we will leave the Cultural Mask File parameter blank.
6. Click the [Next>] button. The Plan a UXO survey: Grid specification dialog is
displayed.
7. Specify the Survey direction (cw angle from true north) that is the azimuth of
the grid Y-axis, upward from the origin, as (-11) and specify the Grid X
dimension (the width of the survey grid) as (37.4) and the Grid Y dimensions
(the height of the survey grid) as (26.2).
8. Click the [Next>] button. The Plan a UXO survey: Survey geometry dialog is
displayed.
Tutorial 1: UX-Data Preparation 19
9. Click the [Update Spacing] button to set the appropriate Line spacing and
Sample spacing for the selected instrument. If your survey specifications are
any different you can change the default values accordingly.
10. Specify the Line azimuth (cw from true north) in degrees as (79). You can also
specify the First line number (line number located at the bottom left of the
survey.) and the Line number increment. For the purposes of this tutorial we
can leave them as the default value of (1)
11. Click the [Next>] button. The displayed dialog is dependent on the selection
from the Define the survey boundary via parameter. In this case the Create
survey from DXF file dialog is displayed.
12. Using the [Browse] button locate the DXF file from your working directory
(surveypolygon.dxf).
13. The X and Y origin coordinates can be entered manually or by clicking the
[Origin] button, the lower left point of the bounding rectangle of the polygon,
in the grid direction, will be selected. In our case, we will click the [Origin]
button.
14. To preview the locations of the survey grids on the “Index map” click the
[Preview] button. The name of the map is prefixed by the project name,
followed by the word “Index”. Therefore, the name of the index map
generated in this tutorial will be APG_Index.map. This map is generated
using the parameters set in these dialogs.
20 Tutorial 1: UX-Data Preparation
15. If the layout is not correct, the origin value may be changed and the [Preview]
button can be clicked to update the index layout. To change the Grid north
azimuth or the Dimensions of the grids, click the [<Back] button, reset the
values, and click [Next>] and then, click [Preview] to update the index map.
This index layout will be drawn on all subsequent maps of the individual grids.
Try to modify the origin to 402775.3, and 4369597.14 and observe the effect.
(Note that, to return to the first point in the polygon file, click the [Origin]
button.)
16. If the grid layout is correct, click the [Finish] button. The survey plan
databases and maps will be created and stored in your project directory.
17. Two summary reports will also be created and stored in your project directory.
The summary file names are prefixed with the project name and followed by
the word “summary”. The summaries generated in this tutorial are entitled
APG_Summary.txt and APG_Summary_Local.txt. The former is in the
projected ground coordinates, while the latter is in the local reference system.
The summary files are particularly useful to project the total line kilometers
and the total number of lines and grids covering the survey.
18. Because we selected (Yes) to Calculate planned databases? on the Plan a
UXO survey dialog, projected databases for each grid in the survey plan are
generated. Creation of these databases provides the line kilometres of the
survey, which is helpful in the preplanning stage. Selecting “No” will prevent
these individual survey planned databases from being generated.
19. Maps that represent the proposed coverage of each survey grid in the survey
plan can also be generated by selecting “Yes” to Create planned maps on the
Plan a UXO survey dialog. The legend of each map identifies the survey grid,
displays the geographic coordinates as well as the project description.
However, we selected (No)” and these individual plan maps were not
generated in our project.
Tutorial 1: UX-Data Preparation 21
Two summary reports are created and stored in your project directory. If the user
chooses to not generate the planned grids these summary files will only report the
total coverage and the total line kilometres, without any further breakdown. The first
report “Summary.txt” (in our case APG_Summary.txt, as we select a Database
name prefix on the Plan a UXO Survey… dialog) displays the data in the projected
ground coordinate system, as shown below:
22 Tutorial 1: UX-Data Preparation
The project Index map can be displayed at any time to use as a reference or for a
backdrop to display project data.
1. On the UX-Process menu, select Survey Layout|Display the Index Map. The
current survey layout will be refreshed and will be displayed. Note that the
name of the index map is automatically generated to be the project name
appended by _index.map.
Cultural features can be defined on the survey Index map by applying an optional
cultural backdrop file, which can be used as a guide to define the cultural areas to be
masked. The permissible file formats are: Polygon, DXF, ArcView Shape files and
DGN files. The resulting cultural mask file is in the Geosoft polygon format and can
be subsequently used to mask the survey database files.
The data files used in this procedure are the Index map (APG_Index.map) and two
PLY files (moguls.ply and cistern.ply). Ensure that you have copied the data files
provided for the tutorial into your project directory.
2. Using the [Browse] button, locate the Map to use (it is recommended to use
the survey index map generated by the planning process i.e.
APG_Index.map). Then [Browse] and select while holding the <Ctrl> key the
two files (moguls.ply and cistern.ply) as the Cultural backdrop.
3. Specify the name of the Output cultural mask file as (Cultural Mask.ply).
The cultural backdrop files can be files containing roads and buildings or
inaccessible areas that can be used as a guide to interactively determine the
cultural back drop. The above files have been supplied for this exercise.
4. Click the [Add] button and the UCECULTURALMASK dialog will be
displayed.
5. Click the [OK] button to define your cultural mask. Note that, the Cultural
backdrop file(s) have been displayed on the map. Holding your cursor over the
first location, press the <Enter> key, the Specify Exact Position dialog will be
displayed.
6. You can specify an exact location or accept the X and Y values as displayed
and click the [OK] button to continue. Note that, to quickly select locations,
without exact positions, select position by clicking your left mouse button. For
this tutorial we will define the areas outlines by the cultural backdrop files.
Tutorial 1: UX-Data Preparation 25
7. When you have defined the cultural mask area, right-click and from the popup
menu select Done. The Add/Remove cultural mask polygons dialog will be
displayed.
8. To add the second polygon area to your Cultural Mask, select the Append
radio button and then click the [Add] button, the UCECULTURALMASK
dialog will be displayed.
9. Click the [OK] button and define the second area, right-click and from the
popup menu select Done. The Add/Remove cultural mask polygons dialog will
again be displayed. Click [Done] the cultural mask will be displayed on your
Index map. This mask can be used as an overlay on the project maps in order to
delineate the culturally noisy areas.
The automatic calculation of the grids may span over areas that can not be surveyed,
or may include grids with a minimal coverage. As an adjustment, you can move the
26 Tutorial 1: UX-Data Preparation
vertices of the individual survey areas in order to optimize the number of survey
areas.
1. The Move control points option can be accessed in two ways; by selecting the
option from the UX-DataPreparation|Survey Lyout menu, or by placing your
cursor over the APG_Index.map, right clicking and from the popup menu
selecting Move Control Point. The UCEMOVEVERTEX dialog will be
displayed.
2. Once you have clicked the [OK] button on the UCEMOVEVERTEX dialog,
you then click once on the vertex you wish to move.
Note: To select the location on the vertex you can click the left mouse button or you
can place your cursor over the location and press the <Enter> key, the Specify
exact position dialog will be displayed. You can use this dialog to specify an
exact X Y location.
3. Then click on the location that you want the vertex to move to. Select the
middle of the right most edge of the grid S1B002 and move it by about 2
meters to the right.
4. This dialog is called in a loop without any further prompting. In order to exit
the loop, right-click and select Done from the popup menu (or press the <Esc>
key on the keyboard), the control point will be moved to the location you
specified.
5. The following image shows the APG_Index.map with the control points
moved to the specified location.
Tutorial 1: UX-Data Preparation 27
As a further adjustment, you can move the vertices of the survey boundary in order to
optimize your survey area. In this exercise we will move the survey boundary to
include only the area that we will focus on in this tutorial, S1A001.
If you are not satisfied with the changes you just made to the survey layout, you can
discard them, and recover the previous state of the layout. Please note that only one
level of changes is saved. We recommend the usage of this tool as opposed to the
one available through the map menu of Oasis montaj; the latter only updates the map
but not the supporting polygon, and grid layout files.
2. Using the dropdown list, select the file (survey layout or boundary file) you
want to discard the latest changes in. Select the (Survey layout file). Your
previous layout is restored and the map updated.
Import
The UX-Process system enables you to import a number of different potential field
data types into an Oasis montaj database and manipulate or view the data in the
Spreadsheet window.
To access additional technical information on importing and exporting your data files,
click the Help ( ) button on the dialog of interest. The data files used in this tutorial
include magnetic (Mag858.XYZ) and EM data files (em61.XYZ and em63.XYZ) in
XYZ format and an EM data file in R61 format (em61mk2.R61). Make sure that you
have copied the data files provided for the tutorial into your project directory.
Import Data
Different types of potential field data can be imported into an Oasis montaj database
(*.gdb) and manipulated or viewed in a spreadsheet window. The import data types
supported include; GEM-G7G, GEM3 (*.dat), GSM-19 Normal and walk (*.dmp),
Geonics raw (*.r61, *.g61) and ASCII (*.m61) formats, Geosoft’s XYZ (*.xyz)
format, excel record oriented files ( *.xls, *.csv), and the new Foerster ASCII format.
We currently support two GPS file types, the Trimble and Ashtec. If the GPS
information is already part of the data file, a "none" option can be selected from the
dropdown list.
Once a data file has been imported into a database, if (1) the project contains a default
projection, and if (2) the database contains lon/lat channels (specifically, a channel
with name starting with “lon” and a channel with name starting with “lat”), then 2
new channels (X_UTM/Y_UTM) will automatically be created and displayed in the
database. These new channels will contain the X/Y coordinates in the default
projection, derived from the existing lon/lat values.
30 Tutorial 1: UX-Data Preparation
A variety of data types can be imported into an Oasis montaj database. For this step
we will import magnetic data in XYZ format.
2. The Import GX dialog asks if you want to import into the current database,
click the [No] button. The Create New Database dialog is displayed.
3. For the New database name, we would suggest using a qualifier either relating
to the survey ‘prefix’ you specified in the Plan a UXO survey dialog (i.e.
APG) or if different instruments have been used prefix the file correspondingly
(i.e. Mag). This is to assist in correlating all the data within your project. Also,
note the ‘suffix’ naming method used for the survey area grids. We would
suggest you maintain these when importing data files for these specific areas.
For example, if you are importing data for the survey area S1A001, add
(_S1A001) as the suffix.
4. You can leave the rest of the parameters to the default values, and click the
[OK] button. The Import survey data dialog is displayed.
Tutorial 1: UX-Data Preparation 31
5. Enter the values as indicated above and click the [OK] button. The data will be
imported into the database and displayed in the spreadsheet window.
6. You will notice that the Time channel in the database is in decimal units. To
make this channel more readable we have to modify its format type. Right
mouse click on the Time channel header cell and select Edit from the popup
menu. Using the Display/Format dropdown list, select (Time) and then click
the [OK] button. If your Time channel changes to a series of two asterisks,
then the channel is not wide enough to display the values. To widen the
channel, move your curser over the right side of the channel, the curser will
change to two arrows, and stretch the channel to the right.
7. You will notice that markers have been added to the coordinate channels
headers to indicate which channels are currently defined to be the "current" X
and Y channels. The markers are little rectangles on the right side of the header
cell, and contain "x" and "y" in reversed display. These labels identify the
primary coordinate channels.
I MPORTING EM D ATA
A variety of data types can be imported into an Oasis montaj database. For this step
we will import EM data in R61 format.
2. This dialog asks if you want to import into the current database, click the [No]
button. The Create New Database dialog is displayed.
3. For the New database name, specify (APGEM61_S1A001) and then you can
leave the remaining parameters to the default values, and click the [OK]
button. The Import survey data dialog is displayed.
4. Using the Import Data Type dropdown list, select (XYZ (mag or EM) and
using the [Browse] button select the Data File as (em61.XYZ).
5. Click the [OK] button and the data will be imported to the database
(APGEM61_S1A001.gdb) and displayed in a spreadsheet window in your
current project.
Tutorial 1: UX-Data Preparation 33
Note: The Time channel is in decimal units and the Display/Format needs to be
changed to (Time). For more information on changing the format, see page
31.
When binary EM61 data is directly imported a profile adjusting tool is automatically
triggered. For this exercise we will import an EM61 MK2 data set and adjust a
profile.
Note that, if the imported X and Y data requires interpolation it will automatically be
splined during import using the standard “Linear” straight-lined technique.
T O I MPORT EM61 MK 2 D ATA INTO A D ATABASE :
3. In the New database name box, type (EM61MK2_S1A001), you can leave the
rest of the parameters to the default values, and click the [OK] button. The
Import survey data dialog is displayed.
34 Tutorial 1: UX-Data Preparation
4. Specify the Import Data Type as (R61) and then using the [Browse] button,
locate the EM61 MK2 data set (em61mk2.r61). Note that, you will need to
change the Files of type to (Files (*.r61)) on the Data File dialog, to display
the file to select.
5. The remaining parameters can be left to the intelligent defaults, as shown
above, click the [OK] button. The Geonics Import dialog will be displayed.
6. This dialog asks, Would you like to adjust the Profiles? click the [Yes] button.
7. The UCEG61ADJUST dialog is displayed in the foreground and the map
(EM61MK2_S1A001.map) is displayed in the background. If the database
contains more than 10 lines the stacked profile map will be too cluttered. To
get around it, the user is prompted to enter the preferred number of stacked
profiles per map.
8. The first sets of stacked profiles are displayed and the user can adjust them. If
there is a second set, when the user presses “Done” the second set is displayed
and the user can go through the same mechanics of shifting the profiles. This
process will continue until all the profiles have been displayed.
Tutorial 1: UX-Data Preparation 35
The following procedure will demonstrate importing EM63 data, in XYZ format, into
a database file.
The EM63 data (em63.XYZ) included with the tutorial data has 26 time gate
channels (e.g. Ch1, Ch2, Ch3… Ch26). During the EM63 import process these time
gates can be combined into a single vector and presented as a decay curve right inside
the database.
Note: Because a default projection has been set for the current project and the EM63
dataset (em63.XYZ) includes Longitude/Latitude channels, the import of this
dataset will also include the automatic creation of projected X_UTM|Y_UTM
channels. For more information about the “Automatic creation of
X_UTM|Y_UTM channels from Lon|Lat channels”, see page 29.
2. This dialog asks if you want to import into the current database, click the [No]
button. The Create new database dialog is displayed.
3. In the New database name box, type (EM63_S1A001), you can leave the rest
of the parameters to the default values, and click the [OK] button. The Import
survey data dialog is displayed.
4. Specify the Import Data Type as (XYZ (EM63)) and then using the [Browse]
button, locate the EM63 data set (em63.xyz).
5. The remaining parameters can be left to the intelligent defaults. Click the [OK]
button and the Database Projection dialog is displayed.
36 Tutorial 1: UX-Data Preparation
6. Use the dropdown lists to select the current X and Y channels. For the X
Channel select (longitude) and for the Y Channel select (latitude) and click
the [OK] button. The Longitude and Latitude channels will be set as the
current X and Y channels and the data will be imported to the
(EM63_S1A001.gdb) database and displayed in a spreadsheet window. For
more information, click the Help ( ) button.
Note: The projected channels (X_UTM / Y_UTM) will be added to the database.
These channels contain the X|Y coordinates in the project’s default projection,
calculated from the existing Longitude/Latitude values.
7. View the channel, EM63, which displays your EM63 time gate channels in
array format.
Note: To view the longitude and latitude channels in geographic format, edit the
header cells (right-click and select Edit from the popup menu) and change the
channel Display format to (Geographic). The Time channel can also be
change to a (Time) format.
Array channels, which contain multiple columns of single channel data, appear as
profile curves in certain cells of a database.
By representing data in an Array channel, all the readings for a single location can be
put into one column of the spreadsheet instead of having a several channels for the
multiple amplitude readings at each survey location. When dealing with Array
Tutorial 1: UX-Data Preparation 37
channels, it is important to remember that the channels do not display the data
numerically, but represent the data as a curve.
Oasis montaj has a number of tools for working with Array data including, an
interactive Array Channel Profile Viewer tool.
The following Array tools are available from the database popup menu (to access
popup menu, right-click on an Array channel cell).
• Array Viewer – Use to view Array data in an interactive display tool.
• Array Display Options – Use to specify the display options for an Array channel
including, the windows to display, base value, and range in values.
• Array Profile Colours – Use to specify colour zone file to be used to colour
individual profiles in Array channel.
• Array Profile Plotting – Use to plot an individual Array profile from an Array
channel in the current database.
• Array Section Colours – Use to specify a colour zone file, to be used to colour an
Array channel in each database cell, when ‘section’ display option is chosen (see
Array Profile/Section Option below).
• Array Profile/Section Options – Use to set option for displaying Array channels in
each database cell, using profiles, section colours, or both.
For more information on these, or on any Geosoft UX-Process System menu item,
click the Help ( ) button on the dialog of interest.
The Array Channel Profile Viewer displays the Array in the display window on the
right and provides a number of display options on the left. For more information,
click the [Help] button.
Instrument Dump
The Dump Geonics Instrument to File and the Import Geonics Dump File to
Database options, found under the UX-DataPreparation|Import menu, enable you to
import data in real time. The instrument (i.e. Scintrex Smartmag or Geonics EM61)
must be connected to your computer through the serial port, while you are doing the
survey. Then you can select the option to import the data directly into a Geosoft
database. Two import modes are available, either the data is directly dumped from the
console to the database (Data dump) or it is recorded and saved to the database
(Walk mode).
Tutorial 1: UX-Data Preparation 39
Note that, because this option requires the instrument to be connected to the computer
throughout the survey, this option can not be demonstrated in this tutorial. .
Export
The UX-Process system enables you to export your data to a Geosoft XYZ file, with
options for header comments and export to or import from a Dig Sheet.
The Setup parameters can be included at the top of the exported file as comments.
Only parameters that are defined will be included and if the projection is set, the
corresponding GXF parameters are also exported to the comment header.
1. Open and select the database you want to export to an XYZ file as
(APGMag_S1A001.gdb).
2. On the UX-DataPreparation menu, select Export|Export XYZ File. The Export
XYZ data dialog is displayed.
3. Specify the name of the new Output XYZ data file as (APGMag_S1A001).
Then using the dropdown menus select to Include Setup Parameters? as (yes)
and Include Comments? as (yes). We will leave the Comments file blank and
click the [Template] button. The Export XYZ Template dialog is displayed.
40 Tutorial 1: UX-Data Preparation
9. Click the [OK] button and the database file will be exported to XYZ format
and saved in your project directory.
10. To view this file, on the Edit menu, select Edit ASCII file. The Edit file using
(your text editor) dialog is displayed, enabling you to browse for the
APGMag_S1A001.XYZ file (see XYZ file below). For more information on
exporting a XYZ file, click the Help ( ) button on the Export XYZ data
dialog.
The Export To/Import From a Dig Sheet option enables you to export to or import
from a dig sheet file in “Excel” format to an Oasis montaj database file. This easy-
to-use dialog simplifies the export and import of dig sheet data directly to an Oasis
database.
The standard dig-sheet template, illustrated below, is used in the field. The field crew
may not have access to the UX-Process package, and should be able to work with a
standard spreadsheet. For this purpose we provide an “export from / import to”
standard Excel dig-sheet spreadsheet to our database format.
42 Tutorial 1: UX-Data Preparation
The illustration above displays the first 11 rows of the dig sheet template file
(DigSheet_Template.xls), which is copied to your current “working directory” and
also can be found in the ‘…|Oasis montaj|etc|’ directory.
Note that, when the dig sheet file is exported there is a title section that is filled as
much as possible with the Project Parameters. The dig sheet basically contains the
entire life cycle of the project. The original picks, the dug targets, the readjustments
etc… The import does the same in reverse.
3. As this dialog can be used in both Export and Import Mode, you must first
specify your ‘Mode’ by selecting the Export radio button, bottom-right of the
dialog (the data arrows should now all be pointing right to left, database-to-
spreadsheet).
4. Then, using the Browse ( ) buttons, (bottom-left of dialog), locate the
Database file to export as (TruthTable_S1A001.gdb). In the Excel File box,
specify the name of the export file as (APGMag_Targets). The Top Left Cell
is the first data cell (below the general Project Information) to be added to the
export file.
44 Tutorial 1: UX-Data Preparation
5. Then, from the top of the dialog, for the Dig sheet Column Name (Unique
Target ID), select the Database Channel as (Target_ID).
6. In the center of the dialog, on the Original Survey tab using the Database
Channel(s) dropdown lists, select your channels that correspond to the Dig-
sheet Column Name (see dialog above) and then, select the Dig Results tab.
7. Again, using the Database Channel(s) dropdown lists, select your channels
that correspond to the Dig-sheet Column Name (see dialog above). When you
are satisfied with your selections, click the [OK] button to export the selected
channels to the Excel spreadsheet file, APGMag_Targets.xls.
Tutorial 1: UX-Data Preparation 45
Oasis montaj provides a Line Selection Tool that enables you to select or deselect
individual, multiple or a range of lines in the current database.
1. Open a database of interest (in this example we are using the database
APGMag_S1A001.gdb).
2. On the Data menu, select Lines|Selection tool. The Line Selection Tool will be
displayed.
3. Using this tool you can highlight single or multiple lines (holding <Shift> or
<Ctrl> keys) and then click the [Select highlight] button to select ( ) the line
or click the [Deselect highlight] buttons to deselect ( ).
4. To include a range of lines, click the [Highlight by range] button and the
dialog will be expanded. In the text box provided, add a range of lines to be
either added or removed from the highlight.
5. Click the [OK] button to close the tool.
46 Tutorial 1: UX-Data Preparation
This section describes how to apply USACE Instrument Tests using the DoD UX-
Process system. Topics discussed in this section include:
• Static Test (page 46)
• Instrument Response (page 49)
• Optimum Sensor Height (page 52)
• 6 Line Test (page 56)
• Repeatability (page 60)
• Azimuth Test (page 62)
• Octant Test (page 66)
• Navigation Cross Test (page 67)
For more technical information about the individual instrument tests, click the Help
( ) button on the dialog of interest. This tutorial uses sample data provided with
the installation of the UX-Process system. The files used to demonstrate how to
perform the instrument tests can be found in the “…\Oasis montaj\data\usace”
directory. Copy these files (mag858_statictest.gdb, DynamicResponse.gdb,
sensorheight.gdb, 6line_em.XYZ, 1023QC_mag_final.XYZ (or
1023QC_EM_final.XYZ), RepeatabilityEM.gdb, AzimuthMag.gdb, and
OctantMag.gdb) to you working directory.
Static Test
Static tests are generally performed by collecting readings in automatic mode with the
instrument held in a stationary position. Readings should be collected at roughly the
same sampling frequency as used for the actual data, for a minimum of 3 minutes. It
is suggested to run the instrument for 3 minutes in a non-responsive environment –
then run it for 1 minute over a responsive object, and again for 1 minute in a non-
responsive environment. These tests should be done at the start and the end of every
day. The Static Test dialog automatically generates a profile view of the channel data
in the database, and flags data points residing outside the user defined acceptable
range. This dialog then stores the static test data in a Master Database located in your
working directory and is named StaticNoiseMaster.gdb. Each instrument has its own
characteristic noise level, for example the EM61 has acceptance criteria within +/-
2.5 mV, while the acceptance range of the EM61MK2 configuration depends on the
time gates, and decreases for the higher time gates.
Tutorial 1: UX-Data Preparation 47
The data file used in this test is the database (mag858_statictest.gdb). Ensure that
you have copied this file to your project directory.
1. Close all databases that are currently open in your project (Data|Close All
Databases) and then using the Data menu, select Open database. The Open
database dialog will be displayed.
2. Select the database (mag858_statictest.gdb) from your working directory and
click the [Open] button. The database will be opened and displayed in a
spreadsheet window in your current project.
3. On the UX-DataPreparation menu, select USACE Instrument Tests|Static Test.
The Static test dialog is displayed.
4. Using the Static calibration line dropdown list, select the line(s) to process as
(<Select 3 lines>). Note that, there are 3 lines in the test data file as described
above.
5. From the Channels to process dropdown list, select (All channels). If you have
set a Project name in the initial project set-up, then the project name will be
displayed (i.e. Aberdeen Proving Grounds), if not specify the name of your
project.
6. Using the Instrument name dropdown list, select (Magnetometer) and in the
Instrument threshold (%) box you can specify a threshold % where only if the
number of points outside the acceptable range is above this threshold they will
be flagged. We will leave this blank.
48 Tutorial 1: UX-Data Preparation
7. In the Default Acceptable range box specify (0.5). Using the Default Display
units dropdown list, select (Instrument units) and we will leave the Grid
location, which is optional, blank. The Operator and the Calibration Date are
filled automatically from the setup parameters dialog. Using the Time
dropdown list, select (AM).
8. The [ChangeParms] button enables you to specify the acceptable range for
multiple channels, click this button and the Static test dialog is displayed.
9. Using this dialog you can cycle through all the selected channels to modify the
acceptable range as required. The Acceptable range of (0.5) is appropriate for
both the BOTTOM_RDG and the TOP_RDG, therefore click the [Next>]
button until the Channel VRT_GRAD is presented. Change the Acceptable
range for the VRT_GRAD to (0.2), leaving the Display units as Instrument
units.
10. Click the [Finish] button and the Static calibration test dialog is displayed.
11. Using the dropdown list, select the lines as shown above (for more information
on the Static calibration parameters, click the Help ( ) button).
12. Click the [Finish] button and the UCECALIBRATE dialog is displayed.
Tutorial 1: UX-Data Preparation 49
13. This dialog asks, “Do you want to specify expected value(s)?”. If you know
what the expected values are for the specific channels, both with an object and
with out an object, you can click the [Yes] button and the Static test dialog will
be displayed enabling you to select the channel(s) and specify the expected
readings.
14. For this tutorial we will click the [No] button and the database
(StaticNoiseMaster.gdb) will be created along with maps of the data channels,
which are displayed in graphical format. The 3 static test lines for each channel
are displayed on the maps, and as many page size maps required are generated.
Note: The acceptable range should include the noise envelope. The units of the
range are standard deviations or instrument units. However, in both cases the
plots are centered on the data mean. Points outside the acceptable range in are
highlighted in red.
Instrument Response
The data file used in this test is the database (DynamicResponse.gdb). Ensure that
you have copied this file to your project directory.
1. The dynamic response should be ideally collected twice a day: at the beginning
of the day’s survey, as well as at the end of the day. Both profiles however are
imported on the same database line. It is recommended that the database lines
be named so as to reflect the date of the survey.
2. On the UX-DataPreparation menu, select USACE Instrument Tests|Instrument
Response. The Instrument response test dialog is displayed.
3. Using the dropdown list, select the Type of response test as (Dynamic
Response) and click the [Next] button. The Select Dynamic Response file(s)
dialog is displayed.
4. The Dynamic response file can be in Geosoft (*.gdb) or (*.xyz) format. If in
XYZ format, the Template dialog will be displayed enabling you to specify the
channel parameters to be imported.
5. Select the dynamic response file (DynamicResponse.gdb) from your working
directory and click the [Open] button. The Dynamic Response Test dialog will
be displayed in the foreground and the database will be displayed in the
background.
Tutorial 1: UX-Data Preparation 51
6. Complete the dialog as shown above and click the [OK] button.
Note: Normally the first line in the database is the reference line to which you will
be comparing the subsequent responses (Line to compare). Select the Target
channel as the channel that contains the response data and the Position
channel as it reflects the ground position along the line of the location of the
buried object(s).
7. The results are displayed on a Dynamic Response map as shown below.
52 Tutorial 1: UX-Data Preparation
Top Profile Graph: The top profile graph displays the reference data in blue and the
data that is being compared in red. The target position markers indicate the location
of the objects. The upper markers pertain to the first profile collected, and if a second
profile is collected on the same day the upper markers delineate their location.
Lower Profile Graph: The lower graph displays all the profiles accumulated to date
highlighting the current comparison response.
Scatter Graph: The scatter graph displayed at the bottom of the map, depict the
deviation of the comparison profile from the reference profile. Ideally the scatter
graph should display all the data points along the diagonal.
The object of this test is to determine the height best suited for a particular survey. A
single test line with known calibration objects will be run with the sensor at three
different heights. This test will display the three profiles superimposed, from which
you can select the test height. The signal to noise ration for each profile is reported
along the right margin, and should be inspected in order to select the optimum height.
Tutorial 1: UX-Data Preparation 53
The user must supply the position of the calibration object (such as a steel trailer
hitch ball), as well as a position depicting a typical noise (such as nails or shells). For
each profile the dialog will relocated the supplied position to the nearest peak before
calculating the signal-to-noise ratio.
The Height optimization test menu item enables you to make this comparative profile
plot of three lines in a database. The data used in this test is the (SensorHeight.gdb).
Note: When calculating the distance channel from X, Y you should specify the
Target position along the axis that changes the most, in this case the X
channel. The projection to the distance channel happens internally.
5. Specify the Target position as (402792.3).
Note: The Target position does not need to be exact because the software will
examine the lines to find the closest point that matches the location.
6. Using the Line name dropdown lists ( ) select the lines to use for the
comparison as (L0_1meter, L0_4meter and L0_7meter) and then in the
associated Height text boxes, specify the height for each line as (0.10, 0.40,
0.70) as shown in the dialog above.
7. Using the radio buttons, select Plot surrounds as (Yes), Override background
level as (Yes), and Overwrite map as (Yes).
8. Then specify the Title as (Height Optimization Test). You can also specify a
Horizontal scale (in units/mm) or in this case leave the parameter blank to use
the default scale. A default scale is chosen to make the plot window 15 cm
wide.
9. Click the [OK] button and the Background specification dialog will be
displayed.
10. This dialog explains how to interactively identify the background level using
your cursor and the comparative profile senor height map. Click the [OK]
button and the profile plot of the three selected lines will be displayed in a map
and your cursor will be in digitizing mode.
Tutorial 1: UX-Data Preparation 55
12. Click the [Yes] button and the background ratios will be added to the legend of
your map (SensorHeight_L0_1meter_L0_4meter_L0_7meter.map).
Note: The legend describes the amplitude and location of the peak for each profile
as well as the relative signal to noise ratios of the profiles. The user may use
this aid to decide on the optimum height to use for the survey. In general the
56 Tutorial 1: UX-Data Preparation
lowest sensor height would yield the highest noise-to-signal ratio and may
often be considered the better choice. However when working on an area
with a lot of buried debris of no consequential interest to the survey, selecting
a higher height will effectively diminish the fragment and debris effect to the
surrounding noise level, while the anomalies of interest will still clearly come
through.
6 Line Test
This test should be performed in an area relatively clear of anomalous response. The
test line will be well marked to facilitate data collection over the exact same line each
time the test is performed. Background response over the test line is established in
Lines 1 and 2. A standard test item, such as a steel trailer hitch ball will be used for
Lines 3 through 6. Heading effects, repeatability of response amplitude, positional
accuracy, and latency are evaluated.
The data set used in this test is the XYZ file (6line_em.XYZ). However, an
alternative Mag data file (1023QC_mag_final.XYZ) is also included in the
“…\Oasis montaj\data\usace” directory for you to use to run the 6 line test.
2. Select the type of input data as (XYZ), and click the [Next>] button. The Input
XYZ file dialog is displayed.
3. Using the Browse ( ) button, select the XYZ file (6line_em.XYZ) and then
click the [Next>] button. The 6line test: Input parameters dialog is displayed.
4. Using the dropdown lists, select the Data Channel as (TG1) and the Main
direction channel as (X). Leave the Orthogonal direction channel (optional)
blank so that the horizontal axis on the output map corresponds to the data X
axis. Note that, when you specify an Orthogonal direction channel a distance
channel will be calculated and the results are then calculated based on the
distance channel.
5. Specify the Target position along main direction as (402784.8) and specify the
Lateral tolerance as (0.35).
6. For the Ignore multi-directional lines? parameter we will accept the default
value (Yes).
58 Tutorial 1: UX-Data Preparation
7. Specify a Title and the Project should be defaulted to the Aberdeen Proving
Grounds. Using the Instrument dropdown list, select (EM-61 Mark II).
8. Click the [OK] button. The 6line test: Line selection dialog is displayed.
Note: Line direction and other relevant information can be found in the XYZ file
header. To view the header information, open the XYZ file (6line_em.XYZ)
using your text editor, for example Notepad. The file header for the XYZ file
(6line_em.XYZ) is shown below.
9. Using the dropdown lists, select the lines as shown in the dialog above based
on the header information in the XYZ file (6line_em.XYZ).
10. Click the [OK] button and the map (6line_em_6LineTest.map) will be
displayed.
Tutorial 1: UX-Data Preparation 59
11. Since all these lines are superimposed, they are colour coded. The legend
describes the colour coded survey lines, as well as indicating the direction in
which each line was surveyed. The lines that contain an item are compared
and the directional offset is displayed in the legend.
12. Each peak that deviates laterally more than the limit is flagged with a triangle,
and if the amplitude tolerance is exceeded it is flagged with an inverted
triangle.
60 Tutorial 1: UX-Data Preparation
Note: The latency calculated by the 6-Line test and displayed on the generated map
can be used to lag correct the data.
Repeatability
The Repeatability test dialog enables you to evaluate the repeatability of your data.
The results are plotted against the original survey and compared for anomaly
locations consistency.
Note that, it is recommended that before you proceed with the Repeatability test you
visually inspect the data (RepeatabilityEM.gdb) for evidence of a lag or DC shift
and correct the data before you continue.
1. Before you begin the repeatability test you need to open the database to test.
On the Data menu select Open database and the Open Database dialog is
displayed. Select the database file RepeatabilityEM.gdb and click the [Open]
button. The database will be displayed in a spreadsheet window.
2. On the UX-DataPreparation menu, select USACE Instrument
Tests|Repeatability. The Repeatability test dialog is displayed.
3. Using the dropdown lists, select the Data Channel as (ch1) and the Amplitude
Units as (Instrument units).
Tutorial 1: UX-Data Preparation 61
4. Specify the Amplitude tolerance as (5.0), the Lateral tolerance as (0.2) and the
Project as (Aberdeen Proving Grounds). Then, using the Instrument
dropdown list, select (EM-61 Mark II).
5. Click the [RepeatList] button and the Provide line pairs to process dialog will
be displayed.
Note: This dialog will automatically pick up lines with the same line name but
different versions, as pairs. However, in this example our data does not
conform to this model, and we will proceed to select the pairs of lines to be
processed for repeatability.
6. Using the dropdown lists, select the Base line as (Lgood_east) and the Repeat
line as (Lgood_west). To select another line pair, click the [Next>] button.
7. Repeat the step above adding the Base line as (Lcables_east) and the Repeat
line as (Lcables_west). You can continue adding line pairs, or to process, click
the [Finish] button and the Repeatability test dialog is again displayed.
8. The [ReScale] button enables you to modify the vertical span of the data. For
example, if the default vertical range is not satisfactory and smaller signals of
interest are not well resolved, you can click this button to circumvent this
potential problem by over-riding the default range by specifying the minimum
and maximum values.
9. Leave the remaining parameters to the intelligent defaults and click the [OK]
button. The map (RepeatabilityEM_1.map) is created and displayed in your
current project.
62 Tutorial 1: UX-Data Preparation
Azimuth Test
The Azimuth test creates an Azimuth Map file and a Heading Correction Table file,
which can be used for heading corrections, and saves them to your current working
directory.
The essence of this test is the same as the octant test (page 66) with the exception that
the sensor must be maintained in a fixed position while the operator revolves slowly
around it. The instrument is normally put on continuous read mode and the operator
rotates at a constant speed. Markers may be added at each cardinal direction. The
aim of the azimuth test is to investigate if the instrument manifests lateral drop out or
dead zones. The user must take note of these ineffective orientations and strictly
avoid them while performing the survey.
1. Create a new blank database and name it (Azimuth_mag.gdb). (On the Data
menu, select New database. In the New database name box, specify
(Azimuth_mag) and accepting the remaining default parameters, click the
[OK] button. An empty database is opened in your current project).
2. Import all of the channels in the XYZ file (Azimuth_mag.xyz) into the new
database. (On the Data menu, select Import|Geosoft XYZ. From the XYZ data
file dropdown list, select (azimuth_mag.XYZ) and accepting the remaining
Tutorial 1: UX-Data Preparation 63
default parameters, click the [OK] button. The file is imported into the current
database.)
3. Observe the flagged Marker Channel (MARK). At every 90 degree mark, the
marker has been incremented.
4. Create a new channel called "Azimuth" of data type “double”. (Select an empty
channel header cell, right-click and from the popup menu select New. The
Create Channel dialog is displayed. In the Name box, specify (Azimuth) and
in the Data type box ensure that (Double) is selected and accepting the
remaining default parameters, click the [OK] button. The empty channel is
displayed in the current database.)
5. Set the value of the Azimuth channel at the first MARK = 4 to 360.0, then set
the value of the first MARK = 3 to -90.0, then the first MARK = 2 to -180, and
then the first MARK = 1 to -270. (Note that, displaying the Mark channel in
profile view will assist in locating the incremental change locations.)
6. The next step is to linearly interpolate the “Azimuth” channel allowing for
edge extrapolation and place the results in the "Azimuth" channel. (On the
Utility menu, select Interpolate. Using the Channel to interpolate dropdown
menu, select (Azimuth) and then specify the Output interpolated channel as
(Azimuth). Using the dropdown menus, select the Interpolation method as
(Linear) and to Interpolate dummy edges? as (Yes). Click the [OK] button and
the interpolated results will be calculated.
Note: If the Azimuth channel is not displayed, select an empty channel header cell,
right-click and from the popup menu, select List. From the channel list select
(Azimuth) and click the [OK] button, the channel is displayed in the current
database.
7. The first point however does not point due north, but rather at 63.91 degrees
east of north. To correct for this the following math expression must be applied
to the azimuth channel: Azimuth=Azimuth+63.91. (Click three times on the
“Azimuth” channel header cell to select the entire database channel and then
press the < = > key. The formula bar will display “Formula=” at the bottom of
the database window. Type the formula “Azimuth=Azimuth+63.91” in the box
provided and press the <Enter> key to calculate the true azimuth for your
data.)
8. You now have an Azimuth database (Azimuth_mag.gdb) to use for the
azimuth test.
A ZIMUTH T EST
The Azimuth database file (Azimuth_mag.gdb) created in the previous step can be
used to perform the Azimuth Test.
3. Click the [Update List] button to update the dropdown lists of the Azimuth test
dialog. Using the dropdown lists, select Azimuth channel as (Azimuth) and
Data channel as (BOTTOM_RDG). Then, specify an Amplitude tolerance of
(0.7). Values exceeding this tolerance outside the mean of the data will be
flagged with a red circle on the map.
4. Click the [OK] button, the Azimuth map (APG_S1_Azimuth_Heading.map)
will be displayed in your project and a heading table file
(APG_S1_Azimuth_Heading.tbl) will be generated and saved to your
working directory.
Tutorial 1: UX-Data Preparation 65
5. The profiles that exceed the amplitude tolerance are flagged along the top or
bottom of the graph with triangle markers. Note the polar diagram of the
instrument response in the margin. If this polar diagram were a circle that
would signify that there are no heading variations.
6. The Geosoft Heading Correction Table file, which is generated and saved
along with the Azimuth Map file to your working directory, can be viewed
using your text editor program (e.g. Notepad). This table file can be used for
Heading Corrections.
66 Tutorial 1: UX-Data Preparation
Octant Test
The Octant test is performed in order to determine the heading effects with direction
of data collection. It should be performed in an area relatively clear of anomalous
response. Four lines will be clearly marked on the ground. These lines will intercept
at the same central point. The lines will run relative to the North axis at 0º, 45º, 90º
and 135º angles. Each line is then surveyed in both directions, and the results are
saved in an Oasis montaj database.
The Octant dialog reads all the surveyed lines and calculates the average intersection
point, of all lines. It then extracts from each line the closest point to the intersection.
Changes or fluctuations in the sensor readings are saved in a heading table and
displayed on a map.
The Octant dialog is used to calculate and display, in cardinal directions, the octant
test. The data used in this test is the (OctantMag.gdb).
1. Open and select (highlight) the database that contains the data you wish to run
the octant test on (OctantMag.gdb).
2. On the UX-DataPreparation menu, select USACE Instrument Tests and then
select Octant Test. The Octant test dialog is displayed.
6. When you run the Octant test, an octant map (APG_S1_Octant.map) and
table file (APG_S1_Octant_Heading.tbl) are created and saved in your
working directory. The octant table file contains the heading information. This
table file can be used to correct your data for heading errors. An example of the
octant heading table is shown below.
The Navigation Cross Test is used to determine the accuracy of your GPS data, with
the results posted to a summary report and automatically generates a map that show
68 Tutorial 1: UX-Data Preparation
the pin location. This test (also known as the Cloverleaf test) determines the accuracy
of the data by crossing over a known location point.
The test consists of putting up to 3 markers on the ground, and surveying directly
over them. The number of markers should match the number of sensors in the array.
The provided example consists of a single marker, and a single EM-61 Mark II
sensor. The operator crosses directly over the marker, then loops and crosses is again
at about a 90 degree angle to the first cross over direction. The navigation cross
process ask the user for the exact location of the pin, and then for all the selected lines
it finds where the line crosses over itself. Then, the distance of the cross over from
the actual pin location is reported. All crossovers outside the specified radius are
ignored. In addition to the distance between the pin and the cross over, the fiducial
and amplitudes radial distance are also reported.
In this case, the known point (calibration grid Rebar #4: X = 402813.51, Y =
4369603.24) had a measuring tape placed on the ground crossing the rebar to help the
operator cross directly over the point. It is recommended that the operator put guides
on the ground to make sure they go over the pin every time.
The data set used for this test is the (NavCross.gdb). This database includes four
lines; Lgood, Lfloat, Loffset and Lbody. Information about each line is outlined
below:
Lgood GPS was at 'INT' with 8 satellites - offset is ~5cm
(multiple good passes in this line)
Lfloat GPS was at 'FLT' accuracy - offset is 9cm
Loffset pretty good, but didn't use a tape measure on the
ground to keep on line - offset is 38cm
Lbody lining up with center of body instead of center of
instrument which was lined up with the operator's
shoulder - offset is 24cm
T O P ERFORM N AVIGATION C ROSS T EST :
1. Open and select (highlight) the database that contains the data you wish to run
the test on (NavCross.gdb).
2. On the UX-DataPreparation menu, select USACE Instrument Tests|Navigation
Cross Test. The Navigation Cross Test dialog is displayed.
Tutorial 1: UX-Data Preparation 69
3. Using the dropdown list, select the Input GDB file as (NavCross.gdb) and the
Lines to process as the (Selected lines).
4. In the Distance tolerance box, specify the tolerance in ground units, as (0.2)
and specify the Filename for summary report as (APG_NavCrossTest.txt).
5. Specify the Pin1 X location as (402813.51) and the Pin1 Y location as
(4369603.24) and then select the channel to process as (ch1).
6. As this dataset only has one Pin we can leave the remaining parameters blank
and click the [OK] button. The map file (_navcross.map) is created and
displayed along with the summary report “APG_NavCrossTest.txt”.
70 Tutorial 1: UX-Data Preparation
Data corrections
Many types of geophysical data contain a time-varying error fundamental to the type
of data being measured. A variety of factors can cause errors: diurnal variations of the
earth's magnetic field, instrument heading, and instrument drift. Tools exist for
correcting these errors.
This section describes how to correct your data. Topics discussed in this section
include:
• Base Station Correction (page 71)
• Heading Correction (page 76)
• Instrument Drift Correction (page 78)
To find more technical information on the data correction parameters, click the Help
( ) button on the dialog of interest. This tutorial uses sample data provided with the
installation of the UX-Process system. Make sure that you have copied the data files
provided into your project directory. The files used to demonstrate how to correct
data are: (mag858BaseStation4.xyz, APG_S1_Octant_Heading.tbl, em61Data.gdb
and the database (APGMag_S1A001.gdb) created earlier in this tutorial.
This section will demonstrate how to apply base station corrections on raw surveyed
magnetic data. This procedure requires the database file (APGMag_S1A001.gdb)
(created in section, Importing Magnetic Data on page 30) that contains the surveyed
data, and the ASCII file (mag858BaseStation4.xyz) that contains the coinciding base
station readings. The cross referencing is performed by using the time (and an
optional date channel). If the base station data contain spikes above a tolerance
specified in the first dialog, then this dialog will locate the largest spike found in the
base station data, display it in the Oasis project and prompt you to filter the base
station data prior to applying the base station correction again. Similarly, if the base
station data manifest data jumps, you can remove them.
Note: The survey data is comprised of 2 surveys taken at different times. The base
station file covers both time spans. This process may take longer than the
others as it does have to cross reference hour’s worth of data.
1. On the Data menu, select Open database. The Open Database dialog will be
displayed. Select the database (APGMag_S1A001.gdb) from your working
directory, and click the [Open] button. The database will be displayed in a
spreadsheet window.
72 Tutorial 1: UX-Data Preparation
3. Using the [Browse] button, select the ASCII Base station file
(mag858Basestation4.XYZ) that has been included with the tutorial data.
Note: Note if the Files of type defaults to file type (*.bas), using the Files of type
dropdown menu, select (Files *.xyz) to display the (*.xyz) files.
4. Leave the Input GDB Date channel (optional) blank, as the data does not
contain this information.
Note: If you omit entering the Input GDB Date channel then the cross referencing is
performed solely using the Input GDB Time channel, and the date is ignored.
5. Using the Input GDB Time channel dropdown list, select (TIME).
6. Specify the acceptable Base station tolerance (nT/sec) as (1.0). The purpose of
the tolerance field is to enable on-the-fly filtering of potential noise in the base
station data. Base station data varies gently with time, and any abrupt
variations are attributed to noise.
7. The process generates an intermediate database of the base station data. You
are offered the option to Save interim base station GDB? By default it lives
only through the duration of the process. This database will have the base
name of the base station data and the extension GDB for this tutorial we will
select (No). .
8. Click the [Next>] button. Note that, if there is more than one header line in the
input XYZ file the process will prompt you with the USEMAGBASE dialog
asking you to identify the base station header line that contains the channel
names.
Tutorial 1: UX-Data Preparation 73
9. Click the [OK] button and the base station file (mag858Basestation4.XYZ)
will be displayed for you to view, in your default text editor (i.e. Notepad).
10. Examine the file to determine which header line contains the channel names
and then closes the file. The Enter field label # or -1 if none dialog is
displayed.
11. Specify the correct line number as (3) and click the [OK] button. The Provide
channel pairs to process dialog is displayed.
12. This dialog is used to specify all the channels that you would like corrected,
along with the corrected channel name. Using the Input GDB Raw mag
channel dropdown list, select the channel to be corrected as (bottom). Then,
click the [DefOuputChannel] button and the name of the Output GDB
Corrected mag channel will be assigned as (bottom_BAS). To specify another
74 Tutorial 1: UX-Data Preparation
channel to correct, click the [AddChannel] button. The Provide channel pairs
to process dialog is again displayed. Proceed by selecting the raw input
channel as (top) and clicking the [DefOuputChannel] button to assign the
corrected output channel as (top_BAS).
13. Click the [Next>] button. The Base Station Correction file dialog is displayed.
Note: If there are no headings for the fields in the base station correction file, then
the first row of ‘values’ for the fields in the base station file are displayed, and
can be used as a guide.
14. Using the Basemag field dropdown list, select the field name of the magnetic
data in the base station file as (MAG).
Note: If you did not select a date field in the previous dialog, then you will not be
prompted for a date field in the current dialog. If however you have specified
a date field in the previous dialog pertaining to the database entries, and the
base station file does not contain a date field, you can override the date, by
clicking the [Override Date] button, and providing a date for cross-
referencing with the date in the GDB file.
15. Using the Time field dropdown list select the field name of the time data in the
base station file as (TIME).
16. Click the [OK] button. If the base station data exceeds the tolerance specified,
a warning message is displayed.
Tutorial 1: UX-Data Preparation 75
Note: The base station data that represents the greatest deviation from the specified
tolerance is displayed and highlighted in the database profile window (as
shown above). The base station data is displayed at the survey frequency, and
the correction (if you decide to proceed with it) is performed on the original
base station data, prior to interpolating it to the increment of the survey data.
17. Click the [OK] button to proceed with the corrections. You will be prompted
to de-spike the base station data by applying a non-linear filter.
18. Click the [Yes] button to de-spike/de-step the base station data. You will again
be prompted and asked if you want to walk through the de-spiking/de-stepping
process?
Note: If you click [Yes] the system will walk you through each spike/step that
exceeds the tolerance however, if you click [No] the de-spiking/de-stepping
will be done in one step.
19. Click the [Yes] button and the dialog is displayed asking, “Would you like to
de-spike/de-step at Fid 2570?”
20. Click the [Yes] button and you will be prompted with the next spike/step in the
data. Continue selecting [Yes] until all of the spikes that exceed the tolerance
are corrected. The data is displayed in the database and the profile window as
shown below.
76 Tutorial 1: UX-Data Preparation
Note: If you are not satisfied with the filtering that has been applied, you can run the
non-linear filter from the X-Utilities|Filters|Non-linear filter menu and adjust
the filter parameters.
Heading Correction
The Octant and Azimuth tests, which are covered in the Instrument Tests section,
output the heading correction table. The heading correction table is a small text file
with 2 reading per line, consisting of the azimuth angle and the relative different of
the reading from the average in that direction. A Geosoft heading table will have a
format similar to:
/ Geosoft Heading Correction Table
/
/= Direction:real:i
/= Correction:real
/
/ Direction Correction
0 -3.25
45 -2.25
90 -0.42
135 +1.98
Tutorial 1: UX-Data Preparation 77
180 +3.19
225 +1.63
270 +0.48
315 +0.00
360 -3.25
This tutorial requires the survey dataset (APGMag_S1A001.gdb) and the table file
(APG_S1_Octant_Heading.tbl) generated earlier.
1. Open and select (highlight) the database from the previous (base station
correction) step (APGMag_S1A001.gdb).
2. On the UX-DataPreparation menu select Data Corrections and then select
Heading Correction (if a database is not already open, you will be prompted to
select one to apply the corrections to). The Heading correction dialog is
displayed.
D ATA D ISPLAY T IP
1. Select (highlight) an empty channel header, right click and from the popup
menu, select List.
2. From the List tool, select the channel you wish to view (e.g. bottom_head) and
click the [OK] button. The channel will be displayed in the spreadsheet
window.
78 Tutorial 1: UX-Data Preparation
1. Select (highlight) the channel header for the channel you wish to view in
profile format (e.g. bottom_head), right click and from the popup menu, select
Show profile.
2. The profile of the channel will be displayed in the profile window.
Without any known controls, the default setting for the drift correction is calculated
to be a non-linear long wavelength signal with a wavelength of 100 seconds, and a
tolerance equal to 1% of the standard deviation of the data. These parameters can be
altered if the result is not satisfactory, by selecting the [Advanced] option of the
Apply instrument drift calculations dialog. If spatial overlapping data is available the
difference at the overlap points can control the long wavelength drift calculation.
This procedure requires the EM database file (em61Data.gdb) that is provided with
this tutorial.
1. On the Data menu, select Open database. The Open Database dialog is
displayed. Select the em61Data.gdb from your project directory and click the
[Open] button. The database will be opened and displayed in your project in a
spreadsheet window.
2. On the UX-DataPreparation menu, select Data Corrections|Instrument Drift
Correction. The Apply instrument drift corrections dialog is displayed.
Tutorial 1: UX-Data Preparation 79
3. Using the Time channel dropdown list, select (TIME), and from the Input data
channel dropdown list, select (M1). In the Output data channel dropdown list
specify a new channel (M1_instr_drift_corr).
4. The Reference/Tie Line Data Channel will not be used in this exercise. We will
accept the default and leave the field blank.
Note: The Reference/Tie Line data consist of tie line and transect intersections. The
reference data correction occurs before the drift correction and if left blank
(the default) the reference data correction will not be applied.
5. Using the %of lowest values to ignore dropdown list, select (5) and from the
%of highest values to ignore list, select (5).
6. Select the Drift correction method as (Non-linear filtering). From the
Despike? dropdown list, select (No). Click the [OK] button and the Non-linear
filter parameters dialog is displayed.
Note: The drift correction methods available are, Non-linear filtering, Zero, First
and Second order trend removal, Median filtering and All five above methods.
7. You can accept the default values and click the [OK] button to apply the
instrument drift.
80 Tutorial 1: UX-Data Preparation
8. You can display the input channel (M1) as well as the drift corrected channel
(M1_INST_DRIFT_CORR) for inspection. To display data channels in the
profile window, select the channel header, right-click and from the popup
menu, select Show profile.
Note: If the drift as calculated is not satisfactory you can run the Apply instrument
drift correction again and modify the parameters for the drift correction
method selected. The “Non-linear filter” parameters are wavelength and drift
tolerance. The wavelength of the non-linear filter defaults to 100 units.
Depending on the signal this drift may not be a good representation of the
drift. The drift is a long wavelength smooth signal. 1% of the standard
deviation of the observed data must adequately generate a smooth drift signal.
If however you observe a visible chatter in the calculated drift, decrease this
parameter.
The user may layout the survey as to derive a relative knowledge of the drift from the
survey itself. This approach consists of running a number of tie lines that intersect the
transects. At the intersections one will have the amplitude of both the transect and the
tie line. These points are reference points that can be used to control the calculated
long wavelength drift. The frequency of these points depends on the survey design
and the character of the drift. Prior to performing the drift correction, the user should
generate a new channel in the transect database by clicking the [CreateRefCh]
button. This channel will be populated with dummies except at the intersection
Tutorial 1: UX-Data Preparation 81
points, where the amplitude of the tie lines is saved. The drift correction option then
should be supplied with the name of this reference point channel in order to ensure
that the calculated drift honours the actual drift at the specified intersection points.
Note: The database provided for the Instrument Drift Corrections (em61data.gdb)
does not include X, Y channels, therefore can not be used to create a reference
channel. However, the channel referencepoint is supplied to use for the
following procedure.
3. In the output data channel specify (M1_drift_ref) and this time for the
Reference data channel, select (referencepoint). This channel contains a
second reading at a given special point.
Note: If X, Y data for both transects and tie lines exist, you can click the
[CreateRefCh] button to populate a reference channel with the difference at
each cross over point.
4. Select the Drift correction method as (All five above methods) and whether to
Despike? as (No). Click the [OK] button and the Non-Linear filter parameters
dialog is displayed. We will accept the default values and click [OK]. The
Median filter parameters dialog is displayed.
82 Tutorial 1: UX-Data Preparation
5. Specify the Rolling window width (sec) as (2) and click the [OK] button to
apply the instrument drift corrections.
6. All 5 filtered channels will be generated and each one will have an appropriate
extension identifying the method that has been applied (e.g. Non-linear
filtering - M1_drift_ref_nl, Zero order trend removal - M1_drift_ref_0tr,
First order trend removal - M1_drift_ref_1tr, Second order trend removal -
M1_drift_ref_2tr, and Median filtering - M1_drift_ref_md).
Note: This time in addition to the long wavelength drift removal, the drift has been
forced to honour the difference at each cross point.
required. The user will display the profile and identify the start and end of each
discontinuity. This information is saved in a channel named "ControlPoints". If the
drift correction GX finds a channel by this name in the database to be corrected, it
will automatically remove the discontinuities.
Scroll to Line 4 of the database (em61data.gdb) using the Database Tools Bar
( ), and notice the unsatisfactory drift correction around the
discontinuity. To overcome this problem, we require a user interaction in order to
define the start and end of the discontinuities. For your convenience this information
has been saved in the channel CP. Proceed with the exercise; however you may opt
to simply rename the channel CP to ContolPoints.
1. Display the profile of the channel M1.
2. On the profile window, place your cursor at the first discontinuity, right click
and from the popup menu select Add control point.
3. Repeat this step at each discontinuity. This process will accumulate a channel
of discontinuity points. When finished selecting the discontinuities, run the
Apply instrument drift correction dialog again, changing the Output data
channel to (M1_drift_control) and change Apply control points? to (Yes).
4. Observe the difference in the drift correction on Line 4 (L4:0).
84 Tutorial 1: UX-Data Preparation
Path Corrections
The Path Corrections menu options enable you to make any necessary modifications
to your survey path. The topics discussed in this section include:
To access additional technical information on resetting your survey path, click the
Help ( ) button on the dialog of interest. The data files used in this section include
the following files: APG_NavCorr.gdb, MasterMag.gdb, surveypolygon.ply,
APGMag_S1A001.gdb.
Warp a Database
The Warp a Database dialog enables you to fit a survey within a specified
quadrilateral. The area to be warped is determined by 4 points inputted by the user.
You may enter the information manually or interactively on a map. In the latter case,
the two quadrilateral areas delineating the original and the warped areas are displayed
on the map and you are prompted to confirm the correctness of the selection, re-select
or cancel. You have to enter all four corners before being prompted.
In this exercise we will warp the grid S1A001 to enlarge the height of the area that is
we will increase the top and bottom of the area, leaving the sides the same. This
procedure uses the database (APGMag_S1A001.gdb) that was generated earlier in
this tutorial.
Note: Although you will be prompted for the name of a map to open to use for
interactive warping, it is highly recommended to open the map prior to
running the dialog, Zoom in to the area of interest and then invoke the Warp a
Database menu item. We will use the index map for this exercise.
2. Open the map (APG_Index.map). and Zoom into the area of interest
(S1A001).
Tutorial 1: UX-Data Preparation 85
4. Using the [Browse] button you can locate the Database to warp
(APGMag_S1A001.gdb). Note that, if you have opened and selected a
database, it will be the default in the Database to warp box.
Note: Some parameters have both a dropdown list ( ) and a browse button ( ).
This is provided for user’s convenience. Once a file (*.gdb, *.map, *.grd) has
been open in a project, it will then be available in the appropriate dropdown
lists. However, if a file has not yet been opened in the current project, then
you will need to use the browse button to locate it.
5. From the Definition mode dropdown list, select the mode as (Interactive). We
will leave the Map to use parameter to the default (the open and selected map).
Click the [OK] button the Warp rectangular grid dialog is displayed.
86 Tutorial 1: UX-Data Preparation
6. Following the instructions provided on the Warp rectangular grid dialog, click
the [OK] button and begin selecting the area to warp.
7. When you have finished selecting the 4 points (4 original points and then 4
new points), the Warp survey file dialog and a “map” of the original area
(green) and warped area (blue) will be displayed.
Select your data points in a counter-clockwise order, as shown above, selecting 2 points,
first the old point, then the new point, at each of the four locations.
8. If the warped area is acceptable click the [Yes] button. The data will be warped
and displayed on the current map together with the grey trace of the old path.
Note: The green polygon represents the original area and the blue polygon
represents the new warped area.
Tutorial 1: UX-Data Preparation 87
In this exercise we will demonstrate how to warp your data manually. This procedure
does not require an open map, but does require that you know both the original and
new X and Y values.
For this exercise we will warp our data back to the original position (pre-interactive
warp).
3. Using the [Browse] button you can locate the Database to warp
(APGMag_S1A001.gdb). Note that, if you have opened and selected a
database, it will be the default in the Database to warp box.
4. From the Definition mode dropdown list, select the mode as (Manual). The
Map to use parameter is only for Interactive mode only and doesn’t apply to
this procedure. Click the [OK] button the Enter lower left coordinate dialog is
displayed.
5. Specify the Original X and Y and the New X and Y values as shown above and
click the [Next>] button. Complete the dialogs using the values listed below:
88 Tutorial 1: UX-Data Preparation
6. When you have finished adding the coordinates on the four dialogs provided,
click the [Finish] button. Your data will be warped back to the original
position (pre-interactive warp).
The geometry of a survey cart is so that often there is a fixed separation between the
centre of measurement, e.g. the centre of a coil, and the location of the GPS device
used to record position. This offset is a function of the orientation of the instrument,
which normally remains fixed relative to the line heading, for instance an instrument
trailer pulled behind an operator.
The Sensor Offset Correction is generally the first correction you would run on your
data, and then you would use the corrected data to run any further corrections.
The Sensor Offset Correction menu item is used to calculate the actual X, Y
coordinates for up to 10 sensors, which are located at fixed offsets from the GPS
receiver. Up to 10 channels of data can be associated with each sensor. As each
sensor will have its unique coordinates, the offset data is copied to a new database.
Each sensor is represented in the new database with the same line number but with a
different line version.
This dialog takes as input for each sensor a fixed offset distance (both along line and
perpendicular to the line direction) and up to 10 channels that are associated with
each sensor. The heading is determined from a smoothed version of the line path; the
user can specify the distance over which to smooth the data prior to calculating the
offset; the longer the distance the smoother the calculated path. The calculated offset
is then added to the original location and the new coordinate set copied, along with
the associated channels, to a new line version.
3. Specify the Number of sensors in array as (1) and the Number of database
channels per sensor as (2). Using the Line to correct dropdown list, select
(Selected lines) and then specify the Output database name as
(APGMAGoffset_S1A001).
Note: You can select up to 10 sensors in an array, which can have up to 10 channels
of data associated with each sensor.
4. Click the [OK] button to continue. The Enter parameters for sensor 1 dialog is
displayed.
5. Using the Data channel 1 dropdown list, select (bottom) and from Data
channel 2, select (top). . Then specify the distance between the sensor and the
GPS instrument in a relative Cartesian coordinate system where the Y-axis is
aligned with the direction of travel (i.e. the Sensor offset in direction of travel)
as (1).Then, for the Sensor offset across direction of travel, specify (0).
6. We will specify the Acceptance Threshold as (0.0159) and a Smoothing
interval for heading as (5). This will ensure that the input locations are thinned
to be no more than 5 metres apart, and then smoothly re-interpolated at five
times the average point separation to produce a curve from which the heading
at any point is determined. If left blank, then this interval is calculated to be
the greater of the input offset distance, or two times the average point
90 Tutorial 1: UX-Data Preparation
8. Click the [Yes] button and the layout is saved in your working directory as
(Multi-sensor layout.map) and the Enter parameters for sensor 1 dialog is
again displayed. Click the [Finish] button and the UCEOffsetMult dialog is
displayed.
9. This dialog is informing you that, “The layout “Multi-sensor layout.map” has
been saved in your working directory”. Click the [OK] button to correct for the
sensor offset and create and display a new offset database
(APGMAGoffset_S1A001.gdb). The Sensor offset corrections dialog is also
displayed.
10. This dialog is asking if you want to “Display Map?”. Click the [Yes] button
and the sensor offset map (APGMag_S1A001_offset_sensor1.map) will be
displayed in the background and the Local coordinate system dialog is
Tutorial 1: UX-Data Preparation 91
11. Using the dropdown lists, select the Local X and Y channel(s) as (loc_x) and
(loc_y). Click the [OK] button, the offset database (APGMAGoffset_S1A001)
and the sensor offset map (APGMag_S1A001_offset_sensor1.map) are
opened and displayed in your current project.
Note: The output database has additional channels named “Combined”. These
channels catch all the offset data, and assuming that the data is surveyed on
the same datum and has been levelled, it enables the user to grid all the data
together. Caution is advised in using these channels for further processing, as
the above 2 conditions are important in being able to use the data from
different sensors homogeneously.
92 Tutorial 1: UX-Data Preparation
The Instrument Latency Correction option is used to correct lag errors introduced by
instrument timing delays.
Typically the systematic lag could be obtained from the 6 line test. Refer to the 6 line
test map, and note the offset observed by surveying in 2 opposite directions. This
offset is plotted along the margin of the map. The X and Y locations are then offset
accordingly. The new positions are estimated by splining X and Y against time and
then using the values of the splines at the time values offset by the specified delay.
The Time channel must be ordered (increasing or decreasing) and the X, Y, and Time
channels cannot contain dummies.
Note: If you do not have a Time channel, you can use the Fiducial channel, in
which case the units will be in fiducial units (or the data sampling rate).
Use the Instrument Latency Correction menu option to correct lag errors introduced
by instrument timing delays. The database (APGMag_S1A001.gdb) will be used for
this correction.
Note: At the end of this procedure we will discard the instrument latency results and
restore the “X “and “Y” channels to their original values because the data
does not require this correction.
4. From the Reference latency channel dropdown list select (TIME), and then
specify the Delay (instrument delay or lag) as (0.5).
5. From the Lines to correct dropdown list, select (Selected lines) and then
specify the Raw X backup channel and the Raw Y backup channel as (X_Raw)
and (Y_Raw). For more information on this or any parameter, click the Help
( ) button.
6. Click the [OK] button to correct the lag errors that were introduced by
instrument timing delays and a dialog will be displayed asking if you want to
“Display map?” Click the [Yes] button and the database will be corrected and
a map displaying the latency correction (APGMag_S1A001_latency.map)
will be generated.
94 Tutorial 1: UX-Data Preparation
The original “X” and “Y” channel values (“Raw” data) can be restore by specifying
“0” for the Delay value. The “Raw” data will then be copied back into the current "X"
and "Y" channels.
The Non-Systematic Lag Correction menu item enables you to interactively digitize a
polygonal area on an open map and correct the localized non-systematic lag within
the delineated area. If your data contains a time channel, you should perform the
latency correction first, and then inspect the corrected data for any further presence of
lag. The non-systematic lag can be introduced to your data by a varying data
collection rate and/or by surveying in areas of highly variable slope. The degree of
the non-systematic lag can vary from area to area on the same survey, and must be
first identified on the map prior to applying the correction.
Non-systematic lag should be corrected by examining gridded data and local profiles
to determine the region and distance offset required for the correction. The distance
value would then be entered and the X and Y locations would be offset accordingly.
Note: The Time channel must be ordered (increasing or decreasing). The X, Y and
Time channels cannot contain dummies.
The database (APGMag_S1A001.gdb) and the latency map created in the previous
section (APGMag_S1A001_latency.map) will be used for this correction.
5. Specify the Lag correction distance as (.75) and from the Lines to correct
dropdown list, select the lines to process as (Selected lines).
6. Specify the default names of the Raw X, Y backup channels as above. For more
information on this or any parameter, click the Help ( ) button.
7. Click the [OK] button; your cursor will change to a crosshair. You can now
digitize the area you wish to correct. When you have enclosed the area of
interest, (lower left quadrant) right click, and from the popup menu, select
Done.
8. The non-systematic lag correction will be applied to the data within the
selected area and a dialog is displayed asking if you want to, “Display map?”
Tutorial 1: UX-Data Preparation 97
The Navigation Error Correction tool was developed to correct GPS navigational
errors. GPS uses an array of up 28 orbiting satellites to locate positions on the Earth's
surface. To provide precise coordinates in 3 dimensions a minimum of 4 satellites are
required. The satellite signals are received on the ground by a receiver unit that
calculates the time delay of the signal to calculate a distance from the satellite; with
the distances from all available satellites a fix of position is possible. While GPS can
be an extremely precise mode of surveying, there are potential errors that if left
uncorrected can compromise the accuracy of the data. GPS data may contain jumps,
dog-legs, and shifts that will place sensor data at the wrong coordinates.
98 Tutorial 1: UX-Data Preparation
This navigation correction tool corrects for fix quality (quality of satellite signals
being received), zero positions (no data being collected), positional double hits
(repeated position readings), time double hits (repeated time stamps) and data jumps
(radio signals from the transponders multipath or suffer interference from cultural
radar sources). These corrections can be applied to all survey lines or only user
specified lines. The original X, Y coordinate channels are renamed and saved with
the database.
1. On the Data menu, select Open database and the Open Database dialog is
displayed. Select the database (APGmag_NavCorr.gdb) and click the
[Open] button. The database is opened and displayed in a spreadsheet window
in your current project.
2. On the UX-DataPreparation menu, select Path Corrections|Navigation Error
Correction. The Path correction dialog is displayed.
Tutorial 1: UX-Data Preparation 99
3. Enter the parameters as shown above and as explained in the section “Applying
Navigation Error Corrections” above.
Note: The Correct navigation for jumps (manual)? parameter now provides three
interactive options, “Straight line”, “Poly line”, or “No correction”. Also, a
new parameter Replace/Dummy corrected points is now available. This
parameter enables you to select “Interpolate” or “Dummy out”. For more
detailed information, click the Help ( ) button.
4. Once you are satisfied with your selections, click the [OK] button. The map
file (APGmag_NaveCorr_navcorr.map) is displayed in the background and
the UCENAVCORR dialog is displayed in the foreground.
100 Tutorial 1: UX-Data Preparation
5. This dialog explains how to manually correct the polyline points. Using your
left-mouse button select the points to move (right-mouse click and select Done
from the popup menu to end the polyline point selection). Then, using the left
mouse button select the exact same number of points to move the line “to”,
click the right-mouse and select Done from the popup menu (or press the
<Esc> key) to finish the line correction. If another line is to be corrected, the
UCENAVCORR dialog will again be displayed, asking to select points on the
next line.
th
6. Select the 4 point from the top of L2 to move and then select a location
between the points 3rd and 5th from the top of L2, straightening out the line.
Your final map should look similar to the following map
(APGmag_NavCorr_navcorr.map).
Tutorial 1: UX-Data Preparation 101
The Split Master Survey Database (GDB) option enables you to split a master
database according to its constituent grids. Often, for convenience in the field,
transects (or lines) are surveyed in a continuous mode through many grid areas. This
option enables you to break down a master database into the corresponding grids.
The data files used in this tutorial are; MasterMag.gdb, APG.csv and
surveypolygon.ply. The APG.csv file is created during the Plan a UXO survey step
and defines the four grid areas of our survey.
Utilities
The Utilities options provide a variety of tools for pipeline detection, selection and
removal, velocity calculations (determining the speed in which the operator walked
the survey) and a 60Hz power line filter to remove any 60 Hz power line interference
in the collected data.
This section describes how to apply the tools discussed above using the UX-Process
system. Topics discussed in this section include:
• Pipeline Detection and Removal (page 103)
• Interactive Pipeline Selection (page 105)
• Velocity Calculation (page 107)
• 60 Hz Power Lines Filter (page 108)
To access additional technical information on the Utilities parameter’s, click the Help
Tutorial 1: UX-Data Preparation 103
( ) button on the dialog of interest. The data file used in this section is the database
(NoisyMag858_S1A001.gdb) that was provided with the installation and the Mag
database (APGMag_S1A001) created earlier.
The Pipeline Detection dialog enables you to produce a polygon file (*.ply)
containing linear features extending over 3 survey lines or longer.
To find trending structures, you specify a preferred angle in the data. Separate trends
are located for each continuous feature. The resulting trends are saved in a polygon
file named pipelines.ply and can be overlain on coinciding maps. The pipeline
polygon may later be used to mask the signal out of the database.
The data files used in this tutorial are the (APG_Index.map), and the (pipeline.gdb).
Note that, to properly display the pipeline information on the Index map; hide the
groups previously displayed in the S1A001 area of the Index map.
1. Open the Index map (APG_Index.map) and zoom into the S1A001 grid area.
Then, open and select (highlight) the database to process (pipeline.gdb).
2. On the UX-DataPreparation menu, select Utilities|Pipeline Detection and
Removal. The Pipeline detection and removal dialog is displayed.
3. Using the Survey Data channel dropdown list, select the (STD_4_4) and then
specify the General direction of pipelines as (135).
4. Specify the Average line spacing as (1.6). This entry defaults to the line
spacing specified in the planning stage, which is not satisfactory for our
example. The Average signal width for the typical pipeline signal defaults to
four times the line spacing that should be specified as (6.4). Select the Output
pipeline polygon as (Append) and click the [OK] button. The UCEPipeline
dialog is displayed.
104 Tutorial 1: UX-Data Preparation
5. This dialog tells you that the “Pipeline.ply” file has been generated and saved
in your working directory. Click the [OK] button and the pipleline.ply file will
be displayed on the APG_Index.map and the UCEPipeline dialog is
displayed.
Note: If the Index map were not already opened and displayed in your project, then
it would be opened and displayed with the UCEPipeline dialog.
6. This dialog asks if after the calculation of the pipelines, you want to remove
them from the data. Click the [Yes] button and the UCEPipeline dialog is
displayed.
7. This dialog tells you that the pipelines, removed from the data, are stored in a
new channel in the database (STR_4_4_pipermv). The pipelines are also
removed from the map (APG_Index.map). Click the [OK] button to continue.
8. In the database window, you can display the profiles before (STD_4_4) and
after (STR_4_4_pipermv) the pipeline removal and observe the difference.
Note: The signal removal can also be executed at a latter time by invoking the
ucepipelineremove GX.
Tutorial 1: UX-Data Preparation 105
The Interactive Pipeline Selection menu item enables you to manually draw
pipelines, save them to a file, display them and remove if selected.
1. Open and select (highlight) the database to process (pipeline.gdb). Then, open
the Index map (APG_Index.map) and zoom into the S1A001 grid area.
2. On the UX-DataPreparation menu, select Utilities|Interactive Pipeline
Selection. The Interactive pipeline selection dialog is displayed.
3. The Map Name should default to the open selected map (APG_Index.map) if
not; use the dropdown list to select the map.
106 Tutorial 1: UX-Data Preparation
4. In the Output Pipeline File Name box, specify the name of the output *.ply file
as (Pipeline.ply) and select the Output Mode as (Overwrite). Click the [OK]
button and the Pipeline points dialog is displayed.
5. This dialog instructs you to pick your pipeline polyline points and press the
<Esc> key when done. Repeat this procedure, clicking the [Yes] button on the
UceInterPipeline dialog to add more pipelines and finally clicking the [No]
button when you have completed your pipeline selections.
6. The UceInterPipeline dialog is displayed and asks if after the calculation of the
pipelines, you want to remove them from the data. Click the [Yes] button and
the Interactive pipeline selection dialog is displayed.
7. Use this dialog to select the Database FileName that you want to remove the
pipelines from (pipeline.gdb) and click the [Next] button. The Interactive
pipeline selection dialog is displayed.
8. Using the dropdown list, select the Data channel as (STD_4_4) and in the
Pipeline position Tolerance (m/feet) box, specify (0.2). Click the [OK] button
and the UceInterPipeline dialog is again displayed.
9. This dialog tells you that the pipelines, removed from the data, are stored in a
new channel in the database (STR_4_4_pipermv). The pipelines are also
removed from the map (APG_Index.map). Click the [OK] button to continue.
10. In the database window, you can display the profiles before (STD_4_4) and
after (STR_4_4_piperemv) the pipeline removal and observe the difference.
Tutorial 1: UX-Data Preparation 107
Velocity Calculation
The Velocity Calculation option enables you to calculate the speed at which the
operator has walked the survey, given ground coordinates and a time channel.
The velocity of the survey at each survey point is calculated from the X, Y and Time
channels. The calculated database channels (Velocity and Point2Point) will be
added to the current database. A map of the velocity can be displayed, which color
codes the average velocity within a tolerance as green, above the tolerance as blue
and below as pink. In the following example we will select the database file
“APGMag_S1A001.gdb” and select “Yes” to display the map. The expected velocity
of the survey should be approximately “1.4 m/sec” and we will specify the acceptable
tolerance as “0.4”.
3. Using the dropdown list, select the Time Channel as (Time) and the Velocity
Unit as (m/sec) and click the [OK] button. A dialog is displayed asking,
“Would you like to create Velocity Map?”
4. Click the [Yes] button and the velocity is calculated and put in the “Velocity”
channel in the current database and the Velocity dialog is displayed.
108 Tutorial 1: UX-Data Preparation
5. Using this dialog specify the Expected Velocity as (1.4) and the Tolerance as
(0.4) and click the [OK] button.
6. The map (APGMag_S1A001_VelocityMap.map) is created and displayed in
your current project and the channels (Velocity and Point2Point) have been
calculated and added to the current project database. Note, to display these
channels, select an empty channel header cell, right-click and from the popup
list select the channel and click the [OK] button.
The 60 Hz Power Line Filter option enables you to remove any 60 Hz power line
interference from your data.
When surveying in the vicinity of a power line, the power line may interfere with the
data and a systematic chatter may be observed. If you do see chatter in the data and
you are aware that the data was surveyed near a power line, please run this filter to
clean up the data. Note that, no other frequencies are affected by the filter.
Tutorial 1: UX-Data Preparation 109
3. This dialog enables you to select from dropdown lists the Time channel
(TIME) and the Channel to filter (noisy_mag). Then, specify the Output
channel as (noisy_mag_filt) and click the [OK] button.
4. The channel (noisy_mag) is filtered and the 60 Hz power line interference is
removed from the data and the results are displayed in the output channel
(noisy_mag_filt).
110 Tutorial 1: UX-Data Preparation
QC QA Tools
The UX-Process system includes Quality Control and Quality Assurance (QC QA)
tools that enable you to determine the quality of your data.
This section describes how to apply the quality control procedures using the UX-
Process system. Topics discussed in this section include:
• Statistics (page 110)
• Subwindow Statistics (page 113)
• Sample Separation (page 115)
• Foot Print Coverage (page 116)
• Noise Threshold (page 118)
• Total Coverage Comparison (page 120)
• Calculate Traverse Gaps (page 122)
The Statistics menu item enables you to calculate and save a statistics summary
report of selected channels and look at the histogram of the selected channels. This
option will produce a statistical summary of the selected lines. You can use the line
selection tools in the Oasis montaj Data menu to select lines to be included, or if you
are using a GS script for batch processing use the SELECT command.
C HANNELS
You can obtain a summary statistics report for a single channel or for multiple
channels. Single channels can be selected from a dropdown list. Multiple channels
can be entered manually however; a comma must separate each channel name.
Channels whose type is set to STRING will not be calculated and only the channel
names will be output to the summary file.
S TATISTICS
If X and Y channels are present in the database, then their statistics are always
included in the statistic summary report. Statistics calculated for the X and Y
channels include the minimum and maximum values, total distance and the number of
points and dummy values.
Tutorial 1: UX-Data Preparation 111
The statistics calculated for each additional channel include the minimum value, the
maximum value, the arithmetic mean, the standard deviation, the mode, the standard
deviation of the first and fourth differences, and the number of valid points. These
statistics are calculated for each line separately and for the entire selected data set.
The mode is useful to check the levelling of the data, while the standard deviation of
the first and fourth differences can be inspected to identify lines with excessive noise.
3. From the Channels to Use dropdown list, select the channels you want to
include in the statistical report as (Choose from List).
4. In the Output filename for statistics summary report box we will accept the
default file name (APGMag_S1A001_stats.txt). Note that, the default file
name is the database name followed by “_stats” and is saved in the current
project directory.
5. From the Display channel histogram(s) dropdown list select (Yes) and click
the [OK] button. The Select Channels to Process tool is displayed.
112 Tutorial 1: UX-Data Preparation
6. Using the arrow buttons select the channel(s) to process. In our case we
selected the channel (top), click the [OK] button and the statistics will be
calculated and displayed in a text file in your default text editor and displayed
as a histogram.
Tutorial 1: UX-Data Preparation 113
Subwindow Statistics
The Sub-window Statistics option enables you to interactively select a subset area of
your data from a map window and produce a statistics summary report (see Statistics
- Summary Report, page 110) of the subset data. If you already have a map that you
would like to use for defining the sub-window, please open it prior to running this
dialog.
5. Click the [OK] button to begin selecting the area to subset (e.g. top left
quadrant). When you have completed selecting the area of interest, right click
Tutorial 1: UX-Data Preparation 115
and from the popup menu, select Done. The statistics will be calculated and
displayed in a histogram and in a text file in your default text editor.
Sample Separation
The Sample Separation menu item enables you to identify data points that exceed a
specified maximum data separation.
3. Using the Line selection dropdown list, select the lines you want to include as
(Selected lines) and specify the Maximum sample separation as (0.12).
4. Using the browse button ( ) locate your Cultural Mask File as
(CulturalMask.ply) and from the New map? dropdown list, select (Create
new map) and then click the [OK] button.
116 Tutorial 1: UX-Data Preparation
5. The data points that exceed the specified maximum sample separation will be
identified and displayed on the map (APG_DATASEP_S1A001.map).
Note: The points that are further apart than the specified separation are flagged in
blue, along the line. This information is also provided in the legend (as shown
above).
The Footprint Coverage option enables you to shade the area covered by the footprint
of the instrument.
This option grey shades the area covered by the footprint of the instrument, enabling
you to see the surface not covered by the survey. It is recommended that you
superimpose the cultural data on the shaded footprint prior to making any decisions
on the coverage.
Note: In order to clearly see the Footprint Coverage you can hide the Sample
separation information on the (APG_DATASEP_S1A001.map). Using the
Tutorial 1: UX-Data Preparation 117
1. Open the database (APGMag_S1A001.gdb) and open and zoom into the
centre of the (APG_DATASEP_S1A001.map).
2. On the UX-DataPreparation menu, select QC QA Tools|Foot Print Coverage.
The Footprint coverage dialog is displayed.
Noise Threshold
The Noise Threshold option enables you to identify sections of the survey that exceed
a specified first or fourth difference noise tolerance or where data is negative. The
negative values test is useful in identifying sections of the survey where instrument
failures are indicated by negative data values.
Note: In order to clearly see the Noise Threshold you can hide the previous option,
the Footprint Coverage. Using the Map View/Group Manager ( ) tool,
hide the groups (UceFootprintCov) and (Path_).
3. Using the dropdown lists, select the available parameters, as shown above, and
then click the [Next>] button. Depending on the test selected, you may be
prompted with a second dialog (Signal noise test…) that requires a tolerance
threshold. In our case the Signal noise test (4th difference) dialog is displayed.
th
4. Enter (0.5), for the 4 -difference tolerance and click the [Done] button. The
sections of the survey that exceed this noise threshold will be identified and
displayed on the map and saved to the database in the FLAG_NOISE channel.
5. The noisy data will also be displayed and flagged on the current map
(APG_DATASEP_S1A001.map).
120 Tutorial 1: UX-Data Preparation
Note: The line path is displayed with a black line on the map, and the sections of the
survey that exceed the noise threshold are displayed with a red dotted line.
The Total Coverage Comparison menu item is used to calculate the total area of
coverage from one or multiple grid areas. The total area is calculated by, counting the
number of grid cells with at least one measurement, and multiplying that value by the
area for each cell.
Note: In order to clearly see the Total Coverage Comparison you can hide some (or
all) of the groups on the map. Using the Map View/Group Manager ( )
tool, hide (un-check) the ‘data’ groups displayed on the (APG_Index.map).
Tutorial 1: UX-Data Preparation 121
3. Using the dropdown lists, select to Use Survey Info as (Yes) and select the
Input Data as (GDBs) and then select (Use current map) for New map?.
4. Click the [Next] button and depending on the Input data selected, a second
dialog will be displayed. In this case, the GDB plot dialog will be displayed.
5. From the Input set dropdown list, select the files to use to calculate the total
coverage area as (Current GDB). Click the [Options] button to set the display
options. The Plot Sample Coverage Grid dialog is displayed.
6. Using the Data channel dropdown list, select (bottom). Then, for the Grid cell
size specify (0.35).
Note: The survey area is divided into cells of this size. Within each cell the number
of valid data points is summed. Optimally, the cell size value should be
122 Tutorial 1: UX-Data Preparation
slightly smaller than the average line spacing. For more information, click the
Help ( ) button on the Plot sample coverage grid dialog.
7. Use the [Ranges] and [Colours] buttons to specify the parameters for the grid
cells. Then, click the [<Back/Done] button, to return to the GDB plot dialog.
Click the [Next] button to plot the results on the map.
8. The APG_Index.map displays the total coverage grids, which represents the
total coverage of the survey area. The Total Coverage grid information is
displayed in the map legend, as shown below:
Use the Calculate traverse gaps menu option to calculate line segments to fill in any
large gaps between survey lines. The calculated line segments are placed <Nominal
Tutorial 1: UX-Data Preparation 123
separation> apart between the 2 survey lines and are numbered by the number of the
survey line to the left of it +1000.x, where x distinguishes between segments between
the same 2 survey lines (note that, left and right are determined by facing along the
direction of the survey). Each subsequent line segment along the same two lines is
named using same convention, but its version number is incremented.
Two new channels are generated (Closest_Left and Closest_Right) and added to the
current database. At each survey point these values depict the closest points on the
two adjacent lines. The data file used in this tutorial is APGMag_S1A001.gdb.
3. From the Line selection dropdown list, select (Selected lines). Then, for the
Nominal separation specify (0.3) and for the Minimum segment length specify
(0.6).
4. Using the Recalculate distances? dropdown list, select (Yes) and two new
channels will be added to the database, Closest_Left and Closest_Right. At
each survey point the values in these channels represent the closes points on
the two adjacent lines. For more information, click the Help ( ) button.
5. From the Store which points in segment? dropdown list, select (First and Last
Only) to store only the first and last points of each line segment. Note that you
also have the option of storing every point of each segment in the database as,
“All”.
6. We will leave the Cultural Mask (optional) parameter blank, and no cultural
mask will be applied. However, if a polygon file (*.ply) is selected the file will
124 Tutorial 1: UX-Data Preparation
act as a cultural mask for the map and no line segments will be generated
within the polygon(s) defined in the file.
7. Using the New map? dropdown list, select (Create new map) and click the
[OK] button. The two new channels will be calculated and placed in the
current database and the map (APG_FILLGAP_S1A001.map) will be
displayed. However, if you database was not large enough to accommodate all
the calculated data the UCEFILLGAPS dialog will also be displayed in the
foreground of your project.
8. This dialog notifies you that your database has grown (in our case 2 times the
original size) and that all of the new lines have been saved in your current
database. To continue, click the [OK] button to close the dialog.
9. The calculated line segments are placed every <Nominal separation> units
from the left survey line, until the right survey line is reached.
Tutorial 1: UX-Data Preparation 125
10. The lines are named by taking the number of the line to the left and adding
“1000” (left and right are determined by facing along the direction of the
survey).
Progress Reporting
The Progress Reporting options are administrative tools that enable you to
systematically keep track of the survey progress by date, and area. Optional
comments for each entry may also be added. Also available are options to view the
projects audit logs and QC reports. The following options are available for use:
• Create Progress Status Table (page 125)
• Load Progress Status (page 126)
• Update Progress Status (page 127)
• View Audit Log (page 128)
• View QC Report (page 129)
The Create progress status table option should be run at the start of a project to
create a new progress status table or modify the default version. Note that, a warning
dialog will be displayed informing you that any changes you have already made to
your progress status will be lost when this option is run.
The index map (APG_Index.map) can also be used to update the Progress status on
the individual grids of the project by right-clicking on a grid area and selecting
Progress status from the popup menu. To access additional technical information on
the progress status, click the Help ( ) button.
Use the Create progress status table option to create/modify the current (or default)
progress status table file. This option enables you to add new items to the table and
provide a colour for each or modify the colour of the exiting table items.
2. You can use this dialog to create/modify the current (default) progress table,
and then click the [SaveTable] button to save it to your local working
directory.
3. To view the original Colour from the default (uceprogresstatus.csv) file, click
the [GetOrigColor] button.
4. To modify the colour, select the Progress table item, click inside the Color box
and using the Color tool select the new colour and click the [OK] button and
then click the [UpdateTblColor] button. This new color will be added to the
progress status table.
5. Click the [Done] button and the Progress table dialog is displayed.
6. To save the current table, click the [Yes] button. The table is saved in your
current working directory.
Use the Load progress status option to load the current (or default) progress status on
the Index map.
The Update Progress Status option enables you to update the progress status directly
on the Index map. You can access the Progress Status options in two ways, either by
selecting the UX-DataPreparation|Progress Reporting|Update Progress Status menu
item or by placing your mouse over the APG_Index.map click the right mouse
button and from the popup menu selecting Progress Status.
2. Click the [Yes] button to refresh the legend and the UCEPROGRESSTATUS
dialog will be displayed. This dialog tells you to, “Left mouse click on grid to
update report (ESC when done)”.
3. Click the [OK] button and then click on the grid area of the APG_Index.map
where you want to update the progress status and the Update progress status of
UXO survey dialog will be displayed.
4. Using the Update progress stage dropdown list, select the progress stage, and
in the Data of completion box, specify the date of the completion of that stage
(Note that, today’s date is the default date displayed).
5. Click the [OK] button and the map will be updated with the current progress
displayed. To change another grid area click on the map again, or to quit, press
the <Esc> key and the UCEPROGRESSTATUS dialog will be displayed.
128 Tutorial 1: UX-Data Preparation
6. This dialog informs you that, “The progress status has been saved in
“APG_Progress_status.csv”, in your current project directory.
The Audit Log records all of the parameters used and the processes performed on
every database and are stored within the individual database files. The View Audit
Log menu item copies the contents of the stored log to an RTF file for convenient
viewing, editing and printing. Changes made to the displayed RTF file are not stored
in the database file.
1. Open and select (highlight) a database (*.gdb) file that you want to view the
Audit Log for. Then, on the UX-DataPreparation menu, select Progress
Reporting|View Audit Log menu. The Audit log is displayed in RTF format.
Tutorial 1: UX-Data Preparation 129
View QC Report
The View QC Report contains quality control test parameters and results that can be
used to verify the quality of your project data. The QC report (by default) is set to
(No). The QC report file “QCReport.log” is stored in the current project directory.
The following UX-Process options can be included in the QC Report, if they are run:
Instrument tests
• Static Test... (ucecalibrate.gx)
• Instrument Response... (uceresponsed.gx)
• Optimum Sensor Height... (uceheight.gx)
• Repeatability... (uceposition.gx)
• Azimuth Test... (uceazimuth.gx)
• Octant Test... (uceoctant.gx)
• Navigation Cross Test... (ucenavcross.gx)
Utilities
• Velocity calculation... (ucevelocity.gx)
QC QA tools
• Sample Separation... (ucedatasep.gx)
• Foot Print Coverage... (uceFootprintCov.gx)
• Noise Threshold... (ucenoise.gx)
130 Tutorial 1: UX-Data Preparation
EM Data
• Check Time Gate Data ... (uceemverify.gx)
• Time Constant Calculation... (uceemtau.gx)
The QC Report is stored in your current project directory. This viewing menu item
displays the contents of the stored log in your default text editor for convenient
viewing and printing. Note that, at least one quality control test (see QC Report
above) must be run before you can view the QC Report.
1. After you have run at least one QC options listed above, on the UX-
DataPreparation menu, select the Progress Reporting|View QC Report menu.
The QCReport.log is displayed in your text editor. This file will include the
test results of all the quality control options that you have run in your current
project.
The Meandering Path|Density Analysis option calls the Density Analysis of the
Meandering Path package. For a more detailed, technical description of the
Meandering Path Density Analysis method, see the Ordnance and Explosives
Meandering Path Program (OE_MPP) User Guide provided with the OE_MPP
Tutorial 1: UX-Data Preparation 131
To access additional technical information on the other DoD tools parameters, click
the Help ( ) button on the dialog of interest. This tutorial uses the dataset:
DensityAnalysis.gdb
3. The Distance units for the project data are displayed for reference purposes;
this parameter cannot be modified.
Note: For more information, click the Help ( ) button on the Meandering Path
Density Analysis dialog.
4. The Site boundary definition can be specified by a DXF file, coordinate
corners, or data boundary. From the dropdown list select (Coordinate
corners) and click the [Next>] button. The Site boundaries dialog is displayed.
5. Enter the coordinates (0, 0, 100, 100) and click the [Next>] button. The Survey
information dialog is displayed.
132 Tutorial 1: UX-Data Preparation
6. Using the [Browse] button, select the Input data GDB (DensityAnalysis.gdb).
Note that, if you have opened and selected a database, it will be the default in
the Input data GDB box.
7. Using the Channel to process dropdown list, select (z). You can leave the
remaining parameters to the default values and click the [Next>] button, the
Output information dialog is displayed.
8. Specify the names of the new Anomaly text file, ADF text file and Grid file as
shown in the dialog above. Note that, if file names are provided, the
corresponding information will be saved in the specified format, otherwise this
information is not saved.
9. We will leave the Smoothing factor and Grid spacing to the default values and
click the [Finish] button to calculate the density analysis.
Tutorial 1: UX-Data Preparation 133
The system is automated as a stand-alone ACCESS database for use and distribution
at all Navy UXO cleanup sites.
The AQAPS Administrator dialog can be called directly from the UX-Process menu.
Please refer to the AQAPS Manual provided with the AQAPS package, installed by
default in “C:\Program files\AQAPS” unless modified by the user during the
installation process. If you do not have AQAPS installed you will be presented with a
message telling you the program can not be located.
I NVOKE AQAPS
EM Data
The EM Data currently consist of the Geophex’ and Geonics’ instrument and
methodology specific calculations. In both instances the instrument manufacturers
supplied the formulae. We implemented these calculations under the scope of the
project to facilitate the flow of the processing within Oasis montaj.
This section describes how to apply instrument specific calculations using the DoD
UX-Process system. Topics discussed in this section include:
• Geophex Conductivity/Susceptibility (page 135)
• Check Time Gate Data (page 137)
• Time Constant Calculations (page 139)
• Display Decay Curves (page 141)
Geophex Susceptibility/Conductivity
This process uses the Geophex routines for the calculation of relative magnetic
susceptibility and apparent conductivity from the in-phase and quadrature EM data.
In addition the Q-Sum (the straight sum of the quadrature channels) and Q-Spread
(the sum of the absolute difference between all frequency pairs of the quadrature
channels) are also calculated.
The primary reason for using the quadrature channels for the detection of UXO
targets is that quadrature channels do not respond to the magnetic geology, and to
ground effects. By definition a UXO detection tool should ignore the geologic noise
and strictly respond to metallic objects. Metallic UXO objects are characterized by a
positive quadrature peak within the operating frequency band.
Q-Sum can detect metallic targets imbedded in a magnetic geology very effectively.
UXO targets will have consistent responses on all quadrature channels, and adding
them up into a single channel helps to better delineate the particular targets of
interest.
Q-Spread is effective in identifying metallic objects in soils that are both conductive
and magnetically susceptible. Metallic objects will have a high Q-spread response.
136 Tutorial 2: UX-Parameter Determination
The background response of these soil types correlates to the in-phase and varies
weakly with the frequency.
5. Then, in the first Frequency box specify (330) and the auto-detect will locate
the associated In-phase and Quadrature channels (i.e. I330, Q330) channels).
If the detected channels are not the ones you intended, then you can
individually specify them using the dropdown lists.
6. Continue specifying the Frequency as (930, 2790, 8190, and 20010). When
you have finished adding all of the frequency information, click the [OK]
button. The relative magnetic susceptibility and apparent conductivity of the
in-phase and quadrature EM data are calculated and saved in the current
database.
7. In addition the QSum and QSpread channels are calculated from the above
input and saved in the current (gem3_S1A001.gdb) database. Note that,
Geophex has provided the algorithms used.
The Check Time Gate Data dialog is used to verify the time gate data. The EM61 data
decays with time, as a result each subsequent time gate must have smaller amplitudes.
This test simply checks that each subsequent time gate channel data is smaller than
the previous one. If this is not the case, the value of that time gate is set to DUMMY,
because it is assumed that it is noise rather than signal. Note that, the original data is
retained in the database.
138 Tutorial 2: UX-Parameter Determination
This procedure uses the database (em61mk2_S1A001.gdb) that was created when we
imported the EM61 MK2 data. Note that the EM data is noisy and should be levelled
prior to running the “Check Time Gate data” dialog.
3. Using the dropdown lists, select the Time gate channels as shown above and
specify the Acceptance Threshold as (2). Then, select (Yes) from the Force
verification dropdown list and click the [OK] button.
Note: Values below the Acceptance Threshold are not compared and are left
untouched. The acceptance threshold should be set above the noise level
because the values in the noise will not always show decay.
4. The channels are examined to ensure that each subsequent time gate channel
data is smaller than the previous one. If this is not the case then the value of
that time gate is set to DUMMY. Note that, the original data is retained in the
database as (_Chan1, _Chan2, _Chan3, and _Chan4).
5. Review the results by scrolling down the database and observe the consecutive
channels (Chan1, Chan2, Chan3, and Chan4) and notice that some of the
output data points are set to null (DUMMY) values, as shown below.
Tutorial 2: UX-Parameter Determination 139
This process uses the Geonics routines to calculate the apparent time constant. The
apparent time constant normalizes the time decay curve into a single number. It is
based on the assumption that the target response or part of it is exponential. Although
the assumption is rarely correct, normalizing the target response removes the
difference in the response magnitude and maintains a closer similarity between
targets of the same type but located at different depths.
This tutorial uses the same database as used above in the Check time gate data dialog
(em61mk2_S1A001.gdb).
3. Using the dropdown lists, select the Channel of time gate(s) 216, 366, 666,
1266 as (Chan1, Chan2, Chan3, Chan4). Specify the Acceptance Threshold
as (2) and then specify the Output TAU channel prefix as (Tau).
4. Click the [OK] button and the VerifyEM data dialog is displayed. Note that, as
we have already verified the time gate data we can accept the default channels
and click the [Yes] button.
5. The time constants will be calculated for all the channel combination pairs and
the outcome will consist of 1 to 6 channels.
Tutorial 2: UX-Parameter Determination 141
The Display Decay Curves option enables you to plot the decay curve, as an array
channel, in your current database. The slope and intersection are also calculated and
saved in the current database.
The EM instrument may have up to 26 time gates. These time gates can be combined
in a single vector and presented as a decay curve right inside the database. After you
click the [OK] button you will notice an additional “array” channel in the database
named DecayCurve. In addition, the intersection and slope of the log of the decay
curve are also calculated and saved under the _intesection and _slope channels.
The Display decay curves option generates three new channels, DecayCurve, _Slope
and _Intersection. The Slope and Intersection channels are calculated from the base-
10 logarithm of the data.
The exponential time-decay is represented as:
S(V,t) =( S n=0,3 [log10 (V(t i+1))- log10 (V(ti))]/[log10(t i+1)- log10 (t i)] )/(n-1)
I(V,t) =( S n=0,4 [V(ti) - S(V,t) *t i] )/n
Where:
142 Tutorial 2: UX-Parameter Determination
For more information, click the Help ( ) button on the Display Decay Curves
dialog.
Target Selection
The UX-Process system includes two target selection options, Automated Anomaly
Pick Along Profiles and Find Peak Dipoles. For more detailed information about
these options, click the Help ( ) button on the dialog of interest.
The Automated anomaly pick along profiles option enables you to locate anomalies
(peaks) in line data. Using this option, you specify a base level and minimum
amplitude and then any contiguous group of three or more values that exceed the base
level, where the maximum value is not the first or last value, and where the maximum
value exceeds the base level value plus the minimum amplitude, is returned as an
anomaly. Once the anomaly is located, the width of the anomaly is also calculated.
The results including the “current” X and Y locations and peak amplitudes are saved
in a new “Target” group (line).
Note: This option will not work for total field magnetic data that displays dipolar
anomalies; it will only pick positive peaks.
3. In the Target list box, specify the name of the new (Target) group (line). Then,
using the dropdown list, select the Channel to pick anomaly as (Chan1).
4. In the Base level box, specify (1) and in the Minimum amplitude box specify
(25). Note that, you can select up to three additional channels whose values
will be saved to the target list. If you like you can select (Chan2, Chan3, and
Chan4).
5. Click the [OK] button and the anomalies will be calculated and the results will
be displayed in the new Target database (EM61MK2_S1A001_Targets.gdb)
and the UCEANOMPICK dialog will also be displayed.
6. This dialog informs you on the number of anomaly peaks found. Click the
[OK] button to close the dialog. For more detailed information on the
algorithm or any of the dialog parameters, click the Help ( ) button on the
Pick anomalies in lines dialog.
The Find Peak Dipoles option enables you to grid a data channel (column) and then
search the grid for every positive peak and its corresponding negative peak. For
detailed information on how the corresponding negative peak is found, click the
[Help] button on the Find Peak Dipoles dialog.
144 Tutorial 2: UX-Parameter Determination
2. Using the radio buttons at the top of the dialog, select to (Create new grid).
Then, using the Survey database browse button ( ) locate
(APGMag_S1A001.gdb). Using the Channel to grid dropdown list, which is
populated with channels from the ‘Survey database’, select (bottom). We will
leave the Grid cell size to the default, (i.e. 0.25).
3. In the Max dipole separation box, specify the radius of the search window as
(2.5). Then, in the Output section, select the Target database from your project
directory as (PickedTargets_S1A001.gdb) and then specify the Group name
as (MagPeaks).
4. Click the [OK] button and the data channel is gridded, the positive peaks are
located and their corresponding negative peaks are also located and these
results are posted to the Output Database PickedTargets_S1A001.gdb.
Tutorial 2: UX-Parameter Determination 145
Target Analysis
The Target Analysis options enable you to analyse your magnetic and EM data to
determining signal strength and the signal to noise ratio for each detected target,
display and refine target windows and to batch fit the targets. For additional
information about each analysis method, click the Help ( ) button on the dialog of
interest.
The Calculate signal strength and SNR option enables you to calculate signal strength
and signal to noise ratio for each detected target given magnetic data, search radius
and the background calculation method. This can be used as a discrimination tool to
distinguish larger deeper anomalies from shallow noise. Both may have same
amplitude peak value but deeper anomalies will have much higher power due to
broader nature of anomaly within the target area.
The background can be calculated in two modes, Batch or Interactive. Batch mode
should be used if the target database encompasses all the targets. In this mode the
vicinity of each target is first removed from the data and then the background is
calculated on the remainder of the data. Interactive mode should be used if you can
observe an area of the data that represents the background well. You will be
prompted to select an area on an existing map that represents the background.
The Signal strength is calculated as the sum of the squares of all the points within the
window and above the background. The Background is set to the mean of the signal
removed data (or the user specified background) minus 3.5 standard deviations. The
formula used to calculate the noise is as follows:
rSignal2noise = (signal_strength2 - noise_StandardDeviation2 / noise_StandardDeviation2
7. Using the arrow buttons, select the data for processing as (bottom) and click
the [OK] button. The data is processed and three new channels are posted to
your target database (Truthtable_S1A001.gdb); SNR_bottom,
Signal_Strength_bottom and Size_bottom.
The Display target windows option enables you to display a selected target group to a
map. Using this option you can obtain the targets from a polygon files in the
148 Tutorial 2: UX-Parameter Determination
“_wrk\plys” directory or you can specify a window size and all targets in the selected
database will be displayed on the map.
6. Click the [OK] button and the select Target Groups to Display dialog is
displayed. Select (D0) and click the [OK] button and the Local coordinate
system dialog is displayed. Select the Local X channel as (loc_x) and the Local
Y channel as (loc_y) and click the [OK] button.
7. The target windows will be displayed on the new map file (Display
Target.map) as shown below.
The Redefine a target window option enables you to redefine the extents of a target
window on a map.
8. Open and select (highlight) the target map file (Display Target.map), created
earlier.
9. On the UX-ParameterDetermination menu, select Target Analysis|Redefine a
Target Window. The Redefine a target window dialog is displayed.
150 Tutorial 2: UX-Parameter Determination
10. Using the radio buttons, select the map containing your target windows as (Use
current map) as the Display Target.map is currently open and selected in our
project.
11. Click the [OK] button and the UCEREDEFINETARGETWINDOW dialog is
displayed.
12. This dialog tells you to click inside the target window you want to redefine.
Click the [OK] button and select a target window to redefine and the
UCEREDEFINETARGETWINDOW dialog is again displayed.
13. This time the dialog tells you which target you have selected and is now
available for editing. Click the [OK] button and using your mouse, redefine the
target window extents. When you are done, right-click and from the popup
menu, select Done. The new extents for the target window will be displayed on
the map.
The Batch Fit Targets option provides the tools so that given a list of Target X and Y
coordinates and a corresponding survey database this process fits the targets, and
populates the target database with the results. EM61, EM61MKII, and Magnetic data
can be fitted.
Tutorial 2: UX-Parameter Determination 151
The process sorts the survey data into one big bin, by X and Y, for fast access. Then
for each target it extracts a window of data centered on the target of the specified
anomaly size. This data is saved in a subset database and each group belongs to one
target, and is saved under its Target ID name. In addition to this window, the profile
closest to the target is also extracted. All the targets are fit and the target database is
updated with the results.
The fitting software has been developed in collaboration with AETC Incorporated.
T O B ATCH F IT T ARGETS :
2. Using the dropdown lists, select the Survey and Target database names as
(APGMag_S1A001.gdb and TruthTable_S1A001.gdb).
3. In the Target window size box, specify a size for the target window that is
centered on the target location and that encompasses the anomaly data. For our
data we selected (2.5). Note that, choosing an appropriate window size may be
difficult if the data quality/data density is not high enough.
4. Using the Instrument type dropdown list, select (Magnetometers). Then in the
Instrument height above ground box specify (0.3). The Instrument coil
separation (EM) box can be left blank, as this is for EM data only.
5. Click the [Next>] button and the Magnetic target information dialog is
displayed. Note that, the type of target information dialog that is displayed
(Mag or EM) depends on the Instrument type selected in the Fit Window
settings dialog.
152 Tutorial 2: UX-Parameter Determination
6. Using the dropdown lists, select the Mag Channel as (bottom), the Target
group to fit as (calGrid_), the Target ID Channel as (Target_ID) and the
Taret Mask channel as (Target_Mask). (Note that, the Altitude channel is
optional.)
7. Specify the Total field, Inclination and Declination as shown above and click
the [OK] button and the fitted targets are saved to a subset database and the
target database is updated with the results.
Tutorial 3: UX-Target Management 153
Target Maps
The Target Maps menu provides a variety of options for creating a target map
including, creating a standard US Army Corps basemap, displaying a grid, contouring
data, adding a site plan in DXF format, creating customized comparative maps and
prove out maps that can be used to compare the picked targets on a prove out grid to
the actual seeded targets.
The Create US Army Corps Basemap option enables you to create a new basemap
(basemap layout now available in either 8.5”x11” or 11”x17” paper size). In addition,
this option enables you to generate a blank map outside of running any other process.
T O C REATE A N EW B ASEMAP :
3. Specify the Map name as (Basemap) and then using the Map size dropdown
list, select (landscape letter 8.5x11), the default.
4. You can add a Map Title (Basemap) and then click the [Next>] button. The
Data range to map dialog is displayed.
5. Click the [Scan data] button to scan the current database file for the Minimum
and Maximum X, Y values. Click the [OK] button to generate a new map that
includes the standard US Army Corps basemap.
154 Tutorial 3: UX-Target Management
Displaying a Grid
UX-Process enables you to work with a variety of grid (and image) formats. You can
access grids, images and other files (DXF and Geosoft Plot) directly in UX-Process
by displaying these files in a map. You can display a grid to an existing (or current)
map or to a new map.
T O D ISPLAY A G RID :
1. Open and select (highlight) the map (Basemap.map) created in the previous
section.
2. On the UX-TargetManagement|Target Maps menu, select Display Grid, this
menu lists several options for displaying grids, select Colour Shaded Grid, to
display a grid with colour shading. The Color-shaded grid image dialog is
displayed.
Tutorial 3: UX-Target Management 155
3. Using the Browse ( ) button; locate the Grid name to display as (bottom.grd).
Then. accepting the intelligent default parameters, as shown above, click the
[Current Map] button to display a colour-shaded grid on the standard US Army
Corp basemap.
Contour
Oasis montaj contouring options are designed to help you make a contour map from
a gridded data file. The basic contouring methodology is to thread contour lines
156 Tutorial 3: UX-Target Management
through constant levels as defined in the grid file. There are a number of options for
contouring;
• Quick - uses default parameters
• Custom – uses parameters specified by user
• Logarithmic - uses a grid that has been gridded logarithmically, that is the grid
values are assumed to be log(Z)
• Control file - ASCII control file enables full cartographic cosmetic control of
plotting options
T O Q UICK C ONTOUR :
1. Open and make current (highlight) a map you want to add contours to
(Basemap.map).
2. On the UX-TargetManagement menu, select Target Maps|Contour and then
select Quick. The Contour dialog is displayed.
3. Specify the Grid file (bottom.grd) to use to calculate the contours and click
the [OK] button. The system adds the default quick contours to the map.
4. For more information about Quick contouring click the Help ( ) button on
the Contour dialog.
Tutorial 3: UX-Target Management 157
The Import a DXF file into a map dialog enables you to import vector entities in a
DXF file into a new or existing map. This dialog operates as a wizard.
Note that, there is no DXF file available to import into this project, however the steps
involved in importing a DXF file are provided below.
1. Open and make current (highlight) a map you want to import a DXF file to.
2. On the UX-TargetManagement menu, select Target Maps|Site Plan from DXF
File. The Import a DXF file into a map dialog is displayed.
3. Using the Browse ( ) button, select the DXF file. Then, using the dropdown
list select to Plot layers to. Then, specify the Maximum number of pen styles to
use and select to “retain colours” or select a specific colour from the Colour
dropdown list.
4. Click the map to plot the file to as either “New map” or “Current map” and
depending on which map you are plotting to the Import to “new” or ”current”
map dialog is displayed.
5. Use this dialog to specify which view you are importing the dxf file into (e.g.
plan, 3D or section). Click the [Next>] button and the DXF Coordinate system
dialog is displayed. Use this dialog to specify the coordinate system. Click the
[OK] button to plot the DXF file to your map. For more information on these
parameters click the Help ( ) button on the Import a DXF file into a map
dialog.
UX-Process provides a range of options for visualizing your data in three dimensions
including; displaying multiple surfaces, each with own relief and contents, and each
with its own orientation in 3D space.
158 Tutorial 3: UX-Target Management
The 3D Colour Range Symbols plotting option enables you to plot 3D symbols with
fixed or variable colours to a 3D map. The colours are fixed or varied based on the
data values of the colour data channel.
1. Open the database that includes the data you want to plot using the 3D Colour
Range Symbol option as (TruthTable_S1A001.gdb) and then open and select
the map to plot the 3D Symbols to as (Basemap.map).
2. On the UX-TargetManagement|Target Maps menu, select 3D Colour Range
Symbols. The 3D Coloured symbols plot dialog is displayed.
3. Using the dropdown lists, select the parameters that best suites your data. Note
that, if you want to colour your data using the values from a secondary
channel, select (Variable colour) from the Colour option dropdown list and
then select a Variable colour: Colour data channel. Click the [OK] button and
the 3D Controls dialog is displayed.
4. You can edit the 3D Controls parameters now, during the creation of the 3D
colour range symbol plot, or from within the 3D Viewer while linked to the 3D
Tool in real-time so that changes are displayed interactively.
5. Click the [OK] button to plot the 3D colour range symbols to the current map.
Note that, the 3D Tool and 3D Viewer are now open in the foreground of your
project. The 3D Viewer enables you to change the point of view and work with
all the attributes that make up the 3D View by using the controls in the 3D
Tool.
Tutorial 3: UX-Target Management 159
6. Close the 3D Viewer and the view will appear on the 2D map just as it last
appeared in the 3D Viewer.
7. You can move and resize this view, just like any view in a map to place it at a
better location. Your map should look similar to the following image. For
information on these or any dialog parameters, click the Help ( ) button.
The Generate Comparative Map option enables you to display on the same map,
registered data of multiple instruments. In order to generate this map it is assumed
160 Tutorial 3: UX-Target Management
that you do have grids of various corrected instruments that properly register
together. For this example we have provided the gridded data of 2 EM61 channels
(ch1.grd & ch4.grd) and one magnetic channel and the vertical gradient (bottom.grd
& vertGrad.grd). This tutorial also uses the map (APG_Index.map).
T O G ENERATE A C OMPARATIVE M AP :
5. This dialog enables you to select the target location coordinates from either a
database or interactively from a map. Use the Input Target List dropdown list
and select (Map Interactive).
Tutorial 3: UX-Target Management 161
Note: If you select the Input target list as (Target Database) the comparative map
is run in batch mode. Instead of picking a point interactively you can select a
target database and create a comparative map for each target in the database.
6. Click the [Next] button and the Generate comparative map dialog is again
displayed.
7. Using the dropdown list, select the Target Map as (APG_Index.map) and
click the [Next] button. The Generate comparative map dialog is again
displayed.
8. In the Window size (in ground units) box, specify (4) meters as the size of the
window for each grid displayed in the map. Then, using the dropdown lists
(and/or) [Browse] buttons, select the 4 grid provided as (ch1.grd, ch4.grd,
bottom.grd, vertGrad.grd).
9. Click the [OK] button to continue. The Comparative Map dialog is displayed.
10. This dialog prompts you to pick a point to display. Click the [OK] button, and
then select one of the well defined local signals on the map.
11. A map displaying 4 grids, centered on the selected point will be displayed.
162 Tutorial 3: UX-Target Management
12. If you are not satisfied with the displayed color distribution, you can use the
Histogram tool to modify it. To modify the color distribution, select your
comparative map and then select the View/Group Manager ( ) button.
13. In the View/Group Manager Tool, double click on the grid (AGG_*) that you
would like to modify and the Histogram Tool will be displayed. Modify the
color distribution as desired. Repeat the process for any other grid that is not
satisfactorily displayed.
Tutorial 3: UX-Target Management 163
If you modify the color distribution of the grids you will then need to synchronize the
color bars with the corresponding displayed grids.
2. The color bars will be updated to reflect the colours on the corresponding
grids.
The Interactively move target location option enables users to select a new target
location on the open “Comparative map”. New target X, Y channels will be created
and added to the target database (Note that, the original X, Y channels will be
backed-up and saved in the __X, __Y channels respectively).
4. Using the dropdown lists, select the Target Line as (D0) and the Target
Channel as (Target_ID). Click the [OK] button and the Comparative Map
dialog will be displayed.
5. This dialog tells you to interactively pick a new point for the target location.
The target will be moved on the map and new target X, Y channels will be
created and added to the target database (PickedTargets_s1A001.gdb). (Note
that, the original X, Y channels are backed-up and saved in the __X, __Y
channels respectively).
Prove-Out Map
The Prove-Out Map menu item includes the following sub-menu options, Prove-Out
Results and the Mask Targets on Prove-Out Map.
To access additional technical information about the Prove-Out Map options, click
the Help ( ) button on the dialog of interest. This tutorial uses the following files to
demonstrate the Prove-out tools, TruthTable_S1A001.gdb and
PickedTargets_S1A001.gdb.
P ROVE -O UT R ESULTS
The Prove-Out Results menu item compares all the lines in the selected databases to
determine if targets found within a search tolerance range in the “Truth target”
database are also found in the “Picked target” database. The two databases must
contain a “Target_ID” channel. The targets in the two databases are compared and
where a picked target is found within a user defined tolerance of the truth database,
166 Tutorial 3: UX-Target Management
The user may define up to three tolerance levels and associate a weight for each. All
targets found within the largest tolerance will be flagged and the “Target_ID” will be
added to the “Target_Matched” channel, and then depending on their closeness to the
actual seeded target they will be weighed. The total numbers of targets found, as well
as the total score of the targets, are reported in the title block. This provides a relative
measure of the goodness of the picked targets.
When multiple targets are found within the user defined maximum tolerance, the
“Target_IDs” are added to the “Multi_Matched” channel. The target closest to the
target in the “Truth target” database is added to the “Target_Matched” channel.
The data sets used in the Prove-out Result test include the following databases
(TruthTable_S1A001.gdb and PickedTargets_S1A001.gdb). Please ensure that you
have copied the data files provided into your project directory.
Note: Points with data (non-dummy values) in the Target_ID channel are recognized
as Targets.
2. Using the Truth Target Database browse button ( ), select the database that
includes the true target locations (Truth Table_S1A001.gdb) and using the
Picked Target Database browse button ( ), select the database that includes
the picked target locations (Picked Targets_S1A001.gdb).
3. The Distance units, which can not be edited, are provided. You can specify up
to 3 Search Tolerances to use to compare target locations. The first tolerance
must be the largest and the last the smallest. Select the radii as shown in the
dialog (0.5, 0.25, and 0.1). The Maximum search radius is mandatory the next
two are optional.
4. For each of the tolerances enter a weight. Normally the innermost tolerance
must have the highest weight. This weight is used to calculate a relative score.
5. Using the dropdown lists, select (Yes) when asked, Would you like to plot a
map? and select (Yes) when asked, Display non-seeded picks?.
6. Click the [OK] button, the targets in the two databases are compared and
where targets are found in each database, the “Truth target” database’s
“Target_Status” channel is flagged as either “Found within
[innermost|middle] tolerance” or “Not Found”. The “Target_Matched”
channel is also populated with the Target_IDs of the matched targets and in the
case where there are multiple targets matched, within the search radii, the
Target_IDs of those targets are added to the “Multi_Matched” channel.
7. This process also creates a map (Picked Targets.map) on which the seeded
targets are displayed as bulls-eyes, with concentric circles delineating the three
specified tolerances. The found targets are identified with blue crosses and
their target ID is posted. The corresponding seeded targets are displayed with
concentric green circles with the radius equal to the specified tolerances. The
168 Tutorial 3: UX-Target Management
targets that were not matched are displayed in red. The statistics of the picks
are reported in the title block. The non-seeded picks are displayed in grey.
8. Some statistics including the relative score are displayed in the Legend box.
The Mask targets on prove-out map option, enables you to display selected layers of
targets (i.e. All, Inner Radius, Middle Radius, Outer Radius, Targets Not Found).
This option requires that the prove-out process be previously applied to the data.
The files used to demonstrate the Mask targets on prove-out map are:
TruthTable_S1A001.gdb and the map created in the previous Prove-out Results Test
(Picked Targets_S1A001.map).
3. Using the [Browse] button for the Truth Target Database locate the
(TruthTable_S1A001.gdb).
4. To begin masking the target groups on the map we should begin with no
targets and then cumulatively add the groups as required. So, for the Target
Selection select (All) and for Mask select (Exclude) and click the [OK] button.
All of the targets will be excluded from the mask.
5. Display the Mask target groups dialog again.
Target Utilities
The Target Utilities menu provides a variety of target organization and management
utilities, including:
• Generate Composite Target ID (page 170)
• Display Target Windows (page 171)
• Redefine a Target Window (page 172)
• Statistics on Target Removed Data (page 173)
• Target Density Calculation (page 175)
• Calculate Distance to Corners (page 176)
• Calculate Target Location (page 177)
• Shortest Path (page 179)
The Generate Composite Target ID option enables you to generate new unique target
IDs specific for each grid within the survey, given the grid name or ID, target ID
(optional) and output channel.
The new target ID syntax is the grid name (e.g. A01) followed by the number
(_0001). The numbering starts from 1 for each grid.
1. Open and select (highlight) the Target database that includes target IDs and X,
Y locations for each target as (TruthTable_S1A001.gdb).
2. On the UX-TargetManagement menu, select the Target Utilities|Generate
Composite Target ID. The Generate combined ID dialog is displayed.
4. Then, using the dropdown list, select the Grid Channel as (GridLoc), we can
select another channel from the dropdown list for the Optional suffix or leave it
blank. Then, specify a new name for the Output Channel for example,
(Comb_ID) and click the [OK] button. The new channel (Comb_ID) that
includes the new unique target ID is added to the current database
(TruthTable_S1A001.gdb).
The Display target windows option enables you to display a selected target group to a
map. Using this option you can obtain the targets from a polygon files in the
“_wrk\plys” directory or you can specify a window size and all targets in the selected
database will be displayed on the map.
3. In the Map Display area, select the map on which to display the target
windows as, (Create new map). Then, in the Map name box, specify (Display
Target.map) and from the Display channel dropdown list, select (bottom)
note that, this channel will be gridded on your new map.
4. Use the Target Windows area to specify the source of the target windows,
either from polygon files or as user defined size. For our purposes we will
select the (Specify a window size) radio button and in the text box provided,
specify (2).
5. Click the [OK] button and the select Target Groups to Display dialog is
displayed. Select (D0) and click the [OK] button and the target windows will
be displayed on the new map file (Display Target.map) as shown below.
The Redefine a target window option enables you to redefine the extents of a target
window on a map.
3. Using this dialog to select the map containing the target windows, select (Use
current map)and click the [OK] button and the
UCEREDEFINETARGETWINDOW dialog is displayed.
4. This dialog tells you to click inside the target window you want to redefine.
Click the [OK] button and select a target window to redefine and the
UCEREDEFINETARGETWINDOW dialog is again displayed.
5. This time the dialog tells you which target you have selected and is now
available for editing. Click the [OK] button and using your mouse, redefine the
target window extents. When you are done, right-click and from the popup
menu, select Done. The new extents for the target window will be displayed on
the map.
The Statistics on target-removed data option enables you to mask out the targets in
your original data, within a specified radius, and then run statistics on the remainder
to evaluate the background values.
174 Tutorial 3: UX-Target Management
4. From the dropdown lists, select the Site database channel to process as
(bottom), the Mask the channel? as (Yes) and Target database groups to
process as (calGrid_).
5. Click the [OK] button and the background statistics for the bottom channel
will be displayed.
Tutorial 3: UX-Target Management 175
The Target density calculation menu item counts the number of targets in a specific
area defined as a polygon on a map.
3. Using the dropdown lists, specify the Target Channel as (Target_ID) and the
Mask Channel as (Target_Matched). Then, using the Pick Mode dropdown
list, select (interactive) as the method we will use to select an area. Click the
[OK] button; the Density Report dialog is displayed.
4. Click the [OK] button, and digitize a polygon around the lower left quadrant of
the S1A001 area, to determine a density analysis. Finish by clicking the right
mouse button and selecting Done from the popup menu.
5. The result of the calculation will be display on the Target Density Report
dialog.
176 Tutorial 3: UX-Target Management
The Calculate distance to corners option calculates the distance of each target to the
four corners of the grid it resides in. The input database is complemented with 5 new
channels:
GridName: containing the grid name in which the target resides
Distance_SW: the distance of the target in the project units from the lower left
corner of the grid
Distance_SE: the distance of the target in the project units from the lower right
corner of the grid
Distance_NW: the distance of the target in the project units from the upper left
corner of the grid
Distance_NE: the distance of the target in the project units from the upper right
corner of the grid
The file used to demonstrate the Calculate distance to corners is the
TruthTable_S1A001.gdb.
2. Using the [Browse] button, select the Target database from your working
directory (TruthTable_S1A001.gdb).
Tutorial 3: UX-Target Management 177
3. Click the [OK] button and the five new channels will be calculated and
displayed in the Target database.
The interactive Calculate target location options enables users to input the distance
of a dig location to at least one corner of a specified quadrilateral grid, and obtain the
coordinates of that location in the grid Cartesian coordinate system.
Often in the field when a target is dug, in order to locate it, the operator measures its
distance from at least 2 corners of the grid. These distances can be triangulated with
the absolute coordinates of the grid corners and translated into ground coordinates in
the same system. The terminology “LowerLeft”, “LowerRight”, “UpperLeft” and
“UpperRight” are considered in the absolute sense with North being directly ahead.
The results are accumulated in a text CSV file.
2. Using the Calculation options dropdown list, select (Single seed item
calculation). Note you also have the option of selecting the ‘Multiple seed
items calculation’.
3. Specify the Target CSV file as (TargetPosition) in the box provided. If you
already have a Target CSV file, use the browse button to locate this file and the
results will be appended at the end of the file.
4. Specify the Grid name as (S1A001) and the Distance Tolerance as (0.2). Click
the [OK] button and the second Calculate target locations dialog will be
displayed.
5. Use this dialog to identify the target and locate its position. In the Output
Target ID name box, specify (8) and then, enter the Distance to grid corner
values; LL (27.22), LR (30.60), UL (16.53), UR (21.83), as shown above. Note
that, for a target inside the grid you should enter at least 3 points, and for a
target along one edge you can enter 2 points.
Note: The [Check Table] button enables you to change a distance entry after the
target has already been triangulated. You can lead the pre-existing values for
that target number by clicking on this button.
6. Click the [OK] button and the UCETriangulate dialog is displayed.
7. Notice that the coordinates of this target are reported in UTM (402786.82,
4369621.68) and local coordinates (16.00, 21.93).
8. Click the [OK] button and the target coordinates are saved in the target CSV
file (TargetPositions.csv).
Tutorial 3: UX-Target Management 179
Shortest Path
Use the Shortest Path (Traveling Salesman Problem) dialog to create an index
channel which gives the shortest round-trip path to selected targets. This dialog will
also produce an output file containing an ordered index list.
1. Open and select the target database (TruthTable_S1A001.gdb). Note that, this
database must contain X/Y channels that define where your targets are located.
2. On the UX-TargetManagement menu, select Target Utilities|Shortest Path.
The Shortest Path dialog will be displayed.
3. Complete the parameters as shown in the dialog above, and click the [OK]
button. An index channel will be created that provides the shortest round-trip
path to selected targets and an output file containing the index list in ascending
order, along with their X,Y points, will also be created.
Target Classification
The Target classification option enables you to classify the target data either in
increasing or decreasing order of up to 4 channels, or by providing up to 4
expressions in order of importance.
T O C LASSIFY T ARGETS :
5. Using the dropdown lists and the associated radio buttons select the Primary,
Second, Third and Fourth Channels and whether the channel is Ascending or
Descending. Click the [OK] button, the data channels will be sorted based on
the parameters specified.
Tutorial 3: UX-Target Management 181
6. If we are looking for very shallow targets with a good coherence in fitting
process, we can classify the data again, this time using Expression as the
prioritization type. The Classification expressions dialog will be displayed.
Image Manipulations
The Image Manipulations menu includes a number of options for working with
project images. For more information on the Image Manipulations parameters, click
the Help ( ) button on the specific dialog of interest. This tutorial uses sample data
provided with the installation of the UX-Process system.
The Attach Image to Dig Sheet option enables you to display post operation images of
dug targets on the prove-out map.
For this option you must have a dig sheet database as well as a corresponding map. If
these items are not already opened in your project, you will be prompted to provide
them. The selected database is scanned for the existence of the key channel
"Target_ID", in the absence of which the process is terminated.
182 Tutorial 3: UX-Target Management
You can accumulate a list of images in a given directory. For convenience the project
delivers a few images. This option is run in a loop. To end the process, press the
<Esc> key. If you select a target that has an attached image already, you will be
asked if you intend to remove the image from the map.
The files used to demonstrate the Attach images to dig sheet are: the
PickedTargets_S1A001.gdb, PickedTargets_S1A001.map and the following image
files provided with this tutorial on you CD: UXO_roofingNails.jpg,
UXO_blu25.bmp, UXO_81mm.jpg, UXO_75mm.jpg, UXO_20mm.bmp,
UXO_150mm.jpg and UXO_105mm.bmp.
3. The Dig sheet database and the Map to use parameters should default to the
open database and map, as selected above. If they were not opened, open the
PickedTargets_S1A001 database and the PickedTargets_s1A001 map file.
Click the [OK] button and the Attach target images dialog is displayed.
4. Using the [Browse] button, select the Directory where images reside (i.e.
D:\DoD\) and then specify the Image file prefix (UXO_).
5. Click the [OK] button; the Attach images dialog is displayed.
Tutorial 3: UX-Target Management 183
Dig/No Dig
The Dig/No Dig (or Attach dig info to dig sheet) option requires a dig sheet database
and a corresponding target map. This option enables you to set, remove or replace
target dig information (Dig or No Dig) to the “Target_dig” channel in the current
database.
3. Using the dropdown and/or browse menus, select the Dig sheet database as
(PickedTargets_s1A001.gdb) and the Map to use as
(PickedTargets_s1A001.map) and click the [OK] button. The Attach dig info
dialog is displayed.
4. This dialog tells you how to Pick a target with the left mouse button, and to
press the <Esc> key to terminate the process. Click the [OK] button and your
cursor will change to a cross-hair ( ).
5. Select a Target on the map and the Attach dig info to dig sheet dialog will be
displayed.
6. Using the Set dig info dropdown list, select either Dig or No Dig and click the
[OK] button. The Target information will be posted to the Target_dig channel
in the current database and the Attach dig info dialog will be displayed.
Tutorial 3: UX-Target Management 185
7. This dialog tells you to select your next target, or to press the <Esc> key to end
the process. Continue selecting and defining your Targets until you are
satisfied with your selections.
8. If you select a target that already has a Target_dig parameter set, the Attach dig
info dialog will be displayed. Depending on the parameter setting as either Dig
or No Dig the following messages will appear.
Dig
No Dig
9. To change the parameters settings, click the [Yes] button, to continue with
another target click [No] and to stop the process click the [Cancel] button.
The Target Attributes option enables you to display the attributes for selected targets
and plot them to a map.
Once a target database is selected, a list of the channels in the database is displayed.
Any number can be selected, and values from these channels will be displayed as
target attributes when the target attributes dialog is displayed.
2. Using the dropdown lists or the browse buttons, select the Target Database
and Target Map from your working directory and click the [OK] button. The
Select target attributes to display dialog is displayed.
3. Using the arrow buttons, select the target attributes that you would like to
display and click the [OK] button. The Select Target dialog is displayed.
4. This dialog tells you that you can select a target using the left mouse button
and to exit press the <Esc> key. Click the [OK] button and select a location on
the (PickedTargets_s1A001.map) and the closest target to the selected
location is found and the Target Attributes dialog will be displayed.
Tutorial 3: UX-Target Management 187
5. This dialog displays the target attributes of the selected target. To select
another target, click the [Select Target] button and the Target Attribute dialog
will close and the target map will be displayed.
6. Click the [Plot] button and the Plot target attributes and image on a map
dialog is displayed.
7. Specify a Title (Blue 26 (20 mm)) and accept the default Target ID and click
the [New Map] button and the target attributes will be plotted on a new map.
188 Tutorial 3: UX-Target Management
The View target image option is provided to enable you to view the target images that
have been attached to the dig sheet database.
information on the Dig Sheet Analysis parameters, click the Help ( ) button on the
specific dialog of interest. This tutorial uses sample data provided with the
installation of the UX-Process system.
The Post Dig Verification of Unsubstantiated Picks option looks to the dig status
channel – and search for two strings “no contact” and “Not reproducible”. It also
looks at the signal strength or an amplitude channel. You can then dismiss the “No
contact” and “Not reproducible” targets only if the amplitude (or signal strength) is
below a user specified threshold. The output channel is populated with “Pass” or
“Fail” flags.
3. The Target Database should default to the current open and selected database
(PostDig_S1A001.gdb), if not use the ( ) browse button to locate it in your
project directory and select it.
4. Using the dropdown lists, select the Dig Status Channel as (Dig_Status) and
the Amplitude Channel as (amplitude). Then specify the Amplitude minimum
in the box provided as (110) and click the [OK] button.
5. The Status_Results channel will be added to the database and will be
populated with “Pass” or “*” (fail) results. If the amplitude is below the
specified Amplitude Min. then the status will be “Pass”, if above it will be “*”
190 Tutorial 3: UX-Target Management
(fail). Note that, Pass means that we accept that there is no anomaly at the
given X and Y location.
The Post dig verification of targets signal ranges option enables you to select a
response channel, a descriptor channel, and a output channel and then specify
acceptable signal strength range for each ordnance type in the descriptor channel.
(Note that, the descriptor information is often user supplied information (not
generated by Oasis montaj), for example the Ordnance type). The system then
scans the target database and if the signal strength is within the user specified range
for that ordnance type then it flags it with a “Pass” if not, it will be flagged as a
“Fail”. This information is saved in the channel “Test_results”.
5. Use this dialog to select ( ) the Ordnance Type (e.g. 20mm M55) and then
specify the Low value (e.g. 80) and the High value (e.g. 95).
6. Click the [Update] button to update the values in the dialog window and then
click the [Done] button and the data will be analyzed and posted to the new
Output Channel (Sig_Verify).