Sparameter and Power Data

This notebook demonstrates how to open many data sources, arrange the data and combine it into a single csv file. Note, because of the size of the data sources involved they are not included. For more information please contact Aric Sanders

Import and transformation of data files to a database

There are several different sources of structured data each one with formatting differences

Check Standard data sources

  1. One Port Raw files that have been converted using Ron Ginley's BDAT -> Ascii converter
  2. One Port files already stored in a legacy SAS database exported into csv by Jolene Spett
  3. Two Port Raw files that have been converted using Ron Ginley's BDAT -> Ascii converter
  4. Two Port files already stored in a legacy SAS database exported into csv by Jolene Spett
  5. Two Port Non-Reciprocal files that have been converted using Ron Ginley's BDAT -> Ascii converter
  6. Power Raw files files that have been converted using Ron Ginley's BDAT -> Ascii converter
  7. Power files already stored in a legacy SAS database exported into csv by Jolene Spett

DUT data sources, already analyzed using various versions of the Calrep HP Basic program

  1. One Port .asc files stored in ascii.dut folder
  2. Two Port .asc files stored in ascii.dut folder
  3. Power .asc files with 4 error columns per s-parameter and power value stored in ascii.dut folder
  4. Power .asc files with 3 error columns per power value stored in ascii.dut

Conversion of these files requires: opening, parsing, and standardization of the data

In [1]:
# import of needed libraries
import os
import re
import datetime
import pandas
from types import *
# import of pyMez to change to import pyMez when __init__.py is changed
from pyMez.Code.DataHandlers.NISTModels import *
from pyMez.Code.Utils.Names import *
import numpy as np
import matplotlib.pyplot as plt
The module smithplot was not found,please put it on the python path
In [2]:
# Location of the various data sources
#input data sources
CONVERTED_RAW_FILE_DIRECTORY=r'C:\Share\Ck_Std_raw_ascii'
SAS_ONE_PORT=os.path.join(TESTS_DIRECTORY,'onechks.csv')
SAS_TWO_PORT=os.path.join(TESTS_DIRECTORY,'twochks.csv')
SAS_POWER=os.path.join(TESTS_DIRECTORY,'powchks.csv')
DUT_TOP_DIRECTORY=r'C:\Share\ascii.dut'
# output data 
ONE_PORT_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\One_Port_Check_Standard.csv"
TWO_PORT_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Two_Port_Check_Standard.csv"
TWO_PORT_NR_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Two_Port_NR_Check_Standard.csv"
POWER_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Power_Check_Standard.csv"
COMBINED_ONE_PORT_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Combined_One_Port_Check_Standard.csv"
COMBINED_TWO_PORT_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Combined_Two_Port_Check_Standard.csv"
COMBINED_POWER_CHKSTD_CSV=r"C:\Share\Converted_Check_Standard\Combined_Power_Check_Standard.csv"
ONE_PORT_CALREP_CSV=r"C:\Share\Converted_DUT\One_Port_DUT.csv"
TWO_PORT_CALREP_CSV=r"C:\Share\Converted_DUT\Two_Port_DUT.csv"
POWER_3TERM_CALREP_CSV=r"C:\Share\Converted_DUT\Power_3Term_DUT.csv"
POWER_4TERM_CALREP_CSV=r"C:\Share\Converted_DUT\Power_4Term_DUT.csv"
SQL_DATABASE=r"C:\Share\Sql_DUT_Checkstandard\sqlite_sparameter_power.db"

Creating import lists by type for the converted raw data sources

  1. We read all the files in the top folder
  2. The 5th line in the header determines the data type (python is zero indexed so it is element 4)
  3. We create 4 lists of all files matching the various types (One-port, Two-Port, Two-PortNR, Power)
In [3]:
# We first get all files in the desired directory
file_names=os.listdir(CONVERTED_RAW_FILE_DIRECTORY)
# The loop runs quicker if we create lists and then add to them
# We create lists of the full path name for each of the data types
raw_files=[]
one_port_raw_files=[]
two_port_raw_files=[]
two_port_NR_raw_files=[]
power_raw_files=[]
# We iterate through the fiel names using the 5 th line to sort into our types
for index,file_name in enumerate(file_names[:]):
    in_file=open(os.path.join(CONVERTED_RAW_FILE_DIRECTORY,file_name),'r')
    lines=[]
    for line in in_file:
        lines.append(line)
    in_file.close()
    #print index,file_name
    if re.search('1-port',lines[4],re.IGNORECASE):
        one_port_raw_files.append(os.path.join(CONVERTED_RAW_FILE_DIRECTORY,file_name))
    elif re.search('2-port',lines[4],re.IGNORECASE) and not re.search('2-portNR',lines[4],re.IGNORECASE):
        two_port_raw_files.append(os.path.join(CONVERTED_RAW_FILE_DIRECTORY,file_name))
    elif re.search('2-portNR',lines[4],re.IGNORECASE):
        two_port_NR_raw_files.append(os.path.join(CONVERTED_RAW_FILE_DIRECTORY,file_name))
    elif re.search('Thermistor|Dry Cal',lines[4],re.IGNORECASE):
        power_raw_files.append(os.path.join(CONVERTED_RAW_FILE_DIRECTORY,file_name))
        
# This loop takes about 10 seconds
In [4]:
# Now we can check if the loop worked properly 
print("There are %s total files"%len(file_names))
print("There are %s one port raw files"%len(one_port_raw_files))
print("There are %s two port raw files"%len(two_port_raw_files))
print("There are %s two port NR raw files"%len(two_port_NR_raw_files))
print("There are %s power raw files"%len(power_raw_files))
total_binned_files=(len(one_port_raw_files)+len(two_port_raw_files)+len(two_port_NR_raw_files)+len(power_raw_files))
if len(file_names)==total_binned_files:
    print("All Files Have Been Acounted For")
else:
    print("{0} out of {1} files were binned".format(total_binned_files,len(file_names)) )
There are 6561 total files
There are 3636 one port raw files
There are 1574 two port raw files
There are 210 two port NR raw files
There are 1141 power raw files
All Files Have Been Acounted For
In [5]:
# Now each data source has to be parsed and converted to a common form
# One large issue is checking for data overlap in the data sources that can be solved by analysing the timestamp
# Also after trying several ways of doing the conversion, the best seems to be create a small csv and then add on 
def build_csv_from_raw_script(input_file_names_list,output_file_name,model_name):
    """Build csv from raw script takes a list of file names conforming to model and builds a single csv. 
    It is intentioned to accept raw files from the sparameter power project that have been converted from bdat
    using Ron Ginely's convertor (modified calrep program). The output is a single csv file with metadata added
    as extra columns (ie a denormalized table)"""
    try:
        # our current definition of metadata keys for all of the raw models
        metadata_keys=["System_Id","System_Letter","Connector_Type_Calibration","Connector_Type_Measurement",
              "Measurement_Type","Measurement_Date","Measurement_Time","Program_Used","Program_Revision","Operator",
              "Calibration_Name","Calibration_Date","Port_Used","Number_Connects","Number_Repeats","Nbs",
              "Number_Frequencies","Start_Frequency",
              "Device_Description","Device_Id"]
        # import the first file
        model=globals()[model_name]
        initial_file=model(input_file_names_list[0])
        # Add the metadata columns and replace any commas with - 
        for column_name in metadata_keys:
            initial_file.add_column(column_name=column_name,column_type='str',
                            column_data=[initial_file.metadata[column_name].replace(',','-') 
                                         for row in initial_file.data])
        # We also add a column at the end that is Measurement_Timestamp, that is 
        # Measurement_Time+Measurement_Date in isoformat
        timestamp=initial_file.metadata["Measurement_Date"]+" "+initial_file.metadata["Measurement_Time"]
        datetime_timestamp=datetime.datetime.strptime(timestamp,'%d %b %Y %H:%M:%S')
        measurement_timestamp=datetime_timestamp.isoformat(' ')
        initial_file.add_column(column_name="Measurement_Timestamp",column_type='str',
                            column_data=[measurement_timestamp
                                         for row in initial_file.data])
        # now we save the intial file with its column names but not its header
        initial_file.header=None
        initial_file.save(output_file_name)
        
        # Now we re-open this file in the append mode and read-in each new file and append it. This seems to work
        # for very large data sets, where as keeping a single object in memory fails
        out_file=open(output_file_name,'a')
        # now we do the same thing over and over and add it to the out file
        for file_name in input_file_names_list[1:]:
            
            model=globals()[model_name]
            parsed_file=model(file_name)
            for column_name in metadata_keys:
                parsed_file.add_column(column_name=column_name,column_type='str',
                            column_data=[parsed_file.metadata[column_name].replace(',','-') 
                                         for row in parsed_file.data])
            timestamp=parsed_file.metadata["Measurement_Date"]+" "+parsed_file.metadata["Measurement_Time"]
            datetime_timestamp=datetime.datetime.strptime(timestamp,'%d %b %Y %H:%M:%S')
            measurement_timestamp=datetime_timestamp.isoformat(' ')
            parsed_file.add_column(column_name="Measurement_Timestamp",column_type='str',
                            column_data=[measurement_timestamp
                                         for row in parsed_file.data])
            # add an endline before appending
            out_file.write('\n')
            # now we only want the data string
            data=parsed_file.get_data_string()
            out_file.write(data)
        # close the file after  loop
        out_file.close()
    # Catch any errors
    except:
            raise
        
In [7]:
# Now we can try it for a subset of one ports
build_csv_from_raw_script(one_port_raw_files[:10],ONE_PORT_CHKSTD_CSV,"OnePortRawModel")
# we re-import the csv using pandas to make sure it worked
one_port_raw_data_frame=pandas.read_csv(ONE_PORT_CHKSTD_CSV)
one_port_raw_data_frame[:5]
Out[7]:
Frequency Direction Connect mag arg System_Id System_Letter Connector_Type_Calibration Connector_Type_Measurement Measurement_Type ... Calibration_Date Port_Used Number_Connects Number_Repeats Nbs Number_Frequencies Start_Frequency Device_Description Device_Id Measurement_Timestamp
0 0.0001 1 1 0.7560 -39.35 HP8510 K NaN N 1-port ... NaN 1 3 1 1 73 7 00080 w/HP432A 80 2012-07-25 19:25:47
1 0.0001 1 2 0.7561 -39.35 HP8510 K NaN N 1-port ... NaN 1 3 1 1 73 7 00080 w/HP432A 80 2012-07-25 19:25:47
2 0.0001 1 3 0.7561 -39.35 HP8510 K NaN N 1-port ... NaN 1 3 1 1 73 7 00080 w/HP432A 80 2012-07-25 19:25:47
3 0.0002 1 1 0.5091 -57.61 HP8510 K NaN N 1-port ... NaN 1 3 1 1 73 7 00080 w/HP432A 80 2012-07-25 19:25:47
4 0.0002 1 2 0.5091 -57.62 HP8510 K NaN N 1-port ... NaN 1 3 1 1 73 7 00080 w/HP432A 80 2012-07-25 19:25:47

5 rows × 26 columns

In [8]:
%timeit build_csv_from_raw_script(one_port_raw_files[:10],ONE_PORT_CHKSTD_CSV,"OnePortRawModel")
10 loops, best of 3: 152 ms per loop
In [36]:
# This loop takes ~ 16.3 ms to run for a file so 16.3ms * num_files
16.3*10**-3*len(one_port_raw_files)
Out[36]:
59.26680000000001
In [9]:
# Let's do all of the files and time it, this loop will be the worst case senario
import_list=one_port_raw_files[:]
start_time=datetime.datetime.now()
build_csv_from_raw_script(import_list,ONE_PORT_CHKSTD_CSV,"OnePortRawModel")
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(import_list),diff.total_seconds()))
3636 files were converted to a single csv in 122.528 seconds
In [11]:
%matplotlib notebook
# Now let's check the integrity of the data by re-importing, selecting and plotting some of it
one_port_raw_data_frame=pandas.read_csv(ONE_PORT_CHKSTD_CSV)
test_subset=one_port_raw_data_frame[one_port_raw_data_frame["Device_Id"]==80]
if COMBINE_S11_S22:
    test_subset.plot(x="Frequency",y="mag")
else:
    test_subset.plot(x="Frequency",y="magS11")
plt.show()
In [44]:
# now 2 port

import_list=two_port_raw_files[:]
start_time=datetime.datetime.now()
build_csv_from_raw_script(import_list,TWO_PORT_CHKSTD_CSV,"TwoPortRawModel")
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(import_list),diff.total_seconds()))
1574 files were converted to a single csv in 31.946 seconds
In [45]:
# now 2 port NR

import_list=two_port_NR_raw_files[:]
start_time=datetime.datetime.now()
build_csv_from_raw_script(import_list,TWO_PORT_NR_CHKSTD_CSV,"TwoPortNRRawModel")
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(import_list),diff.total_seconds()))
210 files were converted to a single csv in 10.689 seconds
In [46]:
# now power

import_list=power_raw_files[:]
start_time=datetime.datetime.now()
build_csv_from_raw_script(import_list,POWER_CHKSTD_CSV,"PowerRawModel")
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(import_list),diff.total_seconds()))
1141 files were converted to a single csv in 7.684 seconds

Now that the conversion of the raw data files is finished, we need to import data from the legacy SAS database, check to see if it is in the csv file, and if it is not add it. It should be noted that the data in the SAS database has a different number of columns, so that some of them need to be translated. In addition, there is no SAS equivelent to two port NR.

Adition of data process

  1. import tables of new raw and SAS types
  2. rename any columns that are equivelent
  3. create any columns that are converted forms (dates)
  4. delete any extra columns
  5. add empty columns for undefined values
  6. exclude any that appear in new raw data set
  7. export new joined file
In [12]:
# step 1: import data sets
raw_one_port=pandas.read_csv(ONE_PORT_CHKSTD_CSV)
sas_one_port=pandas.read_csv(SAS_ONE_PORT)
In [13]:
# step 2: rename any columns that are the same with different names
same={"spid":"System_Id","SP":"Port_Used","ctype":"Connector_Type_Measurement","checkid":"Device_Id",
     "MGAMA":"magS11","PGAMA":"argS11","CON":"Connect","FREQ":"Frequency"}
if COMBINE_S11_S22:
    same["MGAMA"]="mag"
    same["PGAMA"]="arg"
new=sas_one_port.rename(columns=same)
In [14]:
# step 3: create derived columns
def date_conversion(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.isoformat(" ")
def to_measurement_date(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%d %b %y")
def to_measurement_time(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%H:%M:%S")
def to_calibration_date(date_sas_format):
    if type(date_sas_format) is StringType:
        datetime_format=datetime.datetime.strptime(str(date_sas_format),'%d%b%y:%H:%M:%S')
        return datetime_format.strftime("%d %b %y")
    else:
        return date_sas_format
new["Measurement_Timestamp"]=new["MEASDATE"].map(date_conversion)
new["Measurement_Date"]=new["MEASDATE"].map(to_measurement_date)
new["Measurement_Time"]=new["MEASDATE"].map(to_measurement_time)
new["Calibration_Date"]=new["CALDATE"].map(to_calibration_date)
if COMBINE_S11_S22:
    pass
else:
    new["magS22"]=0.0
    new["argS22"]=0.0
new["Measurement_Type"]='1-port'
In [15]:
# step 4: delete any extra columns
del new["CALDATE"]
del new["MEASDATE"]
del new["CAL"]
In [16]:
# check our progress
new[:5]
Out[16]:
Connector_Type_Measurement System_Id Device_Id Frequency Connect Port_Used mag arg Measurement_Timestamp Measurement_Date Measurement_Time Calibration_Date Measurement_Type
0 7 mm System 2,6 C07101 2.0 1 1 0.19937 35.400 1993-08-10 08:31:01 10 Aug 93 08:31:01 02 Aug 93 1-port
1 7 mm System 2,6 C07101 2.0 2 1 0.19932 35.401 1993-08-10 08:31:01 10 Aug 93 08:31:01 02 Aug 93 1-port
2 7 mm System 2,6 C07101 2.0 3 1 0.19934 35.397 1993-08-10 08:31:01 10 Aug 93 08:31:01 02 Aug 93 1-port
3 7 mm System 2,6 C07101 3.0 1 1 0.20243 -37.995 1993-08-10 08:31:01 10 Aug 93 08:31:01 02 Aug 93 1-port
4 7 mm System 2,6 C07101 3.0 2 1 0.20252 -37.986 1993-08-10 08:31:01 10 Aug 93 08:31:01 02 Aug 93 1-port
In [17]:
# step 5: add empty columns
empty_columns=[ u'Direction',  u'System_Letter',
       u'Connector_Type_Calibration', u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description']
for empty_column in empty_columns:
    new[empty_column]=None
In [18]:
# Now check that the column names are the same and order them
raw_columns=raw_one_port.columns
print raw_columns
new=new[raw_columns]
new_columns=new.columns
print new_columns
raw_columns==new_columns
Index([u'Frequency', u'Direction', u'Connect', u'mag', u'arg', u'System_Id',
       u'System_Letter', u'Connector_Type_Calibration',
       u'Connector_Type_Measurement', u'Measurement_Type', u'Measurement_Date',
       u'Measurement_Time', u'Program_Used', u'Program_Revision', u'Operator',
       u'Calibration_Name', u'Calibration_Date', u'Port_Used',
       u'Number_Connects', u'Number_Repeats', u'Nbs', u'Number_Frequencies',
       u'Start_Frequency', u'Device_Description', u'Device_Id',
       u'Measurement_Timestamp'],
      dtype='object')
Index([                 u'Frequency',                  u'Direction',
                          u'Connect',                        u'mag',
                              u'arg',                  u'System_Id',
                    u'System_Letter', u'Connector_Type_Calibration',
       u'Connector_Type_Measurement',           u'Measurement_Type',
                 u'Measurement_Date',           u'Measurement_Time',
                     u'Program_Used',           u'Program_Revision',
                         u'Operator',           u'Calibration_Name',
                 u'Calibration_Date',                  u'Port_Used',
                  u'Number_Connects',             u'Number_Repeats',
                              u'Nbs',         u'Number_Frequencies',
                  u'Start_Frequency',         u'Device_Description',
                        u'Device_Id',      u'Measurement_Timestamp'],
      dtype='object')
Out[18]:
array([ True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True,  True,
        True,  True,  True,  True,  True,  True,  True,  True], dtype=bool)
In [19]:
# step 6: exclude any files that already exist
unique_timestamps=raw_one_port["Measurement_Timestamp"].unique()
new=new[-new["Measurement_Timestamp"].isin(unique_timestamps)]
In [20]:
# step7: add the files and save as csv, note at this point we can write to a db also
combined=pandas.concat([raw_one_port,new])
# combined["mag"]=combined["magS11"]+combined["magS22"]
# combined["arg"]=combined["argS11"]+combined["argS22"]
# del combined["magS11"]
# del combined["magS22"]
# del combined["argS11"]
# del combined["argS22"]
# column_order=[u'Frequency', u'Direction', u'Connect', u'mag', u'arg',  u'System_Id', u'System_Letter',
#        u'Connector_Type_Calibration', u'Connector_Type_Measurement',
#        u'Measurement_Type', u'Measurement_Date', u'Measurement_Time',
#        u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
#        u'Calibration_Date', u'Port_Used', u'Number_Connects',
#        u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
#        u'Device_Description', u'Device_Id', u'Measurement_Timestamp']
combined.to_csv(COMBINED_ONE_PORT_CHKSTD_CSV,index=False)
In [21]:
# Finally we check that the files are all what we expect
number_measurements_raw=len(raw_one_port["Measurement_Timestamp"].unique())
number_measurements_sas=len(sas_one_port["MEASDATE"].unique())
number_new=len(new["Measurement_Timestamp"].unique())
number_combined=len(combined["Measurement_Timestamp"].unique())
print("There were {0} measurements in the raw one port files".format(number_measurements_raw))
print("There were {0} measurements in the sas one port files".format(number_measurements_sas))
print("{0} measurements did not overlap".format(number_new))
print("This resulted in {0} unique measurements".format(number_combined))
print("The statement that the number of raw + non-overlapping measurements is equal to the number of" 
      "combined measurements is {0}, resulting in {1} rows of"
      "data".format(number_new+number_measurements_raw==number_combined,len(combined)))
There were 3407 measurements in the raw one port files
There were 1959 measurements in the sas one port files
1679 measurements did not overlap
This resulted in 5086 unique measurements
The statement that the number of raw + non-overlapping measurements is equal to the number ofcombined measurements is True, resulting in 1700761 rows ofdata
In [22]:
# show a detailed row count, showing how many values are empty
combined.count()
Out[22]:
Frequency                     1700761
Direction                     1579726
Connect                       1700761
mag                           1700761
arg                           1700761
System_Id                     1700761
System_Letter                 1579726
Connector_Type_Calibration      50468
Connector_Type_Measurement    1700761
Measurement_Type              1700761
Measurement_Date              1700761
Measurement_Time              1700761
Program_Used                  1579726
Program_Revision              1579726
Operator                      1579726
Calibration_Name               643520
Calibration_Date               171502
Port_Used                     1700761
Number_Connects               1579726
Number_Repeats                1579726
Nbs                           1579726
Number_Frequencies            1579726
Start_Frequency               1579726
Device_Description            1579726
Device_Id                     1700761
Measurement_Timestamp         1700761
dtype: int64
In [23]:
# Finaly check the data by importing it 
start_time=datetime.datetime.now()
combined_csv=pandas.read_csv(COMBINED_ONE_PORT_CHKSTD_CSV)
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were imported as a single csv in {1} seconds".format(len(combined_csv),diff.total_seconds()))
combined_csv.count()
1700761 files were imported as a single csv in 5.585 seconds
C:\Anaconda2\lib\site-packages\IPython\core\interactiveshell.py:2902: DtypeWarning: Columns (6,7,8,12,14,15,16,23,24) have mixed types. Specify dtype option on import or set low_memory=False.
  interactivity=interactivity, compiler=compiler, result=result)
Out[23]:
Frequency                     1700761
Direction                     1579726
Connect                       1700761
mag                           1700761
arg                           1700761
System_Id                     1700761
System_Letter                 1579726
Connector_Type_Calibration      50468
Connector_Type_Measurement    1700761
Measurement_Type              1700761
Measurement_Date              1700761
Measurement_Time              1700761
Program_Used                  1579726
Program_Revision              1579726
Operator                      1579726
Calibration_Name               643520
Calibration_Date               171502
Port_Used                     1700761
Number_Connects               1579726
Number_Repeats                1579726
Nbs                           1579726
Number_Frequencies            1579726
Start_Frequency               1579726
Device_Description            1579726
Device_Id                     1700761
Measurement_Timestamp         1700761
dtype: int64
In [24]:
number_standards=len(combined_csv["Device_Id"].unique())
print("The number of 1-port check standards is {0}".format(number_standards))
The number of 1-port check standards is 292

Repeat for 2-ports

In [5]:
# todo: make this a stand alone script
# step 1: import data sets
raw_two_port=pandas.read_csv(TWO_PORT_CHKSTD_CSV)
sas_two_port=pandas.read_csv(SAS_TWO_PORT)
# step 2: rename any columns that are the same with different names
same={"spid":"System_Id","SP":"Port_Used","ctype":"Connector_Type_Measurement","checkid":"Device_Id",
     "MS11":"magS11","PS11":"argS11","PS12":"argS21","MS22":"magS22","PS22":"argS22",
      "CON":"Connect","FREQ":"Frequency"}
new=sas_two_port.rename(columns=same)
# step 3: create derived columns
def date_conversion(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.isoformat(" ")
def to_measurement_date(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%d %b %y")
def to_measurement_time(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%H:%M:%S")
def to_calibration_date(date_sas_format):
    if type(date_sas_format) is StringType:
        datetime_format=datetime.datetime.strptime(str(date_sas_format),'%d%b%y:%H:%M:%S')
        return datetime_format.strftime("%d %b %y")
    else:
        return date_sas_format
def to_linear(s12_sas_format):
    return 10.**(-1.*s12_sas_format/20.)
new["Measurement_Timestamp"]=new["MEASDATE"].map(date_conversion)
new["Measurement_Date"]=new["MEASDATE"].map(to_measurement_date)
new["Measurement_Time"]=new["MEASDATE"].map(to_measurement_time)
new["Calibration_Date"]=new["CALDATE"].map(to_calibration_date)
new["magS21"]=new["MS12"].map(to_linear)
new["Measurement_Type"]='2-port'
# step 4: delete any extra columns
del new["CALDATE"]
del new["MEASDATE"]
del new["CAL"]
del new["MS12"]
# step 5: add empty columns
empty_columns=[ u'Direction',u'System_Letter',
       u'Connector_Type_Calibration', u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description']
for empty_column in empty_columns:
    new[empty_column]=None
# Now check that the column names are the same and order them
raw_columns=raw_two_port.columns
print raw_columns
new=new[raw_columns]
new_columns=new.columns
print new_columns
raw_columns==new_columns
# step 6: exclude any files that already exist
unique_timestamps=raw_two_port["Measurement_Timestamp"].unique()
new=new[-new["Measurement_Timestamp"].isin(unique_timestamps)]
# step7: add the files and save as csv, note at this point we can write to a db also
combined=pandas.concat([raw_two_port,new])
combined.to_csv(COMBINED_TWO_PORT_CHKSTD_CSV,index=False)
# Finally we check that the files are all what we expect
number_measurements_raw=len(raw_two_port["Measurement_Timestamp"].unique())
number_measurements_sas=len(sas_two_port["MEASDATE"].unique())
number_new=len(new["Measurement_Timestamp"].unique())
number_combined=len(combined["Measurement_Timestamp"].unique())
print("There were {0} measurements in the raw two port files".format(number_measurements_raw))
print("There were {0} measurements in the sas two port files".format(number_measurements_sas))
print("{0} measurements did not overlap".format(number_new))
print("This resulted in {0} unique measurements".format(number_combined))
print("The statement that the number of raw + non-overlapping measurements is equal to the number of " 
      "combined measurements is {0}, resulting in {1} rows of"
      "data".format(number_new+number_measurements_raw==number_combined,len(combined)))
# Finaly check the data by importing it 
start_time=datetime.datetime.now()
combined_csv=pandas.read_csv(COMBINED_TWO_PORT_CHKSTD_CSV)
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were imported as a single csv in {1} seconds".format(len(combined_csv),diff.total_seconds()))
number_standards=len(combined_csv["Device_Id"].unique())
print("The number of 2-port check standards is {0}".format(number_standards))
Index([u'Frequency', u'Direction', u'Connect', u'magS11', u'argS11', u'magS21',
       u'argS21', u'magS22', u'argS22', u'System_Id', u'System_Letter',
       u'Connector_Type_Calibration', u'Connector_Type_Measurement',
       u'Measurement_Type', u'Measurement_Date', u'Measurement_Time',
       u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Calibration_Date', u'Port_Used', u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description', u'Device_Id', u'Measurement_Timestamp'],
      dtype='object')
Index([                 u'Frequency',                  u'Direction',
                          u'Connect',                     u'magS11',
                           u'argS11',                     u'magS21',
                           u'argS21',                     u'magS22',
                           u'argS22',                  u'System_Id',
                    u'System_Letter', u'Connector_Type_Calibration',
       u'Connector_Type_Measurement',           u'Measurement_Type',
                 u'Measurement_Date',           u'Measurement_Time',
                     u'Program_Used',           u'Program_Revision',
                         u'Operator',           u'Calibration_Name',
                 u'Calibration_Date',                  u'Port_Used',
                  u'Number_Connects',             u'Number_Repeats',
                              u'Nbs',         u'Number_Frequencies',
                  u'Start_Frequency',         u'Device_Description',
                        u'Device_Id',      u'Measurement_Timestamp'],
      dtype='object')
There were 1571 measurements in the raw two port files
There were 1812 measurements in the sas two port files
1585 measurements did not overlap
This resulted in 3156 unique measurements
The statement that the number of raw + non-overlapping measurements is equal to the number of combined measurements is True, resulting in 454882 rows ofdata
454882 files were imported as a single csv in 1.803 seconds
The number of 2-port check standards is 81

Repeat for power

In [107]:
raw_power=pandas.read_csv(POWER_CHKSTD_CSV)
sas_power=pandas.read_csv(SAS_POWER)
print raw_power.columns
print sas_power.columns
Index([u'Frequency', u'Direction', u'Connect', u'magS11', u'argS11',
       u'Efficiency', u'Calibration_Factor', u'System_Id', u'System_Letter',
       u'Connector_Type_Calibration', u'Connector_Type_Measurement',
       u'Measurement_Type', u'Measurement_Date', u'Measurement_Time',
       u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Calibration_Date', u'Port_Used', u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description', u'Device_Id', u'Measurement_Timestamp'],
      dtype='object')
Index([u'ctype', u'spid', u'checkid', u'CALDATE', u'MEASDATE', u'FREQ', u'CON',
       u'SP', u'MGAMA', u'PGAMA', u'KP', u'EFF', u'CAL'],
      dtype='object')
In [115]:
unique_cal=raw_power["Calibration_Factor"].unique()
test=sas_power[-sas_power["KP"].isin(unique_cal)]
print test
         ctype        spid checkid           CALDATE          MEASDATE   FREQ  \
0         WR15  HIJ-WR22/1  B15P04  12MAR02:14:52:19  12MAR02:17:51:58  50.00   
1         WR15  HIJ-WR22/1  B15P04  12MAR02:14:52:19  12MAR02:17:51:58  50.00   
2         WR15  HIJ-WR22/1  B15P04  12MAR02:14:52:19  12MAR02:17:51:58  50.00   
3         WR15  HIJ-WR22/1  B15P04  12MAR02:14:52:19  12MAR02:17:51:58  50.00   
4         WR15  HIJ-WR22/1  B15P04  13MAR02:12:58:16  13MAR02:17:22:12  55.00   
5         WR15  HIJ-WR22/1  B15P04  13MAR02:12:58:16  13MAR02:17:22:12  55.00   
6         WR15  HIJ-WR22/1  B15P04  13MAR02:12:58:16  13MAR02:17:22:12  55.00   
7         WR15  HIJ-WR22/1  B15P04  13MAR02:12:58:16  13MAR02:17:22:12  55.00   
8         WR15  HIJ-WR22/1  B15P04  14MAR02:10:37:33  14MAR02:15:50:24  56.00   
9         WR15  HIJ-WR22/1  B15P04  14MAR02:10:37:33  14MAR02:15:50:24  56.00   
10        WR15  HIJ-WR22/1  B15P04  14MAR02:10:37:33  14MAR02:15:50:24  56.00   
11        WR15  HIJ-WR22/1  B15P04  14MAR02:10:37:33  14MAR02:15:50:24  56.00   
12        WR15  HIJ-WR22/1  B15P04  15MAR02:12:30:51  15MAR02:16:42:00  57.00   
13        WR15  HIJ-WR22/1  B15P04  15MAR02:12:30:51  15MAR02:16:42:00  57.00   
14        WR15  HIJ-WR22/1  B15P04  15MAR02:12:30:51  15MAR02:16:42:00  57.00   
15        WR15  HIJ-WR22/1  B15P04  15MAR02:12:30:51  15MAR02:16:42:00  57.00   
16        WR15  HIJ-WR22/1  B15P04  18MAR02:12:09:33  18MAR02:15:16:28  58.00   
17        WR15  HIJ-WR22/1  B15P04  18MAR02:12:09:33  18MAR02:15:16:28  58.00   
18        WR15  HIJ-WR22/1  B15P04  18MAR02:12:09:33  18MAR02:15:16:28  58.00   
19        WR15  HIJ-WR22/1  B15P04  18MAR02:12:09:33  18MAR02:15:16:28  58.00   
20        WR15  HIJ-WR22/1  B15P04  19MAR02:12:36:42  19MAR02:17:59:35  60.00   
21        WR15  HIJ-WR22/1  B15P04  19MAR02:12:36:42  19MAR02:17:59:35  60.00   
22        WR15  HIJ-WR22/1  B15P04  19MAR02:12:36:42  19MAR02:17:59:35  60.00   
23        WR15  HIJ-WR22/1  B15P04  19MAR02:12:36:42  19MAR02:17:59:35  60.00   
24        WR15  HIJ-WR22/1  B15P04  20MAR02:11:29:13  20MAR02:16:50:15  61.00   
25        WR15  HIJ-WR22/1  B15P04  20MAR02:11:29:13  20MAR02:16:50:15  61.00   
26        WR15  HIJ-WR22/1  B15P04  20MAR02:11:29:13  20MAR02:16:50:15  61.00   
27        WR15  HIJ-WR22/1  B15P04  20MAR02:11:29:13  20MAR02:16:50:15  61.00   
28        WR15  HIJ-WR22/1  B15P04  21MAR02:11:02:12  21MAR02:13:04:45  62.00   
29        WR15  HIJ-WR22/1  B15P04  21MAR02:11:02:12  21MAR02:13:04:45  62.00   
...        ...         ...     ...               ...               ...    ...   
139287  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139288  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139289  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139290  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139291  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139292  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.00   
139293  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139294  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139295  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139296  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139297  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139298  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.25   
139299  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139300  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139301  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139302  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139303  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139304  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.50   
139305  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139306  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139307  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139308  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139309  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139310  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  17.75   
139311  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   
139312  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   
139313  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   
139314  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   
139315  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   
139316  Type N  Direct Com  CTNP44  12NOV97:14:56:16  13NOV97:16:32:54  18.00   

        CON  SP    MGAMA    PGAMA       KP      EFF  CAL  
0         1   1  0.33224   18.345  0.61721  0.94071    1  
1         2   1  0.33072   19.292  0.61721  0.93994    1  
2         3   1  0.32938   18.440  0.61721  0.94030    1  
3         4   1  0.32905   18.623  0.61721  0.93996    1  
4         1   1  0.00783 -150.425  0.41125  0.95622    1  
5         2   1  0.00770 -146.303  0.41125  0.95633    1  
6         3   1  0.00795 -144.081  0.41125  0.95656    1  
7         4   1  0.00795 -142.485  0.41125  0.95671    1  
8         1   1  0.03898 -107.794  0.36607  0.95514    1  
9         2   1  0.04120 -109.139  0.36607  0.95479    1  
10        3   1  0.04608 -111.247  0.36607  0.95475    1  
11        4   1  0.04507 -110.581  0.36607  0.95360    1  
12        1   1  0.05327 -139.428  0.39110  0.95843    1  
13        2   1  0.05327 -140.316  0.39110  0.95886    1  
14        3   1  0.05367 -140.214  0.39110  0.95927    1  
15        4   1  0.05347 -140.707  0.39110  0.95853    1  
16        1   1  0.07439  176.432  0.37048  0.95756    1  
17        2   1  0.07441  176.305  0.37048  0.95770    1  
18        3   1  0.07431  177.947  0.37048  0.95814    1  
19        4   1  0.07480  175.822  0.37048  0.95804    1  
20        1   1  0.10205   94.512  0.33806  0.95971    1  
21        2   1  0.10111   94.802  0.33806  0.95919    1  
22        3   1  0.10283   94.991  0.33806  0.95955    1  
23        4   1  0.10220   94.227  0.33806  0.95901    1  
24        1   1  0.10873   52.417  0.35886  0.95584    1  
25        2   1  0.10895   52.056  0.35886  0.95582    1  
26        3   1  0.10866   51.806  0.35886  0.95670    1  
27        4   1  0.10918   51.938  0.35886  0.95631    1  
28        1   1  0.11330    8.554  0.33303  0.95647    1  
29        2   1  0.11385    9.139  0.33303  0.95622    1  
...     ...  ..      ...      ...      ...      ...  ...  
139287    1   1  0.13410   44.410  3.49719  0.95134    1  
139288    2   1  0.13410   44.410  3.49719  0.95173    1  
139289    3   1  0.13410   44.410  3.49719  0.95151    1  
139290    4   1  0.13410   44.410  3.49719  0.95153    1  
139291    5   1  0.13410   44.410  3.49719  0.95128    1  
139292    6   1  0.13410   44.410  3.49719  0.95106    1  
139293    1   1  0.14500   26.200  3.70532  0.95023    1  
139294    2   1  0.14500   26.200  3.70532  0.95067    1  
139295    3   1  0.14500   26.200  3.70532  0.95000    1  
139296    4   1  0.14500   26.200  3.70532  0.95076    1  
139297    5   1  0.14500   26.200  3.70532  0.95022    1  
139298    6   1  0.14500   26.200  3.70532  0.95047    1  
139299    1   1  0.15680    8.300  3.74479  0.94908    1  
139300    2   1  0.15680    8.300  3.74479  0.94964    1  
139301    3   1  0.15680    8.300  3.74479  0.94937    1  
139302    4   1  0.15680    8.300  3.74479  0.94934    1  
139303    5   1  0.15680    8.300  3.74479  0.94912    1  
139304    6   1  0.15680    8.300  3.74479  0.94890    1  
139305    1   1  0.16900   -9.590  3.64068  0.94647    1  
139306    2   1  0.16900   -9.590  3.64068  0.94661    1  
139307    3   1  0.16900   -9.590  3.64068  0.94669    1  
139308    4   1  0.16900   -9.590  3.64068  0.94747    1  
139309    5   1  0.16900   -9.590  3.64068  0.94692    1  
139310    6   1  0.16900   -9.590  3.64068  0.94672    1  
139311    1   1  0.18190  -27.390  3.26670  0.94423    1  
139312    2   1  0.18190  -27.390  3.26670  0.94625    1  
139313    3   1  0.18190  -27.390  3.26670  0.94566    1  
139314    4   1  0.18190  -27.390  3.26670  0.94597    1  
139315    5   1  0.18190  -27.390  3.26670  0.94616    1  
139316    6   1  0.18190  -27.390  3.26670  0.94617    1  

[122518 rows x 13 columns]
In [111]:
raw_power["Efficiency"].unique()
Out[111]:
array([ 0.94455,  0.94473,  0.94493, ...,  0.99964,  0.99957,  0.99956])
In [118]:
# todo: make this a stand alone script
# step 1: import data sets
raw_power=pandas.read_csv(POWER_CHKSTD_CSV)
sas_power=pandas.read_csv(SAS_POWER)
# step 2: rename any columns that are the same with different names
same={"spid":"System_Id","SP":"Port_Used","ctype":"Connector_Type_Measurement","checkid":"Device_Id",
     "MGAMA":"magS11","PGAMA":"argS11","EFF":"Efficiency","KP":"Calibration_Factor",
      "CON":"Connect","FREQ":"Frequency"}
new=sas_power.rename(columns=same)
# step 3: create derived columns
def date_conversion(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.isoformat(" ")
def to_measurement_date(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%d %b %y")
def to_measurement_time(date_sas_format):
    datetime_format=datetime.datetime.strptime(date_sas_format,'%d%b%y:%H:%M:%S')
    return datetime_format.strftime("%H:%M:%S")
def to_calibration_date(date_sas_format):
    if type(date_sas_format) is StringType:
        datetime_format=datetime.datetime.strptime(str(date_sas_format),'%d%b%y:%H:%M:%S')
        return datetime_format.strftime("%d %b %y")
    else:
        return date_sas_format
new["Measurement_Timestamp"]=new["MEASDATE"].map(date_conversion)
new["Measurement_Date"]=new["MEASDATE"].map(to_measurement_date)
new["Measurement_Time"]=new["MEASDATE"].map(to_measurement_time)
new["Calibration_Date"]=new["CALDATE"].map(to_calibration_date)
new["Measurement_Type"]='power'
# step 4: delete any extra columns
del new["CALDATE"]
del new["MEASDATE"]
del new["CAL"]
# step 5: add empty columns
empty_columns=[ u'Direction',  u'magS22',
       u'argS22',  u'System_Letter',
       u'Connector_Type_Calibration', u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description']
for empty_column in empty_columns:
    new[empty_column]=None
# Now check that the column names are the same and order them
raw_columns=raw_power.columns
print raw_columns
new=new[raw_columns]
new_columns=new.columns
print new_columns
raw_columns==new_columns
# step 6: exclude any files that already exist
unique_timestamps=raw_power["Measurement_Timestamp"].unique()
new=new[-new["Measurement_Timestamp"].isin(unique_timestamps)]
# step7: add the files and save as csv, note at this point we can write to a db also
combined=pandas.concat([raw_power,new])
combined.to_csv(COMBINED_POWER_CHKSTD_CSV,index=False)
# Finally we check that the files are all what we expect
number_measurements_raw=len(raw_power["Measurement_Timestamp"].unique())
number_measurements_sas=len(sas_power["MEASDATE"].unique())
number_new=len(new["Measurement_Timestamp"].unique())
number_combined=len(combined["Measurement_Timestamp"].unique())
print("There were {0} measurements in the raw power files".format(number_measurements_raw))
print("There were {0} measurements in the sas power files".format(number_measurements_sas))
print("{0} measurements did not overlap".format(number_new))
print("This resulted in {0} unique measurements".format(number_combined))
print("The statement that the number of raw + non-overlapping measurements is equal to the number of " 
      "combined measurements is {0}, resulting in {1} rows of "
      "data".format(number_new+number_measurements_raw==number_combined,len(combined)))
# Finaly check the data by importing it 
start_time=datetime.datetime.now()
combined_csv=pandas.read_csv(COMBINED_POWER_CHKSTD_CSV)
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were imported as a single csv in {1} seconds".format(len(combined_csv),diff.total_seconds()))
number_standards=len(combined_csv["Device_Id"].unique())
print("The number of power check standards is {0}".format(number_standards))
Index([u'Frequency', u'Direction', u'Connect', u'magS11', u'argS11',
       u'Efficiency', u'Calibration_Factor', u'System_Id', u'System_Letter',
       u'Connector_Type_Calibration', u'Connector_Type_Measurement',
       u'Measurement_Type', u'Measurement_Date', u'Measurement_Time',
       u'Program_Used', u'Program_Revision', u'Operator', u'Calibration_Name',
       u'Calibration_Date', u'Port_Used', u'Number_Connects',
       u'Number_Repeats', u'Nbs', u'Number_Frequencies', u'Start_Frequency',
       u'Device_Description', u'Device_Id', u'Measurement_Timestamp'],
      dtype='object')
Index([                 u'Frequency',                  u'Direction',
                          u'Connect',                     u'magS11',
                           u'argS11',                 u'Efficiency',
               u'Calibration_Factor',                  u'System_Id',
                    u'System_Letter', u'Connector_Type_Calibration',
       u'Connector_Type_Measurement',           u'Measurement_Type',
                 u'Measurement_Date',           u'Measurement_Time',
                     u'Program_Used',           u'Program_Revision',
                         u'Operator',           u'Calibration_Name',
                 u'Calibration_Date',                  u'Port_Used',
                  u'Number_Connects',             u'Number_Repeats',
                              u'Nbs',         u'Number_Frequencies',
                  u'Start_Frequency',         u'Device_Description',
                        u'Device_Id',      u'Measurement_Timestamp'],
      dtype='object')
There were 867 measurements in the raw power files
There were 2502 measurements in the sas power files
2400 measurements did not overlap
This resulted in 3267 unique measurements
The statement that the number of raw + non-overlapping measurements is equal to the number of combined measurements is True, resulting in 212058 rows of data
212058 files were imported as a single csv in 0.731 seconds
The number of power check standards is 91

Now we import and shape the DUT files (.asc)

In [3]:
one_port_files=[]
two_port_files=[]
power_files=[]
for root,directory,file_names in os.walk(DUT_TOP_DIRECTORY):
    #print file_names
    for file_name in file_names:
        match=re.search('.asc',file_name,re.IGNORECASE)
        
        try:
            if re.search('.txt',file_name,re.IGNORECASE):raise
            if match:
                in_file=open(os.path.join(root,file_name),'r')
                contents=in_file.read()
                in_file.close()
                if re.search('table 1',contents,re.IGNORECASE) and re.search('table 2',contents,re.IGNORECASE) and re.search('table 3',contents,re.IGNORECASE):
                    two_port_files.append(os.path.join(root,file_name))
                elif re.search('table 1',contents,re.IGNORECASE) and re.search('table 2',contents,re.IGNORECASE):
                    power_files.append(os.path.join(root,file_name))
                elif re.search('table 1',contents,re.IGNORECASE):
                    one_port_files.append(os.path.join(root,file_name))
                else:
                    pass
        except:pass
In [4]:
# check the files
print("There are %s one port calrep files"%len(one_port_files))
print("There are %s two port calrep files"%len(two_port_files))
print("There are %s power calrep files"%len(power_files))
There are 364 one port calrep files
There are 514 two port calrep files
There are 901 power calrep files
In [7]:
# We parse the file and extract Analysis_Date and Device_Id
start_time=datetime.datetime.now()
initial_file=OnePortCalrepModel(one_port_files[0])
device_id=initial_file.header[0].rstrip().lstrip()
print("{0} is {1}".format('device_id',device_id))
analysis_date=initial_file.header[1].rstrip().lstrip()
print("{0} is {1}".format('analysis_date',analysis_date))
initial_file.options["data_delimiter"]=","
initial_file.add_column(column_name='Device_Id',column_type='str',
                        column_data=[device_id for row in initial_file.data[:]])
initial_file.add_column(column_name='Analysis_Date',column_type='str',
                        column_data=[analysis_date for row in initial_file.data[:]])
#print initial_file
initial_file.header=None
initial_file.save(ONE_PORT_CALREP_CSV)
del initial_file
out_file=open(ONE_PORT_CALREP_CSV,'a')
file_list=one_port_files[1:]
for index,file_name in enumerate(file_list):
    try:
        print("Processing File Number {0}, {1}".format(index,file_name))
        one_port_table=OnePortCalrepModel(file_name)
        device_id=one_port_table.header[0].rstrip().lstrip()
        analysis_date=one_port_table.header[1].rstrip().lstrip()
        one_port_table.options["data_delimiter"]=","
        one_port_table.add_column(column_name='Device_Id',
                                  column_type='str',
                                  column_data=[device_id for row in one_port_table.data[:]])
        one_port_table.add_column(column_name='Analysis_Date',
                                  column_type='str',
                                  column_data=[analysis_date for row in one_port_table.data[:]])
        #print one_port_table
        out_file.write('\n')
        data=one_port_table.get_data_string()
        out_file.write(data)
        print one_port_table.header
        if index==len(file_list)-1:
            print("Last File")
        else:
            print("Next file is {0}".format(one_port_files[index+1]))
    except DataDimensionError:
        print("{0} was passed due to a data dimensioning problem".format(file_name))
        pass
    except AttributeError:
        print("{0} was passed due to a loading issue".format(file_name))
    except TypeError:
        print("{0} was passed due to an unkown issue".format(file_name))
    except TypeConversionError:
        print("{0} was passed due to improper number of columns".format(file_name))
    except ValueError:
        print("{0} was passed due to improper number of columns".format(file_name))
    except:raise
out_file.close()
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(file_list),diff.total_seconds()))
device_id is 02806
analysis_date is 9 Feb 2016
Processing File Number 0, C:\Share\ascii.dut\052101.asc
['052101', '10 Mar 2016', '\r']
Next file is C:\Share\ascii.dut\052101.asc
Processing File Number 1, C:\Share\ascii.dut\060127.asc
['060127', '10 Mar 2016', '\r']
Next file is C:\Share\ascii.dut\060127.asc
Processing File Number 2, C:\Share\ascii.dut\08046A.asc
['08046A', '29 Jan 2016', '\r']
Next file is C:\Share\ascii.dut\08046A.asc
Processing File Number 3, C:\Share\ascii.dut\08047A.asc
['08047A', '29 Jan 2016', '\r']
Next file is C:\Share\ascii.dut\08047A.asc
Processing File Number 4, C:\Share\ascii.dut\M105P1.asc
['M105P1', '18 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\M105P1.asc
Processing File Number 5, C:\Share\ascii.dut\M105P2.asc
['M105P2', '20 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\M105P2.asc
Processing File Number 6, C:\Share\ascii.dut\M110P2.asc
['M110P2', '20 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\M110P2.asc
Processing File Number 7, C:\Share\ascii.dut\N101P1.asc
['N101P1', '20 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\N101P1.asc
Processing File Number 8, C:\Share\ascii.dut\N101P2.asc
['N101P2', '20 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\N101P2.asc
Processing File Number 9, C:\Share\ascii.dut\N110P1.asc
['N110P1', '20 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\N110P1.asc
Processing File Number 10, C:\Share\ascii.dut\N35101.asc
['N35101', '26 Jan 2016', '\r']
Next file is C:\Share\ascii.dut\N35101.asc
Processing File Number 11, C:\Share\ascii.dut\NTN103.asc
['NTN103', '10 Mar 2016', '\r']
Next file is C:\Share\ascii.dut\NTN103.asc
Processing File Number 12, C:\Share\ascii.dut\NTN103.asc.old
C:\Share\ascii.dut\NTN103.asc.old was passed due to an unkown issue
Processing File Number 13, C:\Share\ascii.dut\NTN103OLD.asc
['NTN103', '11 Feb 2016', '\r']
Next file is C:\Share\ascii.dut\NTN103OLD.asc
Processing File Number 14, C:\Share\ascii.dut\NTN104.asc
['NTN104', '21 Apr 2016', '\r']
Next file is C:\Share\ascii.dut\NTN104.asc
Processing File Number 15, C:\Share\ascii.dut\NTN104_old.asc
['NTN104', '10 Mar 2016', '\r']
Next file is C:\Share\ascii.dut\NTN104_old.asc
Processing File Number 16, C:\Share\ascii.dut\2013\700150 unedited CN24.asc
['700150', '10 Jul 2013', '\r']
Next file is C:\Share\ascii.dut\2013\700150 unedited CN24.asc
Processing File Number 17, C:\Share\ascii.dut\2013\922074.asc
Convert row could not convert [12.4, 0.0482, 0.0017, 0.0015, 0.0014, 0.005, 0.0036, 1.0071, 0.00016] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2013\922074.asc was passed due to improper number of columns
Processing File Number 18, C:\Share\ascii.dut\2013\922075.asc
Convert row could not convert [18.0, 0.0521, 0.0012, 0.0015, 0.0002, 0.0038, 0.0039, 1.0078, 0.00017] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2013\922075.asc was passed due to improper number of columns
Processing File Number 19, C:\Share\ascii.dut\2013\922093.asc
Convert row could not convert [18.0, 0.0989, 0.0012, 0.0015, 0.0003, 0.0038, 0.0033, 1.0066, 0.00038] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2013\922093.asc was passed due to improper number of columns
Processing File Number 20, C:\Share\ascii.dut\2013\922526.asc
Convert row could not convert [8.2, 0.1958, 0.0018, 0.0015, 0.0001, 0.0047, 0.0033, 1.0067, 0.0006] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2013\922526.asc was passed due to improper number of columns
Processing File Number 21, C:\Share\ascii.dut\2013\922562.asc
['922562', '15 Mar 2013']
Next file is C:\Share\ascii.dut\2013\922562.asc
Processing File Number 22, C:\Share\ascii.dut\2013\922563.asc
['922563', '15 Mar 2013']
Next file is C:\Share\ascii.dut\2013\922563.asc
Processing File Number 23, C:\Share\ascii.dut\2013\922566.asc
['922566', '15 Mar 2013']
Next file is C:\Share\ascii.dut\2013\922566.asc
Processing File Number 24, C:\Share\ascii.dut\2013\922568.asc
['922568', '15 Mar 2013']
Next file is C:\Share\ascii.dut\2013\922568.asc
Processing File Number 25, C:\Share\ascii.dut\2013\922589.asc
['922589', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922589.asc
Processing File Number 26, C:\Share\ascii.dut\2013\922589.asc 8510
['922589', '20 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922589.asc 8510
Processing File Number 27, C:\Share\ascii.dut\2013\922589.asc 8753
['922589', ' 6 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922589.asc 8753
Processing File Number 28, C:\Share\ascii.dut\2013\922589.asc no LFs
['922589', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922589.asc no LFs
Processing File Number 29, C:\Share\ascii.dut\2013\922590.asc
['922590', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922590.asc
Processing File Number 30, C:\Share\ascii.dut\2013\922590.asc 8518
['922590', '20 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922590.asc 8518
Processing File Number 31, C:\Share\ascii.dut\2013\922590.asc 8753
['922590', ' 6 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922590.asc 8753
Processing File Number 32, C:\Share\ascii.dut\2013\922590.asc no LFs
['922590', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922590.asc no LFs
Processing File Number 33, C:\Share\ascii.dut\2013\922591.asc
['922591', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922591.asc
Processing File Number 34, C:\Share\ascii.dut\2013\922591.asc 8510
['922591', '20 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922591.asc 8510
Processing File Number 35, C:\Share\ascii.dut\2013\922591.asc 8753
['922591', ' 6 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922591.asc 8753
Processing File Number 36, C:\Share\ascii.dut\2013\922591.asc no LFs
['922591', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922591.asc no LFs
Processing File Number 37, C:\Share\ascii.dut\2013\922592.asc
['922592', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922592.asc
Processing File Number 38, C:\Share\ascii.dut\2013\922592.asc 8510
['922592', '20 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922592.asc 8510
Processing File Number 39, C:\Share\ascii.dut\2013\922592.asc 8753
['922592', ' 6 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922592.asc 8753
Processing File Number 40, C:\Share\ascii.dut\2013\922592.asc no LFs
['922592', '20 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922592.asc no LFs
Processing File Number 41, C:\Share\ascii.dut\2013\922643.asc
Convert row could not convert [26.5, 0.0455, 0.0012, 0.0015, 0.0005, 0.0039, 0.0047, 1.0095, 0.00074] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2013\922643.asc was passed due to improper number of columns
Processing File Number 42, C:\Share\ascii.dut\2013\922709.asc
['922709', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922709.asc
Processing File Number 43, C:\Share\ascii.dut\2013\922709.asc 8510
['922709', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922709.asc 8510
Processing File Number 44, C:\Share\ascii.dut\2013\922709.asc 8753
['922709', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922709.asc 8753
Processing File Number 45, C:\Share\ascii.dut\2013\922709.asc no LFs
['922709', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922709.asc no LFs
Processing File Number 46, C:\Share\ascii.dut\2013\922710.asc
['922710', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922710.asc
Processing File Number 47, C:\Share\ascii.dut\2013\922710.asc 8510
['922710', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922710.asc 8510
Processing File Number 48, C:\Share\ascii.dut\2013\922710.asc 8753
['922710', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922710.asc 8753
Processing File Number 49, C:\Share\ascii.dut\2013\922710.asc no LFs
['922710', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922710.asc no LFs
Processing File Number 50, C:\Share\ascii.dut\2013\922711.asc
['922711', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922711.asc
Processing File Number 51, C:\Share\ascii.dut\2013\922711.asc 8510
['922711', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922711.asc 8510
Processing File Number 52, C:\Share\ascii.dut\2013\922711.asc 8753
['922711', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922711.asc 8753
Processing File Number 53, C:\Share\ascii.dut\2013\922711.asc no LFs
['922711', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922711.asc no LFs
Processing File Number 54, C:\Share\ascii.dut\2013\922712.asc
['922712', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922712.asc
Processing File Number 55, C:\Share\ascii.dut\2013\922712.asc 8510
['922712', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922712.asc 8510
Processing File Number 56, C:\Share\ascii.dut\2013\922712.asc 8753
['922712', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922712.asc 8753
Processing File Number 57, C:\Share\ascii.dut\2013\922712.asc no LFs
['922712', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922712.asc no LFs
Processing File Number 58, C:\Share\ascii.dut\2013\922719.asc
['922719', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922719.asc
Processing File Number 59, C:\Share\ascii.dut\2013\922720.asc
['922720', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922720.asc
Processing File Number 60, C:\Share\ascii.dut\2013\922722.asc
['922722', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922722.asc
Processing File Number 61, C:\Share\ascii.dut\2013\922723.asc
['922723', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922723.asc
Processing File Number 62, C:\Share\ascii.dut\2013\922724.asc
['922724', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922724.asc
Processing File Number 63, C:\Share\ascii.dut\2013\922725.asc
['922725', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922725.asc
Processing File Number 64, C:\Share\ascii.dut\2013\922730.asc
['922730', '25 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922730.asc
Processing File Number 65, C:\Share\ascii.dut\2013\922762.asc
['922762', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922762.asc
Processing File Number 66, C:\Share\ascii.dut\2013\922762.asc 8510
['922762', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922762.asc 8510
Processing File Number 67, C:\Share\ascii.dut\2013\922762.asc 8753
['922762', ' 4 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922762.asc 8753
Processing File Number 68, C:\Share\ascii.dut\2013\922763.asc
['922763', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922763.asc
Processing File Number 69, C:\Share\ascii.dut\2013\922763.asc 8510
['922763', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922763.asc 8510
Processing File Number 70, C:\Share\ascii.dut\2013\922763.asc 8753
['922763', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922763.asc 8753
Processing File Number 71, C:\Share\ascii.dut\2013\922764.asc
['922764', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922764.asc
Processing File Number 72, C:\Share\ascii.dut\2013\922764.asc 8510
['922764', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922764.asc 8510
Processing File Number 73, C:\Share\ascii.dut\2013\922764.asc 8753
['922764', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922764.asc 8753
Processing File Number 74, C:\Share\ascii.dut\2013\922765.asc
['922765', '11 Jun 2013']
Next file is C:\Share\ascii.dut\2013\922765.asc
Processing File Number 75, C:\Share\ascii.dut\2013\922765.asc 8510
['922765', '11 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922765.asc 8510
Processing File Number 76, C:\Share\ascii.dut\2013\922765.asc 8753
['922765', ' 5 Jun 2013', '\r']
Next file is C:\Share\ascii.dut\2013\922765.asc 8753
Processing File Number 77, C:\Share\ascii.dut\2014\60127.asc
['60127', '26 Jun 2014', '\r']
Next file is C:\Share\ascii.dut\2014\60127.asc
Processing File Number 78, C:\Share\ascii.dut\2014\700597.asc
['700597', '15 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\700597.asc
Processing File Number 79, C:\Share\ascii.dut\2014\700598.asc
['700598', '15 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\700598.asc
Processing File Number 80, C:\Share\ascii.dut\2014\700599.asc
['700599', '15 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\700599.asc
Processing File Number 81, C:\Share\ascii.dut\2014\700633.asc
['700633', 'Apr 9 2014', '']
Next file is C:\Share\ascii.dut\2014\700633.asc
Processing File Number 82, C:\Share\ascii.dut\2014\922032.asc
Convert row could not convert [12.4, 0.051, 0.0017, 0.0015, 0.0003, 0.0046, 0.0045, 1.009, 0.0001] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\2014\922032.asc was passed due to improper number of columns
Processing File Number 83, C:\Share\ascii.dut\2014\923072.asc
['923072', '29 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923072.asc
Processing File Number 84, C:\Share\ascii.dut\2014\923073.asc
['923073', '29 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923073.asc
Processing File Number 85, C:\Share\ascii.dut\2014\923074.asc
['923074', '29 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923074.asc
Processing File Number 86, C:\Share\ascii.dut\2014\923075.asc
['923075', '29 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923075.asc
Processing File Number 87, C:\Share\ascii.dut\2014\923076.asc
['923076', '25 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923076.asc
Processing File Number 88, C:\Share\ascii.dut\2014\923077.asc
['923077', '25 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923077.asc
Processing File Number 89, C:\Share\ascii.dut\2014\923078.asc
['923078', '25 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\923078.asc
Processing File Number 90, C:\Share\ascii.dut\2014\C35P09.asc
['C35P09', '31 Mar 2014', '\r']
Next file is C:\Share\ascii.dut\2014\C35P09.asc
Processing File Number 91, C:\Share\ascii.dut\2014\CTN102.asc
['CTN102', ' 8 Jan 2014', '\r']
Next file is C:\Share\ascii.dut\2014\CTN102.asc
Processing File Number 92, C:\Share\ascii.dut\2014\CTN112.asc
['CTN112', '17 Sep 2014', '\r']
Next file is C:\Share\ascii.dut\2014\CTN112.asc
Processing File Number 93, C:\Share\ascii.dut\2014\N24101.asc
['N24101', ' 3 Jan 2014', '\r']
Next file is C:\Share\ascii.dut\2014\N24101.asc
Processing File Number 94, C:\Share\ascii.dut\2014\N24102.asc
['N24102', ' 2 Jan 2014', '\r']
Next file is C:\Share\ascii.dut\2014\N24102.asc
Processing File Number 95, C:\Share\ascii.dut\2015\C22101.asc
['C22101', '27 Apr 2015', '\r']
Next file is C:\Share\ascii.dut\2015\C22101.asc
Processing File Number 96, C:\Share\ascii.dut\2015\C22102.asc
['C22102', ' 5 May 2015', '\r']
Next file is C:\Share\ascii.dut\2015\C22102.asc
Processing File Number 97, C:\Share\ascii.dut\2015\C22103.asc
['C22103', ' 5 May 2015', '\r']
Next file is C:\Share\ascii.dut\2015\C22103.asc
Processing File Number 98, C:\Share\ascii.dut\ascii.old\2001\700014.asc
Convert row could not convert [1.0, 0.0026, 0.0017, 0.0005, 0.0012, 0.0039, 0.9943, 349.75, 0.0116] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2001\700014.asc was passed due to improper number of columns
Processing File Number 99, C:\Share\ascii.dut\ascii.old\2001\700015.asc
Convert row could not convert [2.0, 0.0029, 0.0034, 0.0006, 0.0004, 0.007, 0.9913, 229.18, 0.01462] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2001\700015.asc was passed due to improper number of columns
Processing File Number 100, C:\Share\ascii.dut\ascii.old\2001\700015SS.asc
C:\Share\ascii.dut\ascii.old\2001\700015SS.asc was passed due to improper number of columns
Processing File Number 101, C:\Share\ascii.dut\ascii.old\2001\812208.asc
['812208', '18 Jan 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\812208.asc
Processing File Number 102, C:\Share\ascii.dut\ascii.old\2001\814099_1.asc
['814099', '27 Nov 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\814099_1.asc
Processing File Number 103, C:\Share\ascii.dut\ascii.old\2001\814099_2.asc
['814099', '27 Nov 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\814099_2.asc
Processing File Number 104, C:\Share\ascii.dut\ascii.old\2001\814402.asc
['814402', '12 Sep 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\814402.asc
Processing File Number 105, C:\Share\ascii.dut\ascii.old\2001\922405.asc
C:\Share\ascii.dut\ascii.old\2001\922405.asc was passed due to improper number of columns
Processing File Number 106, C:\Share\ascii.dut\ascii.old\2001\922584.asc
C:\Share\ascii.dut\ascii.old\2001\922584.asc was passed due to improper number of columns
Processing File Number 107, C:\Share\ascii.dut\ascii.old\2001\922589.asc
['922589', '31 Aug 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922589.asc
Processing File Number 108, C:\Share\ascii.dut\ascii.old\2001\922590.asc
['922590', '31 Aug 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922590.asc
Processing File Number 109, C:\Share\ascii.dut\ascii.old\2001\922591.asc
['922591', '31 Aug 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922591.asc
Processing File Number 110, C:\Share\ascii.dut\ascii.old\2001\922592.asc
['922592', '31 Aug 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922592.asc
Processing File Number 111, C:\Share\ascii.dut\ascii.old\2001\922719.asc
['922719', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922719.asc
Processing File Number 112, C:\Share\ascii.dut\ascii.old\2001\922720.asc
['922720', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922720.asc
Processing File Number 113, C:\Share\ascii.dut\ascii.old\2001\922721.asc
['922721', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922721.asc
Processing File Number 114, C:\Share\ascii.dut\ascii.old\2001\922722.asc
['922722', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922722.asc
Processing File Number 115, C:\Share\ascii.dut\ascii.old\2001\922723.asc
['922723', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922723.asc
Processing File Number 116, C:\Share\ascii.dut\ascii.old\2001\922724.asc
['922724', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922724.asc
Processing File Number 117, C:\Share\ascii.dut\ascii.old\2001\922725.asc
['922725', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922725.asc
Processing File Number 118, C:\Share\ascii.dut\ascii.old\2001\922730.asc
['922730', ' 7 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\922730.asc
Processing File Number 119, C:\Share\ascii.dut\ascii.old\2001\951968.asc
Convert row could not convert [26.5, 0.0902, 0.0012, 0.0015, 0.0012, 0.0041, 0.0045, 1.0091, 0.00044] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2001\951968.asc was passed due to improper number of columns
Processing File Number 120, C:\Share\ascii.dut\ascii.old\2001\952047.asc
Convert row could not convert [8.2, 0.0902, 0.0017, 0.0015, 0.001, 0.0048, 0.0032, 1.0063, 0.00083] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2001\952047.asc was passed due to improper number of columns
Processing File Number 121, C:\Share\ascii.dut\ascii.old\2001\952048.asc
Convert row could not convert [12.4, 0.0893, 0.0017, 0.0015, 0.0007, 0.0047, 0.0017, 1.0034, 0.00051] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2001\952048.asc was passed due to improper number of columns
Processing File Number 122, C:\Share\ascii.dut\ascii.old\2001\C07108.asc
['C07108', '17 May 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\C07108.asc
Processing File Number 123, C:\Share\ascii.dut\ascii.old\2001\C07P03.asc
['C07P03', '26 May 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\C07P03.asc
Processing File Number 124, C:\Share\ascii.dut\ascii.old\2001\C14103.asc
['C14103', '20 Aug 2001', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\C14103.asc
Processing File Number 125, C:\Share\ascii.dut\ascii.old\2001\C35P05.asc
['C35P05', ' 3 Oct 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\C35P05.asc
Processing File Number 126, C:\Share\ascii.dut\ascii.old\2001\CTNP12.asc
['CTNP12', ' 2 Jun 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\CTNP12.asc
Processing File Number 127, C:\Share\ascii.dut\ascii.old\2001\CTNP13.asc
['CTNP13', ' 2 Jun 2000', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2001\CTNP13.asc
Processing File Number 128, C:\Share\ascii.dut\ascii.old\2002\700086.asc
['700086', '23 Jan 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\700086.asc
Processing File Number 129, C:\Share\ascii.dut\ascii.old\2002\700149.asc
['700149', '12 Nov 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\700149.asc
Processing File Number 130, C:\Share\ascii.dut\ascii.old\2002\921006.asc
C:\Share\ascii.dut\ascii.old\2002\921006.asc was passed due to improper number of columns
Processing File Number 131, C:\Share\ascii.dut\ascii.old\2002\921241.asc
Convert row could not convert [12.4, 0.1003, 0.0017, 0.0015, 0.0011, 0.0049, 0.004, 1.0081, 0.00015] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2002\921241.asc was passed due to improper number of columns
Processing File Number 132, C:\Share\ascii.dut\ascii.old\2002\922032.asc
Convert row could not convert [12.4, 0.0518, 0.0017, 0.0015, 0.0003, 0.0046, 0.0043, 1.0086, 0.00014] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2002\922032.asc was passed due to improper number of columns
Processing File Number 133, C:\Share\ascii.dut\ascii.old\2002\922467.asc
Convert row could not convert [8.2, 0.0478, 0.0017, 0.0015, 0.0001, 0.0046, 0.0014, 1.0028, 0.00028] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2002\922467.asc was passed due to improper number of columns
Processing File Number 134, C:\Share\ascii.dut\ascii.old\2002\922546.asc
C:\Share\ascii.dut\ascii.old\2002\922546.asc was passed due to improper number of columns
Processing File Number 135, C:\Share\ascii.dut\ascii.old\2002\922684.asc
C:\Share\ascii.dut\ascii.old\2002\922684.asc was passed due to improper number of columns
Processing File Number 136, C:\Share\ascii.dut\ascii.old\2002\922694.asc
['922694', ' 9 Sep 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\922694.asc
Processing File Number 137, C:\Share\ascii.dut\ascii.old\2002\922695.asc
['922695', ' 9 Sep 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\922695.asc
Processing File Number 138, C:\Share\ascii.dut\ascii.old\2002\922696.asc
['922696', ' 9 Sep 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\922696.asc
Processing File Number 139, C:\Share\ascii.dut\ascii.old\2002\922697.asc
['922697', ' 9 Sep 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\922697.asc
Processing File Number 140, C:\Share\ascii.dut\ascii.old\2002\C29102.asc
['C29102', '19 Feb 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\C29102.asc
Processing File Number 141, C:\Share\ascii.dut\ascii.old\2002\CN21.asc
['CN21', '22 Oct 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\CN21.asc
Processing File Number 142, C:\Share\ascii.dut\ascii.old\2002\CN26.asc
['CN26', '22 Oct 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\CN26.asc
Processing File Number 143, C:\Share\ascii.dut\ascii.old\2002\CTN106.asc
['CTN106', '19 Jun 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\CTN106.asc
Processing File Number 144, C:\Share\ascii.dut\ascii.old\2002\CTN112.asc
['CTN112', '19 Jun 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\CTN112.asc
Processing File Number 145, C:\Share\ascii.dut\ascii.old\2002\NTN103.asc
['NTN103', '19 Jun 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\NTN103.asc
Processing File Number 146, C:\Share\ascii.dut\ascii.old\2002\NTN104.asc
['NTN104', '19 Jun 2002', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2002\NTN104.asc
Processing File Number 147, C:\Share\ascii.dut\ascii.old\2003\700209.asc
C:\Share\ascii.dut\ascii.old\2003\700209.asc was passed due to improper number of columns
Processing File Number 148, C:\Share\ascii.dut\ascii.old\2003\700210.asc
C:\Share\ascii.dut\ascii.old\2003\700210.asc was passed due to improper number of columns
Processing File Number 149, C:\Share\ascii.dut\ascii.old\2003\814514.asc
['814514', ' 8 Aug 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\814514.asc
Processing File Number 150, C:\Share\ascii.dut\ascii.old\2003\922007.asc
C:\Share\ascii.dut\ascii.old\2003\922007.asc was passed due to improper number of columns
Processing File Number 151, C:\Share\ascii.dut\ascii.old\2003\922074.asc
Convert row could not convert [12.4, 0.0485, 0.0017, 0.0015, 0.0003, 0.0046, 0.0036, 1.0072, 0.00017] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2003\922074.asc was passed due to improper number of columns
Processing File Number 152, C:\Share\ascii.dut\ascii.old\2003\922392.asc
C:\Share\ascii.dut\ascii.old\2003\922392.asc was passed due to improper number of columns
Processing File Number 153, C:\Share\ascii.dut\ascii.old\2003\922526.asc
Convert row could not convert [8.2, 0.195, 0.0018, 0.0015, 0.0003, 0.0047, 0.0036, 1.0071, 0.00026] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2003\922526.asc was passed due to improper number of columns
Processing File Number 154, C:\Share\ascii.dut\ascii.old\2003\951835.asc
['951835', '12 Dec 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\951835.asc
Processing File Number 155, C:\Share\ascii.dut\ascii.old\2003\952334.asc
['952334', '12 Dec 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\952334.asc
Processing File Number 156, C:\Share\ascii.dut\ascii.old\2003\952525.asc
['952525', '25 Nov 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\952525.asc
Processing File Number 157, C:\Share\ascii.dut\ascii.old\2003\C10P02.asc
['C10P02', '11 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P02.asc
Processing File Number 158, C:\Share\ascii.dut\ascii.old\2003\C10P03.asc
['C10P03', '10 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P03.asc
Processing File Number 159, C:\Share\ascii.dut\ascii.old\2003\C10P04.asc
['C10P04', '10 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P04.asc
Processing File Number 160, C:\Share\ascii.dut\ascii.old\2003\C10P05.asc
['C10P05', '10 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P05.asc
Processing File Number 161, C:\Share\ascii.dut\ascii.old\2003\C10P06.asc
['C10P06', '10 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P06.asc
Processing File Number 162, C:\Share\ascii.dut\ascii.old\2003\C10P07.asc
['C10P07', '11 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P07.asc
Processing File Number 163, C:\Share\ascii.dut\ascii.old\2003\C10P08.asc
['C10P08', '10 Sep 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C10P08.asc
Processing File Number 164, C:\Share\ascii.dut\ascii.old\2003\C62102.asc
['C62102', '12 Feb 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\C62102.asc
Processing File Number 165, C:\Share\ascii.dut\ascii.old\2003\CN08.asc
['CN08', '31 Mar 2003', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2003\CN08.asc
Processing File Number 166, C:\Share\ascii.dut\ascii.old\2004\700085.asc
['700085', '12 May 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700085.asc
Processing File Number 167, C:\Share\ascii.dut\ascii.old\2004\700233.asc
C:\Share\ascii.dut\ascii.old\2004\700233.asc was passed due to improper number of columns
Processing File Number 168, C:\Share\ascii.dut\ascii.old\2004\700234.asc
C:\Share\ascii.dut\ascii.old\2004\700234.asc was passed due to improper number of columns
Processing File Number 169, C:\Share\ascii.dut\ascii.old\2004\700342.asc
['700342', '18 May 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700342.asc
Processing File Number 170, C:\Share\ascii.dut\ascii.old\2004\700343.asc
['700343', '28 May 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700343.asc
Processing File Number 171, C:\Share\ascii.dut\ascii.old\2004\700365.asc
['700365', '22 Jul 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700365.asc
Processing File Number 172, C:\Share\ascii.dut\ascii.old\2004\700366.asc
['700366', '22 Jul 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700366.asc
Processing File Number 173, C:\Share\ascii.dut\ascii.old\2004\700367.asc
['700367', '22 Jul 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700367.asc
Processing File Number 174, C:\Share\ascii.dut\ascii.old\2004\700368.asc
['700368', '22 Jul 2004']
Next file is C:\Share\ascii.dut\ascii.old\2004\700368.asc
Processing File Number 175, C:\Share\ascii.dut\ascii.old\2004\700369.asc
['700369', '22 Jul 2004']
Next file is C:\Share\ascii.dut\ascii.old\2004\700369.asc
Processing File Number 176, C:\Share\ascii.dut\ascii.old\2004\700370.asc
['700370', '22 Jul 2004']
Next file is C:\Share\ascii.dut\ascii.old\2004\700370.asc
Processing File Number 177, C:\Share\ascii.dut\ascii.old\2004\700371.asc
['700371', '22 Jul 2004']
Next file is C:\Share\ascii.dut\ascii.old\2004\700371.asc
Processing File Number 178, C:\Share\ascii.dut\ascii.old\2004\700372.asc
['700372', '22 Jul 2004']
Next file is C:\Share\ascii.dut\ascii.old\2004\700372.asc
Processing File Number 179, C:\Share\ascii.dut\ascii.old\2004\700381.asc1
['700381', ' 6 Oct 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\700381.asc1
Processing File Number 180, C:\Share\ascii.dut\ascii.old\2004\811285.asc
['811285', '12 May 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\811285.asc
Processing File Number 181, C:\Share\ascii.dut\ascii.old\2004\812142.ascold
['812142', '28 Oct 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\812142.ascold
Processing File Number 182, C:\Share\ascii.dut\ascii.old\2004\921060.asc
C:\Share\ascii.dut\ascii.old\2004\921060.asc was passed due to improper number of columns
Processing File Number 183, C:\Share\ascii.dut\ascii.old\2004\922073.asc
Convert row could not convert [8.2, 0.048, 0.0017, 0.0015, 0.0016, 0.0051, 0.004, 1.0081, 5e-05] using ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float']
C:\Share\ascii.dut\ascii.old\2004\922073.asc was passed due to improper number of columns
Processing File Number 184, C:\Share\ascii.dut\ascii.old\2004\922562.asc
['922562', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922562.asc
Processing File Number 185, C:\Share\ascii.dut\ascii.old\2004\922563.asc
['922563', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922563.asc
Processing File Number 186, C:\Share\ascii.dut\ascii.old\2004\922566.asc
['922566', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922566.asc
Processing File Number 187, C:\Share\ascii.dut\ascii.old\2004\922568.asc
['922568', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922568.asc
Processing File Number 188, C:\Share\ascii.dut\ascii.old\2004\922704.asc
['922704', '11 Feb 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922704.asc
Processing File Number 189, C:\Share\ascii.dut\ascii.old\2004\922709.asc
['922709', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922709.asc
Processing File Number 190, C:\Share\ascii.dut\ascii.old\2004\922710.asc
['922710', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922710.asc
Processing File Number 191, C:\Share\ascii.dut\ascii.old\2004\922711.asc
['922711', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922711.asc
Processing File Number 192, C:\Share\ascii.dut\ascii.old\2004\922712.asc
['922712', '22 Mar 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\922712.asc
Processing File Number 193, C:\Share\ascii.dut\ascii.old\2004\CN45.ASC
['CN45', '30 Jul 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\CN45.ASC
Processing File Number 194, C:\Share\ascii.dut\ascii.old\2004\CN52.asc
['CN52', '30 Jul 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\CN52.asc
Processing File Number 195, C:\Share\ascii.dut\ascii.old\2004\NTN1012004.asc
['NTN101', ' 9 Aug 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\NTN1012004.asc
Processing File Number 196, C:\Share\ascii.dut\ascii.old\2004\NTN102-1999.asc
['NTN102', '12 Aug 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\NTN102-1999.asc
Processing File Number 197, C:\Share\ascii.dut\ascii.old\2004\NTN102-2004.asc
['NTN102', '12 Aug 2004', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2004\NTN102-2004.asc
Processing File Number 198, C:\Share\ascii.dut\ascii.old\2005\000151.asc
['000151', '22 Apr 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\000151.asc
Processing File Number 199, C:\Share\ascii.dut\ascii.old\2005\000152.asc
['000152', '22 Apr 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\000152.asc
Processing File Number 200, C:\Share\ascii.dut\ascii.old\2005\000153.asc
['000153', '22 Apr 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\000153.asc
Processing File Number 201, C:\Share\ascii.dut\ascii.old\2005\000156.asc
['000156', '22 Apr 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\000156.asc
Processing File Number 202, C:\Share\ascii.dut\ascii.old\2005\000157.asc
['000157', '22 Apr 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\000157.asc
Processing File Number 203, C:\Share\ascii.dut\ascii.old\2005\00CN53.asc
['00CN53', '20 Sep 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\00CN53.asc
Processing File Number 204, C:\Share\ascii.dut\ascii.old\2005\700163.asc
C:\Share\ascii.dut\ascii.old\2005\700163.asc was passed due to improper number of columns
Processing File Number 205, C:\Share\ascii.dut\ascii.old\2005\700164.asc
C:\Share\ascii.dut\ascii.old\2005\700164.asc was passed due to improper number of columns
Processing File Number 206, C:\Share\ascii.dut\ascii.old\2005\700380.asc1
['700380', ' 6 Oct 2004']
Next file is C:\Share\ascii.dut\ascii.old\2005\700380.asc1
Processing File Number 207, C:\Share\ascii.dut\ascii.old\2005\700395.asc
['700395', ' 1 Feb 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\700395.asc
Processing File Number 208, C:\Share\ascii.dut\ascii.old\2005\700436.asc
['700436', '29 Sep 2005']
Next file is C:\Share\ascii.dut\ascii.old\2005\700436.asc
Processing File Number 209, C:\Share\ascii.dut\ascii.old\2005\813552.asc
['813552', '25 Feb 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\813552.asc
Processing File Number 210, C:\Share\ascii.dut\ascii.old\2005\923062.asc
['923062', ' 7 Sep 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\923062.asc
Processing File Number 211, C:\Share\ascii.dut\ascii.old\2005\952169.asc
['952169', ' 8 Jul 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\952169.asc
Processing File Number 212, C:\Share\ascii.dut\ascii.old\2005\952507.asc
['952507', ' 4 Aug 2005']
Next file is C:\Share\ascii.dut\ascii.old\2005\952507.asc
Processing File Number 213, C:\Share\ascii.dut\ascii.old\2005\952508.asc
['952508', ' 4 Aug 2005']
Next file is C:\Share\ascii.dut\ascii.old\2005\952508.asc
Processing File Number 214, C:\Share\ascii.dut\ascii.old\2005\952508.ascold
['952508', '14 Jun 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\952508.ascold
Processing File Number 215, C:\Share\ascii.dut\ascii.old\2005\953016.asc
['953016', '13 Jul 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\953016.asc
Processing File Number 216, C:\Share\ascii.dut\ascii.old\2005\C14102.asc
['C14102', '27 May 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\C14102.asc
Processing File Number 217, C:\Share\ascii.dut\ascii.old\2005\C24117.asc
['C24117', ' 7 Feb 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\C24117.asc
Processing File Number 218, C:\Share\ascii.dut\ascii.old\2005\C24119.asc
['C24119', ' 7 Feb 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\C24119.asc
Processing File Number 219, C:\Share\ascii.dut\ascii.old\2005\C24131.asc
['C24131', ' 7 Feb 2005', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2005\C24131.asc
Processing File Number 220, C:\Share\ascii.dut\ascii.old\2006\700150.asc
['700150', '15 Dec 2006', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2006\700150.asc
Processing File Number 221, C:\Share\ascii.dut\ascii.old\2006\CN24.asc
['CN24', ' 5 Dec 2006', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2006\CN24.asc
Processing File Number 222, C:\Share\ascii.dut\ascii.old\2007\000002.asc
['000002', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\000002.asc
Processing File Number 223, C:\Share\ascii.dut\ascii.old\2007\000003.asc
['000003', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\000003.asc
Processing File Number 224, C:\Share\ascii.dut\ascii.old\2007\000004.asc
['000004', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\000004.asc
Processing File Number 225, C:\Share\ascii.dut\ascii.old\2007\000005.asc
['000005', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\000005.asc
Processing File Number 226, C:\Share\ascii.dut\ascii.old\2007\100001.asc
['100001', '20 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\100001.asc
Processing File Number 227, C:\Share\ascii.dut\ascii.old\2007\100189.asc
['100189', '20 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\100189.asc
Processing File Number 228, C:\Share\ascii.dut\ascii.old\2007\100313.asc
['100313', '20 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\100313.asc
Processing File Number 229, C:\Share\ascii.dut\ascii.old\2007\100520.asc
['100520', '20 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\100520.asc
Processing File Number 230, C:\Share\ascii.dut\ascii.old\2007\15C1.asc
['15C1', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\15C1.asc
Processing File Number 231, C:\Share\ascii.dut\ascii.old\2007\700004.asc
['700004', '16 Nov 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700004.asc
Processing File Number 232, C:\Share\ascii.dut\ascii.old\2007\700085.asc
['700085', ' 9 May 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700085.asc
Processing File Number 233, C:\Share\ascii.dut\ascii.old\2007\700380.asc
['700380', '27 Aug 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700380.asc
Processing File Number 234, C:\Share\ascii.dut\ascii.old\2007\700487.asc
['700487', ' 9 May 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700487.asc
Processing File Number 235, C:\Share\ascii.dut\ascii.old\2007\700487.asc 8753
['700487', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700487.asc 8753
Processing File Number 236, C:\Share\ascii.dut\ascii.old\2007\700514.asc
['700514', '20 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\700514.asc
Processing File Number 237, C:\Share\ascii.dut\ascii.old\2007\806101.asc
['806101', '23 Jul 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\806101.asc
Processing File Number 238, C:\Share\ascii.dut\ascii.old\2007\814120.asc
['814120', ' 7 Oct 2009', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\814120.asc
Processing File Number 239, C:\Share\ascii.dut\ascii.old\2007\922762.asc sys 1
['922762', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\922762.asc sys 1
Processing File Number 240, C:\Share\ascii.dut\ascii.old\2007\922762.asc.t
C:\Share\ascii.dut\ascii.old\2007\922762.asc.t was passed due to an unkown issue
Processing File Number 241, C:\Share\ascii.dut\ascii.old\2007\923071.asc
['923071', '14 Aug 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\923071.asc
Processing File Number 242, C:\Share\ascii.dut\ascii.old\2007\C07104.asc
['C07104', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\C07104.asc
Processing File Number 243, C:\Share\ascii.dut\ascii.old\2007\C07107.asc
['C07107', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\C07107.asc
Processing File Number 244, C:\Share\ascii.dut\ascii.old\2007\C07108.asc
['C07108', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\C07108.asc
Processing File Number 245, C:\Share\ascii.dut\ascii.old\2007\C07109.asc
['C07109', '12 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\C07109.asc
Processing File Number 246, C:\Share\ascii.dut\ascii.old\2007\CTN102.asc
['CTN102', '25 Jan 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\CTN102.asc
Processing File Number 247, C:\Share\ascii.dut\ascii.old\2007\CTN106.asc
['CTN106', ' 8 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\CTN106.asc
Processing File Number 248, C:\Share\ascii.dut\ascii.old\2007\CTN115.asc
['CTN115', ' 8 Feb 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\CTN115.asc
Processing File Number 249, C:\Share\ascii.dut\ascii.old\2007\HOS1.asc
['HOS1', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\HOS1.asc
Processing File Number 250, C:\Share\ascii.dut\ascii.old\2007\MOS1.asc
['MOS1', '17 Dec 2007', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2007\MOS1.asc
Processing File Number 251, C:\Share\ascii.dut\ascii.old\2008\700553.asc
['700553', '3 Dec 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700553.asc
Processing File Number 252, C:\Share\ascii.dut\ascii.old\2008\700553.asc 100K-2G old 8753T
['700553', '24 Oct 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700553.asc 100K-2G old 8753T
Processing File Number 253, C:\Share\ascii.dut\ascii.old\2008\700553.asc 8510
['700553', '25 Nov 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700553.asc 8510
Processing File Number 254, C:\Share\ascii.dut\ascii.old\2008\700553.asc new 8753T
['700553', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700553.asc new 8753T
Processing File Number 255, C:\Share\ascii.dut\ascii.old\2008\700553.asc old 8753T
['700553', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700553.asc old 8753T
Processing File Number 256, C:\Share\ascii.dut\ascii.old\2008\700554.asc
['700554', '3 Dec 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700554.asc
Processing File Number 257, C:\Share\ascii.dut\ascii.old\2008\700554.asc 100K-2G old 8753T
['700554', '24 Oct 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700554.asc 100K-2G old 8753T
Processing File Number 258, C:\Share\ascii.dut\ascii.old\2008\700554.asc 8510
['700554', '25 Nov 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700554.asc 8510
Processing File Number 259, C:\Share\ascii.dut\ascii.old\2008\700554.asc new 8753T
['700554', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700554.asc new 8753T
Processing File Number 260, C:\Share\ascii.dut\ascii.old\2008\700554.asc old 8753T
['700554', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700554.asc old 8753T
Processing File Number 261, C:\Share\ascii.dut\ascii.old\2008\700555.asc
['700555', '3 Dec 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700555.asc
Processing File Number 262, C:\Share\ascii.dut\ascii.old\2008\700555.asc 100K-2G old 8753T
['700555', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700555.asc 100K-2G old 8753T
Processing File Number 263, C:\Share\ascii.dut\ascii.old\2008\700555.asc 8510
['700555', '25 Nov 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700555.asc 8510
Processing File Number 264, C:\Share\ascii.dut\ascii.old\2008\700555.asc new 8753T
['700555', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700555.asc new 8753T
Processing File Number 265, C:\Share\ascii.dut\ascii.old\2008\700555.asc old 8753T
['700555', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700555.asc old 8753T
Processing File Number 266, C:\Share\ascii.dut\ascii.old\2008\700556.asc
['700556', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700556.asc
Processing File Number 267, C:\Share\ascii.dut\ascii.old\2008\700556.asc 100K-2G old 8753T
['700556', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700556.asc 100K-2G old 8753T
Processing File Number 268, C:\Share\ascii.dut\ascii.old\2008\700556.asc 8510
['700556', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700556.asc 8510
Processing File Number 269, C:\Share\ascii.dut\ascii.old\2008\700556.asc new 8753T
['700556', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700556.asc new 8753T
Processing File Number 270, C:\Share\ascii.dut\ascii.old\2008\700556.asc old 8753T
['700556', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700556.asc old 8753T
Processing File Number 271, C:\Share\ascii.dut\ascii.old\2008\700557.asc
['700557', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700557.asc
Processing File Number 272, C:\Share\ascii.dut\ascii.old\2008\700557.asc 100K-2G old 8753T
['700557', '24 Oct 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700557.asc 100K-2G old 8753T
Processing File Number 273, C:\Share\ascii.dut\ascii.old\2008\700557.asc 8510
['700557', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700557.asc 8510
Processing File Number 274, C:\Share\ascii.dut\ascii.old\2008\700557.asc new 8753T
['700557', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700557.asc new 8753T
Processing File Number 275, C:\Share\ascii.dut\ascii.old\2008\700557.asc old 8753T
['700557', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700557.asc old 8753T
Processing File Number 276, C:\Share\ascii.dut\ascii.old\2008\700558.asc
['700558', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700558.asc
Processing File Number 277, C:\Share\ascii.dut\ascii.old\2008\700558.asc 100K-2G old 8753T
['700558', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700558.asc 100K-2G old 8753T
Processing File Number 278, C:\Share\ascii.dut\ascii.old\2008\700558.asc 8510
['700558', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700558.asc 8510
Processing File Number 279, C:\Share\ascii.dut\ascii.old\2008\700558.asc new 8753T
['700558', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\700558.asc new 8753T
Processing File Number 280, C:\Share\ascii.dut\ascii.old\2008\700558.asc old 8753T
['700558', '25 Nov 2008']
Next file is C:\Share\ascii.dut\ascii.old\2008\700558.asc old 8753T
Processing File Number 281, C:\Share\ascii.dut\ascii.old\2008\923072.asc
['923072', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923072.asc
Processing File Number 282, C:\Share\ascii.dut\ascii.old\2008\923073.asc
['923073', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923073.asc
Processing File Number 283, C:\Share\ascii.dut\ascii.old\2008\923074.asc
['923074', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923074.asc
Processing File Number 284, C:\Share\ascii.dut\ascii.old\2008\923075.asc
['923075', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923075.asc
Processing File Number 285, C:\Share\ascii.dut\ascii.old\2008\923076.asc
['923076', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923076.asc
Processing File Number 286, C:\Share\ascii.dut\ascii.old\2008\923077.asc
['923077', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923077.asc
Processing File Number 287, C:\Share\ascii.dut\ascii.old\2008\923078.asc
['923078', ' 3 Dec 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\923078.asc
Processing File Number 288, C:\Share\ascii.dut\ascii.old\2008\CTNP11.asc
['CTNP11', '21 Aug 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\CTNP11.asc
Processing File Number 289, C:\Share\ascii.dut\ascii.old\2008\CTNP12.asc
['CTNP12', '21 Aug 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\CTNP12.asc
Processing File Number 290, C:\Share\ascii.dut\ascii.old\2008\CTNP13.asc
['CTNP13', '21 Aug 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\CTNP13.asc
Processing File Number 291, C:\Share\ascii.dut\ascii.old\2008\FS0119.asc
['FS0119', '24 Jul 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\FS0119.asc
Processing File Number 292, C:\Share\ascii.dut\ascii.old\2008\LOADC2.asc
['LOADC2', ' 5 Jun 2008', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2008\LOADC2.asc
Processing File Number 293, C:\Share\ascii.dut\ascii.old\2009\951556.asc
C:\Share\ascii.dut\ascii.old\2009\951556.asc was passed due to improper number of columns
Processing File Number 294, C:\Share\ascii.dut\ascii.old\2009\952042.asc
C:\Share\ascii.dut\ascii.old\2009\952042.asc was passed due to improper number of columns
Processing File Number 295, C:\Share\ascii.dut\ascii.old\2009\952157.asc
C:\Share\ascii.dut\ascii.old\2009\952157.asc was passed due to improper number of columns
Processing File Number 296, C:\Share\ascii.dut\ascii.old\2009\953005.asc
['953005', '17 Dec 2009', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2009\953005.asc
Processing File Number 297, C:\Share\ascii.dut\ascii.old\2009\CTN102.asc
['CTN102', '10 Jun 2009', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2009\CTN102.asc
Processing File Number 298, C:\Share\ascii.dut\ascii.old\2009\CTN112.asc
['CTN112', '29 Oct 2009', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2009\CTN112.asc
Processing File Number 299, C:\Share\ascii.dut\ascii.old\2009\CTN114.asc
['CTN114', '10 Jun 2009', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2009\CTN114.asc
Processing File Number 300, C:\Share\ascii.dut\ascii.old\2010\700375.asc
['700375', '25 Mar 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\700375.asc
Processing File Number 301, C:\Share\ascii.dut\ascii.old\2010\923062.asc
['923062', ' 6 Jan 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\923062.asc
Processing File Number 302, C:\Share\ascii.dut\ascii.old\2010\952505.asc
C:\Share\ascii.dut\ascii.old\2010\952505.asc was passed due to improper number of columns
Processing File Number 303, C:\Share\ascii.dut\ascii.old\2010\953042.asc
['953042', ' 6 Jan 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\953042.asc
Processing File Number 304, C:\Share\ascii.dut\ascii.old\2010\CTN112.asc
['CTN112', '15 Dec 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\CTN112.asc
Processing File Number 305, C:\Share\ascii.dut\ascii.old\2010\ISODLD.asc
['ISODLD', '15 Dec 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\ISODLD.asc
Processing File Number 306, C:\Share\ascii.dut\ascii.old\2010\ISOELB.asc
['ISOELB', '15 Dec 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\ISOELB.asc
Processing File Number 307, C:\Share\ascii.dut\ascii.old\2010\ISOELD.asc
['ISOELD', '15 Dec 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\ISOELD.asc
Processing File Number 308, C:\Share\ascii.dut\ascii.old\2010\ISOESH.asc
['ISOESH', '15 Dec 2010', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2010\ISOESH.asc
Processing File Number 309, C:\Share\ascii.dut\ascii.old\2011\700395.asc
['700395', '29 Aug 2011', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2011\700395.asc
Processing File Number 310, C:\Share\ascii.dut\ascii.old\2011\700597.asc
['700597', '30 Aug 2011', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2011\700597.asc
Processing File Number 311, C:\Share\ascii.dut\ascii.old\2011\700598.asc
['700598', '30 Aug 2011', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2011\700598.asc
Processing File Number 312, C:\Share\ascii.dut\ascii.old\2011\700599.asc
['700599', '30 Aug 2011', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2011\700599.asc
Processing File Number 313, C:\Share\ascii.dut\ascii.old\2012\926A21.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A21.asc
Processing File Number 314, C:\Share\ascii.dut\ascii.old\2012\926A22.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A22.asc
Processing File Number 315, C:\Share\ascii.dut\ascii.old\2012\926A23.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A23.asc
Processing File Number 316, C:\Share\ascii.dut\ascii.old\2012\926A24.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A24.asc
Processing File Number 317, C:\Share\ascii.dut\ascii.old\2012\926A25.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A25.asc
Processing File Number 318, C:\Share\ascii.dut\ascii.old\2012\926A26.asc
['926A2', '28 Feb 2012', '\r']
Next file is C:\Share\ascii.dut\ascii.old\2012\926A26.asc
Processing File Number 319, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922562.asc.asc
['922562', ' 6 Mar 2013']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922562.asc.asc
Processing File Number 320, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922563.asc.asc
['922563', '12 Mar 2013']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922563.asc.asc
Processing File Number 321, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566.asc
['922566', '15 Mar 2013', '\r']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566.asc
Processing File Number 322, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566H.asc.asc
['922566', '12 Mar 2013']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566H.asc.asc
Processing File Number 323, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566Q.asc
['922566', '15 Mar 2013', '\r']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922566Q.asc
Processing File Number 324, C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922568.asc.asc
['922568', ' 6 Mar 2013']
Next file is C:\Share\ascii.dut\ascii.old\Army Data Temp Hold\922568.asc.asc
Processing File Number 325, C:\Share\ascii.dut\ILC SOLT 2013\L1CL4.asc
['L1CL4', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L1CL4.asc
Processing File Number 326, C:\Share\ascii.dut\ILC SOLT 2013\L1CL5.asc
['L1CL5', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L1CL5.asc
Processing File Number 327, C:\Share\ascii.dut\ILC SOLT 2013\L1CL6.asc
['L1CL6', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L1CL6.asc
Processing File Number 328, C:\Share\ascii.dut\ILC SOLT 2013\L1CL7.asc
['L1CL7', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L1CL7.asc
Processing File Number 329, C:\Share\ascii.dut\ILC SOLT 2013\L2CL4.asc
['L2CL4', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L2CL4.asc
Processing File Number 330, C:\Share\ascii.dut\ILC SOLT 2013\L2CL5.asc
['L2CL5', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L2CL5.asc
Processing File Number 331, C:\Share\ascii.dut\ILC SOLT 2013\L2CL6.asc
['L2CL6', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L2CL6.asc
Processing File Number 332, C:\Share\ascii.dut\ILC SOLT 2013\L2CL7.asc
['L2CL7', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC SOLT 2013\L2CL7.asc
Processing File Number 333, C:\Share\ascii.dut\ILC TRL 2013\L1CL4.asc
['L1CL4', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L1CL4.asc
Processing File Number 334, C:\Share\ascii.dut\ILC TRL 2013\L1CL5.asc
['L1CL5', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L1CL5.asc
Processing File Number 335, C:\Share\ascii.dut\ILC TRL 2013\L1CL6.asc
['L1CL6', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L1CL6.asc
Processing File Number 336, C:\Share\ascii.dut\ILC TRL 2013\L1CL7.asc
['L1CL7', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L1CL7.asc
Processing File Number 337, C:\Share\ascii.dut\ILC TRL 2013\L2CL4.asc
['L2CL4', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L2CL4.asc
Processing File Number 338, C:\Share\ascii.dut\ILC TRL 2013\L2CL5.asc
['L2CL5', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L2CL5.asc
Processing File Number 339, C:\Share\ascii.dut\ILC TRL 2013\L2CL6.asc
['L2CL6', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L2CL6.asc
Processing File Number 340, C:\Share\ascii.dut\ILC TRL 2013\L2CL7.asc
['L2CL7', '17 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC TRL 2013\L2CL7.asc
Processing File Number 341, C:\Share\ascii.dut\ILC Type N 2013\ILCN1.asc
['ILCN1', '31 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC Type N 2013\ILCN1.asc
Processing File Number 342, C:\Share\ascii.dut\ILC Type N 2013\ILCN2.asc
['ILCN2', '31 May 2013', '\r']
Next file is C:\Share\ascii.dut\ILC Type N 2013\ILCN2.asc
Processing File Number 343, C:\Share\ascii.dut\Old\08046A.asc
['08046A', '23 Dec 2015', '\r']
Next file is C:\Share\ascii.dut\Old\08046A.asc
Processing File Number 344, C:\Share\ascii.dut\Old\08047A.asc
['08047A', '23 Dec 2015', '\r']
Next file is C:\Share\ascii.dut\Old\08047A.asc
Processing File Number 345, C:\Share\ascii.dut\Old\700437.asc
['700437\t\t\t\t\t\t\t\t\t\t', '16-Oct-15\t\t\t\t\t\t\t\t\t\t']
Next file is C:\Share\ascii.dut\Old\700437.asc
Processing File Number 346, C:\Share\ascii.dut\Old\700437.asc_tmf
['700437', '16 Oct 2015', '\r']
Next file is C:\Share\ascii.dut\Old\700437.asc_tmf
Processing File Number 347, C:\Share\ascii.dut\Old\C35104.asc
['C35104', ' 7 Jan 2016', '\r']
Next file is C:\Share\ascii.dut\Old\C35104.asc
Processing File Number 348, C:\Share\ascii.dut\Old\CTN112.asc
['CTN112', '20 Nov 2015', '\r']
Next file is C:\Share\ascii.dut\Old\CTN112.asc
Processing File Number 349, C:\Share\ascii.dut\Old\N35102.asc
['N35102', '26 Jan 2016', '\r']
Next file is C:\Share\ascii.dut\Old\N35102.asc
Processing File Number 350, C:\Share\ascii.dut\Temp Holding\N35101.asc
['N35101', '23 Dec 2015', '\r']
Next file is C:\Share\ascii.dut\Temp Holding\N35101.asc
Processing File Number 351, C:\Share\ascii.dut\Temp Holding\N35102.asc
['N35102', '23 Dec 2015', '\r']
Next file is C:\Share\ascii.dut\Temp Holding\N35102.asc
Processing File Number 352, C:\Share\ascii.dut\Trans 082711\1870.asc
['1870', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\1870.asc
Processing File Number 353, C:\Share\ascii.dut\Trans 082711\47141.asc
['47141', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\47141.asc
Processing File Number 354, C:\Share\ascii.dut\Trans 082711\47144.asc
['47144', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\47144.asc
Processing File Number 355, C:\Share\ascii.dut\Trans 082711\69329.asc 5-18-2011
['69329', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\69329.asc 5-18-2011
Processing File Number 356, C:\Share\ascii.dut\Trans 082711\69333.asc
['69333', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\69333.asc
Processing File Number 357, C:\Share\ascii.dut\Trans 082711\71369.asc
['71369', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\71369.asc
Processing File Number 358, C:\Share\ascii.dut\Trans 082711\82306.asc
['82306', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\82306.asc
Processing File Number 359, C:\Share\ascii.dut\Trans 082711\83087.asc 5-18-2011
['83087', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\83087.asc 5-18-2011
Processing File Number 360, C:\Share\ascii.dut\Trans 082711\83646.asc 5-18-2011
['83646', '18 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\83646.asc 5-18-2011
Processing File Number 361, C:\Share\ascii.dut\Trans 082711\8807A.asc
['8807A', '16 May 2011', '\r']
Next file is C:\Share\ascii.dut\Trans 082711\8807A.asc
Processing File Number 362, C:\Share\ascii.dut\Trans 082711\C07109.asc
['C07109', '11 Feb 2011', '\r']
Last File
363 files were converted to a single csv in 3.936 seconds
In [8]:
#check the data
one_port_calrep_data_frame=pandas.read_csv(ONE_PORT_CALREP_CSV)
one_port_calrep_data_frame[:20]
Out[8]:
Frequency mag uMb uMa uMd uMg arg uAb uAa uAd uAg Device_Id Analysis_Date
0 0.05 1.0002 0.0020 0.0005 0.0001 0.0041 -2.21 0.11 0.03 0.00 0.23 02806 9 Feb 2016
1 0.06 1.0002 0.0019 0.0005 0.0000 0.0039 -2.67 0.11 0.03 0.00 0.22 02806 9 Feb 2016
2 0.07 1.0001 0.0018 0.0005 0.0000 0.0038 -3.12 0.11 0.03 0.00 0.22 02806 9 Feb 2016
3 0.08 1.0000 0.0018 0.0005 0.0000 0.0037 -3.58 0.10 0.03 0.00 0.21 02806 9 Feb 2016
4 0.09 1.0000 0.0018 0.0005 0.0000 0.0037 -4.03 0.10 0.03 0.00 0.21 02806 9 Feb 2016
5 0.10 1.0001 0.0018 0.0005 0.0001 0.0037 -4.49 0.10 0.03 0.02 0.21 02806 9 Feb 2016
6 0.15 1.0001 0.0017 0.0005 0.0000 0.0036 -6.73 0.10 0.03 0.00 0.21 02806 9 Feb 2016
7 0.20 1.0000 0.0017 0.0005 0.0000 0.0036 -8.98 0.10 0.03 0.00 0.20 02806 9 Feb 2016
8 0.25 0.9999 0.0017 0.0005 0.0000 0.0035 -11.23 0.10 0.03 0.00 0.20 02806 9 Feb 2016
9 0.30 1.0000 0.0017 0.0005 0.0000 0.0035 -13.49 0.10 0.03 0.00 0.20 02806 9 Feb 2016
10 0.35 1.0001 0.0017 0.0005 0.0000 0.0035 -15.74 0.10 0.03 0.00 0.20 02806 9 Feb 2016
11 0.40 1.0000 0.0017 0.0005 0.0000 0.0035 -17.99 0.10 0.03 0.00 0.20 02806 9 Feb 2016
12 0.45 1.0000 0.0017 0.0005 0.0000 0.0035 -20.25 0.10 0.03 0.00 0.20 02806 9 Feb 2016
13 0.50 1.0000 0.0017 0.0005 0.0000 0.0035 -22.50 0.10 0.03 0.00 0.20 02806 9 Feb 2016
14 0.55 1.0000 0.0017 0.0005 0.0000 0.0035 -24.75 0.10 0.03 0.00 0.20 02806 9 Feb 2016
15 0.60 1.0001 0.0017 0.0005 0.0000 0.0035 -27.01 0.10 0.03 0.00 0.20 02806 9 Feb 2016
16 0.65 1.0001 0.0017 0.0005 0.0000 0.0035 -29.26 0.10 0.03 0.00 0.20 02806 9 Feb 2016
17 0.70 1.0000 0.0017 0.0005 0.0000 0.0035 -31.52 0.10 0.03 0.00 0.20 02806 9 Feb 2016
18 0.75 1.0000 0.0017 0.0005 0.0000 0.0035 -33.77 0.10 0.03 0.00 0.20 02806 9 Feb 2016
19 0.80 0.9999 0.0017 0.0005 0.0000 0.0035 -36.03 0.10 0.03 0.00 0.20 02806 9 Feb 2016
In [12]:
# We parse the file and extract Analysis_Date and Device_Id
PRINT_REPORT=False
start_time=datetime.datetime.now()
initial_file=TwoPortCalrepModel(two_port_files[0])
device_id=initial_file.joined_table.header[0].rstrip().lstrip()
if PRINT_REPORT:
    print("{0} is {1}".format('device_id',device_id))
try:
    analysis_date=initial_file.joined_table.header[1].rstrip().lstrip()
except:
    analysis_date=""
if PRINT_REPORT:
    print("{0} is {1}".format('analysis_date',analysis_date))
initial_file.joined_table.options["data_delimiter"]=","
initial_file.joined_table.add_column(column_name='Device_Id',column_type='str',
                        column_data=[device_id for row in initial_file.joined_table.data[:]])
initial_file.joined_table.add_column(column_name='Analysis_Date',column_type='str',
                        column_data=[analysis_date for row in initial_file.joined_table.data[:]])
#print initial_file
initial_file.joined_table.header=None
initial_file.joined_table.save(TWO_PORT_CALREP_CSV)
del initial_file
out_file=open(TWO_PORT_CALREP_CSV,'a')
file_list=two_port_files[1:]
for index,file_name in enumerate(file_list):
    try:
        if PRINT_REPORT:
            print("Processing File Number {0}, {1}".format(index,file_name))
        two_port_table=TwoPortCalrepModel(file_name)
        device_id=two_port_table.joined_table.header[0].rstrip().lstrip()
        if PRINT_REPORT:
            print("{0} is {1}".format('device_id',device_id))
        try:
            analysis_date=two_port_table.joined_table.header[1].rstrip().lstrip()
        except:
            analysis_date=""
        if PRINT_REPORT:
            print("{0} is {1}".format('analysis_date',analysis_date))
        two_port_table.joined_table.options["data_delimiter"]=","
        two_port_table.joined_table.add_column(column_name='Device_Id',column_type='str',
                                column_data=[device_id for row in two_port_table.joined_table.data[:]])
        two_port_table.joined_table.add_column(column_name='Analysis_Date',column_type='str',
                                column_data=[analysis_date for row in two_port_table.joined_table.data[:]])
        out_file.write("\n")
        data=two_port_table.joined_table.get_data_string()
        out_file.write(data)
        if PRINT_REPORT:
            print two_port_table.joined_table.header
            if index==len(file_list)-1:
                print("Last File")
            else:
                print("Next file is {0}".format(two_port_files[index+1]))
    except DataDimensionError:
        print("{0} was passed due to a data dimensioning problem".format(file_name))
        pass
    except AttributeError:
        print("{0} was passed due to a loading issue".format(file_name))
    except TypeError:
        print("{0} was passed due to an unkown issue".format(file_name))
    except TypeConversionError:
        print("{0} was passed due to improper number of columns".format(file_name))
    except ValueError:
        print("{0} was passed due to improper number of columns".format(file_name))
    except:raise
out_file.close()
stop_time=datetime.datetime.now()
diff=stop_time-start_time
print("{0} files were converted to a single csv in {1} seconds".format(len(file_list),diff.total_seconds()))
C:\Share\ascii.dut\ascii.old\2003\922596.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2003\922597.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2003\922598.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2003\922599.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2003\922600.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2003\922601.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700312.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700313.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700314.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700315.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700316.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700318.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700324.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700325.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700326.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700327.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2004\700364l.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2008\06364.asc was passed due to a data dimensioning problem
C:\Share\ascii.dut\ascii.old\2010\C15205.1asc was passed due to a loading issue
C:\Share\ascii.dut\Old\N35202.asc.bak was passed due to a loading issue
C:\Share\ascii.dut\Old\N35202.asc.old was passed due to a loading issue
513 files were converted to a single csv in 22.242 seconds
In [14]:
#check the data
two_port_calrep_data_frame=pandas.read_csv(TWO_PORT_CALREP_CSV)
two_port_calrep_data_frame[:10]
Out[14]:
Frequency magS11 uMbS11 uMaS11 uMdS11 uMgS11 argS11 uAbS11 uAaS11 uAdS11 ... uMaS22 uMdS22 uMgS22 argS22 uAbS22 uAaS22 uAdS22 uAgS22 Device_Id Analysis_Date
0 0.10 0.0023 0.0035 0.0005 0.0001 0.0072 29.36 64.96 0.03 1.38 ... 0.0005 0.0000 0.0072 21.67 72.00 0.03 2.02 144.01 000146 29 Jan 2016
1 0.15 0.0024 0.0035 0.0005 0.0000 0.0070 23.03 61.07 0.03 0.62 ... 0.0005 0.0001 0.0070 17.13 60.29 0.03 1.56 120.60 000146 29 Jan 2016
2 0.20 0.0029 0.0034 0.0005 0.0001 0.0069 16.46 53.31 0.03 0.90 ... 0.0005 0.0001 0.0069 8.78 57.30 0.03 0.53 114.60 000146 29 Jan 2016
3 0.25 0.0029 0.0034 0.0005 0.0001 0.0069 6.96 52.45 0.03 1.24 ... 0.0005 0.0001 0.0069 3.49 51.65 0.03 0.75 103.29 000146 29 Jan 2016
4 0.30 0.0033 0.0034 0.0005 0.0001 0.0069 0.93 48.14 0.03 1.22 ... 0.0005 0.0001 0.0069 -3.51 50.43 0.03 0.85 100.86 000146 29 Jan 2016
5 0.35 0.0033 0.0034 0.0005 0.0001 0.0069 -5.09 48.05 0.03 1.39 ... 0.0005 0.0001 0.0069 -11.51 47.75 0.03 0.78 95.49 000146 29 Jan 2016
6 0.40 0.0034 0.0034 0.0005 0.0001 0.0069 -10.24 47.13 0.03 1.48 ... 0.0005 0.0001 0.0069 -18.81 46.78 0.03 0.54 93.55 000146 29 Jan 2016
7 0.45 0.0034 0.0034 0.0005 0.0000 0.0068 -17.22 47.31 0.03 1.74 ... 0.0005 0.0001 0.0068 -26.59 45.48 0.03 0.83 90.96 000146 29 Jan 2016
8 0.50 0.0035 0.0034 0.0005 0.0000 0.0068 -25.99 45.65 0.03 1.75 ... 0.0005 0.0001 0.0068 -31.45 45.88 0.03 0.65 91.76 000146 29 Jan 2016
9 0.55 0.0036 0.0034 0.0005 0.0000 0.0068 -30.70 45.58 0.03 1.90 ... 0.0005 0.0001 0.0068 -40.05 45.54 0.03 0.71 91.09 000146 29 Jan 2016

10 rows × 33 columns

In [5]:
# First we separate the power list between the two types
power_3term=[]
power_4term=[]
for file_name in power_files:
    try:
        new_table=PowerCalrepModel(file_name)
        number_columns=len(new_table.joined_table.data[0])
        if number_columns == 19:
            power_3term.append(file_name)
        elif number_columns ==21:
            power_4term.append(file_name)
        else:
            print("{0} does not conform".format(file_name))
    except:
        print("{0} caused an error".format(file_name))
        pass
C:\Share\ascii.dut\ascii.old\700095.ASC dircomp caused an error
begin_line is 36
C:\Share\ascii.dut\ascii.old\2001\811671low.ASC caused an error
C:\Share\ascii.dut\ascii.old\2001\812069low.asc caused an error
C:\Share\ascii.dut\ascii.old\2002\814116.ASC caused an error
C:\Share\ascii.dut\ascii.old\2003\700194.ASC caused an error
C:\Share\ascii.dut\ascii.old\2003\810004.asc caused an error
C:\Share\ascii.dut\ascii.old\2003\813504.ASC caused an error
C:\Share\ascii.dut\ascii.old\2003\814114.ASC caused an error
C:\Share\ascii.dut\ascii.old\2003\814115.ASC caused an error
C:\Share\ascii.dut\ascii.old\2006\700091.ASC.1-5MHzdircompbiased caused an error
C:\Share\ascii.dut\ascii.old\2006\700091.ASCdircomp .0001-.100 GHzbiased caused an error
C:\Share\ascii.dut\ascii.old\2006\700185.ascBIASED caused an error
C:\Share\ascii.dut\ascii.old\2006\700185.ascUNbiased caused an error
C:\Share\ascii.dut\ascii.old\2006\700237low.asc biased Aug. 24,2006 caused an error
C:\Share\ascii.dut\ascii.old\2006\700237low.asc biased Nov. 3, 2006 caused an error
C:\Share\ascii.dut\ascii.old\2006\700390.ASC caused an error
C:\Share\ascii.dut\ascii.old\2006\813462.asc Aug. 2006 caused an error
C:\Share\ascii.dut\ascii.old\2006\813462low.ascAug. 2006 caused an error
C:\Share\ascii.dut\ascii.old\2006\86214.low asc 2-15-2006 caused an error
C:\Share\ascii.dut\ascii.old\2006\953023.ASC100-12.4 caused an error
C:\Share\ascii.dut\ascii.old\2007\700398.asc Clrp7.1 unbiased caused an error
C:\Share\ascii.dut\ascii.old\2008\700174.ASC 12.2dc caused an error
C:\Share\ascii.dut\ascii.old\2008\700174.ASC 12.2dcA caused an error
C:\Share\ascii.dut\ascii.old\2008\700546.ASC 12.2dc caused an error
C:\Share\ascii.dut\ascii.old\2008\700546.ASC 12.2dcA caused an error
C:\Share\ascii.dut\ascii.old\2008\952480low.asc 6ports Oct. 30, 2008 caused an error
C:\Share\ascii.dut\ascii.old\2009\700385.asc 8.5 caused an error
C:\Share\ascii.dut\ascii.old\2009\700576.ASC Dec. 2009 caused an error
C:\Share\ascii.dut\ascii.old\2010\915355.ASC .1-18 GHz caused an error
C:\Share\ascii.dut\ascii.old\2011\700373.oASC caused an error
C:\Share\ascii.dut\ascii.old\2011\700515.oASC caused an error
C:\Share\ascii.dut\ascii.old\2012\700606.ASC Mar. 2012 caused an error
begin_line is 18
C:\Share\ascii.dut\eff tests\CN49.asc caused an error
begin_line is 18
C:\Share\ascii.dut\eff tests\CN50.asc caused an error
C:\Share\ascii.dut\Old\700579.ASC caused an error
C:\Share\ascii.dut\Trans 082711\700581.ASC calrep 12.2dc caused an error
C:\Share\ascii.dut\Trans 082711\700581.ASC calrep 14.1dc caused an error
C:\Share\ascii.dut\Trans 082711\700581.ASC orig calrep 14.1dc caused an error
In [6]:
print("There are {0} three term files".format(len(power_3term)))
print("There are {0} four term files".format(len(power_4term)))
There are 694 three term files
There are 169 four term files
In [21]:
def power_calrep_to_csv_script(power_file_list,output_file,print_report=False):
    """ Script converts all of the files in power_file_list to a single csv file (output_file).
    Option to print a detailed report print_report=True."""
    PRINT_REPORT=print_report
    # start timer for analysis
    start_time=datetime.datetime.now()
    # seed file for format and column naems
    initial_file=PowerCalrepModel(power_file_list[0])
    # device id assumed to be the first line of header
    device_id=initial_file.joined_table.header[0].rstrip().lstrip()
    if PRINT_REPORT:
        print("{0} is {1}".format('device_id',device_id))
    # try and find the date in the header, since some of the dates are in different lines
    # flatten the header and remove the device id. If it fails just make analysis date blank.
    try:
        header=string_list_collapse(initial_file.joined_table.header[:],string_delimiter="")
        header=header.rstrip().lstrip().replace(device_id,"")
        analysis_date=header
    except:
        analysis_date=""
    if PRINT_REPORT:
        print("{0} is {1}".format('analysis_date',analysis_date))
    # insure that the data delimiter is a comma.
    initial_file.joined_table.options["data_delimiter"]=","
    # Add columns with device id and analysis date in them
    initial_file.joined_table.add_column(column_name='Device_Id',column_type='str',
                            column_data=[device_id for row in initial_file.joined_table.data[:]])
    initial_file.joined_table.add_column(column_name='Analysis_Date',column_type='str',
                            column_data=[analysis_date for row in initial_file.joined_table.data[:]])

                                         
    #print initial_file
    # remove the header for output purposes
    initial_file.joined_table.header=None
    # save the seed file with column names in csv format
    initial_file.joined_table.save(output_file)
    # clear the object from memory
    del initial_file    
    # now the initial write is completed open the file in append mode and append each file 
    out_file=open(output_file,'a')
    for index,file_name in enumerate(power_file_list):
        try:
            if PRINT_REPORT:
                print("Processing File Number {0}, {1}".format(index,file_name))
            table=PowerCalrepModel(file_name)
            # device id assumed to be the first line of header
            device_id=table.joined_table.header[0].rstrip().lstrip()
            if PRINT_REPORT:
                print("{0} is {1}".format('device_id',device_id))
            # try and find the date in the header, since some of the dates are in different lines
            # flatten the header and remove the device id. If it fails just make analysis date blank.
            try:
                header=string_list_collapse(table.joined_table.header[:],string_delimiter="")
                header=header.rstrip().lstrip().replace(device_id,"")
                analysis_date=header
            except:
                analysis_date=""
            if PRINT_REPORT:
                print("{0} is {1}".format('analysis_date',analysis_date))
            # insure that the data delimiter is a comma.
                print table.joined_table.header 
                print table.joined_table.column_names
                print("Data is {0} rows x {1} columns".format(len(table.joined_table.data),
                                                              len(table.joined_table.data[0])))
                if index==len(power_file_list)-1:
                    print("Last File")
                else:
                    print("Next file is {0}".format(power_files[index+1]))
            table.joined_table.options["data_delimiter"]=","
            # Add columns with device id and analysis date in them
            table.joined_table.add_column(column_name='Device_Id',column_type='str',
                                    column_data=[device_id for row in table.joined_table.data[:]])
            table.joined_table.add_column(column_name='Analysis_Date',column_type='str',
                                    column_data=[analysis_date for row in table.joined_table.data[:]])
            # write to out_file
            out_file.write("\n")
            data=table.joined_table.get_data_string()
            out_file.write(data)
        except DataDimensionError:
            print("{0} was passed due to a data dimensioning problem".format(file_name))
            pass
        except AttributeError:
            print("{0} was passed due to a loading issue".format(file_name))
        except ValueError:
            print("{0} was passed due to a column size issue".format(file_name))
        except:raise
    # Close out the script
    out_file.close()
    stop_time=datetime.datetime.now()
    diff=stop_time-start_time
    print("{0} files were converted to a single csv in {1} seconds".format(len(power_file_list),diff.total_seconds()))
                                         
In [22]:
# run the script for the two power types
power_calrep_to_csv_script(power_3term,POWER_3TERM_CALREP_CSV,print_report=False)
694 files were converted to a single csv in 10.393 seconds
In [23]:
power_calrep_to_csv_script(power_4term,POWER_4TERM_CALREP_CSV)
169 files were converted to a single csv in 1.16 seconds