Hi, I have a problem with merging data where the server on auto download always creates a backup
Is there a secret attribute that loggernet looks at whilst checking the file?
For that matter what does loggernet look at when appending data into a dat file?
I built a script in xls to take a new dat file data set, where I have added more columns into a table, say from 2 columns to 4 columns, i merge them and drop the result back into the download location. This works repeatedly in a test setup with a cr6 logger. My only limitation is the first timestamp in the new master dat is later than last timestmp of the previous dat file.
I can rearrange the table columns, remove some in the new file and still the downloads carry on with the newly modified file with merged time period data. (same column header, eg: temp) now merged two time periods from an updated table.
In testing it worked over and over.
In practice when I tried to use it on a CR1000X, 1 hour data (24 coulumns), If I edit/add one row line or add a group of hour data with matching columns the next download fails and creates a backup.
The only thing different I can see is that we have daylight saving, currently the logger is 1 hour back from PC systems timestamps. Can the editing time have an effect or trip for error?
Ie Created, Modfied and Accessed have to be in order or an error will occur?
Note: I have tried even joining two backups and new data into one, where both contain perfect timestamps, record sequences and columns and this fails still.
Why can I test on the bench and everything works, but real world fail? I can't find an error difference in method?
thanks for any help or advice in advance
Whenever LoggerNet finds it necessary to create a backup of a data file because the file headers do not match, it will record a message in its Transaction log (Tran$.log, Tran1.log, ...) noting the event as well as the reason for that decision. I assume that the format that you are using here is the default of TOA5. The checks that are done are as follows:
- The number of fields in the first line must match the expecation for the file format. For TOA5 files, this line must have at least 8 fields.
- The file type designation must match. For TOA5, the first field of the first line must be "TOA5".
- The number of column names (second line in the header) must match.
- Each column name must match.
- The number of units fields in the third line of the header must match.
- The number of process fields in the fourth line of the header must match.
Essentially, the purpose of these checks is to ensure that the header of an already existing file adequately describes the data that we want to append to that data file.
Hi, thanks for the guidance.
Source of the problem lay in corruption of the dat file either by manually opening/edit/save/close or XLSM import.
The problem was due to the UNITS set via the ET table output, as follows, the square was getting damaged and not transfered through a basic manual edit or xlsm.
The issue in XLS source lay in the the open and import line where the XLSM ORIGIN defaulted to "437 OEM United States"
I used "65000 Unicode (UTF-7)" and this kept the square formatting correct
I looked at "65001 Unicode (UTF-8)", but this alters the formatting, "Windows US ASCII" also alters the formatting.
Windows (ANSI) appears to keep the formating, although I haven't tested it yet.
For those whom are interested, here is part of my XLSM import script for merging dat files, which imports the main dat file twice, one delimted ad the other not to preserve the logger timestamp formatting, the important part is the WorkBooks.OpenText and the Origin formatting.
The correct Origin may be one of the other options, if CS can advise on that, otherwise "65000 Unicode (UTF-7)" works
Sub GetMasterDAT() Set src = Worksheets("Home").Range("A1:B10") With src .ClearContents End With Set src = Worksheets("Home").Range("H5:H10") With src .ClearContents End With Set src = Worksheets("Home").Range("C4") With src With src.Interior .Color = 65280 .Pattern = xlSolid End With End With Dim xWs As Worksheet Application.DisplayAlerts = False For Each xWs In Application.ActiveWorkbook.Worksheets If xWs.Name <> "Home" Then xWs.Delete End If Next Application.DisplayAlerts = True On Error GoTo ErrorHandler Workbooks.OpenText Filename:=Application.GetOpenFilename _ , Origin:=65000, StartRow:=1, DataType:=xlDelimited, TextQualifier:=xlDoubleQuote _ , ConsecutiveDelimiter:=False, Tab:=True, Semicolon:=False, Comma:=True _ , Space:=False, Other:=False, TrailingMinusNumbers:=True ActiveSheet.Move Before:=Workbooks("mergeDAT.xlsm").Sheets(1) masterDAT = ActiveSheet.Name Sheets("Home").Select Range("A3").Select Selection.Offset(1, 0) = masterDAT Selection.Offset(1, 1) = "masterDAT" Workbooks.OpenText Filename:=masterDAT & ".dat", Origin:=65000 _ , StartRow:=1, DataType:=xlDelimited, TextQualifier:=xlNone, _ ConsecutiveDelimiter:=False, Tab:=False, Semicolon:=False, Comma:=False _ , Space:=False, Other:=False, FieldInfo:=Array(1, 1), _ TrailingMinusNumbers:=True ActiveSheet.Move Before:=Workbooks("mergeDAT.xlsm").Sheets(1) masterDAT2 = ActiveSheet.Name Sheets("Home").Select Range("A4").Select Selection.Offset(1, 0) = masterDAT2 Selection.Offset(1, 1) = "masterDAT2" With src With src.Interior .Color = 3381555 .Pattern = xlSolid End With End With MsgBox "Success: " & masterDAT & " imported" Set src = Worksheets("Home").Range("C8") With src With src.Interior .Color = 65280 .Pattern = xlSolid End With End With Exit Sub ErrorHandler: With src With src.Interior .ColorIndex = 3 .Pattern = xlSolid End With End With MsgBox "ERROR: DAT File collection failed" Exit Sub End Sub
ouch, too add to the confusion I just merged my main dat files and whilst checking the units found that I had to revert to "65001 Unicode (UTF-8)" .
I don't know why but a fresh custom download of a few records to merge with older timestamp file reacted differently to the "MJ/m²" with MJ/mA² using Origin 65000 UTF-7, so reverted to UTF-8 and the dat file imported as "MJ/m²"
something seems to be flip flopping, or I am dealing with damaged dat files?
Of course the main problem is with custom unicode characters in headers and units.
Personally it would be useful to have a units output on/off with table calcs like the ET calc which outputs the units automatically.
These files are used in vistadata vision continous data sets. When I update programmes adding and removing columns in tables this sort of problem causes alot of frustration. Taken two weeks to find and sort this problem.