Working with CSV files

Post

Posted
Rating:
#1 (In Topic #375)
Avatar
Regular
stevedee is in the usergroup ‘Regular’
The CSV (Comma Seperated Values) file is my favourite format when working with data; its simple to understand and its been around forever. CSV files can be opened in any text editor or imported into a spreadsheet.

Each value is seperated from the next via a comma, except the last value in a line. Multiline CSV files normally contain ordered data, where each line represents a data record and each value is the data in a field. The first line may contain field headers.

I must admit I've almost always written code from scratch to deal with CSVs. I seem to find it relaxing working with string commands like Mid, Instr & Replace.

However, I really don't need to do this every time because there is a CsvFile class hidden away in gb.Util. I've only just bothered to look at it, so I thought I'd share my notes.

In a new project I started by creating a text file and adding some CSV data. I then declared an object based on the CsvFile class and added my code to a Button_Click();

Code (gambas)

  1. Public myCSVfile As CsvFile
  2.  
  3. Public Sub Button1_Click()
  4. Dim aFields As String[]
  5. Dim iRecords As Integer
  6. Dim colRecords As New Collection
  7.  
  8.   myCSVfile = New CsvFile(Application.Path & "/" & "csvTest.csv")
  9.   Do Until myCSVfile.Eof
  10.     colRecords[myCSVfile.Line] = myCSVfile.Read()   'read every line into a collection
  11.     Inc iRecords                                    'count the number of records
  12.   Loop
  13.   aFields = myCSVfile.Fields      'get the field headers (i.e. "name", "rank" & "serialNo")
  14.  
  15.  
  16.  

As you can see, you only need a few lines of code to 'decode' the CSV file and make the data easily accessible.

Then I wrote some nonsense to display data;

Code (gambas)

  1.  
  2.  
  3.   'display some data from the CSV file
  4. Dim thisRecord As Integer
  5.  
  6.   thisRecord = Rnd(iRecords) + 2  '1st line=1, but this is the header row, so offset by 2
  7.   Me.Text = "Dad's Army Roll Call: " & "Field count: " & aFields.Count & ", Record count: " & iRecords
  8.   Label1.Text = aFields[0] & ": " & colRecords[CStr(thisRecord)]["rank"]  
  9.   Label1.Text &= " " & colRecords[CStr(thisRecord)]["name"]
  10.   Label1.Text &= " #" & colRecords[CStr(thisRecord)]["serialno"]

If your delimited file doesn't use commas, no problem!
You can specify the delimiter when you instantiate the object:-

Code (gambas)

  1. myHashSVFile = New CsvFile ( Path As String [ , Separator As String, Escape As String ] )



I've also attached an oven-ready project, including the CSV file, to save you some time.
Attachment
Online now: No Back to the top

Post

Posted
Rating:
#2
Avatar
Enthusiast
GrayGhost is in the usergroup ‘Enthusiast’
Thank you for the work and the listing … you must have been looking over my shoulder watching my struggle with file handling, I will be looking at your code to see how I can use it    :D

At the risk of insulting the teacher  :o
I was informed a few days ago to use   &/  instead of  & "/" &

http://gambaswiki.org/wiki/lang/stringop?nh
Online now: No Back to the top

Post

Posted
Rating:
#3
Avatar
Regular
stevedee is in the usergroup ‘Regular’

grayghost4 said

 … you must have been looking over my shoulder watching my struggle with file handling…
Talking of File handling, take a look at the File class in gb.Util. You can get the contents of a file like this:-

Code (gambas)

  1. Public myFile As File
  2. ...
  3.   myFile = New File
  4.   TextArea1.Text = myFile.Load(Application.Path & "/csvTest.csv")

…and if you edit it, you can save it like this:-

Code (gambas)

  1.   myFileCopy.Save(Application.Path & "/csvTest.copy", TextArea1.Text)

Really simple! Much easier than the old way.

…I was informed a few days ago to use   &/  instead of  & "/" &

Don't you worry, I have a very thick skin!

Neither way is right or wrong. I often prefer "wordy" to "terse" code. Long live:-

Code (gambas)

  1. Let integerWithAVeryLongName = integerWithAVeryLongName + 1

So just go with what you are happy with.
Online now: No Back to the top

Post

Posted
Rating:
#4
Avatar
Enthusiast
GrayGhost is in the usergroup ‘Enthusiast’
:evil: Thank you for you post, I have changed my program to use a csv file.
It took a while to learn to implement it, but has made loading a group of boxes on the form a snap.  :D
I think I am starting to learn a little  OOP . :D
it seam that rich text is not supported here, I tried to use <small> but it did not work.

I have discovered that you don't have to count the records as you read them
the Property mycsvfile.line give the last read line  So
 

Code (gambas)

  1.  
  2. iRecords = mycsvfile.line
  3.   '  then
  4.  
  5.  For icount = 2 To iRecords   ' + 1 not needed
  6.     ComboBox1.Add(colRecords[CStr(icount)]["accname"])
  7.     ComboBox1.Insert("************")
  8.   Next
  9.  
  10.  
  11.  index = ComboBox1.Index + 2  
  12.    For Each oObj In accountinfoBoxes.Children
  13.          oObj.Text = colRecords[Index][aFields[icount]]
  14.        Inc icount
  15.    Next
  16.  
Online now: No Back to the top

Post

Posted
Rating:
#5
Avatar
Regular
Serban is in the usergroup ‘Regular’
Thanks!
I am a huge fan of text files and string manipulation.
Most of the work done by DirLister, is string manipulation in fact.
I'm thinking of how to write a book on the solutions applied in DirLister.
There is a lot of code and lots of solutions that can be Copy/Pasted. Well, almost "Copy/Paste".
Why I need the ideas outlined here?
I'm planning to move to a different level with DirLister. Build a statistics module to show me aggregated data on the time spent for the various operations, the amount of data processed and finally, the amount of time consumed. Maybe, finally, convert it into some currency to show the money spared on storage media spendings with DirLister.
Unfortunately, there still is a long way to go to get there…
Greetings! :)

The only thing necessary for the triumph of evil is for good men to do nothing.”― Edmund Burke;
It's easy to die for an idea. It is way harder to LIVE for your idea! (Me)
Online now: No Back to the top

Post

Posted
Rating:
#6
Avatar
Guru
cogier is in the usergroup ‘Guru’
I am a fan of CSV files. Over time I have discovered the power of Gambas so I can now upload a CSV file, ignoring blank lines and dealing with the inverted commas, in one line.

Code (gambas)

  1. Dim sData As String[] = Split(File.Load("MyFile.csv"), gb.NewLine, Chr(34), True)
Online now: No Back to the top

Post

Posted
Rating:
#7
Avatar
Regular
Serban is in the usergroup ‘Regular’

cogier said

[…] in one line.

Code (gambas)

  1. Dim sData As String[] = Split(File.Load("MyFile.csv"), gb.NewLine, Chr(34), True)
I guess this is the only efficient approach. Now, we can cycle through the items and sort them, according to the needs, as [sData] is suitable for any kind of loop, thus we can apply multiple filtering conditions, up to where we want to get.
Probably the filtering code might get a little fat, but this does the job! :D

The only thing necessary for the triumph of evil is for good men to do nothing.”― Edmund Burke;
It's easy to die for an idea. It is way harder to LIVE for your idea! (Me)
Online now: No Back to the top

Post

Posted
Rating:
#8
Regular
01McAc is in the usergroup ‘Regular’
I have a question re CSV files so I need to reopen the old thread:
When I read (myCSVfile.Read() ) the CSV file the first line with headers are completely in lower case although the fist letter of the word is a uppercase letter. The DB refuses to update the data as the field names are case sensitive in sqlite3. Any ideas?

Code

For Each sFile In Dir(DirChooser1.SelectedPath, "*.exp").Sort()
          
          I = 0
          Debug sFile & "  Len: " & Stat(DirChooser1.SelectedPath &/ sFile).Size
          myCSVfile = New CsvFile(DirChooser1.SelectedPath &/ sFile)
          Do Until myCSVfile.Eof
            colRecords[myCSVfile.Line] = myCSVfile.Read()   'read every line into a collection
            Inc iRecords                                    'count the number of records
          Loop
          
          aFields = myCSVfile.Fields      'get the field headers (i.e. "name", "rank" & "serialNo"
          
          Debug "Colcounts: " & colRecords.Count
           
          For ii = 2 To colRecords.Count + 1                    ' go through every line
            sSQL = "SELECT " & aFields[0] & " FROM Lenses WHERE " & aFields[0] & " = " & colRecords[CStr(2)][aFields[0]]
            Debug "SQL: " & sSQL  
            $Rec = DBS.$Con.Exec(sSQL)
            If $Rec.Available Then ' change/edit record
                $Query = " ID = &1"
                $Rec = DBS.$Con.Edit("Lenses", $Query, colRecords[CStr(2)][aFields[0]])
            Else 'new record
                $Rec = DBS.$Con.Create("Lenses")
            Endif
            
            For i = 0 To myCSVfile.Fields.count - 1           ' go through the whole data set within the line
              Debug "Field: " & aFields[i] & "  Content: " & colRecords[CStr(ii)][aFields[i]]     
              If (aFields[i] <> "DateCreated") Or aFields[i] <> "DateChanged" Then
                  $Rec[aFields[i]] = colRecords[CStr(ii)][aFields[i]]  
              Endif
            Next  
          Next
Online now: No Back to the top

Post

Posted
Rating:
#9
Avatar
Enthusiast
PJBlack is in the usergroup ‘Enthusiast’
something like so ?

Code (gambas)

  1.     Print Chr(Asc(Left(sString, 1)) - 32) & Right(sString, -1)
  2.  
Online now: No Back to the top

Post

Posted
Rating:
#10
Regular
01McAc is in the usergroup ‘Regular’
Looks good but doesn't work for fields like "DateChanged" or "LensVariants". Most of my DB fields have just one uppercase letter in the beginning but some have two. My fault, I should have used just one naming convention in the DB design. Too late to change it.
Online now: No Back to the top

Post

Posted
Rating:
#11
Avatar
Regular
stevedee is in the usergroup ‘Regular’

01McAc said

…Too late to change it…

…then I think you will just have to access the headers like this:-

Code (gambas)

  1.   strHeaders = File.Load("/home/steve/Gambas/camera_or_whatever.csv")
  2.   strHeaders = Mid(strHeaders, 1, InStr(strHeaders, "\n"))

…then carry on using the CSV code.

For File.Load you will need gb.Utils
Online now: No Back to the top

Post

Posted
Rating:
#12
Regular
01McAc is in the usergroup ‘Regular’
Thank you, works like a charm.
Online now: No Back to the top

Post

Posted
Rating:
#13
Regular
01McAc is in the usergroup ‘Regular’
One more question re the format.

the original CSV file looks like this, text is always in double quotes:
ID,ID_Lens,Purchase_date,Purchase_price,Purchase_serial_no,Purchase_from,Purchase_lens_colour,Purchase_lens_condition,Purchase__comment,DateCreated,DateChanged
14,100,2019-04-25,"700€",159357,"Fotoshop Berlin",Black,Rosetta,,"2021-02-25 10:41:37","2021-02-25 10:42:51"


When I read the lines with ..

Code (gambas)

  1. Do Until myCSVfile.Eof
  2.             colRecords[myCSVfile.Line] = myCSVfile.Read()  'read every line into a collection

..there are obvious no more double quotes in colRecords[myCSVfile.Line]. Can I get a line of the CSV file with the original double qoutes?
Online now: No Back to the top

Post

Posted
Rating:
#14
Avatar
Regular
stevedee is in the usergroup ‘Regular’

01McAc said

…Can I get a line of the CSV file with the original double qoutes?

I haven't looked into this in any detail, but I don't think you can preserve quotes ( " ) using the CsvFile class. If you simply read the file using the File.Load method, you could work through the data looking for quotes, but its starting to become a Horrible Hack rather than a nicely designed program.

If I were you I would take a second look at the database, and assess how much work would be required to correct the problems there. How did you populate it with quoted strings? Did you first create the data in a spreadsheet, then export as a quoted string csv file?

My other suggestion is that you take a look at the Quote() method. Maybe if you know which fields are quoted, you could re-quote them before sending them back to the database.…but that also sounds like a pretty dreadful idea to me.
Online now: No Back to the top

Post

Posted
Rating:
#15
Regular
01McAc is in the usergroup ‘Regular’
stevedee,
thanks for your time and effort. A short follow up to describe the intention: the simple idea behind is to export new and changed data from tables into a csv file using sqlite3 DB syntax beginning from a specific date because a new version of my little project Lens Register has been released.  

My code generates the following shell script …

Code (gambas)

  1. cmdLine = "sqlite3 -header -csv \"" & myDBLocation & "\"  \"SELECT * FROM " & sElement & " WHERE DateChanged > '"
  2. Shell cmdLine
… and then exports the data into a csv file with quotes. This is a very simple and fast method to generate a csv file. Unfortunately, the structure of the csv file (e.g. quotes, headers etc) is static and cannot be changed in sqlite.

After installing a new release of a Lens Register release which includes the standard database users can easily import their own data they added to the database since the last release. But this turned out it is not so simple :shock:
sqlite provides an (not standard SQL) UPSERT syntax. An UPSERT is an ordinary INSERT statement that is followed by the special ON CONFLICT clause. It inserts a record and in case of a PK violation it updates the record. Exactly what I need but the fields and values needs to be assigned one by one, depending on values (integer or string) they need quotes, e.g.:
INSERT INTO Lenses (id, focallength, lensmaker_id)
  VALUES(7, '50mm', 3)
  ON CONFLICT(id)
  DO UPDATE SET
                        focallength=excluded.focallength
                        lensmaker_id=excluded.age;

I presume that I can do a lookup to the table (Lenses), check field names and types (string or integer) and save them in a local array. In case of field type string I manually add quotes to the UPSERT statement.
Online now: No Back to the top

Post

Posted
Rating:
#16
Avatar
Expert
Quincunxian is in the usergroup ‘Expert’
Extract data to CSV file with table field names as headers.

KeyTable is the name of the target table in the database
AG.DateFormatFiles = "YYYYMMDD"
The rest should be fairly obvious.

Code (gambas)

  1. Public Sub ExtractDataAsCSV()
  2.   Dim OutFile As File
  3.   Dim TmpLine As String
  4.   Dim Delimiter As String = ","
  5.   Dim $Rec As Result
  6.   Dim TmpTable As Table
  7.   Dim TmpField As Field
  8.  
  9.   OutFile = Open AG.ApplicationFolder &/ KeyTable & "_" & Format(Now, AG.DateFormatFiles) & ".csv" For Create
  10.  
  11.   'Use the Field names As The Column headers  
  12.   TmpTable = DB.$Con.Tables[KeyTable]
  13.   For Each TmpField In Tmptable.Fields  
  14.   TmpLine &= TmpField.Name & Delimiter
  15.   Next
  16.   Print #OutFile, TmpLine
  17.  
  18.   $Rec = DB.$Con.Exec("SELECT * FROM " & KeyTable)
  19.   'Write the records
  20.   If Not IsNull($Rec) And $Rec.available Then
  21.     For Each $Rec
  22.       TmpLine = Str($Rec!Id) & Delimiter
  23.       TmpLine &= Chr(34) & $Rec!LinkTypeName & Chr(34) & Delimiter  ' Enclose in Quotes so that any text comma's are managed.
  24.       TmpLine &= Chr(34) & $Rec!Information & Chr(34) & Delimiter
  25.       TmpLine &= Str($Rec!DateCreated) & Delimiter
  26.       TmpLine &= Str($Rec!DateChanged)
  27.       Print #OutFile, TmpLine
  28.     Next
  29.   Close #Outfile
  30.  
  31.  

Cheers - Quin.
I code therefore I am
Online now: No Back to the top

Post

Posted
Rating:
#17
Regular
01McAc is in the usergroup ‘Regular’
an export into a csv file is not the problem since the database comes with a build-in solution. The import (insert if the record does not exist; update if exists) is imho quite tricky.
If someone is interested in the details:  my use case is
  1. the database is released with a default set of lens records as part of the application. Data baseline is let's say 1-1-2021 but could be any other date
  2. user adds new and personal data about lenses, details etc. while using the app
  3. a new version of the software Lens Register has been released (by me) with the default database (see #1) - means: no personal data in there
  4. the user exports the new and personal data since 1-1-2021 up to today into csv files (form based in the app)
  5. the user installs the new release at home (including the default database)
  6. the user imports the CSV-files (insert if the record does not exist; update if exists) in order to get back his added personal data about lenses etc.

Re #6 I need a so called upsert clause in sqlite. It looks like this:
INSERT INTO Lenses(
ID,MakerId,MountId,MaxAperture,MinAperture,FocalLength,LensType,LensLabel,Order_No,Production_era,LensVariants,Number_of_lenses_groups,MFD,Diaphragm_setting_type,Angle_of_view,Filter_type,Accessories,Materials,Dimensions_diameter,Weight,Inscriptions,Designer,Information,DateCreated,DateChanged,Aperture,Smallest_object_field,Largest_reproduction,Position_of_entrance_pupil,Scales,Viewfinder,Length_to_bayonet_flange,Largest_diameter,Tech_Data,Focusing_range,Lens_mount_description,Maximum_magnification,Compatibility,Bayonet,Dimension,Special_editions,Known_as,Lens_Coding,FocalLength_sort,Focus_throw,Type_of_focus_ring,R_lens_spec)
VALUES(104,6,1,'1.4','16.0','35mm',NULL,'35mm ZEISS Distagon T* 1.4/35 ZM',NULL,NULL,NULL,'10 | 7','0,61 m (24.01") - ∞',NULL,'Angular field** (diag. horiz. vert.): 62° | 53° | 37°','Filter thread:

M49 x 0.75',NULL,NULL,'43 mm (1.69") / Diameter max.
63 mm (2.49") / Diameter of focusing ring:
60 mm (2.39")','381 g (13.4 oz)',NULL,NULL,'Coverage at close range (MOD): 609 x 406 mm (24.00 x 16.00")
Flange focal distance: 28 mm (1.09")','2021-02-03 13:52:46','2021-02-24 15:48:04','f/1.4 – f/16',NULL,NULL,'63 mm (2.48")',NULL,NULL,NULL,NULL,'https://www.zeiss.com/consumer-products/int/photography/zm/distagon-1435-zm.html#accessories','0,70 m (27.56") - ∞',NULL,NULL,'Full Frame',NULL,'Length (with lens caps): 87 mm (3.30") / Length (without lens caps):
65 mm (2.57")',NULL,'35mm Distagon, The Fat',NULL,35,'Rotation angle of focusing ring: 90°',NULL,0)
ON CONFLICT (ID) DO UPDATE SET
MakerId = 6,
MountId = 1,
MaxAperture = '1.4',
MinAperture = '16.0',
FocalLength = '35mm',
LensType = NULL,
LensLabel = '35mm ZEISS Distagon T* 1.4/35 ZM',
Order_No = NULL,
Production_era = NULL,
LensVariants = NULL,
Number_of_lenses_groups = '10 | 7',
MFD = '0,61 m (24.01") - ∞',
Diaphragm_setting_type = NULL,
Angle_of_view = 'Angular field** (diag. horiz. vert.): 62° | 53° | 37°',
Filter_type = 'Filter thread:

M49 x 0.75',
Accessories = NULL,
Materials = NULL,
Dimensions_diameter = '43 mm (1.69") / Diameter max.
63 mm (2.49") / Diameter of focusing ring:
60 mm (2.39")',
Weight = '381 g (13.4 oz)',
Inscriptions = NULL,
Designer = NULL,
Information = 'Coverage at close range (MOD): 609 x 406 mm (24.00 x 16.00")
Flange focal distance: 28 mm (1.09")',
DateCreated = '2021-02-03 13:52:46',
DateChanged = '2021-02-24 15:48:04',
Aperture = 'f/1.4 – f/16',
Smallest_object_field = NULL,
Largest_reproduction = NULL,
Position_of_entrance_pupil = '63 mm (2.48")',
Scales = NULL,
Viewfinder = NULL,
Length_to_bayonet_flange = NULL,
Largest_diameter = NULL,
Tech_Data = 'https://www.zeiss.com/consumer-products/int/photography/zm/distagon-1435-zm.html#accessories',
Focusing_range = '0,70 m (27.56") - ∞',
Lens_mount_description = NULL,
Maximum_magnification = NULL,
Compatibility = 'Full Frame',
Bayonet = NULL,
Dimension = 'Length (with lens caps): 87 mm (3.30") / Length (without lens caps):
65 mm (2.57")',
Special_editions = NULL,
Known_as = '35mm Distagon, The Fat',
Lens_Coding = NULL,
FocalLength_sort = 35,
Focus_throw = 'Rotation angle of focusing ring: 90°',
Type_of_focus_ring = NULL,
R_lens_spec = 0
WHERE ID = 104

My Sub ImportCSVFile which generates the upsert clause above looks quite nasty but it works.

Code (gambas)

  1. Public Sub ImportCSVFile()
  2.  
  3.   Dim DBS As New Cls_SQL_DataBase
  4.   Dim $Rec As Result
  5.   Dim sSQL As String
  6.   Dim aFields, sFilenamepart As String[]
  7.   Dim iRecords, i, ii, iii As Integer
  8.   Dim colRecords As New Collection
  9.   Dim sFile, sHeader, sTablename, sUpsert, sAddon, sUpdate, sWhere As String
  10.   Dim sPattern As String = "*.exp"
  11.  
  12.  
  13.   For Each sFile In Dir(DirChooser1.SelectedPath, sPattern).Sort()
  14.     If Stat(DirChooser1.SelectedPath &/ sFile).Size <> 0 Then  
  15.           sHeader = File.Load(DirChooser1.SelectedPath &/ sFile)
  16.           sHeader = Mid(sHeader, 1, InStr(sHeader, "\n") - 1)
  17.           aFields = Split(sHeader, ",")
  18.          
  19.           sFilenamepart = Split(sFile, "-")
  20.           sTablename = sFilenamepart[1]
  21.           getFields(sTablename)
  22.  
  23.           i = 0
  24.           iRecords = 0
  25.           Try myCSVfile.Close
  26.           myCSVfile = New CsvFile(DirChooser1.SelectedPath &/ sFile)
  27.          
  28.           Do Until myCSVfile.Eof
  29.             colRecords[myCSVfile.Line] = myCSVfile.Read()  'read every line into a collection
  30.             Inc iRecords                                    'count the number of records for later usage
  31.           Loop
  32.          
  33.          
  34.           For ii = 2 To iRecords + 1 'colRecords.Count + 1  --> colRecords.Count shows false values here!!   ' go through every line
  35.               sUpdate = ""
  36.               sUpsert = "INSERT INTO " & sTablename & "(" & gb.Lf & sHeader & ") " & gb.Lf
  37.               sUpsert &= "VALUES("
  38.              
  39.               For i = 0 To myCSVfile.Fields.count - 1
  40.                   sAddon = FieldIsTypeOf(aFields[i], colRecords[CStr(ii)][aFields[i]])
  41.                   If isnull(colRecords[CStr(ii)][aFields[i]]) Or colRecords[CStr(ii)][aFields[i]] = "" Then
  42.                     sUpsert &= "NULL" & ","
  43.                   Else
  44.                     sUpsert &= sAddon & colRecords[CStr(ii)][aFields[i]] & sAddon & ","
  45.                   Endif
  46.                  
  47.                   If i = 0 Then ' ID (alway zero!) doesn't need an update but needed for WHERE clause
  48.                       sWhere = "WHERE " & aFields[i] & " = " & colRecords[CStr(ii)][aFields[i]] & gb.Lf
  49.                   Else                  
  50.                       sUpdate &= aFields[i] & " = "
  51.                       If IsNull(sAddon) And colRecords[CStr(ii)][aFields[i]] = "" Then
  52.                           sUpdate &= "NULL" & "," & gb.Lf
  53.                         Else
  54.                           sUpdate &= sAddon & colRecords[CStr(ii)][aFields[i]] & sAddon & "," & gb.Lf
  55.                       Endif
  56.                   Endif
  57.               Next
  58.               sUpsert = Left(sUpsert, -1) & ")" & gb.Lf
  59.               sUpsert &= "ON CONFLICT (ID) DO UPDATE SET" & gb.Lf
  60.              
  61.               sUpdate = Left(sUpdate, -2)
  62.               sUpsert &= sUpdate & gb.Lf & sWhere
  63.               File.Save(User.home &/ sFile & "-" & Format(Now, "dd/mm/yyyy hh-nn-ss-uu") & "-upsert.sql", sUpsert)
  64.               $Rec = DBS.$Con.Exec(sUpsert)
  65.           Next  'colRecords.Count + 1
  66.            
  67.      Endif    ' File Size <> 0
  68.   Next
  69.      
  70.   DBS.$Con.Commit
  71.  
Online now: No Back to the top

Post

Posted
Rating:
#18
Avatar
Regular
cage is in the usergroup ‘Regular’
How can you break cvs data down to individual parts and assign them to strings?  I have tried different ways with split but never get the results needed.  For example:

Code (gambas)

  1. Dim String1 as new [100]
  2. Dim String2 as new [100]
  3. Dim String3 as new [100]
  4.  
  5. "Data1", "Data2" , "Data3"
  6.  
  7. String1[] = Data1
  8. String2[] = Data2
  9. String3[] = Data3
  10. Listbox,add String1[]
  11.  

This the basics of what I need.  Thanks in advance for any assistance.
Online now: No Back to the top

Post

Posted
Rating:
#19
Avatar
Expert
Quincunxian is in the usergroup ‘Expert’
Extract any database table to CSV with fields as headers - Works well with SQLite but have not tried with MySql but should be generic enough to work.
Must have a connection to your database open ($Con in the example below)

Code (gambas)

  1. Public Sub ExtractTableToCSV(InSavePath As String, InTable As String, AutoOpen As Boolean)
  2.  
  3.   Dim TmpLine As String
  4.   Dim TmpTable As Table
  5.   Dim TmpField As Field
  6.   Dim FileCSV As File
  7.   Dim FileName As String = InSavePath &/ InTable & "_" & Format(Now, "ddmmyyyy") & ".csv"
  8.   Dim Delimiter As String = ","
  9.   Dim $Rec As Result
  10.  
  11.   FileCSV = Open FileName For Create
  12.   TmpTable = $Con.Tables[InTable]
  13.   '------------Writte the Headers------------
  14.   For Each TmpField In TmpTable.Fields
  15.     TmpLine &= String.UCaseFirst(TmpField.Name) & Delimiter
  16.   Next
  17.   Print #FileCSV, TmpLine
  18.  
  19.   '------------Write the records-------------------
  20.  
  21.   $Rec = $Con.Exec("SELECT * FROM " & INTable)
  22.   If $Rec.Available Then
  23.     For Each $Rec
  24.       TmpLine = ""
  25.       For Each TmpField In TmpTable.Fields
  26.         TmpLine &= Chr(34) & $Rec[TmpField.Name] & Chr(34) & Delimiter
  27.       Next
  28.       Print #FileCSV, TmpLine
  29.     Next
  30.  
  31.    Close #FileCSV
  32.  
  33.   '------------Open the file for review if desired-------------------
  34.   If AutoOpen then Exec ["xdg-open",FileName]
  35.  
  36.   Message(Error.Text)
  37.  
  38.  

Cheers - Quin.
I code therefore I am
Online now: No Back to the top

Post

Posted
Rating:
#20
Avatar
Regular
cage is in the usergroup ‘Regular’
 Thanks Quincunxian.  SQL is a bit of over kill for what I needed.  I found the solution in another post.  I had to modify it to make it work for my needs.  Thanks again for quickly responding to my question.
Online now: No Back to the top
1 guest and 0 members have just viewed this.