CSV file import – SLP not deleting dropped lcoations

Home Forums Store Locator Plus CSV file import – SLP not deleting dropped lcoations

This topic contains 7 replies, has 2 voices, and was last updated by  Doug 1 year, 9 months ago.

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
    Posts
  • #27026

    Doug
    Participant

    SLP (Ver 2.2.23) is set to file import hourly from a URL.

    Using pro-pack 4.2.03

    The first import works great.  All 1000 locations come import very fast from the CSV file (on another server).  It appears that if a record is dropped (a store location is deleted) from the source CSV file then it does not delete from SLP database on the website.

    The existing records appear to be updating but it doesn’t seem to be updating deletions from source CSV.  (CSV file is formatted with header row per documentation)

     

    second question/confirmation

    Do I understand correctly that if FILE IMPORT is used then the file settings for skip first line, field line has field name, skip geocoding, duplicates handling are ignored.  Setting them has no affect on the import only if the CSV FILE URL + CRON is used.

     

    The CSV file import via FILE URL on the Cron job works great and updates records but will not.

     

    Thanks so much.

    #27032

    Doug
    Participant

    Addendum,

    I changed an existing entry and on next scheduled import it added the change in the source CSV file as a new location but kept the old listing.

    I added (change) to the store entry in the sl_store column.

    Also the checkboxes for Skip First Line, First Line has field name and Skip Geocoding won’t stay “ticked” however duplicates handling remains at UPDATE.

    • This reply was modified 1 year, 11 months ago by  Doug.
    #27037

    Doug
    Participant

    Okay,

    I understand it now but with one problem.

    1. On very first import to a clean SLP database the source CSV file can’t have the sl_id column.
    2. Import locations from source CSV file manually – that works fine
    3. Export from SLP as CSV to get the SLP generated sl_id information
    4. Any changes to the CSV file now with the sl_id column including the SLP generated ids will now be picked up with a cron.

    But this is where it falls apart for me unless I’m missing something.

    1. My source db on another server will create a CSV file with all the necessary data except the sl_id
    2. SLP does pick up that file however without the sl_id column and the SLP generated number after first import it will add changes as a new record instead of updating and/or deleting records.
    3. Note:  I did try to create my own sl_id numbers hoping that SLP would use those instead but it just bumped all info 1 column over in favor of SLP generated numbers and thus fouled up the import.

    If my source db is changing by adding, deleting or revising records and generating the CSV file automatically and SLP is set to pick up that CSV file on a CRON then I have the problem of getting the sl_id back from SLP to my source db to re-output the CSV file to properly pick up the changes.

    And that is a manual operation.

    Am I missing something?

     

    Sorry for multiple posts but I wanted you to have all the info.

    #27262

    Lance Cleveland
    Keymaster

    The import does not have a delete locations option.  It is purely additive.

    I need to think about the method for deletion for cases where a remote URL may have a current list of locations where a dropped location would be reflected.

    There are several options for handling this.

    The option I like the least is adding an “action”column with words like “DELETE” or “UPDATE” or “ADD” as it takes away the advantage of the remote source being a reflection of the active locations.

    A better option, IMO, is to fire off a process when the last location has been processed from the CSV file that says “delete any location not updated on/after the timestamp on the last location that was in the CSV file”.  The problem with that approach is that if the CSV import breaks for some reason the locations aren’t deleted.  Or worse, some corner case cause the delete process to fire but only 10 of 10,000 locations were imported and the entire location set is deleted which makes for a nightmare on the next import when Google limits geocoding to 2500 locations.

     

     

    #27267

    Doug
    Participant

    Thanks for looking into this Lance,

    Let’s make a the assumption that geocoding has/will be done at source before generating the CSV file on the remote location. Essentially bypassing geocoding in SLP solving your problem but only when remote CSV function is used.  I geocoded all the data prior to import and will geocode new/revised data at source and not through SLP.

    Perhaps it could be that the SLP_ID can be generated by source and used by SLP on import instead of internally generate but when using the remote CSV function.  SLP compares with CSV and makes additions/corrections.

     

    The following may be a bit clunky but let’s make the assumption that if remote CSV function is being used then there is someone that can set up the source output with proper flags in place.

    To use less processing time source CSV file would have a new rev/delete/add column for identifying if data has changed/added combined with the SLP_ID field generated by source.  If blank then ignore.

    That way SLP would look for only those entries with set for change and make the revision/addition/deletion in the record.

    The challenge is that my clients have their own (clunky) systems and rather poor “find a location” systems in place or none at all.  This is the missing piece to the puzzle for me and I’d gladly pay for a new Remote CSV plugin in addition to the others.

     

     

     

     

    • This reply was modified 1 year, 10 months ago by  Doug.
    #27401

    Lance Cleveland
    Keymaster

    Contact Extender has an “identifier” field that is used to match the SLP location data with an external source.

    I know, an odd place for it, but it was a client customization project that they paid for along with the field additions, so that is where it ended up.

    That identifier field, if present in your import while Contact Extender is installed & active, will be used to match up records.  It will only update records as necessary.

    It will NOT delete missing records, however.

     

    SLP will NOT geocode any import location that has that lat/long set.    It will not geocode any import location with the street + street 2 + city + state + zip + country fields have not changed (any change, spaces, etc. will trigger a geocode lookup).   Those are 2 cases where geocoding is bypassed.

     

    However, those two pieces of the puzzle will not address the issue of deleting locations during import.

     

     

    Maybe I can set it so that if the sl_id field is present OR the identifier field is present (from Contact Extender) and the rest of the record is blank it deletes the record.    No use for all-blank locations…

    #27403

    Doug
    Participant

    The request may sound simple but I know it isn’t.  I appreciate you looking into it.  I would buy that added feature in a second plus the other needed add-ons to make it work.

    I just want to be able to sync a source CSV with SLP on an ongoing basis.   If data in the name, address, hours, long, lat, etc changes or if the store is deleted or a new added in source CSV…I’d like that to sync to SLP (CSV -> SLP) via the cron feature you put in.

    I, of course, would output the CSV to the specifications set by you including geocoded data unless SLP will geocode automatically if long/lat is missing.  I do realize that I could overwhelm Google if I tried to geocode 500+ on a single sync…I’d be geocoding 2 – 10…weekly…maybe…on a sync.

    My client loves SLP but has all the data in another system for internal/intranet use and just doesn’t want to manually update records after the initial ingest.  (and yes…even if it is as simple as a single change once a week.)

    I do appreciate you looking into this.

    PS:  I’ll just zip over and buy the contact extender now and wait for that update…hopefully.

     

    • This reply was modified 1 year, 10 months ago by  Doug.
    #27814

    Doug
    Participant

    Feb 18, 2015 Update

    Contact extender installed on a clean install and database but fields will not populate with propack cron job.  Manually importing the CSV file works as advertised adding the extended information.  (sl_identifier, sl_contact, sl_first_name etc.)

    I am running tests to see if changes to pre-existing data will sync from CSV to SLP on cron.

     

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.