Difference between revisions of "ITrack/Pro/DSI Integration"
Daytonlowell (talk | contribs) |
Ghagemoser (talk | contribs) |
||
(24 intermediate revisions by 5 users not shown) | |||
Line 4: | Line 4: | ||
===How it works=== | ===How it works=== | ||
A batch file is ran(Probably want it to happen via a scheduled task) that does the following: | A batch file is ran(Probably want it to happen via a scheduled task) that does the following: | ||
# | #Navigates to the DSI export directory | ||
#Runs a | #Runs a Python script(The script connects to the database and runs a query to get the data, then loads that data into a csv) | ||
#Uploads the file to a directory on DSI's FTP server | #Uploads the csv file to a directory on DSI's FTP server using [ftp://ftp.ncftp.com/ncftp/binaries/Setup%20NcFTP%203.2.5.msi ncftp] | ||
#Deletes the local copy of the csv file | #Deletes the local copy of the csv file | ||
===The batch file=== | ===The batch file=== | ||
<source lang="dos"> | <source lang="dos"> | ||
"<path to | "<path to dsi export directory>" | ||
python DSI_Exporter.py | |||
ncftpput -u <FTP_Server_Username> -p <FTP_Server_Password> <FTP_Server_Name_Or_IP> <path on remote ftp server to put the file> "<local DSI Export directory>\output.csv" | ncftpput -u <FTP_Server_Username> -p <FTP_Server_Password> <FTP_Server_Name_Or_IP> <path on remote ftp server to put the file> "<local DSI Export directory>\output.csv" | ||
DEL "<local server DSI Export directory>\output.csv" | DEL "<local server DSI Export directory>\output.csv" | ||
Line 17: | Line 18: | ||
#Make sure the line "DEL output.txt" is pointed to the output file in the first step. | #Make sure the line "DEL output.txt" is pointed to the output file in the first step. | ||
#Make a scheduled task that runs the batch file. | #Make a scheduled task that runs the batch file. | ||
#If you are given an ftp address in the form of <nowiki>ftp://</nowiki><ftp.something.com, remove the leading <nowiki>ftp://</nowiki>, as it's unnecessary and will confuse the exporter. | |||
#In the event the FTP site does not use a specific directory for uploading, substitute a blank string ("") for <path on remote ftp server to put the file>. Otherwise the exporter will get confused. | |||
===The | ===The Python Script=== | ||
<source lang=" | New exporter doesn't use a sql script because it is blocked by the FILE permission as well as the secure file priv flag in the mysql config. | ||
SELECT partuse.part AS parttype, | *The SELECT portion of the old sql script is used in the query run by the python script | ||
inventory.partnum AS SKU, | |||
inventory.tagnum AS tagnumber, | <source lang="python"> | ||
inventory.typenum, | import mysql.connector | ||
inventory.stocknum, | from mysql.connector import Error | ||
inventory.vinnum AS vin, | import csv | ||
inventory.make, | |||
inventory.model, | |||
inventory.year, | def connect(): | ||
inventory.bodystyle, | conn = None | ||
inventory.location, | try: | ||
inventory.description, | #Enter connection details | ||
inventory.core AS coreprice, | conn = mysql.connector.connect( | ||
inventory.interchangenum, | host=<hostname>, | ||
inventory.status, | database=<database>, | ||
inventory.suggestedprice AS retailprice, | user=<user>, | ||
inventory.bottomprice AS wholesaleprice, | password=<password>, | ||
inventory.replenish, | port=<port>) | ||
inventory.deplete, | |||
inventory.quantity, | #Confirm that connection is successful | ||
inventory.dateentered, | if conn.is_connected(): | ||
inventory.label1, | log = open("log.txt", "a") | ||
inventory.data1, | log.write("\n") | ||
inventory.label2, | log.write("Connected to MySQL database") | ||
inventory.data2, | log.close() | ||
inventory.label3, | #Add to logs | ||
inventory.data3, | except Error as e: | ||
inventory.label4, | log = open("log.txt", "a") | ||
inventory.data4, | log.write("\n") | ||
inventory.cost, | log.write(e) | ||
inventory.pmanufacturer AS partmanufacturer, | log.close() | ||
inventory.pmodel AS partmodel, | finally: | ||
inventory.minquantity, | if conn is not None and conn.is_connected(): | ||
inventory.maxquantity, | curs = conn.cursor() | ||
inventory.taxable, | query = 'SELECT partuse.part AS parttype, inventory.partnum AS SKU, inventory.tagnum AS tagnumber, inventory.typenum, inventory.stocknum, inventory.vinnum AS vin, inventory.make, inventory.model, inventory.year, inventory.bodystyle, inventory.location, replace(inventory.description, "\r\n", "") as description,inventory.core AS coreprice, inventory.interchangenum, inventory.status, inventory.suggestedprice AS retailprice, inventory.bottomprice AS wholesaleprice, inventory.replenish, inventory.deplete, inventory.quantity, inventory.dateentered, inventory.label1, inventory.data1, inventory.label2, inventory.data2, inventory.label3, inventory.data3, inventory.label4, inventory.data4, inventory.cost, inventory.pmanufacturer AS partmanufacturer, inventory.pmodel AS partmodel, inventory.minquantity, inventory.maxquantity, inventory.taxable, inventory.oemnum, inventory.condition, inventory.serialnum, inventory.side, inventory.category, inventory.listprice, replace(inventory.notes, "\r\n", "") as notes, inventory.parentpartnum, inventory.weight, invmaster.trackingnum AS trackingnumber FROM inventory LEFT JOIN partuse ON inventory.typenum = partuse.typenum LEFT JOIN invmaster ON inventory.stocknum = invmaster.stocknum LEFT JOIN locations ON inventory.partnum = locations.partnum WHERE inventory.status IN("A", "H");' | ||
inventory.oemnum, | count = 0 | ||
inventory.condition, | curs.execute(query) | ||
inventory.serialnum, | #Set up csv write location and parameters | ||
inventory.side, | with open('output.csv', 'w', newline='') as csvfile: | ||
inventory.category, | filewriter = csv.writer(csvfile, delimiter=',', quotechar='"', quoting=csv.QUOTE_MINIMAL) | ||
inventory.listprice, | #Add column headers | ||
inventory.notes, | filewriter.writerow(['parttype', 'SKU', 'tagnumber', 'typenum', 'stocknum', 'vin', 'make', 'mdoel', 'year', 'bodystyle', 'location', 'description', 'coreprice', 'interchangenum', 'status', 'retailprice', 'wholesaleprice', 'replenish', 'deplete', 'quantity', 'dateentered', 'label1', 'data1', 'label2', 'data2', 'label3', 'data3', 'label4', 'data4', 'cost', 'partmanufacturer', 'partmodel', 'minquantity','maxquantity', 'taxable', 'oemnum', 'condition', 'serialnum', 'side', 'category', 'listprice', 'notes', 'parentpartnum', 'weight', 'trackingnumber']) | ||
inventory.parentpartnum, | |||
inventory.weight, | #loop through each row and add append the data to the csv | ||
invmaster.trackingnum AS trackingnumber | for (parttype, SKU, tagnumber, typenum, stocknum, vin, make, mdoel, year, bodystyle, location, description, coreprice, interchangenum, status, retailprice, wholesaleprice, replenish, deplete, quantity, dateentered, label1, data1, label2, data2, label3, data3, label4, data4, cost, partmanufacturer, partmodel, minquantity, maxquantity, taxable, oemnum, condition, serialnum, side, category, listprice, notes, parentpartnum, weight, trackingnumber) in curs: | ||
FROM inventory | filewriter.writerow([parttype, SKU, tagnumber, typenum, stocknum, vin, make, mdoel, year, bodystyle, location, description, coreprice, interchangenum, status, retailprice, wholesaleprice, replenish, deplete, quantity, dateentered, label1, data1, label2, data2, label3, data3, label4, data4, cost, partmanufacturer, partmodel, minquantity, maxquantity, taxable, oemnum, condition, serialnum, side, category, listprice, notes, parentpartnum, weight, trackingnumber]) | ||
LEFT JOIN partuse ON inventory.typenum = partuse.typenum | |||
LEFT JOIN invmaster ON inventory.stocknum = invmaster.stocknum | count = count + 1 | ||
LEFT JOIN locations ON inventory.partnum = locations.partnum | log = open("log.txt", "a") | ||
WHERE inventory.status IN( | log.write("\n") | ||
log.write(str(count)+ ' rows were added to results.csv') | |||
log.close() | |||
conn.close() | |||
if __name__ == '__main__': | |||
connect() | |||
</source> | </source> | ||
'''Note:''' | '''Note:''' | ||
===Rock and Dirt Export=== | |||
This script and batch file can work for exporting to other FTP sites, for example Rock & Dirt. The processes is functionally the same. You can find the pre-configured files for Rock and Dirt at '''svn\ITrackPro\Utilities\RockAndDirt'''. You will still need to create accounts and fill in the scripts with the system-appropriate paths. | |||
==Importing from DSI to ITrack Pro== | ==Importing from DSI to ITrack Pro== | ||
Line 90: | Line 95: | ||
A batch file is ran via a scheduled task that does the following: | A batch file is ran via a scheduled task that does the following: | ||
# Using a command line FTP client, connects to a DSI provided FTP server | # Using a command line FTP client, connects to a DSI provided FTP server | ||
## Downloads all .xml files in a certain directory on the FTP server and puts them in the DSI Import application installation directory | ## Downloads all .xml and .XML files in a certain directory on the FTP server and puts them in the DSI Import application installation directory | ||
# Runs the ISoft-built DSI Import application. (You will most likely need a dev to compile you the most recent version. Also this application requires libmysql.dll) | # Runs the ISoft-built DSI Import application. (You will most likely need a dev to compile you the most recent version. Also this application requires libmysql.dll) | ||
## Connects to a MySQL server and ITrack Pro database using database connection info that must be supplied in a host.ini file(this can be copied from the customer's ITrack Pro client software installation). | ## Connects to a MySQL server and ITrack Pro database using database connection info that must be supplied in a host.ini file(this can be copied from the customer's ITrack Pro client software installation). | ||
Line 124: | Line 128: | ||
##Deletes the file | ##Deletes the file | ||
#Exits returning 0 if everything went fine, non-zero if some error happened | #Exits returning 0 if everything went fine, non-zero if some error happened | ||
'''Note:''' In order for the importer to work correctly, the function '''f_adjust_inventory''' MUST be present in their database or the importer will not correctly adjust inventory when a file is processed. If a customer using the DSI import does not have the function in their database, log in to the database under their normal mysql user and run the following query: | |||
<source lang="Mysql"> | |||
DELIMITER $$ | |||
CREATE FUNCTION `f_adjust_inventory`(in_partnum INT UNSIGNED, in_store SMALLINT UNSIGNED, in_quantity_change INT) RETURNS INT(11) | |||
BEGIN | |||
DECLARE nNewQuantity INT; | |||
DECLARE cNewStatus CHAR(1); | |||
DECLARE nOldQuantity INT; | |||
DECLARE cOldStatus CHAR(1); | |||
DECLARE bReplenish ENUM('False','True'); | |||
SELECT `quantity`, `status`, `replenish` | |||
INTO nOldQuantity, cOldStatus, bReplenish | |||
FROM inventory | |||
WHERE `partnum` = in_partnum AND `store` = in_store | |||
LOCK IN SHARE MODE; | |||
SET nNewQuantity = nOldQuantity + in_quantity_change; | |||
IF nOldQuantity <= 0 AND cOldStatus = 'S' AND nNewQuantity > 0 THEN | |||
SET cNewStatus = 'A'; | |||
ELSEIF nNewQuantity <= 0 THEN | |||
SET cNewStatus = 'S'; | |||
END IF; | |||
UPDATE `inventory` | |||
SET `quantity` = nNewQuantity, `status` = IFNULL(cNewStatus, `status`), `datemodified` = NOW() | |||
WHERE `partnum` = in_partnum AND `store` = in_store; | |||
RETURN nNewQuantity; | |||
END $$ | |||
DELIMITER ; | |||
</source> | |||
=== Additional Info === | |||
* The way that the DSI Integration knows what part is referenced between the two systems, is that inventory.notes in ITrack will contain the part ID for DSI. If this information is missing, or if additional information is added to inventory.notes, then any sales in DSI will not be successfully updated in ITrack. | |||
* Currently, the DSI system does not send over XML documents if it can not find any ITrack parts that were sold. It was requested that Ben Cramer change this so that DSI sends over XML files even if they are empty. | |||
* Because the process automatically deletes the files once they are used, it is difficult to keep logs on what has happened. Archiving has been set up in the DSI Integration so that we now keep a log of all XML files that we have received. | |||
[[Category:Integrations]] |
Latest revision as of 14:47, 20 November 2023
Exporting from ITrack Pro to DSI
Batch file and sql script in SVN: ITrack\ITrackPro\Utilities\DSIExport\
How it works
A batch file is ran(Probably want it to happen via a scheduled task) that does the following:
- Navigates to the DSI export directory
- Runs a Python script(The script connects to the database and runs a query to get the data, then loads that data into a csv)
- Uploads the csv file to a directory on DSI's FTP server using ncftp
- Deletes the local copy of the csv file
The batch file
"<path to dsi export directory>"
python DSI_Exporter.py
ncftpput -u <FTP_Server_Username> -p <FTP_Server_Password> <FTP_Server_Name_Or_IP> <path on remote ftp server to put the file> "<local DSI Export directory>\output.csv"
DEL "<local server DSI Export directory>\output.csv"
- Make sure the line "DEL output.txt" is pointed to the output file in the first step.
- Make a scheduled task that runs the batch file.
- If you are given an ftp address in the form of ftp://<ftp.something.com, remove the leading ftp://, as it's unnecessary and will confuse the exporter.
- In the event the FTP site does not use a specific directory for uploading, substitute a blank string ("") for <path on remote ftp server to put the file>. Otherwise the exporter will get confused.
The Python Script
New exporter doesn't use a sql script because it is blocked by the FILE permission as well as the secure file priv flag in the mysql config.
- The SELECT portion of the old sql script is used in the query run by the python script
import mysql.connector
from mysql.connector import Error
import csv
def connect():
conn = None
try:
#Enter connection details
conn = mysql.connector.connect(
host=<hostname>,
database=<database>,
user=<user>,
password=<password>,
port=<port>)
#Confirm that connection is successful
if conn.is_connected():
log = open("log.txt", "a")
log.write("\n")
log.write("Connected to MySQL database")
log.close()
#Add to logs
except Error as e:
log = open("log.txt", "a")
log.write("\n")
log.write(e)
log.close()
finally:
if conn is not None and conn.is_connected():
curs = conn.cursor()
query = 'SELECT partuse.part AS parttype, inventory.partnum AS SKU, inventory.tagnum AS tagnumber, inventory.typenum, inventory.stocknum, inventory.vinnum AS vin, inventory.make, inventory.model, inventory.year, inventory.bodystyle, inventory.location, replace(inventory.description, "\r\n", "") as description,inventory.core AS coreprice, inventory.interchangenum, inventory.status, inventory.suggestedprice AS retailprice, inventory.bottomprice AS wholesaleprice, inventory.replenish, inventory.deplete, inventory.quantity, inventory.dateentered, inventory.label1, inventory.data1, inventory.label2, inventory.data2, inventory.label3, inventory.data3, inventory.label4, inventory.data4, inventory.cost, inventory.pmanufacturer AS partmanufacturer, inventory.pmodel AS partmodel, inventory.minquantity, inventory.maxquantity, inventory.taxable, inventory.oemnum, inventory.condition, inventory.serialnum, inventory.side, inventory.category, inventory.listprice, replace(inventory.notes, "\r\n", "") as notes, inventory.parentpartnum, inventory.weight, invmaster.trackingnum AS trackingnumber FROM inventory LEFT JOIN partuse ON inventory.typenum = partuse.typenum LEFT JOIN invmaster ON inventory.stocknum = invmaster.stocknum LEFT JOIN locations ON inventory.partnum = locations.partnum WHERE inventory.status IN("A", "H");'
count = 0
curs.execute(query)
#Set up csv write location and parameters
with open('output.csv', 'w', newline='') as csvfile:
filewriter = csv.writer(csvfile, delimiter=',', quotechar='"', quoting=csv.QUOTE_MINIMAL)
#Add column headers
filewriter.writerow(['parttype', 'SKU', 'tagnumber', 'typenum', 'stocknum', 'vin', 'make', 'mdoel', 'year', 'bodystyle', 'location', 'description', 'coreprice', 'interchangenum', 'status', 'retailprice', 'wholesaleprice', 'replenish', 'deplete', 'quantity', 'dateentered', 'label1', 'data1', 'label2', 'data2', 'label3', 'data3', 'label4', 'data4', 'cost', 'partmanufacturer', 'partmodel', 'minquantity','maxquantity', 'taxable', 'oemnum', 'condition', 'serialnum', 'side', 'category', 'listprice', 'notes', 'parentpartnum', 'weight', 'trackingnumber'])
#loop through each row and add append the data to the csv
for (parttype, SKU, tagnumber, typenum, stocknum, vin, make, mdoel, year, bodystyle, location, description, coreprice, interchangenum, status, retailprice, wholesaleprice, replenish, deplete, quantity, dateentered, label1, data1, label2, data2, label3, data3, label4, data4, cost, partmanufacturer, partmodel, minquantity, maxquantity, taxable, oemnum, condition, serialnum, side, category, listprice, notes, parentpartnum, weight, trackingnumber) in curs:
filewriter.writerow([parttype, SKU, tagnumber, typenum, stocknum, vin, make, mdoel, year, bodystyle, location, description, coreprice, interchangenum, status, retailprice, wholesaleprice, replenish, deplete, quantity, dateentered, label1, data1, label2, data2, label3, data3, label4, data4, cost, partmanufacturer, partmodel, minquantity, maxquantity, taxable, oemnum, condition, serialnum, side, category, listprice, notes, parentpartnum, weight, trackingnumber])
count = count + 1
log = open("log.txt", "a")
log.write("\n")
log.write(str(count)+ ' rows were added to results.csv')
log.close()
conn.close()
if __name__ == '__main__':
connect()
Note:
Rock and Dirt Export
This script and batch file can work for exporting to other FTP sites, for example Rock & Dirt. The processes is functionally the same. You can find the pre-configured files for Rock and Dirt at svn\ITrackPro\Utilities\RockAndDirt. You will still need to create accounts and fill in the scripts with the system-appropriate paths.
Importing from DSI to ITrack Pro
Batch file and DSI Import application in SVN: ITrack\ITrackPro\Utilities\DSIImport\
How it works
A batch file is ran via a scheduled task that does the following:
- Using a command line FTP client, connects to a DSI provided FTP server
- Downloads all .xml and .XML files in a certain directory on the FTP server and puts them in the DSI Import application installation directory
- Runs the ISoft-built DSI Import application. (You will most likely need a dev to compile you the most recent version. Also this application requires libmysql.dll)
- Connects to a MySQL server and ITrack Pro database using database connection info that must be supplied in a host.ini file(this can be copied from the customer's ITrack Pro client software installation).
- When the application is ran it scans its own directory for any & all .xml file(s)
- It then attempts to import all of the information(invoices, invoice line items, customers & it depletes quantity of inventory records) from all of the xml files
- Upon successful import it then deletes the .xml files from the DSI Import application directory
The batch file
ncftpget -DD -u <FTP_Username> -p <FTP_Password> <server name or IP address> "<Local directory of DSI Import Application>" <path on server where the xml file will be>/*.xml
ncftpget -DD -u <FTP_Username> -p <FTP_Password> <server name or IP address> "<Local directory of DSI Import Application>" <path on server where the xml file will be>/*.XML
"<Local directory with DSI Import Application>\DSIImport.exe"
exit
Note: <path on server where the xml file will be> is relative to the root. In many cases this is something like /ISOFT-IN/
The DSI Import application
The DSI Importer currently
- Connects to an ITrack Pro database using information from a host.ini file in its directory
- Opens all .xml files in its directory
- For each one of those files, it:
- Reads in invoices based on the specs we were given by DSI long ago
- For each one of the invoices that it reads in, it:
- Creates a customer if the DSI customer on that invoice hasn't been seen before
- Creates a new invoice using the f_create_invoice database function
- For every lineitem on that imported invoice, it
- Depletes quantity from inventory by looking for a part with a tag number matching the part number that they typed in to DSI for that lineitem
- Creates a new lineitem using the f_create_lineitem function
- Creates a new lineitem on that invoice (using f_create_lineitem) to store the tax amount
- Deletes the file
- Exits returning 0 if everything went fine, non-zero if some error happened
Note: In order for the importer to work correctly, the function f_adjust_inventory MUST be present in their database or the importer will not correctly adjust inventory when a file is processed. If a customer using the DSI import does not have the function in their database, log in to the database under their normal mysql user and run the following query:
DELIMITER $$
CREATE FUNCTION `f_adjust_inventory`(in_partnum INT UNSIGNED, in_store SMALLINT UNSIGNED, in_quantity_change INT) RETURNS INT(11)
BEGIN
DECLARE nNewQuantity INT;
DECLARE cNewStatus CHAR(1);
DECLARE nOldQuantity INT;
DECLARE cOldStatus CHAR(1);
DECLARE bReplenish ENUM('False','True');
SELECT `quantity`, `status`, `replenish`
INTO nOldQuantity, cOldStatus, bReplenish
FROM inventory
WHERE `partnum` = in_partnum AND `store` = in_store
LOCK IN SHARE MODE;
SET nNewQuantity = nOldQuantity + in_quantity_change;
IF nOldQuantity <= 0 AND cOldStatus = 'S' AND nNewQuantity > 0 THEN
SET cNewStatus = 'A';
ELSEIF nNewQuantity <= 0 THEN
SET cNewStatus = 'S';
END IF;
UPDATE `inventory`
SET `quantity` = nNewQuantity, `status` = IFNULL(cNewStatus, `status`), `datemodified` = NOW()
WHERE `partnum` = in_partnum AND `store` = in_store;
RETURN nNewQuantity;
END $$
DELIMITER ;
Additional Info
- The way that the DSI Integration knows what part is referenced between the two systems, is that inventory.notes in ITrack will contain the part ID for DSI. If this information is missing, or if additional information is added to inventory.notes, then any sales in DSI will not be successfully updated in ITrack.
- Currently, the DSI system does not send over XML documents if it can not find any ITrack parts that were sold. It was requested that Ben Cramer change this so that DSI sends over XML files even if they are empty.
- Because the process automatically deletes the files once they are used, it is difficult to keep logs on what has happened. Archiving has been set up in the DSI Integration so that we now keep a log of all XML files that we have received.