Apache Tomcat Webserver
Apache Tomcat is an application server designed for Jakarta Servlets. In the current deployment it allows extensibility and an accessible, easy to use interface for users.
- Tomcat is automatically launched on server startup by the Windows Task StartStirlingWebServer. This process is launched under system user ID 0 so no interface is shown. To launch the server with real-time logging do not launch it with the windows task. Instead launch it from command prompt using the script C:\DeploymentTomcat\bin\startup.bat.
- Logs will always be saved in C:\DeploymentTomcat\logs.
- The live version of Tomcat is deployed on port 8082.
- Settings for the server deployment are contained in C:\DeploymentTomcat\conf. User settings will be in tomcat-users.xml. Server settings in server.xml.
- To restart the server run C:\DeploymentTomcat\bin\shutdown.bat followed by running the StartStirlingWebServer task from Windows Task Manager.
- WAR files should be deployed to C:\DeploymentTomcat\webapps.
- Tomcat is accessible from any device on the same network and contains sensitive data. The server should NEVER be connected to an unsecured network. If security needs to be implemented a login system should be created.
Tomcat integrates into Eclipse IDE for rapid development. Eclipse is currently configured to create a test environment with a second Tomcat deployment located in C:\SecondaryTomcat.
- This server is deployed on port 8081 and will normally only be accessible during live testing.
- While in Eclipse the server will be started automatically when debugging and stopped when debugging is complete.
Eclipse Servlets
Eclipse is used to create and modify Java servlets that can be packaged in a WAR file and rapidly deployed to the live Tomcat server.
- The server will load index.html as the landing page.
- Servlets can be requested by adding the servlet name to the server URL. (ex. *Server IP*/StirlingWebServer/InvoiceQBOrders)
- Created servlets will have a doGet method for GET requests and a doPost method for POST by default.
Creating a Servlet
- Open Eclipse IDE for Enterprise
- If not opened by default open the StirlingWebServer Project
- In the project explorer under Java Resources right click the desired package or location for the new servlet
- Click New->Servlet
- The only required field is the class name which is what the servlet will be accessible as.
- Once the class name and any other options are filled click finish to create the servlet.
Deploying an Update
- Open Eclipse IDE for Enterprise
- If not opened by default open the StirlingWebServer Project
- In the project explorer right click the base StirlingWebServer
- Select Export->WAR file
- In the Destination drop down select C:\DeploymentTomcat\webapps\StirlingWebServer.war
- Click Finish
- Restart the live Tomcat deployment to see changes.
- Run C:\DeploymentTomcat\bin\shutdown.bat
- Open Windows Task Scheduler and navigate to the admin folder
- Run the StartStirlingWebServer task from Windows Task Manager
Advice and Tips
- To connect to the SQLEXPRESS Microsoft SQL server create a connection string and import java.sql.Connection, java.sql.DriverManager, java.sql.ResultSet, java.sql.SQLException, and java.sql.Statement. Create a try statement and create a statement object (connectionUrl is the connection string):
Connection connection = DriverManager.getConnection(connectionUrl);
Statement statement = connection.createStatement();
Create a query String and execute it with the statement object. Use execute if no return results are required. Use executeQuery if rows will be returned:
statement.execute(selectSql);
or
ResultSet resultSet = null;
resultSet = statement.executeQuery(selectSql);
To access returned rows read each row of the returned result set:
PrintWriter out = response.getWriter();
int columnCount = resultSet.getMetaData().getColumnCount();
while (resultSet.next()) {
for (int i = 1; i <= columnCount; i++) {
out.print(resultSet.getString(i));
}
}
The result set has a different get method for different types of data. Make sure to use the correct one!
- To return HTML responses use Java Stringbuilder to build the HTML response and print it with the PrintWriter. If the returned HTML shows database data and needs unique action links for each item create URLs to the desired servlet and add a data flag in the URL for the GET method. EX:
rows.append(“<td><input type=\”checkbox\” name=\””+resultSet.getString(6)+”__send\” value=\”send\” class=\”send\” >”
To run a windows command create an ArrayList of commands and run them with ProcessBuilder. Errors should be printed so add the flag redirectErrorStream(true) and read the returned error string with a BufferedReader:
PrintWriter out = response.getWriter();
List<String> commands = new ArrayList<String>();
commands.add(“powershell”);
commands.add(“\”AssignCostcoM2H_run.ps1\””);
ProcessBuilder builder = new ProcessBuilder();
builder.command(commands);
builder.redirectErrorStream(true);
String docType =
“<!doctype html public \”-//w3c//dtd html 4.0 ” + “transitional//en\”>\n”;
out.print(docType +
“<html>\n” +
“<body>\n” +
“<table>\n”);
try {
Process p = builder.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String readline;
while ((readline = reader.readLine()) != null) {
if(!readline.contains(“NOTE: Picked up JDK_JAVA_OPTIONS”)) {
out.println(“<tr><td>”+readline+”</td></tr>\n”);
}
}
reader.close();
p.waitFor();
p.destroy();
} catch (IOException | InterruptedException ex) {
}
out.print( “</table>\n” +
“</body>” +
“</html>”);
- To display table data use the JavaScript DataTables plugin: https://datatables.net/
- DataTables allows easy filtering and has many optional settings to fit the table.
Talend ETL
Talend is opensource ETL software built with Java that allows for easy integration between data sources and can perform complex data mapping. Completed jobs can be exported as executable scripts and executed as needed. Talend has been integrated into the EDI document flow and performs the steps of extracting EDI data from QuickBooks, mapping fields to EDI standards and generating an XML version of the document to be sent.
- The flow of data in Talend is divided between main flows and iterations. A data flow will perform each step on the entirety of the data before moving to the next step. Iteration flows will perform all steps on each piece of data before moving to the next step.
- Documentation is available online at: https://help.talend.com/r/aEfbAPXLVkzYRF28OR1Jww/root
- To catch and report errors that occur in scripts include the tLogCatcher component connected to the job LogJobErrors. The error reporting job will send out an email with error logs to the system admin. An email will only be sent once per hour per job, so if a job is continually failing the admin will only be notified once per hour.
- By updated the version number of jobs after deployment Talend has basic versioning control and previous versions can be viewed.
- If a job is accidentally deleted it will be moved to the Recycle bin and can be restored.
- Commonly used components can be stored in the Metadata folder for easy use in multiple jobs.
- The Java live generated code can be viewed by toggling between the Designer and Code tab in the job editor window.
- Errors that occur during runtime when testing a job will reference the component that caused the error. Errors that prevent the job from compiling will be reported in the log and highlighted in the raw Java code.
- If a job is updated that is used as a sub job in other jobs each job requiring the update also needs to be updated.
Deploying a Job Update
- Open Talend Open Studio
- Ensure that the job has been saved and is closed in the job viewer
- Right click the job in the repository viewer
- Select Build Job
- Do not change any settings
- Click Finish
- Once Talend has built the updated job right click the job in the repository viewer
- Click Edit Properties
- Click “m” in the Version settings. This will increment the version number.
Advice and Tips
- There are always multiple ways to perform a function in Talend.
- tDBInput allows for a select query or even multiple queries. The final query must return a dataset.
- tDBOutput allows for inserts or updates on tables without writing a specific query. If an update is required the primary key must be selected in Talend.
- tDBRow allows a query to be executed without returning a dataset.
- To perform actions on grouped data from a table load the rows and use tUniqRow to filter out any duplicates. The unique rows can be sent through a tFlowToIterate component to allow data iteration.
- Use tMap to perform Java checks or changes on data. This is especially useful if a later component cannot take null values for example. Nulls could be changed to blank spaces.
- If a query needs to be dynamically constructed from a list of values use tDenormalize to create a string of values that can be used dynamically in a tDBRow query.
- In the tMap component, if two inputs are needed most of the time the secondary input should be loaded once the primary input reaches tMap. By default, tMap will load the secondary input once the script starts. To change this behavior, open the tMap and click the settings for the secondary input. Change the Lookup Model to “Reload at each row”.
- tSetGlobalVar can be used to store important values for use later in the job. Most commonly in iterate loops for references to a value being iterated on.
Microsoft SQL
The deployed version of Microsoft SQL Server is v15.0.2. Two databases are being used to group common data together. EDIDatabase for EDI related data and OrderTracking for QuickBooks data manipulation and online order data. QODBC is an adapter for QuickBooks data that allows queries of QuickBooks data using SQL. Unfortunately, QODBC does not allow complex queries but will not give an error and will instead crash. Do not perform queries with more than two left joins or ANY subqueries. To perform a query requiring these functions extract the required data out of QODBC with a simple query and store it in a temporary table in Microsoft SQL. Any complex queries can be performed from here. Microsoft SQL can access QODBC directly using OPENQUERY. OPENQUERY queries are constructed from strings so be careful with escaping apostrophes. If the query needs to be dynamically constructed the WHOLE query must be enclosed in a string which means that each apostrophe needs to be double escaped i.e., ‘ -> ‘’’’. Examples:
A static query (apostrophes are escaped once)
SELECT * FROM OPENQUERY(qremote, ‘
SELECT
InvoiceLine.RefNumber,
InvoiceLinkedTxn.LinkedTxnRefNumber,
InvoiceLine.PONumber,
InvoiceLine.CustomerRefFullName,
InvoiceLine.InvoiceLineItemRefFullName,
InvoiceLine.InvoiceLineQuantity,
InvoiceLine.SalesTaxTotal,
InvoiceLine.CustomFieldShippingVia
FROM
InvoiceLine,
InvoiceLinkedTxn
WHERE InvoiceLine.TxnDate >= ({fn CURDATE()}-30)
AND InvoiceLine.CustomerRefFullName IN
(”Costco:Costco.ca”, ”Walmart.ca”, ”Home Depot.ca”, ”Lowes.ca”, ”Rona/Reno Depot”)
AND InvoiceLine.TxnID = InvoiceLinkedTxn.TxnID
AND InvoiceLinkedTxn.LinkedTxnTxnType = ”SalesOrder”’);
A dynamic query (apostrophes are escaped twice)
DECLARE @sql nvarchar(MAX)
SET @sql = ‘SELECT * FROM OPENQUERY(qremote, ”
SELECT
SalesOrderLinkedTxn.TxnID,
InvoiceLine.InvoiceLineItemRefFullName,
InvoiceLine.InvoiceLineAmount
FROM
SalesOrderLinkedTxn,
InvoiceLine
WHERE SalesOrderLinkedTxn.CustomFieldOther = ””’+@ConsignmentNumber+””’
AND SalesOrderLinkedTxn.LinkedTxnTxnID = InvoiceLine.TxnID
AND InvoiceLine.InvoiceLineAmount <> 0 ”)’
EXEC sp_executesql @sql;
QODBC also does not have robust error reporting. Due to the difficulty of tracking down errors in complex queries, extracting the raw data out of QODBC into MSSQL before performing the final query will allow use of MSSQL’s complete error logging. Queries run in Microsoft SQL Server Management Studio will show error messages in the log. If a script or 3rd party program interacting with the database is reporting errors then a real-time logger should be used for debugging.
Realtime Logging for MSSQL
- Start SQL Server Profiler
- Click File->New Trace
- Select the Server from the Server name dropdown
- Click Connect
- Leave the Trace Properties as the default properties
- Click Run
- New connections will be shown and all runtime logs can be dynamically viewed in the trace viewer
- To close the trace, click the Stop button
QODBC
QODBC is a 3rd party connector for QuickBooks that allows applications to interact with QuickBooks as an ODBC data source. QODBC uses an optimizer file to construct QuickBooks data in a proprietary data format for rapid querying.
- The optimizer file is located in C:\Users\Administrator\AppData\Roaming\QODBC Driver for QuickBooks\Optimizer.
- There is a delay for each application to connect with QODBC and a second delay for QODBC to connect with QuickBooks. This means that to optimize connected applications they should interact with as few queries as possible and if possible, should query a table with the command: NOSYNC. This will stop QODBC from optimizing the table, saving time.
- QODBC cannot perform complex queries. Do not perform queries with more than two left joins and do not use subqueries. To perform advanced queries use MSSQL.
- More than 3 concurrent connections(queries) to QODBC can cause the driver to crash.
- Sometimes the connector will get stuck connecting to QuickBooks but will not crash.
- The Talend script CheckAndRestartQODBC runs every minute and checks if QODBC is stuck connecting or if it is not running. If either of these cases occur the script will start or restart QODBC. This keeps downtime to under 2 minutes. Sometimes the optimizer file will become corrupt and require a rebuild. If the QuickBooks company file is rebuilt the QODBC optimizer file must also be rebuilt.
Rebuilding the Optimizer File
- Delete the old optimizer file located in C:\Users\Administrator\AppData\Roaming\QODBC Driver for QuickBooks\Optimizer
- Click start and search VB Demo -> start VB Demo
- Click Connections->Add New Connection
- Click Machine Data Source tab
- Select QuickBooks Data QRemote
- Click OK
- Remove the placeholder query and type sp_optimizefullsync all
- Execute the query
- NOTE: The rebuild will take about 8 hours and no queries can be run while it is being done. Please factor this into deciding when to rebuild. Rebuilding a single table can take 30 minutes and may be a better option in a pinch. Use the query sp_optimizefullsync <<table name>> to optimize a single table.
SAP Crystal Server
Crystal Reports is used generate documents not included with QuickBooks and to create complex reports.
- The reports are created in SAP Crystal Reports 2020 and are published to SAP BusinessObjects Enterprise Server (Crystal Reports Server).
- The server MUST be able to access a data source used in a Crystal Reports and must also have a computable driver.
- All data sources must be 64bit to be accessed by the reports on the Crystal Reports Server.
- If login credentials are required to access the data source used in a Crystal Report, they must be entered into the Central Management Console. Each time the report is updated the credentials have to be re-entered.
- Reports can be scheduled to run automatically in the instance manager of the Crystal Reports Server. Report parameters can be configured in the instance manager.
- Crystal Reports Server has been configured to connect with an SMTP server to deliver emails.
- Crystal Reports Server is deployed on port 8080. Access is password protected but the server should still never be connected to an unsecured network.
Entering Data Source Credentials
- Click Start and scroll down the application list to SAP Business Intelligence
- Open the SAP Business Intelligence folder and start SAP Crystal Server Central Management Console
- Login using the Administrator credentials
- Under the Organize menu select Folders
- Open the StirlingReports folder
- Right click the desired report and select Database Configuration
- Change the dropdown for When viewing and scheduling report to “Use same database logon as when report is run”
- In the Database logon information click “Use original database logon information from the report”
- Enter the database credentials into the User and Password
- Click Update to save the settings
Uploading a Report to Crystal Reports Server
- Open the report to be uploaded in Crystal Reports
- Click File->Save As
- Select Enterprise
- Sign in using the Administrator Profile
- Expand the StirlingReports folder
- Edit the file name and save the report
Creating an Automatic Report Instance
- Open a browser
- Navigate to the page *Server IP*:8080/BOE/BI
- Login using the Administrator credentials
- Click the home icon
- Click Folders
- Click Public Folders->StirlingReports
- Click the 3 dots to the right of the report to be automated->Click Schedule
- Title the report instance
- If the email needs to be delivered by email click Add in Destinations
- Select Email
- From = SMTP email address
- To = Destination Address
- Fill in the Subject and Message as needed
- Confirm
- Set the Recurrence of the report.
- DO NOT allow retries
- Turn on report notifications if needed
- Click the Report Features Tab at the top
- Change the Format to PDF
- Put in any required report parameters
- Put in Database Logon details if required
- Click Schedule to save the report instance
Adding a new Crystal Reports Server Destination
- Click Start and scroll down the application list to SAP Business Intelligence
- Open the SAP Business Intelligence folder and start SAP Crystal Server Central Management Console
- Login using the Administrator credentials
- Under the Organize menu select Servers
- Open Service Categories->Core Services
- Right Click STIRLINGSERVER0.AdaptiveJobServer and click Properties
- Select Destination
- Add or modify existing Destinations
- Click Save & Close
Advice and Tips
- Report filters used in Crystal Reports will not always be transformed into the database query. If the filter requires two passes on data (performing a calculation on a column in reference to another column) or if the criteria is not transferable then Crystal Reports will perform the filtering after the data is loaded. It is important to include filter criteria that will limit the data loaded for report optimization.
- Sub-reports can be inserted into a report to show new data or perform dynamic reporting on grouped data by passing rows to the sub-report.
- Use the group or section expert to start a new page before or after a section. To prevent creating a blank page at the end of the report use the new page formula and put in: if not OnLastRecord then TRUE else FALSE
- Individual sections can be rotated landscape or portrait for flexible page layouts.
- By using a spreadsheet with rows matching the different pages of a report it is possible to run multiple reports together on the same data. Include the report data and the spreadsheet data but do not link them. This will cause the report to loop through the report rows once for each spreadsheet row. By suppressing sections based on the spreadsheet row value, multiple runs can be done on the same data in one report. For example, a report with a bill of lading, packing slip and a master packing List all can be run in one report.
- Sometimes the Crystal Reports query builder will not be able to construct the required query or will construct an un-optimized query. To get around this, in the Database source menu a command can be used on a data source. This will allow a query to be written in SQL directly and can even include dynamic parameters that will integrate into the report.
Bots Open-Source EDI Translator
Bots is a powerful open-source software that can perform all parts of EDI document translation and transmission. Bots has been configured to Transform XML documents into EDI documents, sending the transformed documents and providing automatic 997 document confirmations for documents received. In the production environment Talend creates the XML document from data loops and processes XML documents generated by bots from incoming EDI documents.
- When the bots webserver is running it is deployed on port 8099.
- This is the process used to translate incoming documents
- When a document is sent to bots the trading partner and message type will be used to select what document structure to use and where the result document will be sent.
- A document that is not properly constructed will fail to send if it does not match the required document structure. This will not stop messages from sending that are structured properly but do not match the required trading partner mapping.
- Trading partners requiring AS2 document transmission can be configured in the pyAS2 server. Documents sent to that trading partner can be dropped off and picked up via SMTP on the AS2 server. Documents dropped off on the server will be automatically processed every 15 minutes.
- The bots webserver should only be turned on for troubleshooting and for configuration. It should not be turned on outside of those scenarios. The web portal is password protected but should not be used on an unsecured network.
- To process pending documents run: C:\python27\Scripts\bots-engine.py –new in python3
- Bots can be configured to send or request confirmation 997 documents based on bots Route, Channel, From Partner, To Partner or Message type.
- A bots channel is a location or destination that bots can direct files to.
- A bots route is the way documents will be moved from one channel to another. Multiple routes can be created for one partner to filter out certain message types or to return 997 documents.
- Each document has two IDs. One is created at the time of document creation and the other is assigned when the message is sent by bots. The ID assigned by bots is used along with the section ID to track message status.
Adding a Trading Partner
- Start the bots web server and log into the web interface
- Go to the configuration tab->Partners
- Click Add partner
- Check the “Active” option
- Enter the partner identification – This is the partner ISA ID / EDI ID
- Give the partner a name
- Click Save
- Three Channels will have to be created to cover document sending and receiving.
- Go to the configuration tab->Channels
- Click Add channel
- Name the channel ID using PARTNERNAME_Inbound_IN
- Select In/Out as in
- Configure the type to match the partner method for document transmission
- If information for the document communication is needed fill in the required information
- Enter Filename as *
- Add an Archive path to C:\EDIDocuments\PartnerName\InboundArchive
- Save the new channel
- Add another channel
- Name the channel ID using PARTNERNAME_Outbound_IN
- Select In/Out as in
- Choose type file
- Enter Path as C:\EDIDocuments\PartnerName\Outbound
- Enter Filename as *.xml
- Add an Archive path to C:\EDIDocuments\PartnerName\Outbound\OutboundArchive\XML
- Save the new channel
- Add another channel
- Name the channel ID using PARTNERNAME_Outbound_OUT
- Select In/Out as out
- Configure the type to match the partner method for document transmission
- If information for the document communication is needed fill in the required information
- Enter Filename as PARTNERNAME_{messagetype}_*.x12
- Optionally an archive folder can be added but if it is added 2 copies of each sent document will be archived. One from the XML file one from the x12 file. Bots will keep records of all documents sent for the previous 30 days regardless of the archive setting.
- An inbound and outbound route need to be created to complete the partner creation.
- Go to the configuration tab->Routes
- Click Add channel
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Inbound
- Enter Sequence as 1
- Select the Incoming Channel as PARTNERNAME_Inbound_IN
- Select Fromeditype as x12
- Enter Frommessagetype as x12_one2one
- Set Translate as Nothing
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Inbound
- Enter Sequence as 10
- Leave the Incoming channel, Fromeditype, Frommessagetype, and Outgoing channel empty
- Set Translate to Translate
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Inbound
- Enter Sequence as 50
- Leave the Incoming channel, Fromeditype and Frommessagetype empty
- Set Translate to Nothing
- Set the Outgoing channel to PARTNERNAME_Outbound_OUT
- Open the Filtering for outchannel
- Set Toeditype to x12
- Enter Tomessagetype as 997004010
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Inbound
- Enter Sequence as 100
- Leave the Incoming channel, Fromeditype and Frommessagetype empty
- Set Translate to Nothing
- Set the Outgoing channel to Inbound_OUT
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Outbound
- Enter Sequence as 1
- Set Incoming channel as PARTNERNAME_Outbound_IN
- Enter Fromeditype as xmlnocheck
- Set Frommessagetype as x12_one2one
- Set Translate to Nothing
- Set the Outgoing channel to Inbound_OUT
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Outbound
- Enter Sequence as 10
- Leave the Incoming channel, Fromeditype, Frommessagetype and Outgoing channel empty
- Set Translate to Translate
- Save the route
- Add another route
- Check the “Active” option
- Enter the Idroute as PARTNERNAME_Outbound
- Enter Sequence as 100
- Leave the Incoming channel, Fromeditype and Frommessagetype empty
- Set Translate to Nothing
- Set the Outgoing channel to PARTNERNAME_Outbound_OUT
- Save the route
- If any specific formatting is required for the partner, create a file name using the partners ISA ID; PARTNERISAID.py in C:\Python27\Lib\site-packages\bots\usersys\partners\x12. Use syntax from https://bots.readthedocs.io/en/latest/configuration/partner/partner-syntax/.
The partner has now been added in bots. Documents will still have to be mapped and tested to complete the connection.
Selenium – Python Automation
Selenium is a browser automation library. It is being used to automate repetitive actions done using a browser. Specifically scraping order data for automatic order entry, order tracking information submission and order invoicing.
- All of the scripts interact with Commerce Hub. An online vendor marketplace network.
- The Commerce Hub password used to access the site needs to be changed every 3 months.
- Selenium uses xPath to find web elements for interaction.
Website Order Entry
Basic web forms are used to allow sales agents to enter sales orders, service tickets and part slips. The data is stored on the website before being processed by Talend and inserted into QuickBooks. Data that is entered this way has an authorization tag allowing order administrators to review the data before confirmation.
- Sales agents can request sales order confirmations for customers. The confirmation will be sent once the order is authorized.
- Customer and model data is synced from QuickBooks.
AS2 Document Server
AS2 is a data exchange protocol. Most documents sent using the AS2 protocol are encrypted and signed by the sender. The receiver will use a paired key to decrypt the message. In AS2 protocol the sender system will initiate the connection with the receiving system. This means that in order to receive messages the receiving system must be continuously listening for message transmissions. The production AS2 server is hosted on Digital Ocean. The production AS2 server is pyAS2, an open-source AS2 server built on the Django framework.
- Most AS2 partners will request an MDNs return. The AS2 server supports both synchronous and asynchronous MDNs.
- Each partner will have a folder automatically generated for sending and a folder for receiving.
- Documents place in the outgoing folder will be automatically process and sent to the corresponding partner.
- Confirmation messages can be customized to meet partner requirements.
- Limited documentation for pyAS2 can be found at: https://django-pyas2.readthedocs.io/en/latest/.
- The server will send documents placed in the partner folders every 15 minutes starting 5 minutes after the hour.
Adding an AS2 Partner
- Go to the Public certificates section
- Click ADD PUBLIC CERTIFICATE
- Click Choose File and upload the partners public certificate
- Click Save
- Go To the Partner section
- Click ADD PARTNER
- Enter the partner name
- Enter the partner’s AS2 Identifier
- Entering the email address is optional
- Enter the Target URL for the partner’s AS2 server
- Add a subject for messages. This will be sent in each message header.
- Set the Content type
- Enter the Confirmation Message: MDN for Message-ID: $messageid From: $sender To: $receiver | Status: Received | This is not a guarantee that the message contents have been processed.
- Click on Security Settings
- Set the encryption settings to match the partner’s specifications and select the partner public certificate
- Click on MDN Settings
- Check Request MDN
- Set MDN Mode to what the partner specifies
- Set Request Signed MDN to SHA-1
- Click on Advanced Settings
- Make sure that Keep Original Filename is NOT checked
- Click Save
The partner is now added. The outbound and inbound folders will need to have permissions modified to allow bots to access the folders via SMTP.
Extended Warranty Store Module
The extended warranty addon for the webstore allows users to register product warranties and purchase an extended warranty. The module is built for OpenCart, an open-source web store. Customers who register a warranty will receive a confirmation email with a link allowing them to purchase an extended warranty for the serial number they registered. Automated emails will go out 30 days after registering and 30 days before expiring reminding customers to purchase an extended warranty. Customers can opt out of receiving these automated emails. Each product that a warranty can be registered for has a unique set of questions to be answered when a warranty is registered. Different extended warranty pricing is used for each product category.
- Warranty confirmations can be resent if required.
- Emails sent from the web store are sent through a Mailjet SMTP relay. Mailjet tracks email delivery and link clicking analytics. URLs used in the email will be replaced with links to track clicks and redirect to the replaced link.
Adding an Extended Warranty Category
- In the OpenCart admin page, navigate to System->Localisation->Warranty Periods
- Click Add
- Enter the category name
- Make sure Unlimited is unchecked
- Enter the period the warranty will be extended by
- Leave sort order as 0
- Click Save
- Navigate to Catalog->Products
- Click the “+” to add a product
- Enter the Product Name. This name is what the customer will see.
- Enter the description
- Enter a Meta Tag Title
- Enter tags for each product this extended warranty will apply to. This will make it easier for customers to search the extended warranty.
- Go to the Data tab
- Enter the Model number. This must be unique.
- Enter the warranty Price
- Select the Tax Class as Taxable Goods
- Set Subtract Stock as No
- Set Requires Shipping as No
- Set Status as Enabled
- Go to the Links Tab
- Add the Extended Warranty Category
- Go to the Warranty Tab
- Select Is Extended Warranty as Yes
- Select the Warranty Period that was added previously as the Extended Warranty Period
- Select Is Warranty as No
- Save the new product
Adding a New Question Sheet for Warranty Registration
- In the OpenCart admin page, navigate to System->Localisation->Warranty Questions
- If new questions need to be created for this sheet click Add
- Enter the Question
- Select the Type of question: Text, Select One, Multi Select, Yes/No or Multi Select with Additional Info
- If more information is required to complete the question enter it now
- Save the question
- Once all Questions have been added navigate to System->Localisation->Warranty Sheets
- Click Add
- Enter the name of the Sheet. This is for internal reference only.
- Add all the questions for the sheet. The Sort Order is the order in which the questions appear to the user.
- Click Save
Adding a Registerable Product
- In the OpenCart admin page, navigate to System->Localisation->Warranty Periods
- If the basic warranty period for the product is not created click Add
- Enter the name of the warranty
- Enter the base Warranty Period
- Click Save
- Navigate to Catalog->Products
- Click the “+” to add a product
- Enter the Model Number as the Product Name
- Enter the Meta Tag as the Model Number
- Go to the Data Tab
- Enter the Model Number
- Set the Status as Disabled
- Go to the Warranty Tab
- Select Is Extended Warranty as No
- Set Extended Warranty Period as the corresponding extended warranty
- Select Is Warranty as Yes
- Select the Standard Warranty Period
- Select the Warranty Sheet
- Click Save
Module Descriptions and Notes
Apache Tomcat Webserver
SyncCATEDIOrders
This servlet will pop up a window and call GetCAT940DetailsFromQB. If any errors are generated during the script run they will be logged in the window. A close window button will appear upon script completion.
Calls
GetCAT940DetailsFromQB
Called By
Tomcat Index
PrepCATEDI
This Servlet will load all unsent orders from EDIDatabase.CAT into an HTML table. Each order row has a checkbox to send or update. When Submit Selected is clicked all the checked orders will be sent via GET to SendCATEDI.
Calls
SendCATEDI
Called By
Tomcat Index
SendCATEDI
This Servlet parses the orders in the get parameters into a list of orders to update and a list to send. The script CATUpdateEDIOrderData is called and the orders to update are sent as a contextual parameter. Any errors that occur in the script will be logged in the returned HTML. Next the table EDIDatabase.CAT will be updated with each order to send being flagged with ToSendFlag=1. Then CATGenerateShipRequestXML will be called and any errors will be logged in the returned HTML.
Calls
CATUpdateEDIOrderData
CATGenerateShipRequestXML
Called By
PrepCATEDI
CATOrderStatus
This Servlet will load data from EDIDatabase.CAT joined with EDIDatabase.MessageTracking and EDIDatabase.CAT_945 into an HTML table. This data will show the status of orders sent and any returned warehouse shipping advices. Orders can be archived from this screen to clean it up if desired.
Calls
ArchiveCat940
Called By
Tomcat Index
ArchiveCat940
This Servlet will parse all orders sent as post parameters and update the table EDIDatabase.CAT to swap archive status for each order.
Calls
Called By
CATOrderStatus
ArchivedCATOrderStatus
ArchivedCATOrderStatus
This Servlet is the same as CATOrderStatus but shows orders that have been archived.
Calls
ArchiveCat940
Called By
Tomcat Index
CATShipConfirmations
This Servlet loads data from EDIDatabase.CAT_945 into an HTML table. The data can be archived by clicking the checkbox next to the row and submitting.
Calls
ArchiveCatShip
Called By
Tomcat Index
ArchiveCatShip
This Servlet will swap the archive status for each shipment parsed from the posted orders in EDIDatabase.CAT_945.
Calls
Called By
CATShipConfirmations
ArchivedCATShipConfirmations
ArchivedCATShipConfirmations
This Servlet is the same as CATShipConfirmations but shows shipments that are archived.
Calls
ArchiveCatShip
Called By
Tomcat Index
PrepCANTREXEDI
This Servlet loads invoices and credits from EDIDatabase.CANTREX_810 into an HTML table. The details for each transaction can be opened from this page and each order can be sent, updated, or archived by checking the corresponding box.
Calls
ArchiveCantrex810Unsent
This Servlet loads invoices and credits from EDIDatabase.CANTREX_810 into an HTML table. Only transactions that have been archived will be loaded.
Calls
ArchiveCantrex810Unsent
Called By
Tomcat Index
SendCANTREXEDI
Cantrex810Details
Called By
Tomcat Index
ArchiveCantrex810Unsent
This Servlet swaps the archive status of unsent transactions in EDIDatabase.CANTREX_810.
Calls
Called By
PrepCANTREXEDI
ArchivedCantrexUnsent810
ArchivedCantrexUnsent810
This Servlet loads invoices and credits from EDIDatabase.CANTREX_810 into an HTML table. Only transactions that have been archived will be loaded.
Calls
ArchiveCantrex810Unsent
Called By
Tomcat Index
SendCANTREXEDI
This Servlet parses transactions sent in GET parameters into lists of transactions to send and transactions to update. First the script CANTREXUpdate810DetailsFromQB is called and the orders to update are sent as a context parameter. Then the table EDIDatabase.CANTREX_810 is updated and all the transactions to send are flagged for sending. Then CANTREXGenerate810XML is run.
Calls
CANTREXUpdated810DetailsFromQB
CANTREXGenerate810XML
Called By
PrepCANTREXEDI
SentCantrex810
This Servlet loads data from EDIDatabase.CANTREX_810_SENT and EDIDatabase.MessageTracking into an HTML table. Transactions can be viewed and archived.
Calls
Cantrex810Details
ArchiveCantrex810Sent
Called By
Tomcat Index
ArchiveCantrex810Sent
This Servlet is the same as ArchiveCantrex810Unsent but works on the sent tables.
Calls
Called By
ArchivedSentCantrex810
SentCantrex810
ArchivedSentCantrex810
This Servlet is similar to SentCantrex810 but shows data from EDIDatabase.CANTREX_810_SENT instead.
Calls
ArchiveCantrex810Unsent
Called By
Tomcat Index
ArchiveCantrex810Unsent
This Servlet swaps the archive status of unsent transactions in EDIDatabase.CANTREX_810_SENT.
Calls
Called By
ArchivedSentCantrex810
Cantrex810Details
This Servlet shows a basic Invoice/Credit interface. The details are loaded from EDIDatabase.CANTREX_810 and EDIDatabase.CANTREX_810_SAC or the corresponding sent tables. If the transaction has been sent it is possible to flag it for re-sending from this page.
Calls
FlagCantrex810Resend
Called By
PrepCANTREXEDI
SentCantrex810
FlagCantrex810Resend
This Servlet swaps the status allowing a transaction to be re-loaded in the EDIDatabase.CANTREX_810_SENT.
Calls
Called By
Cantrex810Details
GetCantrexInvoices
This Servlet runs the script CANTREXGet810DetailsFromQB and logs any errors in the returned HTML.
Calls
CANTREXGet810DetailsFromQB
Called By
Tomcat Index
GetCantrexCredits
This Servlet runs the script CANTREXGet810DetailsFromQB_Credit and logs any errors in the returned HTML.
Calls
CANTREXGet810DetailsFromQB_Credit
Called By
Tomcat Index
OrdersToInvoice
This Servlet runs the stored procedure OrderTracking.UpdateOrdersToInvoice. Next the updated data from OrderTracking.OrdersToInvoice is loaded into an HTML table. Orders can be filtered and sorted. Each order row has a checkbox that when checked will add it to a list of orders to be invoiced when submitted. There is a date field at the bottom that will set what invoice date to use for the submitted orders.
Calls
OrderTracking.UpdateOrdersToInvoice
InvoiceQBOrders
Called By
Tomcat Index
InvoiceQBOrders
This Servlet parses all the order numbers from the GET parameters and runs a SQL query to pull order data and re-insert it as an invoice. Next the newly created invoices will be queried to return the reference numbers matched to the orders numbers for reference.
Calls
Called By
OrdersToInvoice
OrdersToTrack
This Servlet runs the stored procedure OrderTracking.updateOrdersToTrack. Next all the updated data is loaded from OrderTracking.ToTrack. Orders that meet the criteria for tracking are available to be selected. The requirements for an order to be automatically trackable are that there is only one tracking number matching the PO number in the database; That there is only a single item on the order; And finally, that the Reference number has not already been tracked OR that it has been flagged to allow a retry. Selected orders can be submitted to add them to the queue for automatic tracking.
Calls
OrderTracking.updateOrdersToTrack
ArchiveOrdersToTrack
SubmitOrdersToTrack
Called By
Tomcat Index
SubmitOrdersToTrack
This Servlet parses order numbers from GET parameters. This list is used to load data from OrderTracking.ToTrack into OrderTracking.PendingTracking. Any data moved is then deleted from OrderTracking.ToTrack.
Calls
Called By
OrdersToTrack
ArchiveOrdersToTrack
This Servlet swaps the archive status for orders in OrderTracking.ToTrack.
Calls
Called By
OrdersToTrack
TrackedOrders
This Servlet loads data from OrderTracking.TrackedOrders into an HTML table. Orders can be flagged for retrying which allows them to be selected in the OrdersToTrack Servlet. If there are any non-critical errors that prevent the script from completely tracking an order it will show here.
Calls
RetryTracking
Called By
Tomcat Index
RetryTracking
This Servlet sets the retry flag to 1 in OrderTracking.TrackedOrders for the order number in the GET parameter.
Calls
Called By
TrackedOrders
AddTrackingInfo
This is NOT a Servlet and is an HTML page that parses out a table from Excel data that is pasted in a text field. The parsed data will be submitted to a further data parser to complete the tracking data submission.
Calls
ParseTrackingData
Called By
Tomcat Index
ParseTrackingData
This Servlet reads an array of data in from the GET parameters. The first and second columns are checked to see if they are PO numbers. If only one column is filled with PO numbers then the other column is inserted into OrderTracking.TrackingNumbers matched to the PO numbers. Tracking numbers are also checked to prevent duplicate entries.
Calls
Called By
AddTrackingInfo
TrackingNumbers
This Servlet loads data from OrderTracking.TrackingNumbers into an HTML table. If there is more than one tracking number for a PO number then the tracking numbers can be opened for that PO number.
Calls
TrackingNumberDetails
Called By
Tomcat Index
TrackingNumberDetails
This Servlet loads a PO number from the GET parameters. The PO number is used to get all valid tracking numbers from OrderTracking.TrackingNumbers and show it in an HTML table. Tracking numbers can be selected and submitted for deletion.
Calls
DeleteTrackingNumbers
Called By
TrackingNumbers
DeleteTrackingNumbers
This Servlet deletes rows from OrderTracking.TrackingNumbers where the ID is in the GET parameters.
Calls
Called By
TrackingNumberDetails
SyncWebsiteCustomerItems
This Servlet runs the script SyncOnlineDatabase to update the information used by the sales agent forms.
Calls
SyncOnlineDatabase
Called By
Tomcat Index
AssignCostco
This Servlet runs the script AssignCostcoM2H which assigns ship locations for orders being shipped via Metro2Home.
Calls
AssignCostcoM2H
Called By
Tomcat Index
PrepUPS
This Servlet executes the stored procedure OrderTracking.GetCostcoUPS loads data from QuickBooks into a table that has been configured to import into UPS WorldShip.
Calls
OrderTracking.GetCostcoUPS
Called By
Tomcat Index
Tomcat Index
The main page server by the Tomcat Server.
Calls
SyncCATEDIOrders
PrepCATEDI
CATOrderStatus
ArchivedCATOrderStatus
CATShipConfirmations
ArchivedCATShipConfirmations
PrepCANTREXEDI
ArchivedCantrexUnsent810
SentCantrex810
ArchivedSentCantrex810
GetCantrexInvoices
GetCantrexCredits
OrdersToInvoice
OrdersToTrack
TrackedOrders
AddTrackingInfo
TrackingNumbers
SyncWebsiteCustomerItems
AssignCostco
PrepUPS
Called By
Talend ETL
AssignCostcoM2H
This job runs the AssignCostco.py script and sends the script output in an email to involved users. If there is an error the script admin will be notified.
Calls
AssignCostco.py
Called By
AssignCostco
CheckAndRestartQODBC
This job is scheduled to run every minute. The job will check if QODBC is running and restart it if it is not. The log file will also be read and if QODBC has been attempting to connect for 2 minutes it will be restarted.
Calls
restart_qodbc.bat
check_qodbc.bat
Called By
Task Scheduler
CopyOldQODBCLogs
This job will run once a day and copy log files from QODBC into a master log file. The old logs will then be deleted.
Calls
Called By
Task Scheduler
ImportCHUBOrders
This job will run the script ImportOrders.py. The script output will be emailed to involved users. The script will generate two spreadsheets that get emailed along with a string of order data that is used to insert new orders into QODBC. This job is run once each morning.
Calls
ImportOrders.py
Called By
Task Scheduler
Last15DayOrderReport
This job, despite the name, will generate two spreadsheets with data from QuickBooks cross referenced with data from EDIDatabase.TrackedOrders, EDIDatabase.PendingTracking and EDIDatabase.ChubOrderStatus. One of the spreadsheets is a filtered version showing only the orders requiring immediate user action.
Calls
Called By
Task Scheduler
LogJobErrors
This job is included as a sub-job in all other jobs. If an error occurs in other jobs this job will be called and passed details of the error. Those details will be logged in a log file and a notification sent to the script admin. If a notice has been sent in the last hour then the error will be logged but no notice will be sent.
Calls
Called By
All Jobs
TrackCHUBOrders
This job runs the script TrackOrders.py. The output of the script is emailed to the involved users.
Calls
TrackOrders.py
Called By
Task Scheduler
Run_Bots
This job runs bots to send pending documents. Documents that have failed to send are also retried. After bots has run the incoming documents are read by the first document parser: ParseEDITypePartner. Any messages that have been sent have the MessageID updated with the ID generated by bots during document transmission. This job is run every 15 minutes.
Calls
UpdateEDIMessageInfo
ParseEDITypePartner
Called By
Task Scheduler
ParseEDITypePartner
This job reads in the list of inbound XML documents generated by bots. Each file is broken apart according to EDI standards and the transaction envelopes are sent to the corresponding document parser.
Calls
CANTREXParseDoctype
CATParseDoctype
StirlingParseDoctype
Called By
Run_Bots
UpdateEDIMessageInfo
This job reads the data file generated by bots and updates the table EDIDatabase.MessageTracking accordingly.
Calls
Called By
Run_Bots
CANTREXParseDoctype
This job parses the passed EDI envelope to break apart transaction types. In the case of this partner, only 997 documents will be transmitted so the document is sent to the Parse997 job.
Calls
Parse997
Called By
ParseEDITypePartner
CATParseDoctype
This job works like the other partner document parsers. This partner will send 945 transactions as well as 997.
Calls
Parse997
CAT_Parse945
Called By
ParseEDITypePartner
StirlingParseDoctype
This job is used for sending and receiving test documents. It works the same as the other document parsers.
Calls
Parse997
Called By
ParseEDITypePartner
Parse997
This job reads in the Section ID, Message ID and the acknowledgment status. These are used to update the EDIDatabase.MessageTracking table.
Calls
Called By
CATParseDoctype
CANTREXParseDoctype
StirlingParseDoctype
CAT_Parse945
This job maps the 945 warehouse shipping advices into the table EDIDatabase.CAT_945.
Calls
Called By
CATParseDoctype
ProcessNew945s
This job runs after each bots run but is not triggered by the Run_Bots job. This job loads all the recent 945 confirmations and confirms it with the expected requested shipments. If the shipment requested matches what was confirmed as shipped then the order will be automatically invoiced. If an order is invoiced and also fits the automatic tracking requirements it will be added to the tracking queue automatically.
Calls
Called By
Task Scheduler
CANTREXGenerate810XML
This job reads in data from EDIDatabase.CANTREX_810 and EDIDatabase.CANTREX_810_SAC. Transactions will be mapped into an XML document using the EDI structure and element names for an 810 transaction. The XML document will loop on the Reference Number with different transaction rows. The SAC fields are written to an XML segment, converted to a string and injected into the main document. This was done because the Talend tWriteXML component can only have a single XML loop. In the next implementation of an XML/EDI mapping the component tAdvancedFileOutputXML should be used. That component allows multiple XML loops to be combined in one document and would cut out 7 steps of the current document generation process.
Calls
EDIDatabase.SENDCANTREXINVOICE
Called By
SendCANTREXEDI
CANTREXGet810DetailsFromQB_Credit
This job loads credit data from QODBC.CreditMemoLine and maps it to EDIDatabase.CANTREX_810 and EDIDatabase.CANTREX_810_SAC. The table, EDIDatabase.CANTREX_810 stores the main transaction data and has different rows for each transaction line. The SAC component of the 810 structure is a separate loop and is stored in the EDIDatabase.CANTREX_810_SAC table. The primary reference key between the tables is the QuickBooks transaction reference number. That means that only one version of each transaction can be stored in each table at one time. Data loaded from QuickBooks will be checked to make sure the reference number does not already exist in the table.
Calls
Called By
GetCantrexCredits
CANTREXGet810DetailsFromQB
This job is the same as CANTREXGet810DetailsFromQB_Credit but loads data from QODBC.InvoiceLine.
Calls
Called By
GetCantrexInvoices
CANTREXUpdated810DetailsFromQB
This job receives a comma separated list of reference numbers as a contextual parameter. These reference numbers will be loaded from QODBC.CreditMemoLine and QODBC.InvoiceLine and replace any rows with matching reference numbers. This job is used to updated transaction information if it has been loaded from QuickBooks and found to be incorrect before it was sent. The transaction can be updated in QuickBooks and then updated via this job.
Calls
Called By
GetCantrexCredits
CATGenerateShipRequestXML
This job generates 940 warehouse ship requests from the table EDIDatabase.CAT. There is only one data loop for this document. Unlike the Cantrex data flow, CAT does not have a table for unsent data and a table for sent data. All the data is stored on a single table with the column “SentEDI” used to indicate the send status.
Calls
Called By
SendCATEDI
CATUpdateEDIOrderData
This job uses a list of comma separated reference numbers to load data from SalesOrderLine. The loaded data is used to update EDIDatabase.CAT. Rows matching the reference numbers will be updated.
Calls
Called By
SendCATEDI
GetCAT940DetailsFromQB
This job loads data from QODBC.SalesOrderLine and loads it into the table EDIDatabase.CAT. The table EDIDatabase.CAT is structured to make it easier to map the columns to the EDI 945 document structure.
Calls
Called By
Error! Reference source not found.
ManualConfirmationEmail
This job is used to send out sales agent confirmation emails for orders. If an order needed to be changed after the email has been sent this job can be used to resend it. To select the correct SA number change the value in tSetGlobalVar_2 and to select the email address change the email in tFixedFlowInput_1.
Calls
Called By
SendConfirmationEmails
This job checks the online StirlingMySQL.orders table for all orders that have a pending requested confirmation. If the order has been authorized in QuickBooks the order confirmation email will be sent to the sales agent that placed the order.
Calls
Called By
SyncStirlingMarathonComOrders
SyncOnlineDatabase
This job loads QODBC.Item into StirlingMySQL.warranty_items; QODBC.Item into StirlingMySQL.item; and QODBC.Customer into StirlingMySQL.customer. This job is used to update the information used by sales agents to complete orders and service requests in online forms.
Calls
Called By
SyncWebsiteCustomerItems
SyncStirlingMarathonComOrders
This job loads orders entered online by sales agents into QuickBooks. The data comes from StirlingMySQL.orders and is loaded into QODBC.SalesOrderLine. QuickBooks allows users to create a limited number of custom fields. These custom fields are not able to be inserted into QODBC in the same table as SalesOrderLine. To insert the custom fields, they are inserted after the initial data insert into the QODBC.CustomField with a reference to the transaction ID.
Calls
SendConfirmationEmails
Called By
Task Scheduler
DownloadNewPartSlips
This job loads data from StirlingMySQL.partslips into QODBC.EstimateLine. A notification email goes out for each part slip that is imported.
Calls
Called By
Task Scheduler
DownloadNewServiceTickets
This job loads data from StirlingMySQL.servicetickets into QODBC.EstimateLine. A notification email goes out for each service ticket that is imported.
Calls
Called By
Task Scheduler
MicrosoftSQL
EDIDatabase.SENDCANTREXINVOICE
This stored procedure is called to move data from EDIDatabase.CANTREX_810 and EDIDatabase.CANTREX_810_SAC to the corresponding sent tables. 3 parameters are used to call the procedure. The parameter BIG02_InvNumber is used as the primary key to query data. The two other parameters, InternalMessageID and SectionID are added to the data and all 3 parameters are inserted into the table EDIDatabase.MessageTracking. Once the data has been inserted all the old data is cleared out.
Calls
Called By
CANTREXGenerate810XML
OrderTracking.GetConsignmentOrderInfo
This stored procedure loads data from QODBC.SalesOrderLine and QODBC.InvoiceLine. The data is cross-referenced to determine the total dollar value for the consolidation group of orders that remains to be shipped. This procedure is used by a crystal report to help order administrators determine which consolidation groups to ship.
Calls
Called By
Crystal ConsolidationTotal
OrderTracking.GetConsignmentOrderInfoMulti
This stored procedure is the same as GetConsignmentOrderInfo but will take multiple consolidation numbers instead of one.
Calls
Called By
Crystal ConsolidationTotalMulti
OrderTracking.GetCostcoUPS
This stored procedure loads data from QODBC.SalesOrder into OrderTracking.CostcoUPS. The data in this table can be imported into UPS WorldShip.
Calls
Called By
PrepUPS
OrderTracking.MoveTrackedOrder
This stored procedure moves order data from OrderTracking.PendingTracking to OrderTracking.TrackedOrders. It also inserts a note column with details on the success or failure of the order that was tracked.
Calls
Called By
TrackOrders.py
OrderTracking.UpdateOrdersToInvoice
This stored procedure loads data from QODBC.SalesOrder into OrderTracking.OrdersToInvoice. The data can then be used to select which orders to invoice.
Calls
Called By
OrdersToInvoice
OrderTracking.updateOrdersToTrack
This stored procedure loads data from QODBC.InvoiceLine into OrderTracking.ToTrack. Invoices that meet the criteria to be automatically tracked will be stored with Trackable=1. This will be used to allow the invoice to be selected and submitted for tracking. Only invoices with a date in the previous 30 days will be loaded.
Calls
Called By
OrdersToTrack
Python Selenium
TrackOrders.py
This script loads all the queued orders from OrderTracking.PendingTracking and uses the data to automatically invoice and track orders in Commerce Hub. The script uses Selenium library to step through the web forms and fill in information. If an error occurs during order entry, a note will be added to the order data in OrderTracking.TrackedOrders when the script ends. This script is using the Selenium Chrome webdriver and the chromedriver.exe file will need to be updated when chrome updates. Commerce Hub requires passwords to be reset every 3 months. When it becomes time to reset the password a different page will load and the script will be unable to login. To prevent this the password should be changed every 2 to 2.5 months and updated accordingly. Product mappings must also be used to ensure that the correct product was shipped.
Calls
OrderTracking.MoveTrackedOrder
Called By
TrackCHUBOrders
ImportOrders.py
This script reads all the orders in the “download” queue on Commerce Hub. The data is then written to an Excel file and as a string for use in the insert SQL query. Any orders that fail to be read will be written to a separate Excel file with the reason for the failure. The two spreadsheets will be emailed to the involved users by the Talend script ImportCHUBOrders.
Calls
Called By
ImportCHUBOrders
CheckOrders.py
This script loads orders for customers serviced through Commerce Hub that are invoiced and not confirmed as tracked in OrderTracking.TrackedOrders. Each order is opened in Commerce Hub to check if it has been closed. The status of the order is saved in OrderTracking.ChubOrderStatus. This information is used to generate reports on what orders require user action.
Calls
Called By
Task Scheduler
AssignCostco.py
This script loads orders being shipped via Metro 2 Home. A list of orders that are expected to ship this way is available on Commerce Hub. These shipments must be assigned a ship location. The location assigned in QuickBooks is used to assign the location in Commerce Hub. A confirmation email will be sent out by the Talend script AssignCostcoM2H.
Calls
Called By
AssignCostcoM2H