Skip to main content

Exchanging Data with Triumph Intelligence

How to send us your data so we can train your machine learning model

Updated over 2 weeks ago

Overview

To get you the most accurate and precise predictions possible, the Triumph Rates team needs to train a machine learning model using historical data from your brokerage. This is how we make sure the rates you see are tailored to your particular brokerage's buying power.


What Data to Send

We'll need a .csv or .xls file with your historical data. The easiest way to send it is by emailing the file to your customer success manager or business analyst. The more historical loads you include, the better. Two years or more of load data is ideal. The minimum number of loads we need listed is 544, with at least 32 being from the last two weeks.

Your .csv or .xls file should have the following fields:

  • Pick up city*

  • Pick up state*

  • Pick up zip

  • Drop off city*

  • Drop off state*

  • Drop off zip

  • Mileage

  • Equipment type*

  • Actual buy rate (Linehaul + Fuel)*

  • Weight

  • Booked date + time*

  • Pickup date + time*

  • Covered date + time*

  • Commodity description

*- mandatory field


How to Send Data

Historical Data

You can send your historical data to Triumph in a number of different ways:

  • Email it to your customer success manager or business analyst.

  • Send it using STFP file transfer

  • Through your API

  • Through your Customer Success Manager via manual upload

  • By custom integration

Continuous Data

Once Triumph has your initial historical data, you can transfer future data through an API, SFTP, or TMS integration.

Maintaining an accurate model requires a steady flow of data, so if the above options are not available, you'll need to send regular updates through email. Make sure your business analyst knows the email address that you're using to send files so they can lock in the automatic upload process. Send an update once a week at minimum, ahead of a 30-day rolling look-back, to import_load@greenscreens.ai


Data Flow

Did this answer your question?