# How to Import CSV Files

### About this export

| Field | Value |
| --- | --- |
| **content_type** | lesson |
| **platform** | contentstack-academy |
| **source_url** | https://www.contentstack.com/academy/courses/use-cases-and-tutorials/how-to-import-csv-files |
| **course_slug** | use-cases-and-tutorials |
| **lesson_slug** | how-to-import-csv-files |
| **markdown_file_url** | /academy/md/courses/use-cases-and-tutorials/how-to-import-csv-files.md |
| **generated_at** | 2026-04-28T06:55:50.127Z |

> Part of **[Use Cases and Tutorials](https://www.contentstack.com/academy/courses/use-cases-and-tutorials)** on Contentstack Academy. **Academy MD v3** — structured for retrieval; no quiz or assessment keys.

<!-- ai_metadata: {"lesson_id":"13","type":"video","duration_seconds":315,"video_url":"https://cdn.jwplayer.com/previews/SODo168W","thumbnail_url":"https://cdn.jwplayer.com/v2/media/SODo168W/poster.jpg?width=720","topics":["How","Import","CSV","Files"]} -->

#### Video details

#### At a glance

- **Title:** How To Import CSV Files
- **Duration:** 5m 15s
- **Media link:** https://cdn.jwplayer.com/previews/SODo168W
- **Publish date (unix):** 1751777052

#### Streaming renditions

- application/vnd.apple.mpegurl
- audio/mp4 · AAC Audio · 113670 kbps
- video/mp4 · 180p · 316p · 155585 kbps
- video/mp4 · 270p · 476p · 186145 kbps
- video/mp4 · 360p · 634p · 211750 kbps
- video/mp4 · 406p · 712p · 225927 kbps

#### Timed text tracks (delivery)

- **thumbnails:** `https://cdn.jwplayer.com/strips/SODo168W-120.vtt`

#### Transcript

We'll discuss how to import CSV files into Lytx. This is one of the many common workflows for importing user information from various sources, and can be particularly helpful for unique datasets, or for systems that don't yet have a built-in integration into Lytx. This can be useful for bringing in offline data from sources like trade show visits, for campaign lists for targeting known consumers, uploading enrichment information about existing users, or for historical data during migrations away from existing CRMs, CMSs, or other marketing systems. Our system can connect to your existing data storage locations, including your SFTP file server, AWS S3 buckets, Google Cloud Storage, or a Lytx-managed SFTP directory. The import workflow can be configured to import files with custom delimiters, and PGP encryption can be used to ensure privacy-compliant data transfers. CSV imports can be set up in two ways depending on your needs, either as a one-time import or as an automated continuous import. There will be a couple requirements that we have for any data import. First, we must have a unique identifier for the user, like an email, a cookie, or a Salesforce ID, so that we can merge these user attributes to the right profile. This is covered in more detail in our videos on Identity Resolution. A second requirement is that any user attributes you import must be mapped in the Lytx query language files on your account. LQL field mapping is covered in more detail in another video, but the quick rule of thumb is that if no users have already had the attribute mapped, and it doesn't show up in the schema audit page, then we will need to map it before importing the file. The schema audit page provides a full listing of the mapped fields on your account. The last requirement for our file to import is that it must have a header that labels which fields are which. So I'm going to show you how to import a CSV file from an Amazon AWS S3 bucket. While logged into the account, I'm going to navigate to Data, and then to the Integrations tab. From there, I'm going to select the Amazon AWS tile, and we'll see that I have four previously run workflows. If you have not run any workflows on the account yet, the first thing you may need to do is add an authorization. There are several different ways you can authorize your Amazon account, including AWS keys, delegating access using AWS IAM management, or adding AWS keys with an additional PGP key to encrypt or decrypt data transfers. I already have an authorization on this account, so I'm going to go back to the workflows. Now I'm going to configure the workflow. By selecting New Workflow, I get a list of all the different options we can use for importing or exporting data from Lytx. I'm going to use the Import CSV workflow here. We'll select the authorization that has been added to the account, and then we have a couple options that we need to choose. First, we need to choose a stream that this data will be imported to. The fields that I am importing are mapped on the default stream, so I'm going to select the default stream here. Next, we can choose which bucket we're importing from. I'm going to use the Lytx CSV demo bucket. And then the last option that we're required to choose is the file. I only have one file in my bucket, so I'm going to select the only one there. Now from here, we could click Start Import, and it will run and start processing that. However, there are a few other options we could configure if needed. If you were using a pipe delimited or tab delimited file, we could change that option here. A date or datetime field can be selected if the records that we're importing need to be tied back to when the event occurred. We don't need to configure that for this file, but it is an option if you wanted it. You can select which fields to import or which ones to exclude if needed, but by default all will be included. And then there are also options for continuous imports, where you can import multiple files across many different days. These advanced options are discussed in more detail in the documentation on the left-hand side of the screen. So I'm going to click Start Import, and we can see that the workflow has been successfully started in the screen dialog box on the bottom left-hand side of the screen. We can click on the active workflow now and check the status of it. Once it loads, we can see that this workflow has synced and added three user profiles to our account. It has successfully completed. That's how you import a CSV file into Linux from your S3 bucket. Thanks for watching!

#### Subtitles (WebVTT)

```webvtt
WEBVTT

1
00:00:00.000 --> 00:00:04.700
We'll discuss how to import CSV files into Lytx.

2
00:00:04.700 --> 00:00:08.680
This is one of the many common workflows for importing user information from various sources,

3
00:00:08.680 --> 00:00:13.360
and can be particularly helpful for unique datasets, or for systems that don't yet have

4
00:00:13.360 --> 00:00:15.840
a built-in integration into Lytx.

5
00:00:15.840 --> 00:00:19.640
This can be useful for bringing in offline data from sources like trade show visits,

6
00:00:19.640 --> 00:00:24.040
for campaign lists for targeting known consumers, uploading enrichment information about existing

7
00:00:24.040 --> 00:00:30.880
users, or for historical data during migrations away from existing CRMs, CMSs, or other marketing

8
00:00:30.880 --> 00:00:33.120
systems.

9
00:00:33.120 --> 00:00:39.400
Our system can connect to your existing data storage locations, including your SFTP file

10
00:00:39.400 --> 00:00:48.180
server, AWS S3 buckets, Google Cloud Storage, or a Lytx-managed SFTP directory.

11
00:00:48.180 --> 00:00:52.280
The import workflow can be configured to import files with custom delimiters, and PGP encryption

12
00:00:52.280 --> 00:00:56.060
can be used to ensure privacy-compliant data transfers.

13
00:00:56.060 --> 00:01:02.540
CSV imports can be set up in two ways depending on your needs, either as a one-time import

14
00:01:02.540 --> 00:01:06.300
or as an automated continuous import.

15
00:01:06.300 --> 00:01:09.860
There will be a couple requirements that we have for any data import.

16
00:01:09.860 --> 00:01:16.500
First, we must have a unique identifier for the user, like an email, a cookie, or a Salesforce

17
00:01:16.500 --> 00:01:21.060
ID, so that we can merge these user attributes to the right profile.

18
00:01:21.060 --> 00:01:24.560
This is covered in more detail in our videos on Identity Resolution.

19
00:01:24.560 --> 00:01:31.240
A second requirement is that any user attributes you import must be mapped in the Lytx query

20
00:01:31.240 --> 00:01:33.280
language files on your account.

21
00:01:33.280 --> 00:01:37.720
LQL field mapping is covered in more detail in another video, but the quick rule of thumb

22
00:01:37.720 --> 00:01:42.040
is that if no users have already had the attribute mapped, and it doesn't show up in the schema

23
00:01:42.040 --> 00:01:47.320
audit page, then we will need to map it before importing the file.

24
00:01:47.320 --> 00:01:54.100
The schema audit page provides a full listing of the mapped fields on your account.

25
00:01:54.100 --> 00:01:57.900
The last requirement for our file to import is that it must have a header that labels

26
00:01:57.900 --> 00:02:01.560
which fields are which.

27
00:02:01.560 --> 00:02:08.020
So I'm going to show you how to import a CSV file from an Amazon AWS S3 bucket.

28
00:02:08.020 --> 00:02:12.180
While logged into the account, I'm going to navigate to Data, and then to the Integrations

29
00:02:12.180 --> 00:02:13.580
tab.

30
00:02:13.580 --> 00:02:20.280
From there, I'm going to select the Amazon AWS tile, and we'll see that I have four previously

31
00:02:20.280 --> 00:02:23.660
run workflows.

32
00:02:23.660 --> 00:02:28.160
If you have not run any workflows on the account yet, the first thing you may need to do is

33
00:02:28.160 --> 00:02:30.480
add an authorization.

34
00:02:30.480 --> 00:02:36.440
There are several different ways you can authorize your Amazon account, including AWS keys, delegating

35
00:02:36.460 --> 00:02:45.460
access using AWS IAM management, or adding AWS keys with an additional PGP key to encrypt

36
00:02:45.460 --> 00:02:51.140
or decrypt data transfers.

37
00:02:51.140 --> 00:02:54.940
I already have an authorization on this account, so I'm going to go back to the workflows.

38
00:02:54.940 --> 00:02:58.800
Now I'm going to configure the workflow.

39
00:02:58.800 --> 00:03:04.100
By selecting New Workflow, I get a list of all the different options we can use for importing

40
00:03:04.100 --> 00:03:07.080
or exporting data from Lytx.

41
00:03:07.080 --> 00:03:10.840
I'm going to use the Import CSV workflow here.

42
00:03:10.840 --> 00:03:16.280
We'll select the authorization that has been added to the account, and then we have a couple

43
00:03:16.280 --> 00:03:18.040
options that we need to choose.

44
00:03:18.040 --> 00:03:22.200
First, we need to choose a stream that this data will be imported to.

45
00:03:22.200 --> 00:03:27.280
The fields that I am importing are mapped on the default stream, so I'm going to select

46
00:03:27.280 --> 00:03:29.480
the default stream here.

47
00:03:29.480 --> 00:03:33.000
Next, we can choose which bucket we're importing from.

48
00:03:33.000 --> 00:03:36.820
I'm going to use the Lytx CSV demo bucket.

49
00:03:36.820 --> 00:03:42.460
And then the last option that we're required to choose is the file.

50
00:03:42.460 --> 00:03:47.980
I only have one file in my bucket, so I'm going to select the only one there.

51
00:03:47.980 --> 00:03:52.380
Now from here, we could click Start Import, and it will run and start processing that.

52
00:03:52.380 --> 00:03:57.020
However, there are a few other options we could configure if needed.

53
00:03:57.020 --> 00:04:01.640
If you were using a pipe delimited or tab delimited file, we could change that option

54
00:04:01.640 --> 00:04:02.640
here.

55
00:04:02.640 --> 00:04:06.400
A date or datetime field can be selected if the records that we're importing need to be

56
00:04:06.400 --> 00:04:09.120
tied back to when the event occurred.

57
00:04:09.120 --> 00:04:15.220
We don't need to configure that for this file, but it is an option if you wanted it.

58
00:04:15.220 --> 00:04:20.240
You can select which fields to import or which ones to exclude if needed, but by default

59
00:04:20.240 --> 00:04:21.240
all will be included.

60
00:04:22.220 --> 00:04:27.900
And then there are also options for continuous imports, where you can import multiple files

61
00:04:27.900 --> 00:04:30.740
across many different days.

62
00:04:30.740 --> 00:04:36.980
These advanced options are discussed in more detail in the documentation on the left-hand

63
00:04:36.980 --> 00:04:39.940
side of the screen.

64
00:04:39.940 --> 00:04:45.420
So I'm going to click Start Import, and we can see that the workflow has been successfully

65
00:04:45.420 --> 00:04:50.620
started in the screen dialog box on the bottom left-hand side of the screen.

66
00:04:50.620 --> 00:04:57.540
We can click on the active workflow now and check the status of it.

67
00:04:57.540 --> 00:05:04.180
Once it loads, we can see that this workflow has synced and added three user profiles to

68
00:05:04.180 --> 00:05:05.180
our account.

69
00:05:05.180 --> 00:05:10.740
It has successfully completed.

70
00:05:10.740 --> 00:05:14.840
That's how you import a CSV file into Linux from your S3 bucket.

71
00:05:14.840 --> 00:05:15.520
Thanks for watching!

```

```transcript
<!-- PLACEHOLDER: replace with real transcript before publish if cues were auto-derived from WebVTT -->
[00:00] We'll discuss how to import CSV files into Lytx.
[00:04] This is one of the many common workflows for importing user information from various sources,
[00:08] and can be particularly helpful for unique datasets, or for systems that don't yet have
[00:13] a built-in integration into Lytx.
[00:15] This can be useful for bringing in offline data from sources like trade show visits,
[00:19] for campaign lists for targeting known consumers, uploading enrichment information about existing
[00:24] users, or for historical data during migrations away from existing CRMs, CMSs, or other marketing
[00:30] systems.
[00:33] Our system can connect to your existing data storage locations, including your SFTP file
[00:39] server, AWS S3 buckets, Google Cloud Storage, or a Lytx-managed SFTP directory.
[00:48] The import workflow can be configured to import files with custom delimiters, and PGP encryption
[00:52] can be used to ensure privacy-compliant data transfers.
[00:56] CSV imports can be set up in two ways depending on your needs, either as a one-time import
[01:02] or as an automated continuous import.
[01:06] There will be a couple requirements that we have for any data import.
[01:09] First, we must have a unique identifier for the user, like an email, a cookie, or a Salesforce
[01:16] ID, so that we can merge these user attributes to the right profile.
[01:21] This is covered in more detail in our videos on Identity Resolution.
[01:24] A second requirement is that any user attributes you import must be mapped in the Lytx query
[01:31] language files on your account.
[01:33] LQL field mapping is covered in more detail in another video, but the quick rule of thumb
[01:37] is that if no users have already had the attribute mapped, and it doesn't show up in the schema
[01:42] audit page, then we will need to map it before importing the file.
[01:47] The schema audit page provides a full listing of the mapped fields on your account.
[01:54] The last requirement for our file to import is that it must have a header that labels
[01:57] which fields are which.
[02:01] So I'm going to show you how to import a CSV file from an Amazon AWS S3 bucket.
[02:08] While logged into the account, I'm going to navigate to Data, and then to the Integrations
[02:12] tab.
[02:13] From there, I'm going to select the Amazon AWS tile, and we'll see that I have four previously
[02:20] run workflows.
[02:23] If you have not run any workflows on the account yet, the first thing you may need to do is
[02:28] add an authorization.
[02:30] There are several different ways you can authorize your Amazon account, including AWS keys, delegating
[02:36] access using AWS IAM management, or adding AWS keys with an additional PGP key to encrypt
[02:45] or decrypt data transfers.
[02:51] I already have an authorization on this account, so I'm going to go back to the workflows.
[02:54] Now I'm going to configure the workflow.
[02:58] By selecting New Workflow, I get a list of all the different options we can use for importing
[03:04] or exporting data from Lytx.
[03:07] I'm going to use the Import CSV workflow here.
[03:10] We'll select the authorization that has been added to the account, and then we have a couple
[03:16] options that we need to choose.
[03:18] First, we need to choose a stream that this data will be imported to.
[03:22] The fields that I am importing are mapped on the default stream, so I'm going to select
[03:27] the default stream here.
[03:29] Next, we can choose which bucket we're importing from.
[03:33] I'm going to use the Lytx CSV demo bucket.
[03:36] And then the last option that we're required to choose is the file.
[03:42] I only have one file in my bucket, so I'm going to select the only one there.
[03:47] Now from here, we could click Start Import, and it will run and start processing that.
[03:52] However, there are a few other options we could configure if needed.
[03:57] If you were using a pipe delimited or tab delimited file, we could change that option
[04:01] here.
[04:02] A date or datetime field can be selected if the records that we're importing need to be
[04:06] tied back to when the event occurred.
[04:09] We don't need to configure that for this file, but it is an option if you wanted it.
[04:15] You can select which fields to import or which ones to exclude if needed, but by default
[04:20] all will be included.
[04:22] And then there are also options for continuous imports, where you can import multiple files
```

#### Lesson text

Quick tutorial on how to import user data into Lytics via CSV upload.

## Importing Custom Data

### Video Tutorial

**Note:** On January 10, 2023, we upgraded our UI with a new, refreshed interface. All of the underlying functionality is the same, but you will notice that things look a little different from this Academy guide. The most notable change is that the navigation menu has moved from the top of the app to the left side. We appreciate your patience as we work on updating our Academy.

In this tutorial, we'll show you how to import CSV files into Lytics. This is a common workflow to import user data from various sources, and it's particularly useful for uploading unique datasets or connecting to providers that don't have a pre-built integration in Lytics.

Watch the "How to Import CSV Files" video (5 mins) for a walk through of setting up a one-time or recurring CSV upload from SFTP or other secure locations.

**Note:** The integrations list now lives in the **Jobs** section of the Lytics UI (formerly the "Integrations" tab). Navigate to **Data > Jobs > New Job** to find the list of integrations in your account.

### Knowledge Check

**Custom delimiters can be selected for CSV file imports.**

A. True

B. False

Answer: A - This is true. Custom delimiters can be chosen for imports, including commonly used delimiters like tabs and pipes.

**Which systems Lytics can import CSV files from? Select all that apply.**

A. Amazon Web Services (AWS) S3 buckets

B. Google Cloud Storage

C. SFTP servers

D. A collection of floppy disks in my basement

Answer: A, B, C

**PGP encryption is required for all data imports.**

A. True

B. False

Answer: B - This is false. Importing PGP encrypted files is supported, but not required.

**Can files be automatically imported on an ongoing or recurring basis?**

A. Yes, this can be configured by selecting "Keep Updated".

B. No, you must import each file manually.

Answer: A - Yes, file import workflows can be scheduled for recurring imports.

### Next Steps

Here are recommended resources to continue learning about importing custom data into Lytics.

#### Academy Courses

*   Connecting Integrations
*   Jobs and Authorizations

#### Documentation

*   [Custom Integrations](https://learn.lytics.com/documentation/product/integrations/custom-integrations/overview) (for more information on using CSV or JSON imports)
*   [Custom Data Sources](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/custom-data-sources)

#### Key takeaways

- Connect **How to Import CSV Files** back to your stack configuration before moving to the next module.
- Capture one concrete artifact (screenshot, Postman call, or code snippet) that proves the step works in your environment.
- Re-read the delivery versus management boundary for anything you changed in the entry model.

## Supplement for indexing

### Content summary

How to Import CSV Files. Quick tutorial on how to import user data into Lytics via CSV upload. Importing Custom Data Video Tutorial Note: On January 10, 2023, we upgraded our UI with a new, refreshed interface. All of the underlying functionality is the same, but you will notice that things look a little different from this Academy guide. The most notable change is that the navigation menu has moved from the top of the app to the left side. We appreciate your patience as we work on updating our Academy. In this tutorial, we'll show you how to import CSV files into Lytics. This is a common workflow to import user data from various sources, and it's particularly useful for uploading unique datasets or connecting to pr

### Retrieval tags

- How
- Import
- CSV
- Files
- use-cases-and-tutorials
- lesson 13
- How to Import CSV Files
- use-cases-and-tutorials lesson

### Indexing notes

Index this lesson as a primary chunk tagged with lesson_id "13" and topics: [How, Import, CSV, Files].
Parent course slug: use-cases-and-tutorials. Use asset_references URLs as thumbnail hints in search results when present.
Never surface LMS quiz content or assessment answers from this file.

### Asset references

| Label | URL |
| --- | --- |
| Video thumbnail: How to Import CSV Files | `https://cdn.jwplayer.com/v2/media/SODo168W/poster.jpg?width=720` |

### External links

| Label | URL |
| --- | --- |
| Contentstack Academy home | `https://www.contentstack.com/academy/` |
| Training instance setup | `https://www.contentstack.com/academy/training-instance` |
| Academy playground (GitHub) | `https://github.com/contentstack/contentstack-academy-playground` |
| Contentstack documentation | `https://www.contentstack.com/docs/` |
| Custom Integrations | `https://learn.lytics.com/documentation/product/integrations/custom-integrations/overview` |
| Custom Data Sources | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/custom-data-sources` |
