# Lytics JavaScript Tag

### About this export

| Field | Value |
| --- | --- |
| **content_type** | lesson |
| **platform** | contentstack-academy |
| **source_url** | https://www.contentstack.com/academy/courses/beyond-the-basics/lytics-javascript-tag |
| **course_slug** | beyond-the-basics |
| **lesson_slug** | lytics-javascript-tag |
| **markdown_file_url** | /academy/md/courses/beyond-the-basics/lytics-javascript-tag.md |
| **generated_at** | 2026-04-28T06:55:35.246Z |

> Part of **[Beyond the Basics](https://www.contentstack.com/academy/courses/beyond-the-basics)** on Contentstack Academy. **Academy MD v3** — structured for retrieval; no quiz or assessment keys.

<!-- ai_metadata: {"lesson_id":"02","type":"video","duration_seconds":363,"video_url":"https://cdn.jwplayer.com/previews/d4p068ys","thumbnail_url":"https://cdn.jwplayer.com/v2/media/d4p068ys/poster.jpg?width=720","topics":["Lytics","JavaScript","Tag"]} -->

#### Video details

#### At a glance

- **Title:** Lytics JavaScript Tag
- **Duration:** 6m 3s
- **Media link:** https://cdn.jwplayer.com/previews/d4p068ys
- **Publish date (unix):** 1751881581

#### Streaming renditions

- application/vnd.apple.mpegurl
- video/mp4 · 180p · 248p · 150480 kbps

#### Timed text tracks (delivery)

- **thumbnails:** `https://cdn.jwplayer.com/strips/d4p068ys-120.vtt`

#### Transcript

In this video we're going to cover the JS tag and how to audit the data that's coming in from the JS tag in Linux. So the first place that we're going to start is talking about how data comes in from the web and how to install that JS tag on your site. So there are going to be a couple different ways to install the JS tag so that that data passes in to our data streams within Linux. The first way to install the method is through a tag manager such as Google Tag Manager or Teleum. There's usually going to be the Linux tag in the store itself where you can just select the tag, put your account ID in there, and then install the tag. However, if you have a more custom installation that requires a little bit more finesse to get that information in, you also have the option to manually upload the script onto the web page and add those additional features such as if you have a single page app or if you want to pass additional information in for when that tag loads. Additionally, if you have any extra information that you want to pass to Linux outside of what passes in the default data layer or in the default tag itself, you have options using the JS tag dot send functionality to get that information to pass into Linux. So downstream there's the default information that can pass into the data streams into what we call the default stream, which is just a stream of data coming from the web. You also have the option, as I mentioned before, to use the JS tag dot send method to get additional information into Linux. So both of those can be really, really useful when bringing data into Linux. So the next place that we're going to look or talk about is the data streams. The reason that we start there is that's the first place the data is going to hit when it comes into Linux. So this data stream is going to be the raw values that we see coming in from the web. This doesn't necessarily mean that it's going to be the only the values that we want to bring in map to the profiles, but any values that are available from the JS tag. That could include your default data layer, a more custom data layer that's been set up in your tag, or any custom sends and JSON payloads that are being sent using the JS tag dot send method. In the data streams tag in Linux, you're able to see the raw fields, the frequency of which they come in, and any sort of cardinality with that field. This gives us a ton of information on if that field's coming in, how often that field is coming in, and then if there's any additional information associated to that field. You can also see sample information in the raw data stream of a sample set of what we can expect to see in that value. This is going to be super helpful for data auditing because it will allow us to see if there's any sort of garbage data coming in, information that's not super valuable, nulls, special characters that we don't want to see in that field. This is sort of a window into that raw information. The next place that the data is going to flow is the data schema. So the data scheme is important because it's going to be only fields that we've explicitly mapped in the LQL itself. LQL, as a reminder, stands for a Lytics Query Language. In the Lytics Query file, we are going to pull in information that we want to map to the profile and only information we want to map to the profile. What that means is we may be receiving raw information in the data stream that never gets mapped and never gets surfaced in the UI. That is acceptable and can happen. However, that information will not show up in the data schema. The data schema can be super useful because it's going to talk to you about a couple different things. It's going to tell you what fields we've mapped in the LQL, the data types for those fields, how many users have those fields, if those fields are used in any audiences, and then also if there's any overlap between the different sources and those fields. So for instance, if we have something like an email, that's going to be seen throughout the different sources and that will show up in the data schema as something that's frequently used across different sources. This will also give you a health score or a percentage of how much of your data is actually shared across multiple sources. The final destination of your data, once it's come into the data schema, is going to be the user fields tab in Linux. The reason that the user fields tab is really important is it's going to show you only fields that have showed up in user profiles themselves. If it's in the data schema but not in the user fields, that means that we've mapped the fields in the LQL but we have yet to see them on any user profiles. The user fields are going to show you the type of data that's showing up in that field and then also how much of that data is showing up in that field. It will also show you through visuals what sort of data shows up more frequently in those fields using different types of graphs. Depending on the field, whether it be a Boolean, a String, or an Int, the type of visuals are going to be a little bit different. So that's something to note. From the user fields tab, you can also create audiences. So if you find that there's a field there that you're really really interested in sort of hashing out what it is and how many users have it, how often it's seen, then you can directly go to the audience builder with that field and find out that information. Something really important to remember about both the data schema and the user fields is that they're going to be able to break down per source. So you can see per stream what information is available and what data has been mapped or has also showed up on the user fields themselves.

#### Subtitles (WebVTT)

```webvtt
WEBVTT

1
00:00:00.000 --> 00:00:05.000
In this video we're going to cover the JS tag and how to audit the data that's

2
00:00:05.000 --> 00:00:08.920
coming in from the JS tag in Linux. So the first place that we're going to

3
00:00:08.920 --> 00:00:15.080
start is talking about how data comes in from the web and how to install that JS

4
00:00:15.080 --> 00:00:20.000
tag on your site. So there are going to be a couple different ways to install the JS

5
00:00:20.000 --> 00:00:25.640
tag so that that data passes in to our data streams within Linux. The first way

6
00:00:25.640 --> 00:00:30.240
to install the method is through a tag manager such as Google Tag Manager or

7
00:00:30.240 --> 00:00:35.240
Teleum. There's usually going to be the Linux tag in the store itself where you

8
00:00:35.240 --> 00:00:39.320
can just select the tag, put your account ID in there, and then install the tag.

9
00:00:39.320 --> 00:00:44.000
However, if you have a more custom installation that requires a little bit

10
00:00:44.000 --> 00:00:49.680
more finesse to get that information in, you also have the option to manually

11
00:00:49.680 --> 00:00:54.880
upload the script onto the web page and add those additional features such as if

12
00:00:54.880 --> 00:00:59.680
you have a single page app or if you want to pass additional information in

13
00:00:59.680 --> 00:01:06.880
for when that tag loads. Additionally, if you have any extra information that you

14
00:01:06.880 --> 00:01:11.240
want to pass to Linux outside of what passes in the default data layer or in

15
00:01:11.240 --> 00:01:17.680
the default tag itself, you have options using the JS tag dot send functionality

16
00:01:17.680 --> 00:01:23.200
to get that information to pass into Linux. So downstream there's the default

17
00:01:23.200 --> 00:01:27.640
information that can pass into the data streams into what we call the default

18
00:01:27.640 --> 00:01:32.280
stream, which is just a stream of data coming from the web. You also have the

19
00:01:32.280 --> 00:01:38.040
option, as I mentioned before, to use the JS tag dot send method to get additional

20
00:01:38.040 --> 00:01:43.480
information into Linux. So both of those can be really, really useful when

21
00:01:43.480 --> 00:01:48.400
bringing data into Linux. So the next place that we're going to look or talk

22
00:01:48.400 --> 00:01:52.920
about is the data streams. The reason that we start there is that's the first

23
00:01:52.920 --> 00:01:57.000
place the data is going to hit when it comes into Linux. So this data stream is

24
00:01:57.000 --> 00:02:01.600
going to be the raw values that we see coming in from the web. This doesn't

25
00:02:01.600 --> 00:02:05.520
necessarily mean that it's going to be the only the values that we want to

26
00:02:05.520 --> 00:02:10.080
bring in map to the profiles, but any values that are available from the JS

27
00:02:10.080 --> 00:02:14.720
tag. That could include your default data layer, a more custom data layer that's

28
00:02:14.720 --> 00:02:19.760
been set up in your tag, or any custom sends and JSON payloads that are being

29
00:02:19.800 --> 00:02:25.880
sent using the JS tag dot send method. In the data streams tag in Linux, you're

30
00:02:25.880 --> 00:02:31.840
able to see the raw fields, the frequency of which they come in, and any sort of

31
00:02:31.840 --> 00:02:38.000
cardinality with that field. This gives us a ton of information on if that

32
00:02:38.000 --> 00:02:42.040
field's coming in, how often that field is coming in, and then if there's any

33
00:02:42.040 --> 00:02:46.360
additional information associated to that field. You can also see sample

34
00:02:46.360 --> 00:02:51.760
information in the raw data stream of a sample set of what we can expect to see

35
00:02:51.760 --> 00:02:56.600
in that value. This is going to be super helpful for data auditing because it

36
00:02:56.600 --> 00:03:00.480
will allow us to see if there's any sort of garbage data coming in, information

37
00:03:00.480 --> 00:03:05.080
that's not super valuable, nulls, special characters that we don't want to see in

38
00:03:05.080 --> 00:03:10.800
that field. This is sort of a window into that raw information. The next place that

39
00:03:10.800 --> 00:03:15.800
the data is going to flow is the data schema. So the data scheme is important

40
00:03:15.800 --> 00:03:19.800
because it's going to be only fields that we've explicitly mapped in the LQL

41
00:03:19.800 --> 00:03:26.600
itself. LQL, as a reminder, stands for a Lytics Query Language. In the Lytics

42
00:03:26.600 --> 00:03:31.800
Query file, we are going to pull in information that we want to map to the

43
00:03:31.800 --> 00:03:37.360
profile and only information we want to map to the profile. What that means is we

44
00:03:37.360 --> 00:03:41.720
may be receiving raw information in the data stream that never gets mapped and

45
00:03:41.760 --> 00:03:47.320
never gets surfaced in the UI. That is acceptable and can happen. However, that

46
00:03:47.320 --> 00:03:52.400
information will not show up in the data schema. The data schema can be super

47
00:03:52.400 --> 00:03:56.640
useful because it's going to talk to you about a couple different things. It's

48
00:03:56.640 --> 00:04:00.120
going to tell you what fields we've mapped in the LQL, the data types for

49
00:04:00.120 --> 00:04:05.560
those fields, how many users have those fields, if those fields are used in any

50
00:04:05.560 --> 00:04:10.280
audiences, and then also if there's any overlap between the different sources and

51
00:04:10.280 --> 00:04:14.600
those fields. So for instance, if we have something like an email, that's

52
00:04:14.600 --> 00:04:18.440
going to be seen throughout the different sources and that will show up

53
00:04:18.440 --> 00:04:22.960
in the data schema as something that's frequently used across different

54
00:04:22.960 --> 00:04:27.920
sources. This will also give you a health score or a percentage of how much of

55
00:04:27.920 --> 00:04:33.120
your data is actually shared across multiple sources. The final destination

56
00:04:33.120 --> 00:04:36.960
of your data, once it's come into the data schema, is going to be the user

57
00:04:36.960 --> 00:04:42.040
fields tab in Linux. The reason that the user fields tab is really important is

58
00:04:42.040 --> 00:04:45.720
it's going to show you only fields that have showed up in user profiles

59
00:04:45.720 --> 00:04:51.480
themselves. If it's in the data schema but not in the user fields, that means

60
00:04:51.480 --> 00:04:56.600
that we've mapped the fields in the LQL but we have yet to see them on any user

61
00:04:56.600 --> 00:05:01.160
profiles. The user fields are going to show you the type of data that's showing

62
00:05:01.160 --> 00:05:05.840
up in that field and then also how much of that data is showing up in that

63
00:05:05.840 --> 00:05:10.560
field. It will also show you through visuals what sort of data shows up more

64
00:05:10.560 --> 00:05:15.200
frequently in those fields using different types of graphs. Depending on

65
00:05:15.200 --> 00:05:19.160
the field, whether it be a Boolean, a String, or an Int, the type of visuals

66
00:05:19.160 --> 00:05:23.520
are going to be a little bit different. So that's something to note. From the

67
00:05:23.520 --> 00:05:27.840
user fields tab, you can also create audiences. So if you find that there's a

68
00:05:27.840 --> 00:05:32.200
field there that you're really really interested in sort of hashing out what

69
00:05:32.200 --> 00:05:37.120
it is and how many users have it, how often it's seen, then you can directly go

70
00:05:37.120 --> 00:05:41.680
to the audience builder with that field and find out that information. Something

71
00:05:41.680 --> 00:05:46.240
really important to remember about both the data schema and the user

72
00:05:46.240 --> 00:05:51.200
fields is that they're going to be able to break down per source. So you can see per

73
00:05:51.200 --> 00:05:57.160
stream what information is available and what data has been mapped or has

74
00:05:57.160 --> 00:06:01.480
also showed up on the user fields themselves.

```

```transcript
<!-- PLACEHOLDER: replace with real transcript before publish if cues were auto-derived from WebVTT -->
[00:00] In this video we're going to cover the JS tag and how to audit the data that's
[00:05] coming in from the JS tag in Linux. So the first place that we're going to
[00:08] start is talking about how data comes in from the web and how to install that JS
[00:15] tag on your site. So there are going to be a couple different ways to install the JS
[00:20] tag so that that data passes in to our data streams within Linux. The first way
[00:25] to install the method is through a tag manager such as Google Tag Manager or
[00:30] Teleum. There's usually going to be the Linux tag in the store itself where you
[00:35] can just select the tag, put your account ID in there, and then install the tag.
[00:39] However, if you have a more custom installation that requires a little bit
[00:44] more finesse to get that information in, you also have the option to manually
[00:49] upload the script onto the web page and add those additional features such as if
[00:54] you have a single page app or if you want to pass additional information in
[00:59] for when that tag loads. Additionally, if you have any extra information that you
[01:06] want to pass to Linux outside of what passes in the default data layer or in
[01:11] the default tag itself, you have options using the JS tag dot send functionality
[01:17] to get that information to pass into Linux. So downstream there's the default
[01:23] information that can pass into the data streams into what we call the default
[01:27] stream, which is just a stream of data coming from the web. You also have the
[01:32] option, as I mentioned before, to use the JS tag dot send method to get additional
[01:38] information into Linux. So both of those can be really, really useful when
[01:43] bringing data into Linux. So the next place that we're going to look or talk
[01:48] about is the data streams. The reason that we start there is that's the first
[01:52] place the data is going to hit when it comes into Linux. So this data stream is
[01:57] going to be the raw values that we see coming in from the web. This doesn't
[02:01] necessarily mean that it's going to be the only the values that we want to
[02:05] bring in map to the profiles, but any values that are available from the JS
[02:10] tag. That could include your default data layer, a more custom data layer that's
[02:14] been set up in your tag, or any custom sends and JSON payloads that are being
[02:19] sent using the JS tag dot send method. In the data streams tag in Linux, you're
[02:25] able to see the raw fields, the frequency of which they come in, and any sort of
[02:31] cardinality with that field. This gives us a ton of information on if that
[02:38] field's coming in, how often that field is coming in, and then if there's any
[02:42] additional information associated to that field. You can also see sample
[02:46] information in the raw data stream of a sample set of what we can expect to see
[02:51] in that value. This is going to be super helpful for data auditing because it
[02:56] will allow us to see if there's any sort of garbage data coming in, information
[03:00] that's not super valuable, nulls, special characters that we don't want to see in
[03:05] that field. This is sort of a window into that raw information. The next place that
[03:10] the data is going to flow is the data schema. So the data scheme is important
[03:15] because it's going to be only fields that we've explicitly mapped in the LQL
[03:19] itself. LQL, as a reminder, stands for a Lytics Query Language. In the Lytics
[03:26] Query file, we are going to pull in information that we want to map to the
[03:31] profile and only information we want to map to the profile. What that means is we
[03:37] may be receiving raw information in the data stream that never gets mapped and
[03:41] never gets surfaced in the UI. That is acceptable and can happen. However, that
[03:47] information will not show up in the data schema. The data schema can be super
[03:52] useful because it's going to talk to you about a couple different things. It's
[03:56] going to tell you what fields we've mapped in the LQL, the data types for
[04:00] those fields, how many users have those fields, if those fields are used in any
[04:05] audiences, and then also if there's any overlap between the different sources and
[04:10] those fields. So for instance, if we have something like an email, that's
[04:14] going to be seen throughout the different sources and that will show up
[04:18] in the data schema as something that's frequently used across different
[04:22] sources. This will also give you a health score or a percentage of how much of
[04:27] your data is actually shared across multiple sources. The final destination
[04:33] of your data, once it's come into the data schema, is going to be the user
[04:36] fields tab in Linux. The reason that the user fields tab is really important is
[04:42] it's going to show you only fields that have showed up in user profiles
[04:45] themselves. If it's in the data schema but not in the user fields, that means
[04:51] that we've mapped the fields in the LQL but we have yet to see them on any user
```

#### Lesson text

Learn about the Lytics JavaScript tag and how it passes information from your website into Lytics user profiles.

## Video Tutorial

### What will I learn?  

*   What default data and events are collected?
*   How raw event inputs are sent through the [Data Streams](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/data-streams).
*   How LQL mapping creates the [Data Schema](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/schema-audit).
*   How [User Fields](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/user-fields) are stored in the user profile.

Watch the "Lytics JavaScript Tag" video (6 mins) for an introduction to the Lytics JavaScript tag and how it passes information from your website into Lytics user profiles.

### LQL Mapping

The video briefly mentions LQL mapping functions. See the Lytics Query Language course for more details.

## Installing the Lytics JS Tag

To find the Lytics JavaScript tag in the Lytics UI, click your Account Name at the bottom of the lefthand navigation menu and click [JavaScript Snippet](https://app.lytics.com/connect?view=v3) from the expanded menu.

Make sure to use Version 3 of the Lytics tag. See the documentation for more information: 

*   [Installation & Configuration](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/using-version-3/installation-configuration)
*   [Version 3 documentation](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/version-3-improvements)

You'll probably find it easier to install the Lytics JS Tag using a Tag Manager, like Google Tag Manager. See details at: [Working with Tag Managers](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/working-with-tag-managers)

## Knowledge Check

Where can raw data fields be seen in the Lytics UI? Data Streams

**Map the section of the data flow with the data points involved**

Data Schema

Includes mapped data on user profiles

Data Stream

Includes mapped fields, even if no data

User Fields

Includes raw values coming in from the web

**Which data stream contains web data collected by the Lytics JavaScript Tag?**

A. Experiences stream

B. Default stream

C. Custom stream

Answer: B

**Where can you find the percentage of user data being used in audiences?**

A. Data Streams

B. User Fields

C. Schema Audit

Answer: C

**New fields added via the jstag.send method will automatically be added to a user's profile.**

A. True

B. False

Answer: False - raw event data still needs to be mapped using Lytics Query Language (LQL) through the Data Schema first.

## More Resources

### Documentation

Here are some recommended resources to continue learning about the Lytics JS Tag.

*   [Lytics JavaScript Tag introduction](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/introduction)
*   [Onboarding Web Data](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/onboarding-web-data)
*   [Collecting Data with V3 Tag](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/using-version-3/collecting-data)
*   [Troubleshooting: Verifying Data is sent to Lytics](https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/troubleshooting#verify-data-is-being-sent-to-lytics)
*   [Installing Lytics Image Pixel](https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/onboarding-web-data#lytics-image-pixel)

#### Key takeaways

- Connect **Lytics JavaScript Tag** back to your stack configuration before moving to the next module.
- Capture one concrete artifact (screenshot, Postman call, or code snippet) that proves the step works in your environment.
- Re-read the delivery versus management boundary for anything you changed in the entry model.

## Supplement for indexing

### Content summary

Lytics JavaScript Tag. Learn about the Lytics JavaScript tag and how it passes information from your website into Lytics user profiles. Video Tutorial What will I learn? What default data and events are collected? How raw event inputs are sent through the Data Streams (https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/data-streams). How LQL mapping creates the Data Schema (https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/schema-audit). How User Fields (https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/user-fields) are stored in the user profile. Watch the "Lytics JavaScript Tag" video (6 mins) for an int

### Retrieval tags

- Lytics
- JavaScript
- Tag
- beyond-the-basics
- lesson 02
- Lytics JavaScript Tag
- beyond-the-basics lesson

### Indexing notes

Index this lesson as a primary chunk tagged with lesson_id "02" and topics: [Lytics, JavaScript, Tag].
Parent course slug: beyond-the-basics. Use asset_references URLs as thumbnail hints in search results when present.
Never surface LMS quiz content or assessment answers from this file.

### Asset references

| Label | URL |
| --- | --- |
| Video thumbnail: Lytics JavaScript Tag | `https://cdn.jwplayer.com/v2/media/d4p068ys/poster.jpg?width=720` |

### External links

| Label | URL |
| --- | --- |
| Contentstack Academy home | `https://www.contentstack.com/academy/` |
| Training instance setup | `https://www.contentstack.com/academy/training-instance` |
| Academy playground (GitHub) | `https://github.com/contentstack/contentstack-academy-playground` |
| Contentstack documentation | `https://www.contentstack.com/docs/` |
| Data Streams | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/data-streams` |
| Data Schema | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/schema-audit` |
| User Fields | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/user-fields` |
| JavaScript Snippet | `https://app.lytics.com/connect?view=v3` |
| Installation & Configuration | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/using-version-3/installation-configuration` |
| Version 3 documentation | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/version-3-improvements` |
| Working with Tag Managers | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/working-with-tag-managers` |
| Lytics JavaScript Tag introduction | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/introduction` |
| Onboarding Web Data | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/onboarding-web-data` |
| Collecting Data with V3 Tag | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/using-version-3/collecting-data` |
| Troubleshooting: Verifying Data is sent to Lytics | `https://learn.lytics.com/documentation/product/features/lytics-javascript-tag/troubleshooting#verify-data-is-being-sent-to-lytics` |
| Installing Lytics Image Pixel | `https://learn.lytics.com/documentation/product/features/data-onboarding-and-management/onboarding-web-data#lytics-image-pixel` |
