Usage Analytics in Analysis Workspace How to measure your organization's usage of Adobe Analytics

What is Usage Analytics, aka Analytics on Analytics?

Usage Analytics is the data collected about your usage of Adobe Analytics. Think of Adobe Analytics like the website you manage, where “visitors” are the users of Analytics at your organization.

This data can help justify your investment in Adobe Analytics to your leadership team, provide insight into which users leverage data in their roles, and show you the effectiveness of your training sessions & shared projects. Use this data to answer questions such as:

  • How many users are accessing Analytics each month?
  • Are our users logging in more since our training session?
  • Which users are still using Reports & Analytics?
  • Are people using the projects we build?
  • Who deleted my segment??

This page shows you how to start analyzing your organization's Adobe Analytics usage IN Analysis Workspace today, by following six steps: 1) Create, 2) Download, 3) Transform, 4) Upload, 5) Enrich, and 6) Analyze.

A couple notes before getting started:

  • The Adobe Analytics Product team is evaluating adding native usage analytics in the product. As such, the process outlined on this page is meant to be an interim approach to Usage Analytics. You can stay up-to-date on the progress of the native usage analytics project here.
  • The Adobe Analytics team is also researching adding component usage data to the product, e.g. 'Last used date' for segments. The process outlined on this page will contain component management data, e.g. Created, Deleted, Updated, and Shared information. If you would like to measure component usage today, Wayne Verner (a member of the Adobe Analytics community) has documented how to do this for segments here.


Create a report suite meant for usage analysis, to keep the data separate from your digital data. Go to Admin > Report Suites > Create New.


Download the Usage & Access log data, along with your user list.

Usage Data (2 options)

  • Leverage the 2.0 API /auditlogs/usage method. Learn more here and try it out on Swagger here.
  • Or, download data via the browser under Analytics > Admin > Logs > Usage & Access > Download Report.
Sample Usage & Access log download, with corresponding column headers.

User List

Download user data via the browser under: Experience Cloud > Admin Console > Users > Export Users list to CSV.


Transform the logs into eVars, props, and events for importing into Analytics. There are several tools you can use to transform the logs, e.g. Processing Rules, Excel, Python, etc. Below is an example of the format you want to transform the log data into; this is the exact format used in Bulk Data Insertion API. We recommend this import method instead of Full Processing Data Sources which will soon be deprecated.

Bulk data insertion API example. reportSuiteID, timestamp (Time), marketingCloudVisitorID (Login name), pageName (Event Category;EventDetail), userAgent (filler text), eVar1 (Login name), eVar2 (Report suite), evar3 (Event Category;EventDetail), events list

Dimensions to Create

You can create several dimensions from the logs. Below are the recommended dimensions to start with. In Step 5 'Enrich', there are additional classifications that you can create from the dimensions below.

  • reportSuiteID (Required): This is needed to route your data to the correct report suite.
  • Timestamp (Required): Set with log field 'Time'. The format must be ISO 8601 or Unix Time format. You can test with a small sample to validate your format or use Bulk Data Insertion API’s validation endpoint for test files. Note: Rows within files must be ordered from oldest to newest.
  • marketingCloudVisitorID (Required) and 1 eVar: Set with log field 'Login'. NOTE: if your organization uses email addresses as login IDs, scrub the email addresses at this time before inserting into your input file. Replace with first & last name, or simply remove the "@abc.com" portion of the string.
  • s.pagename (Required) and 1 eVar: Set with 'Event Type;Event Detail'. If Event Type is numeric, you can find a friendly name lookup table here.
  • userAgent (Required): Provide a filler value. This field is not exported by the Usage API, but is required for Bulk Data Insertion.
  • 1 eVar: Set with log field 'Report Suite'. Note: the report suite is sparsely populated throughout the logs, so you may want to massage this column a bit.

Events to Create

You can create a variety of events from the logs. To do this, you want to look for certain words in the Event Category;Event Detail concatenated field. See the table below for 43 suggested events. Two notes when creating the event column for your file upload:

  • If you want to set multiple events, the format is "event1,event2,event3=10". Note that multiple events must be enclosed in quotes in order to be accepted by the Bulk Data Insertion API.
  • Do not create ‘Unsharing’ events. The Logs do not capture ‘unsharing’ actions correctly today, across any component. It is recommended those be omitted for now.

Alternatively, you can use processing rules for event creation instead (see next section).

List of events to create from the Event Details log field

Use processing rules to create events

Instead of adding an event column to your upload file, you can omit that column and use processing rules instead. With this approach, setup 1 processing rule with many rule criteria in it (1 criteria per event you want to setup).

It is important that this processing rule be setup before your first uploaded file. Processing rules will run against your full processing data source or data insertion upload just like they would regular hits that come in through data collection.

Use a processing rule to define your custom events


Upload the transformed data into Analytics using Bulk Data Insertion API. See documentation for detailed directions and information about required fields.

Note: Imported rows of data are charged as primary server calls. However, the size of usage data should be far less than typical digital traffic.


(Optional) Enrich the imported data with Classifications & Virtual report suites.


Event Category;Event Detail dimension: This dimension can be classified into Event Category, Event Detail, Component Name, and Component ID using Admin > Classification Rules. See the 3 sets of rules to create below.

1. Parse apart Event Category & Event Detail based on delimiter you set. Here are the rules you can setup:

Parse Event Category;Event Detail

2. Improve the completeness of Event Category by looking for keywords such as Calculated Metric, Project, etc. Here are the rules we applied:

Improvement Event Category

3. Define Component ID & Component Name with RegEx. Components include Projects, Segments, Calculated Metrics, Date Ranges, and Virtual Report Suites. Here are the rules you can setup:

Create Component ID & Component Name

Login Name Classifications: You can group your users into as many buckets as you’d like, then upload a manual classification file under Admin > Classification Importer. Some suggested classifications are 1) Team, 2) Admin vs Non-Admin, 3) Product profile (from Experience Cloud Admin Console), and 4) custom User Types (Executive, Marketer, Analyst, etc).

Virtual Report Suites (VRS)

You can setup a VRS to define a flexible session length and rename dimensions/metrics with friendlier names. Some suggested renames are:

  • “Unique visitors” to “Login users”
  • “Visits” to “Sessions”
  • “Occurrences” to “All actions”


Use Analysis Workspace to analyze the usage data. The 3 fundamental traffic measures in Analytics represent the following:

  • Unique visitors = counts your login names, since that was set into VisitorID
  • Visits = 30 minutes of inactivity, as seen in your timestamp column. The session timeout can be redefined as needed in Virtual Report Suites.
  • Occurrences = rows in the file you uploaded. This represents the number of actions taken by your users.

Overall Usage

View overall usage trends to see how often your users are logging in, how many sessions each user has, and how that compares to historical usage.

Overall usage analysis

Usage by Team, User & Role

Look at usage by teams or roles (Admin/User). If you recently trained a team, look at their behavior before & after training to see if they have started using the tool more.

Team/user usage analysis

Common Journeys Through Analytics

View common journeys users are taking through Analytics. See which users are logging in and going straight to Reports & Analytics, to find opportunities to suggest Analysis Workspace training.

Journey usage analysis

Workspace Project Usage

Dive into details about Analysis Workspace, from project creation & shares, to project views. Breakdown project names by login name to see which users are accessing your projects. Use project views to understand if shared dashboards are actually being opened & viewed by the intended audience.

Workspace project usage analysis

Component Management

View management metrics for all your components - Projects, Segments, Calculated metrics, VRS, Date ranges, and Alerts - to better understand which users are creating, updating, deleting and sharing within your organization.

Note: Currently, component usage is not available through the logs, e.g. last used date or where used. If you would like to measure component usage today, Wayne Verner (a member of the Adobe Analytics community) has documented how to do this for segments here.

Advanced Feature Usage

Although the log tracking is more sparse, you can measure certain features used within Workspace based on the log events that are registered. Features include Contribution Analysis, Segment Comparison, Histograms, Right-click > Attribution IQ comparison added, and Right-click > Date Comparisons.

Advanced feature usage analysis

Like this idea?

If you want to see Usage Analytics built more directly into Adobe Analytics, upvote the idea on Adobe Forums to show your support.

Or, ready for more content?

Visit adobe.ly/aaresources for a full list of Adobe Analytics Spark pages & other helpful resources.

Created By
Jen Lasser