Skip to main content
Version: Next

Glue

Certified

Important Capabilities

CapabilityStatusNotes
Detect Deleted EntitiesEnabled by default when stateful ingestion is turned on.
DomainsSupported via the domain config field
Platform InstanceEnabled by default
Table-Level LineageEnabled by default

Note: if you also have files in S3 that you'd like to ingest, we recommend you use Glue's built-in data catalog. See here for a quick guide on how to set up a crawler on Glue and ingest the outputs with DataHub.

This plugin extracts the following:

  • Tables in the Glue catalog
  • Column types associated with each table
  • Table metadata, such as owner, description and parameters
  • Jobs and their component transformations, data sources, and data sinks

IAM permissions

For ingesting datasets, the following IAM permissions are required:

{
"Effect": "Allow",
"Action": [
"glue:GetDatabases",
"glue:GetTables"
],
"Resource": [
"arn:aws:glue:$region-id:$account-id:catalog",
"arn:aws:glue:$region-id:$account-id:database/*",
"arn:aws:glue:$region-id:$account-id:table/*"
]
}

For ingesting jobs (extract_transforms: True), the following additional permissions are required:

{
"Effect": "Allow",
"Action": [
"glue:GetDataflowGraph",
"glue:GetJobs",
],
"Resource": "*"
}

plus s3:GetObject for the job script locations.

CLI based Ingestion

Install the Plugin

pip install 'acryl-datahub[glue]'

Starter Recipe

Check out the following recipe to get started with ingestion! See below for full configuration options.

For general pointers on writing and running a recipe, see our main recipe guide.

source:
type: glue
config:
# Coordinates
aws_region: "my-aws-region"

sink:
# sink configs

Config Details

Note that a . is used to denote nested fields in the YAML recipe.

FieldDescription
aws_region 
string
AWS region code.
aws_access_key_id
string
AWS access key ID. Can be auto-detected, see the AWS boto3 docs for details.
aws_advanced_config
object
Advanced AWS configuration options. These are passed directly to botocore.config.Config.
aws_endpoint_url
string
The AWS service endpoint. This is normally constructed automatically, but can be overridden here.
aws_profile
string
Named AWS profile to use. Only used if access key / secret are unset. If not set the default will be used
aws_proxy
map(str,string)
aws_secret_access_key
string
AWS secret access key. Can be auto-detected, see the AWS boto3 docs for details.
aws_session_token
string
AWS session token. Can be auto-detected, see the AWS boto3 docs for details.
catalog_id
string
The aws account id where the target glue catalog lives. If None, datahub will ingest glue in aws caller's account.
emit_s3_lineage
boolean
Whether to emit S3-to-Glue lineage.
Default: False
extract_owners
boolean
When enabled, extracts ownership from Glue directly and overwrites existing owners. When disabled, ownership is left empty for datasets.
Default: True
extract_transforms
boolean
Whether to extract Glue transform jobs.
Default: True
glue_s3_lineage_direction
string
If upstream, S3 is upstream to Glue. If downstream S3 is downstream to Glue.
Default: upstream
ignore_resource_links
boolean
If set to True, ignore database resource links.
Default: False
ignore_unsupported_connectors
boolean
Whether to ignore unsupported connectors. If disabled, an error will be raised.
Default: True
platform
string
The platform to use for the dataset URNs. Must be one of ['glue', 'athena'].
Default: glue
platform_instance
string
The instance of the platform that all assets produced by this recipe belong to
read_timeout
number
The timeout for reading from the connection (in seconds).
Default: 60
use_s3_bucket_tags
boolean
If an S3 Buckets Tags should be created for the Tables ingested by Glue. Please Note that this will not apply tags to any folders ingested, only the files.
Default: False
use_s3_object_tags
boolean
If an S3 Objects Tags should be created for the Tables ingested by Glue.
Default: False
env
string
The environment that all assets produced by this connector belong to
Default: PROD
aws_role
One of string, union(anyOf), string, AwsAssumeRoleConfig
AWS roles to assume. If using the string format, the role ARN can be specified directly. If using the object format, the role can be specified in the RoleArn field and additional available arguments are documented at https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html?highlight=assume_role#STS.Client.assume_role
aws_role.RoleArn 
string
ARN of the role to assume.
aws_role.ExternalId
string
External ID to use when assuming the role.
database_pattern
AllowDenyPattern
regex patterns for databases to filter in ingestion.
Default: {'allow': ['.*'], 'deny': [], 'ignoreCase': True}
database_pattern.allow
array(string)
database_pattern.deny
array(string)
database_pattern.ignoreCase
boolean
Whether to ignore case sensitivity during pattern matching.
Default: True
domain
map(str,AllowDenyPattern)
A class to store allow deny regexes
domain.key.allow
array(string)
domain.key.deny
array(string)
domain.key.ignoreCase
boolean
Whether to ignore case sensitivity during pattern matching.
Default: True
table_pattern
AllowDenyPattern
regex patterns for tables to filter in ingestion.
Default: {'allow': ['.*'], 'deny': [], 'ignoreCase': True}
table_pattern.allow
array(string)
table_pattern.deny
array(string)
table_pattern.ignoreCase
boolean
Whether to ignore case sensitivity during pattern matching.
Default: True
profiling
GlueProfilingConfig
Configs to ingest data profiles from glue table
profiling.column_count
string
The parameter name for column count in glue table.
profiling.max
string
The parameter name for the max value of a column.
profiling.mean
string
The parameter name for the mean value of a column.
profiling.median
string
The parameter name for the median value of a column.
profiling.min
string
The parameter name for the min value of a column.
profiling.null_count
string
The parameter name for the count of null values in a column.
profiling.null_proportion
string
The parameter name for the proportion of null values in a column.
profiling.row_count
string
The parameter name for row count in glue table.
profiling.stdev
string
The parameter name for the standard deviation of a column.
profiling.unique_count
string
The parameter name for the count of unique value in a column.
profiling.unique_proportion
string
The parameter name for the proportion of unique values in a column.
profiling.operation_config
OperationConfig
Experimental feature. To specify operation configs.
profiling.operation_config.lower_freq_profile_enabled
boolean
Whether to do profiling at lower freq or not. This does not do any scheduling just adds additional checks to when not to run profiling.
Default: False
profiling.operation_config.profile_date_of_month
integer
Number between 1 to 31 for date of month (both inclusive). If not specified, defaults to Nothing and this field does not take affect.
profiling.operation_config.profile_day_of_week
integer
Number between 0 to 6 for day of week (both inclusive). 0 is Monday and 6 is Sunday. If not specified, defaults to Nothing and this field does not take affect.
profiling.partition_patterns
AllowDenyPattern
Regex patterns for filtering partitions for profile. The pattern should be a string like: "{'key':'value'}".
Default: {'allow': ['.*'], 'deny': [], 'ignoreCase': True}
profiling.partition_patterns.allow
array(string)
profiling.partition_patterns.deny
array(string)
profiling.partition_patterns.ignoreCase
boolean
Whether to ignore case sensitivity during pattern matching.
Default: True
stateful_ingestion
StatefulStaleMetadataRemovalConfig
Base specialized config for Stateful Ingestion with stale metadata removal capability.
stateful_ingestion.enabled
boolean
The type of the ingestion state provider registered with datahub.
Default: False
stateful_ingestion.remove_stale_metadata
boolean
Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.
Default: True

Concept Mapping

Source ConceptDataHub ConceptNotes
"glue"Data Platform
Glue DatabaseContainerSubtype Database
Glue TableDatasetSubtype Table
Glue JobData Flow
Glue Job TransformData Job
Glue Job Data sourceDataset
Glue Job Data sinkDataset

Compatibility

To capture lineage across Glue jobs and databases, a requirements must be met – otherwise the AWS API is unable to report any lineage. The job must be created in Glue Studio with the "Generate classic script" option turned on (this option can be accessed in the "Script" tab). Any custom scripts that do not have the proper annotations will not have reported lineage.

Code Coordinates

  • Class Name: datahub.ingestion.source.aws.glue.GlueSource
  • Browse on GitHub

Questions

If you've got any questions on configuring ingestion for Glue, feel free to ping us on our Slack.