23/09/22 03:12:03 INFO DriverDaemon$: Started Log4j2
23/09/22 03:12:06 INFO DatabricksMain$$anon$1: Configured feature flag data source LaunchDarkly
23/09/22 03:12:06 INFO DatabricksMain$$anon$1: Load feature flag from LaunchDarkly
23/09/22 03:12:06 WARN DatabricksMain$$anon$1: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:06 INFO DriverDaemon$: Current JVM Version 1.8.0_362
23/09/22 03:12:06 INFO DriverDaemon$: ========== driver starting up ==========
23/09/22 03:12:06 INFO DriverDaemon$: Java: Azul Systems, Inc. 1.8.0_362
23/09/22 03:12:06 INFO DriverDaemon$: OS: Linux/amd64 5.15.0-1042-azure
23/09/22 03:12:06 INFO DriverDaemon$: CWD: /databricks/driver
23/09/22 03:12:06 INFO DriverDaemon$: Mem: Max: 6.3G loaded GCs: PS Scavenge, PS MarkSweep
23/09/22 03:12:06 INFO DriverDaemon$: Logging multibyte characters: ✓
23/09/22 03:12:06 INFO DriverDaemon$: 'publicFile.rolling.rewrite' appender in root logger: class org.apache.logging.log4j.core.appender.rewrite.RewriteAppender
23/09/22 03:12:06 INFO DriverDaemon$: == Modules:
23/09/22 03:12:08 INFO DriverDaemon$: Starting prometheus metrics log export timer
23/09/22 03:12:08 INFO DriverConf: Configured feature flag data source LaunchDarkly
23/09/22 03:12:08 INFO DriverConf: Load feature flag from LaunchDarkly
23/09/22 03:12:08 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:08 INFO DriverDaemon$: Loaded JDBC drivers in 190 ms
23/09/22 03:12:08 INFO DriverDaemon$: Universe Git Hash: 1e4a70bdafc31fab94e8e4a9c01a52855f6e151d
23/09/22 03:12:08 INFO DriverDaemon$: Spark Git Hash: 6deb9aa8cd233e381216b0ac25d7cfb153f8af95
23/09/22 03:12:08 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
23/09/22 03:12:08 WARN RunHelpers$: Missing tag isolation client: java.util.NoSuchElementException: key not found: TagDefinition(clientType,The client type for a request, used for isolating resources for the request.,false,false,List(),DATA_LABEL_UNSPECIFIED)
23/09/22 03:12:08 INFO DatabricksILoop$: Creating throwaway interpreter
23/09/22 03:12:08 INFO MetastoreMonitor$: Internal metastore configured
23/09/22 03:12:08 INFO DataSourceFactory$: DataSource Jdbc URL: jdbc:mariadb://consolidated-westus2-prod-metastore-addl-2.mysql.database.azure.com:3306/organization4679476628690204?useSSL=true&sslMode=VERIFY_CA&disableSslHostnameVerification=true&trustServerCertificate=false&serverSslCert=/databricks/common/mysql-ssl-ca-cert.crt
23/09/22 03:12:08 INFO ConcurrentRateLimiterConfParser$: No additional configuration supplied to the concurrent rate-limiter. Defaults would be used.
23/09/22 03:12:08 INFO ConcurrentRateLimiterConfParser$: Service com.databricks.backend.daemon.driver.DriverCorral concurrent rate-limiter ConcurrentRateLimitConfig - Dry-Run: false | Dimension: WORKSPACE | API: DEFAULT | High: 100 | Low: 50
23/09/22 03:12:08 INFO ConcurrentRateLimiterConfParser$: Service com.databricks.backend.daemon.driver.DriverCorral concurrent rate-limiter ConcurrentRateLimitConfig - Dry-Run: false | Dimension: ACCOUNT_ID | API: DEFAULT | High: 100 | Low: 50
23/09/22 03:12:08 INFO DriverCorral: Creating the driver context
23/09/22 03:12:08 INFO DatabricksILoop$: Class Server Dir: /local_disk0/tmp/repl/spark-4347861282214610666-415cbfc1-bc72-4ecc-8182-d24eda276af6
23/09/22 03:12:09 INFO HikariDataSource: metastore-monitor - Starting...
23/09/22 03:12:09 INFO HikariDataSource: metastore-monitor - Start completed.
23/09/22 03:12:09 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
23/09/22 03:12:09 WARN SparkConf: The configuration key 'spark.akka.frameSize' has been deprecated as of Spark 1.6 and may be removed in the future. Please use the new key 'spark.rpc.message.maxSize' instead.
23/09/22 03:12:09 INFO SparkContext: Running Spark version 3.3.0
23/09/22 03:12:10 INFO ResourceUtils: ==============================================================
23/09/22 03:12:10 INFO ResourceUtils: No custom resources configured for spark.driver.
23/09/22 03:12:10 INFO ResourceUtils: ==============================================================
23/09/22 03:12:10 INFO SparkContext: Submitted application: Databricks Shell
23/09/22 03:12:10 INFO HikariDataSource: metastore-monitor - Shutdown initiated...
23/09/22 03:12:10 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 7284, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
23/09/22 03:12:10 INFO HikariDataSource: metastore-monitor - Shutdown completed.
23/09/22 03:12:10 INFO ResourceProfile: Limiting resource is cpu
23/09/22 03:12:10 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/09/22 03:12:10 INFO MetastoreMonitor: Metastore healthcheck successful (connection duration = 1514 milliseconds)
23/09/22 03:12:10 INFO SecurityManager: Changing view acls to: root
23/09/22 03:12:10 INFO SecurityManager: Changing modify acls to: root
23/09/22 03:12:10 INFO SecurityManager: Changing view acls groups to: 
23/09/22 03:12:10 INFO SecurityManager: Changing modify acls groups to: 
23/09/22 03:12:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
23/09/22 03:12:10 INFO Utils: Successfully started service 'sparkDriver' on port 40381.
23/09/22 03:12:10 INFO SparkEnv: Registering MapOutputTracker
23/09/22 03:12:11 INFO SparkEnv: Registering BlockManagerMaster
23/09/22 03:12:11 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/09/22 03:12:11 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/09/22 03:12:11 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/09/22 03:12:11 INFO DiskBlockManager: Created local directory at /local_disk0/blockmgr-d9671de5-06a4-41c4-a396-6164c52e9d6e
23/09/22 03:12:11 INFO MemoryStore: MemoryStore started with capacity 3.3 GiB
23/09/22 03:12:11 INFO SparkEnv: Registering OutputCommitCoordinator
23/09/22 03:12:11 INFO SparkContext: Spark configuration:
eventLog.rolloverIntervalSeconds=900
libraryDownload.sleepIntervalSeconds=5
libraryDownload.timeoutSeconds=180
spark.akka.frameSize=256
spark.app.name=Databricks Shell
spark.app.startTime=1695352329292
spark.cleaner.referenceTracking.blocking=false
spark.databricks.acl.client=com.databricks.spark.sql.acl.client.SparkSqlAclClient
spark.databricks.acl.dfAclsEnabled=false
spark.databricks.acl.enabled=false
spark.databricks.acl.provider=com.databricks.sql.acl.ReflectionBackedAclProvider
spark.databricks.acl.scim.client=com.databricks.spark.sql.acl.client.DriverToWebappScimClient
spark.databricks.automl.serviceEnabled=true
spark.databricks.cloudProvider=Azure
spark.databricks.cloudfetch.hasRegionSupport=true
spark.databricks.cloudfetch.requesterClassName=*********(redacted)
spark.databricks.clusterSource=UI
spark.databricks.clusterUsageTags.attribute_tag_budget=
spark.databricks.clusterUsageTags.attribute_tag_dust_execution_env=
spark.databricks.clusterUsageTags.attribute_tag_dust_maintainer=
spark.databricks.clusterUsageTags.attribute_tag_dust_suite=
spark.databricks.clusterUsageTags.attribute_tag_service=
spark.databricks.clusterUsageTags.autoTerminationMinutes=15
spark.databricks.clusterUsageTags.azureSubscriptionId=a4f54399-8db8-4849-adcc-a42aed1fb97f
spark.databricks.clusterUsageTags.cloudProvider=Azure
spark.databricks.clusterUsageTags.clusterAllTags=[{"key":"Vendor","value":"Databricks"},{"key":"Creator","value":"jason.yip@tredence.com"},{"key":"ClusterName","value":"jason.yip@tredence.com's Cluster"},{"key":"ClusterId","value":"0808-055325-43kdx9a4"},{"key":"Environment","value":"POC"},{"key":"Project","value":"SI"},{"key":"DatabricksEnvironment","value":"workerenv-4679476628690204"}]
spark.databricks.clusterUsageTags.clusterAvailability=SPOT_WITH_FALLBACK_AZURE
spark.databricks.clusterUsageTags.clusterCreator=Webapp
spark.databricks.clusterUsageTags.clusterFirstOnDemand=1
spark.databricks.clusterUsageTags.clusterGeneration=58
spark.databricks.clusterUsageTags.clusterId=0808-055325-43kdx9a4
spark.databricks.clusterUsageTags.clusterLastActivityTime=1695351309363
spark.databricks.clusterUsageTags.clusterLogDeliveryEnabled=true
spark.databricks.clusterUsageTags.clusterLogDestination=dbfs:/cluster-logs
spark.databricks.clusterUsageTags.clusterMaxWorkers=2
spark.databricks.clusterUsageTags.clusterMetastoreAccessType=RDS_DIRECT
spark.databricks.clusterUsageTags.clusterMinWorkers=1
spark.databricks.clusterUsageTags.clusterName=jason.yip@tredence.com's Cluster
spark.databricks.clusterUsageTags.clusterNoDriverDaemon=false
spark.databricks.clusterUsageTags.clusterNodeType=Standard_DS3_v2
spark.databricks.clusterUsageTags.clusterNumCustomTags=0
spark.databricks.clusterUsageTags.clusterNumSshKeys=0
spark.databricks.clusterUsageTags.clusterOwnerOrgId=4679476628690204
spark.databricks.clusterUsageTags.clusterOwnerUserId=*********(redacted)
spark.databricks.clusterUsageTags.clusterPinned=false
spark.databricks.clusterUsageTags.clusterPythonVersion=3
spark.databricks.clusterUsageTags.clusterResourceClass=default
spark.databricks.clusterUsageTags.clusterScalingType=autoscaling
spark.databricks.clusterUsageTags.clusterSizeType=VM_CONTAINER
spark.databricks.clusterUsageTags.clusterSku=STANDARD_SKU
spark.databricks.clusterUsageTags.clusterSpotBidMaxPrice=-1.0
spark.databricks.clusterUsageTags.clusterState=Pending
spark.databricks.clusterUsageTags.clusterStateMessage=Starting Spark
spark.databricks.clusterUsageTags.clusterTargetWorkers=1
spark.databricks.clusterUsageTags.clusterUnityCatalogMode=*********(redacted)
spark.databricks.clusterUsageTags.clusterWorkers=1
spark.databricks.clusterUsageTags.containerType=LXC
spark.databricks.clusterUsageTags.dataPlaneRegion=westus2
spark.databricks.clusterUsageTags.driverContainerId=5603f7b1e1d64b3fb68a6cbede3b5d75
spark.databricks.clusterUsageTags.driverContainerPrivateIp=10.11.115.134
spark.databricks.clusterUsageTags.driverInstanceId=48c9146699474759853905f5e39b09cf
spark.databricks.clusterUsageTags.driverInstancePrivateIp=10.11.115.198
spark.databricks.clusterUsageTags.driverNodeType=Standard_DS3_v2
spark.databricks.clusterUsageTags.effectiveSparkVersion=11.3.x-cpu-ml-scala2.12
spark.databricks.clusterUsageTags.enableCredentialPassthrough=*********(redacted)
spark.databricks.clusterUsageTags.enableDfAcls=false
spark.databricks.clusterUsageTags.enableElasticDisk=true
spark.databricks.clusterUsageTags.enableGlueCatalogCredentialPassthrough=*********(redacted)
spark.databricks.clusterUsageTags.enableJdbcAutoStart=true
spark.databricks.clusterUsageTags.enableJobsAutostart=true
spark.databricks.clusterUsageTags.enableLocalDiskEncryption=false
spark.databricks.clusterUsageTags.enableSqlAclsOnly=false
spark.databricks.clusterUsageTags.hailEnabled=false
spark.databricks.clusterUsageTags.ignoreTerminationEventInAlerting=false
spark.databricks.clusterUsageTags.instanceWorkerEnvId=workerenv-4679476628690204
spark.databricks.clusterUsageTags.instanceWorkerEnvNetworkType=vnet-injection
spark.databricks.clusterUsageTags.isDpCpPrivateLinkEnabled=false
spark.databricks.clusterUsageTags.isIMv2Enabled=false
spark.databricks.clusterUsageTags.isServicePrincipalCluster=false
spark.databricks.clusterUsageTags.isSingleUserCluster=*********(redacted)
spark.databricks.clusterUsageTags.managedResourceGroup=databricks-rg-SI-ADB-runlzayanl524
spark.databricks.clusterUsageTags.ngrokNpipEnabled=true
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2=1
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Abfss=0
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Dbfs=1
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2File=0
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Gcs=0
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2S3=0
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Volumes=0
spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Workspace=0
spark.databricks.clusterUsageTags.numPerGlobalInitScriptsV2=0
spark.databricks.clusterUsageTags.orgId=4679476628690204
spark.databricks.clusterUsageTags.privateLinkEnabled=false
spark.databricks.clusterUsageTags.region=westus2
spark.databricks.clusterUsageTags.runtimeEngine=STANDARD
spark.databricks.clusterUsageTags.sparkEnvVarContainsBacktick=false
spark.databricks.clusterUsageTags.sparkEnvVarContainsDollarSign=false
spark.databricks.clusterUsageTags.sparkEnvVarContainsDoubleQuotes=false
spark.databricks.clusterUsageTags.sparkEnvVarContainsEscape=false
spark.databricks.clusterUsageTags.sparkEnvVarContainsNewline=false
spark.databricks.clusterUsageTags.sparkEnvVarContainsSingleQuotes=false
spark.databricks.clusterUsageTags.sparkImageLabel=release__11.3.x-snapshot-cpu-ml-scala2.12__databricks-universe__11.3.20__1e4a70b__6deb9aa__jenkins__2a43af3__format-3
spark.databricks.clusterUsageTags.sparkMasterUrlType=*********(redacted)
spark.databricks.clusterUsageTags.sparkVersion=11.3.x-cpu-ml-scala2.12
spark.databricks.clusterUsageTags.userId=*********(redacted)
spark.databricks.clusterUsageTags.userProvidedRemoteVolumeCount=*********(redacted)
spark.databricks.clusterUsageTags.userProvidedRemoteVolumeSizeGb=*********(redacted)
spark.databricks.clusterUsageTags.userProvidedRemoteVolumeType=*********(redacted)
spark.databricks.clusterUsageTags.userProvidedSparkVersion=*********(redacted)
spark.databricks.clusterUsageTags.workerEnvironmentId=workerenv-4679476628690204
spark.databricks.credential.aws.secretKey.redactor=*********(redacted)
spark.databricks.credential.redactor=*********(redacted)
spark.databricks.credential.scope.fs.adls.gen2.tokenProviderClassName=*********(redacted)
spark.databricks.credential.scope.fs.gs.auth.access.tokenProviderClassName=*********(redacted)
spark.databricks.credential.scope.fs.impl=*********(redacted)
spark.databricks.credential.scope.fs.s3a.tokenProviderClassName=*********(redacted)
spark.databricks.delta.logStore.crossCloud.fatal=true
spark.databricks.delta.multiClusterWrites.enabled=true
spark.databricks.delta.preview.enabled=true
spark.databricks.driverNfs.clusterWidePythonLibsEnabled=true
spark.databricks.driverNfs.enabled=true
spark.databricks.driverNfs.pathSuffix=.ephemeral_nfs
spark.databricks.driverNodeTypeId=Standard_DS3_v2
spark.databricks.enablePublicDbfsFuse=false
spark.databricks.eventLog.dir=eventlogs
spark.databricks.eventLog.enabled=true
spark.databricks.eventLog.listenerClassName=com.databricks.backend.daemon.driver.DBCEventLoggingListener
spark.databricks.io.directoryCommit.enableLogicalDelete=false
spark.databricks.managedCatalog.clientClassName=com.databricks.managedcatalog.ManagedCatalogClientImpl
spark.databricks.metrics.filesystem_io_metrics=true
spark.databricks.mlflow.autologging.enabled=true
spark.databricks.overrideDefaultCommitProtocol=org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol
spark.databricks.passthrough.adls.gen2.tokenProviderClassName=*********(redacted)
spark.databricks.passthrough.adls.tokenProviderClassName=*********(redacted)
spark.databricks.passthrough.enabled=false
spark.databricks.passthrough.glue.credentialsProviderFactoryClassName=*********(redacted)
spark.databricks.passthrough.glue.executorServiceFactoryClassName=*********(redacted)
spark.databricks.passthrough.oauth.refresher.impl=*********(redacted)
spark.databricks.passthrough.s3a.threadPoolExecutor.factory.class=com.databricks.backend.daemon.driver.aws.S3APassthroughThreadPoolExecutorFactory
spark.databricks.passthrough.s3a.tokenProviderClassName=*********(redacted)
spark.databricks.preemption.enabled=true
spark.databricks.privateLinkEnabled=false
spark.databricks.python.defaultPythonRepl=ipykernel
spark.databricks.redactor=com.databricks.spark.util.DatabricksSparkLogRedactorProxy
spark.databricks.repl.enableClassFileCleanup=true
spark.databricks.secret.envVar.keys.toRedact=*********(redacted)
spark.databricks.secret.sparkConf.keys.toRedact=*********(redacted)
spark.databricks.service.dbutils.repl.backend=com.databricks.dbconnect.ReplDBUtils
spark.databricks.service.dbutils.server.backend=com.databricks.dbconnect.SparkServerDBUtils
spark.databricks.session.share=false
spark.databricks.sparkContextId=4347861282214610666
spark.databricks.sql.configMapperClass=com.databricks.dbsql.config.SqlConfigMapperBridge
spark.databricks.tahoe.logStore.aws.class=com.databricks.tahoe.store.MultiClusterLogStore
spark.databricks.tahoe.logStore.azure.class=com.databricks.tahoe.store.AzureLogStore
spark.databricks.tahoe.logStore.class=com.databricks.tahoe.store.DelegatingLogStore
spark.databricks.tahoe.logStore.gcp.class=com.databricks.tahoe.store.GCPLogStore
spark.databricks.unityCatalog.credentialManager.apiTokenProviderClassName=*********(redacted)
spark.databricks.unityCatalog.credentialManager.tokenRefreshEnabled=*********(redacted)
spark.databricks.unityCatalog.enabled=false
spark.databricks.workerNodeTypeId=Standard_DS3_v2
spark.databricks.workspaceUrl=*********(redacted)
spark.databricks.wsfs.workspacePrivatePreview=true
spark.databricks.wsfsPublicPreview=true
spark.delta.sharing.profile.provider.class=*********(redacted)
spark.driver.allowMultipleContexts=false
spark.driver.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
spark.driver.host=10.11.115.134
spark.driver.maxResultSize=4g
spark.driver.port=40381
spark.driver.tempDirectory=/local_disk0/tmp
spark.eventLog.enabled=false
spark.executor.extraClassPath=/databricks/spark/dbconf/log4j/executor:/databricks/spark/dbconf/jets3t/:/databricks/spark/dbconf/hadoop:/databricks/hive/conf:/databricks/jars/*
spark.executor.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Djava.io.tmpdir=/local_disk0/tmp -XX:ReservedCodeCacheSize=512m -XX:+UseCodeCacheFlushing -XX:PerMethodRecompilationCutoff=-1 -XX:PerBytecodeRecompilationCutoff=-1 -Djava.security.properties=/databricks/spark/dbconf/java/extra.security -XX:-UseContainerSupport -XX:+PrintFlagsFinal -XX:+PrintGCDateStamps -XX:+PrintGCDetails -verbose:gc -Xss4m -Djava.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni -Djavax.xml.datatype.DatatypeFactory=com.sun.org.apache.xerces.internal.jaxp.datatype.DatatypeFactoryImpl -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl -Djavax.xml.validation.SchemaFactory:http://www.w3.org/2001/XMLSchema=com.sun.org.apache.xerces.internal.jaxp.validation.XMLSchemaFactory -Dorg.xml.sax.driver=com.sun.org.apache.xerces.internal.parsers.SAXParser -Dorg.w3c.dom.DOMImplementationSourceList=com.sun.org.apache.xerces.internal.dom.DOMXSImplementationSourceImpl -Djavax.net.ssl.sessionCacheSize=10000 -Dscala.reflect.runtime.disable.typetag.cache=true -Dcom.google.cloud.spark.bigquery.repackaged.io.netty.tryReflectionSetAccessible=true -Dlog4j2.formatMsgNoLookups=true -Ddatabricks.serviceName=spark-executor-1
spark.executor.id=driver
spark.executor.memory=7284m
spark.executor.tempDirectory=/local_disk0/tmp
spark.extraListeners=io.openlineage.spark.agent.OpenLineageSparkListener
spark.files.fetchFailure.unRegisterOutputOnHost=true
spark.files.overwrite=true
spark.files.useFetchCache=false
spark.hadoop.databricks.dbfs.client.version=v2
spark.hadoop.databricks.fs.perfMetrics.enable=true
spark.hadoop.databricks.s3.amazonS3Client.cache.enabled=true
spark.hadoop.databricks.s3.create.deleteUnnecessaryFakeDirectories=false
spark.hadoop.databricks.s3.verifyBucketExists.enabled=false
spark.hadoop.databricks.s3commit.client.sslTrustAll=false
spark.hadoop.fs.AbstractFileSystem.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
spark.hadoop.fs.abfs.impl=shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemHadoop3
spark.hadoop.fs.abfs.impl.disable.cache=true
spark.hadoop.fs.abfss.impl=shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.SecureAzureBlobFileSystemHadoop3
spark.hadoop.fs.abfss.impl.disable.cache=true
spark.hadoop.fs.adl.impl=com.databricks.adl.AdlFileSystem
spark.hadoop.fs.adl.impl.disable.cache=true
spark.hadoop.fs.azure.authorization.caching.enable=false
spark.hadoop.fs.azure.cache.invalidator.type=com.databricks.encryption.utils.CacheInvalidatorImpl
spark.hadoop.fs.azure.skip.metrics=true
spark.hadoop.fs.azure.user.agent.prefix=*********(redacted)
spark.hadoop.fs.cpfs-abfss.impl=*********(redacted)
spark.hadoop.fs.cpfs-abfss.impl.disable.cache=true
spark.hadoop.fs.cpfs-adl.impl=*********(redacted)
spark.hadoop.fs.cpfs-adl.impl.disable.cache=true
spark.hadoop.fs.cpfs-s3.impl=*********(redacted)
spark.hadoop.fs.cpfs-s3a.impl=*********(redacted)
spark.hadoop.fs.cpfs-s3n.impl=*********(redacted)
spark.hadoop.fs.dbfs.impl=com.databricks.backend.daemon.data.client.DbfsHadoop3
spark.hadoop.fs.dbfsartifacts.impl=com.databricks.backend.daemon.data.client.DBFSV1
spark.hadoop.fs.fcfs-abfs.impl=*********(redacted)
spark.hadoop.fs.fcfs-abfs.impl.disable.cache=true
spark.hadoop.fs.fcfs-abfss.impl=*********(redacted)
spark.hadoop.fs.fcfs-abfss.impl.disable.cache=true
spark.hadoop.fs.fcfs-s3.impl=*********(redacted)
spark.hadoop.fs.fcfs-s3.impl.disable.cache=true
spark.hadoop.fs.fcfs-s3a.impl=*********(redacted)
spark.hadoop.fs.fcfs-s3a.impl.disable.cache=true
spark.hadoop.fs.fcfs-s3n.impl=*********(redacted)
spark.hadoop.fs.fcfs-s3n.impl.disable.cache=true
spark.hadoop.fs.fcfs-wasb.impl=*********(redacted)
spark.hadoop.fs.fcfs-wasb.impl.disable.cache=true
spark.hadoop.fs.fcfs-wasbs.impl=*********(redacted)
spark.hadoop.fs.fcfs-wasbs.impl.disable.cache=true
spark.hadoop.fs.file.impl=com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem
spark.hadoop.fs.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemHadoop3
spark.hadoop.fs.gs.impl.disable.cache=true
spark.hadoop.fs.gs.outputstream.upload.chunk.size=16777216
spark.hadoop.fs.idbfs.impl=com.databricks.io.idbfs.IdbfsFileSystem
spark.hadoop.fs.mcfs-s3a.impl=com.databricks.sql.acl.fs.ManagedCatalogFileSystem
spark.hadoop.fs.mlflowdbfs.impl=com.databricks.mlflowdbfs.MlflowdbfsFileSystem
spark.hadoop.fs.s3.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3
spark.hadoop.fs.s3.impl.disable.cache=true
spark.hadoop.fs.s3a.assumed.role.credentials.provider=*********(redacted)
spark.hadoop.fs.s3a.attempts.maximum=10
spark.hadoop.fs.s3a.block.size=67108864
spark.hadoop.fs.s3a.connection.maximum=200
spark.hadoop.fs.s3a.connection.timeout=50000
spark.hadoop.fs.s3a.fast.upload=true
spark.hadoop.fs.s3a.fast.upload.active.blocks=32
spark.hadoop.fs.s3a.fast.upload.default=true
spark.hadoop.fs.s3a.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3
spark.hadoop.fs.s3a.impl.disable.cache=true
spark.hadoop.fs.s3a.max.total.tasks=1000
spark.hadoop.fs.s3a.multipart.size=10485760
spark.hadoop.fs.s3a.multipart.threshold=104857600
spark.hadoop.fs.s3a.retry.limit=20
spark.hadoop.fs.s3a.retry.throttle.interval=500ms
spark.hadoop.fs.s3a.threads.max=136
spark.hadoop.fs.s3n.impl=shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystemHadoop3
spark.hadoop.fs.s3n.impl.disable.cache=true
spark.hadoop.fs.stage.impl=com.databricks.backend.daemon.driver.managedcatalog.PersonalStagingFileSystem
spark.hadoop.fs.stage.impl.disable.cache=true
spark.hadoop.fs.wasb.impl=shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem
spark.hadoop.fs.wasb.impl.disable.cache=true
spark.hadoop.fs.wasbs.impl=shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem
spark.hadoop.fs.wasbs.impl.disable.cache=true
spark.hadoop.hive.hmshandler.retry.attempts=10
spark.hadoop.hive.hmshandler.retry.interval=2000
spark.hadoop.hive.server2.enable.doAs=false
spark.hadoop.hive.server2.idle.operation.timeout=7200000
spark.hadoop.hive.server2.idle.session.timeout=900000
spark.hadoop.hive.server2.keystore.password=*********(redacted)
spark.hadoop.hive.server2.keystore.path=/databricks/keys/jetty-ssl-driver-keystore.jks
spark.hadoop.hive.server2.session.check.interval=60000
spark.hadoop.hive.server2.thrift.http.cookie.auth.enabled=false
spark.hadoop.hive.server2.thrift.http.port=10000
spark.hadoop.hive.server2.transport.mode=http
spark.hadoop.hive.server2.use.SSL=true
spark.hadoop.hive.warehouse.subdir.inherit.perms=false
spark.hadoop.mapred.output.committer.class=com.databricks.backend.daemon.data.client.DirectOutputCommitter
spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version=2
spark.hadoop.parquet.abfs.readahead.optimization.enabled=true
spark.hadoop.parquet.block.size.row.check.max=10
spark.hadoop.parquet.block.size.row.check.min=10
spark.hadoop.parquet.filter.columnindex.enabled=false
spark.hadoop.parquet.memory.pool.ratio=0.5
spark.hadoop.parquet.page.metadata.validation.enabled=true
spark.hadoop.parquet.page.size.check.estimate=false
spark.hadoop.parquet.page.verify-checksum.enabled=true
spark.hadoop.parquet.page.write-checksum.enabled=true
spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.enabled=false
spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.throwsException=false
spark.hadoop.spark.driverproxy.customHeadersToProperties=*********(redacted)
spark.hadoop.spark.hadoop.aws.glue.cache.db.size=1000
spark.hadoop.spark.hadoop.aws.glue.cache.db.ttl-mins=30
spark.hadoop.spark.hadoop.aws.glue.cache.table.size=1000
spark.hadoop.spark.hadoop.aws.glue.cache.table.ttl-mins=30
spark.hadoop.spark.sql.parquet.output.committer.class=org.apache.spark.sql.parquet.DirectParquetOutputCommitter
spark.hadoop.spark.sql.sources.outputCommitterClass=com.databricks.backend.daemon.data.client.MapReduceDirectOutputCommitter
spark.home=/databricks/spark
spark.logConf=true
spark.master=spark://10.11.115.134:7077
spark.metrics.conf=/databricks/spark/conf/metrics.properties
spark.openlineage.endpoint=api/v1/lineage
spark.openlineage.namespace=adb-5445974573286168.8#default
spark.openlineage.url=*********(redacted)
spark.openlineage.url.param.code=*********(redacted)
spark.r.backendConnectionTimeout=604800
spark.r.numRBackendThreads=1
spark.rdd.compress=true
spark.repl.class.outputDir=/local_disk0/tmp/repl/spark-4347861282214610666-415cbfc1-bc72-4ecc-8182-d24eda276af6
spark.rpc.message.maxSize=256
spark.scheduler.listenerbus.eventqueue.capacity=20000
spark.scheduler.mode=FAIR
spark.serializer.objectStreamReset=100
spark.shuffle.manager=SORT
spark.shuffle.memoryFraction=0.2
spark.shuffle.reduceLocality.enabled=false
spark.shuffle.service.enabled=true
spark.shuffle.service.port=4048
spark.sparklyr-backend.threads=1
spark.sparkr.use.daemon=false
spark.speculation=false
spark.speculation.multiplier=3
spark.speculation.quantile=0.9
spark.sql.allowMultipleContexts=false
spark.sql.hive.convertCTAS=true
spark.sql.hive.convertMetastoreParquet=true
spark.sql.hive.metastore.jars=/databricks/databricks-hive/*
spark.sql.hive.metastore.sharedPrefixes=org.mariadb.jdbc,com.mysql.jdbc,org.postgresql,com.microsoft.sqlserver,microsoft.sql.DateTimeOffset,microsoft.sql.Types,com.databricks,com.codahale,com.fasterxml.jackson,shaded.databricks
spark.sql.hive.metastore.version=0.13.0
spark.sql.legacy.createHiveTableByDefault=false
spark.sql.parquet.cacheMetadata=true
spark.sql.parquet.compression.codec=snappy
spark.sql.sources.commitProtocolClass=com.databricks.sql.transaction.directory.DirectoryAtomicCommitProtocol
spark.sql.sources.default=delta
spark.sql.streaming.checkpointFileManagerClass=com.databricks.spark.sql.streaming.DatabricksCheckpointFileManager
spark.sql.streaming.stopTimeout=15s
spark.sql.warehouse.dir=*********(redacted)
spark.storage.blockManagerTimeoutIntervalMs=300000
spark.storage.memoryFraction=0.5
spark.streaming.driver.writeAheadLog.allowBatching=true
spark.streaming.driver.writeAheadLog.closeFileAfterWrite=true
spark.task.reaper.enabled=true
spark.task.reaper.killTimeout=60s
spark.ui.port=40001
spark.ui.prometheus.enabled=true
spark.worker.aioaLazyConfig.dbfsReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosDbfsReadinessCheckClient
spark.worker.aioaLazyConfig.iamReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosIamRoleCheckClient
spark.worker.cleanup.enabled=false
23/09/22 03:12:11 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set.
23/09/22 03:12:11 INFO log: Logging initialized @14416ms to org.eclipse.jetty.util.log.Slf4jLog
23/09/22 03:12:11 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_362-b09
23/09/22 03:12:11 INFO Server: Started @14686ms
23/09/22 03:12:12 INFO AbstractConnector: Started ServerConnector@544300a6{HTTP/1.1, (http/1.1)}{10.11.115.134:40001}
23/09/22 03:12:12 INFO Utils: Successfully started service 'SparkUI' on port 40001.
23/09/22 03:12:12 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@18fa5af6{/,null,AVAILABLE,@Spark}
23/09/22 03:12:12 WARN FairSchedulableBuilder: Fair Scheduler configuration file not found so jobs will be scheduled in FIFO order. To use fair scheduling, configure pools in fairscheduler.xml or set spark.scheduler.allocation.file to a file that contains the configuration.
23/09/22 03:12:12 INFO FairSchedulableBuilder: Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
23/09/22 03:12:12 INFO DatabricksEdgeConfigs: serverlessEnabled : false
23/09/22 03:12:12 INFO DatabricksEdgeConfigs: perfPackEnabled : false
23/09/22 03:12:12 INFO DatabricksEdgeConfigs: classicSqlEnabled : false
23/09/22 03:12:12 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://10.11.115.134:7077...
23/09/22 03:12:12 INFO TransportClientFactory: Successfully created connection to /10.11.115.134:7077 after 111 ms (0 ms spent in bootstraps)
23/09/22 03:12:13 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20230922031213-0000
23/09/22 03:12:13 INFO TaskSchedulerImpl: Task preemption enabled.
23/09/22 03:12:13 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44293.
23/09/22 03:12:13 INFO NettyBlockTransferService: Server created on 10.11.115.134:44293
23/09/22 03:12:13 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
23/09/22 03:12:13 INFO BlockManager: external shuffle service port = 4048
23/09/22 03:12:13 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.11.115.134, 44293, None)
23/09/22 03:12:13 INFO BlockManagerMasterEndpoint: Registering block manager 10.11.115.134:44293 with 3.3 GiB RAM, BlockManagerId(driver, 10.11.115.134, 44293, None)
23/09/22 03:12:13 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.11.115.134, 44293, None)
23/09/22 03:12:13 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20230922031213-0000/0 on worker-20230922031209-10.11.115.133-34159 (10.11.115.133:34159) with 4 core(s)
23/09/22 03:12:13 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.11.115.134, 44293, None)
23/09/22 03:12:13 INFO StandaloneSchedulerBackend: Granted executor ID app-20230922031213-0000/0 on hostPort 10.11.115.133:34159 with 4 core(s), 7.1 GiB RAM
23/09/22 03:12:13 INFO DatabricksUtils: Disabling Databricks event logging listener because spark.extraListeners does not contain the Databricks event logger class
23/09/22 03:12:13 INFO SparkContext: Registered listener io.openlineage.spark.agent.OpenLineageSparkListener
23/09/22 03:12:13 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20230922031213-0000/0 is now RUNNING
23/09/22 03:12:14 INFO ContextHandler: Stopped o.e.j.s.ServletContextHandler@18fa5af6{/,null,STOPPED,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@b307030{/jobs,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@263f6e96{/jobs/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@39acf187{/jobs/job,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@dd3e1e3{/jobs/job/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7878459f{/stages,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5d24703e{/stages/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@17554316{/stages/stage,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5af1b221{/stages/stage/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7d49fe37{/stages/pool,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@231c521e{/stages/pool/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1be3a294{/storage,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@729d1428{/storage/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1728d307{/storage/rdd,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3f0b5619{/storage/rdd/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@36ce9eaf{/environment,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1f27f354{/environment/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4425b6ed{/executors,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5039c2cf{/executors/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5ca006ac{/executors/threadDump,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1372696b{/executors/threadDump/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@206e5183{/executors/heapHistogram,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@32eb38e5{/executors/heapHistogram/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@21539796{/static,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@68ea1eb5{/,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@d7c00de{/api,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6046fba0{/metrics,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@755033c5{/jobs/job/kill,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5b49b1df{/stages/stage/kill,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@118c1faa{/metrics/json,null,AVAILABLE,@Spark}
23/09/22 03:12:14 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
23/09/22 03:12:14 INFO SparkContext: Loading Spark Service RPC Server. Classloader stack:List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@512dc0e0, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@556e4588, sun.misc.Launcher$AppClassLoader@1c53fd30, sun.misc.Launcher$ExtClassLoader@35a9782c)
23/09/22 03:12:15 INFO SparkServiceRPCServer: Initializing Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@512dc0e0, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@556e4588, sun.misc.Launcher$AppClassLoader@1c53fd30, sun.misc.Launcher$ExtClassLoader@35a9782c)
23/09/22 03:12:15 INFO SparkServiceRPCServer: Starting Spark Service RPC Server
23/09/22 03:12:15 INFO SparkServiceRPCServer: Starting Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@512dc0e0, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@556e4588, sun.misc.Launcher$AppClassLoader@1c53fd30, sun.misc.Launcher$ExtClassLoader@35a9782c)
23/09/22 03:12:15 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_362-b09
23/09/22 03:12:15 INFO AbstractConnector: Started ServerConnector@457d3f54{HTTP/1.1, (http/1.1)}{0.0.0.0:15001}
23/09/22 03:12:15 INFO Server: Started @18100ms
23/09/22 03:12:15 INFO DatabricksILoop$: Finished creating throwaway interpreter
23/09/22 03:12:15 INFO DatabricksILoop$: Successfully registered spark metrics in Prometheus registry
23/09/22 03:12:15 INFO DatabricksILoop$: Successfully initialized SparkContext
23/09/22 03:12:16 INFO SharedState: Scheduler stats enabled.
23/09/22 03:12:16 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
23/09/22 03:12:16 INFO SharedState: Warehouse path is 'dbfs:/user/hive/warehouse'.
23/09/22 03:12:16 INFO AsyncEventQueue: Process of event SparkListenerApplicationStart(Databricks Shell,Some(app-20230922031213-0000),1695352329292,root,None,None,None) by listener OpenLineageSparkListener took 1.933345606s.
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@134ec0f3{/storage/iocache,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@491f3fb0{/storage/iocache/json,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2a738d47{/SQL,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@8bd9d08{/SQL/json,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@d8a2b1b{/SQL/execution,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3328db4f{/SQL/execution/json,null,AVAILABLE,@Spark}
23/09/22 03:12:16 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@23169374{/static/sql,null,AVAILABLE,@Spark}
23/09/22 03:12:16 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:16 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:19 INFO DriverConf: Configured feature flag data source LaunchDarkly
23/09/22 03:12:19 INFO DriverConf: Load feature flag from LaunchDarkly
23/09/22 03:12:19 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:21 INFO StandaloneSchedulerBackend$StandaloneDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.11.115.133:57974) with ID 0, ResourceProfileId 0
23/09/22 03:12:21 INFO DatabricksMountsStore: Mount store initialization: Attempting to get the list of mounts from metadata manager of DBFS
23/09/22 03:12:21 INFO log: Logging initialized @24438ms to shaded.v9_4.org.eclipse.jetty.util.log.Slf4jLog
23/09/22 03:12:21 INFO DynamicRpcConf: Configured feature flag data source LaunchDarkly
23/09/22 03:12:21 INFO DynamicRpcConf: Load feature flag from LaunchDarkly
23/09/22 03:12:21 WARN DynamicRpcConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:22 INFO TypeUtil: JVM Runtime does not support Modules
23/09/22 03:12:22 INFO DatabricksMountsStore: Mount store initialization: Received a list of 9 mounts accessible from metadata manager of DBFS
23/09/22 03:12:22 INFO DatabricksMountsStore: Updated mounts cache. Changes: List((+,DbfsMountPoint(s3a://databricks-datasets-california/, /databricks-datasets)), (+,DbfsMountPoint(uc-volumes:/Volumes, /Volumes)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-tracking)), (+,DbfsMountPoint(wasbs://dbstorage32gi53vs6kgpo.blob.core.windows.net/4679476628690204, /databricks-results)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-registry)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /Volume)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volumes)), (+,DbfsMountPoint(wasbs://dbstorage32gi53vs6kgpo.blob.core.windows.net/4679476628690204, /)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volume)))
23/09/22 03:12:22 INFO BlockManagerMasterEndpoint: Registering block manager 10.11.115.133:45037 with 3.6 GiB RAM, BlockManagerId(0, 10.11.115.133, 45037, None)
23/09/22 03:12:22 INFO DatabricksFileSystemV2Factory: Creating wasbs file system for wasbs://root@dbstorage32gi53vs6kgpo.blob.core.windows.net
23/09/22 03:12:23 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:12:23 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:12:23 INFO DbfsHadoop3: Initialized DBFS with DBFSV2 as the delegate.
23/09/22 03:12:23 INFO HiveConf: Found configuration file file:/databricks/hive/conf/hive-site.xml
23/09/22 03:12:23 INFO SessionManager: HiveServer2: Background operation thread pool size: 100
23/09/22 03:12:23 INFO SessionManager: HiveServer2: Background operation thread wait queue size: 100
23/09/22 03:12:23 INFO SessionManager: HiveServer2: Background operation thread keepalive time: 10 seconds
23/09/22 03:12:23 INFO AbstractService: Service:OperationManager is inited.
23/09/22 03:12:23 INFO AbstractService: Service:SessionManager is inited.
23/09/22 03:12:23 INFO SparkSQLCLIService: Service: CLIService is inited.
23/09/22 03:12:23 INFO AbstractService: Service:ThriftHttpCLIService is inited.
23/09/22 03:12:23 INFO HiveThriftServer2: Service: HiveServer2 is inited.
23/09/22 03:12:23 INFO AbstractService: Service:OperationManager is started.
23/09/22 03:12:23 INFO AbstractService: Service:SessionManager is started.
23/09/22 03:12:23 INFO SparkSQLCLIService: Service: CLIService is started.
23/09/22 03:12:23 INFO AbstractService: Service:ThriftHttpCLIService is started.
23/09/22 03:12:23 INFO ThriftCLIService: HTTP Server SSL: adding excluded protocols: [SSLv2, SSLv3]
23/09/22 03:12:23 INFO ThriftCLIService: HTTP Server SSL: SslContextFactory.getExcludeProtocols = [SSL, SSLv2, SSLv2Hello, SSLv3]
23/09/22 03:12:23 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_362-b09
23/09/22 03:12:23 INFO session: DefaultSessionIdManager workerName=node0
23/09/22 03:12:23 INFO session: No SessionScavenger set, using defaults
23/09/22 03:12:23 INFO session: node0 Scavenging every 660000ms
23/09/22 03:12:23 WARN SecurityHandler: ServletContext@o.e.j.s.ServletContextHandler@3c6c87fa{/,null,STARTING} has uncovered http methods for path: /*
23/09/22 03:12:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3c6c87fa{/,null,AVAILABLE}
23/09/22 03:12:23 INFO SslContextFactory: x509=X509@12f85dc8(1,h=[az-westus.workers.prod.ns.databricks.com],a=[],w=[]) for Server@54a04eae[provider=null,keyStore=file:///databricks/keys/jetty-ssl-driver-keystore.jks,trustStore=null]
23/09/22 03:12:23 INFO AbstractConnector: Started ServerConnector@40f49d72{SSL, (ssl, http/1.1)}{0.0.0.0:10000}
23/09/22 03:12:23 INFO Server: Started @26472ms
23/09/22 03:12:23 INFO ThriftCLIService: Started ThriftHttpCLIService in https mode on port 10000 path=/cliservice/* with 5...500 worker threads
23/09/22 03:12:23 INFO AbstractService: Service:HiveServer2 is started.
23/09/22 03:12:23 INFO HiveThriftServer2: HiveThriftServer2 started
23/09/22 03:12:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@150eab74{/sqlserver,null,AVAILABLE,@Spark}
23/09/22 03:12:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@43f670f3{/sqlserver/json,null,AVAILABLE,@Spark}
23/09/22 03:12:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@95f61c2{/sqlserver/session,null,AVAILABLE,@Spark}
23/09/22 03:12:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@41f05f1{/sqlserver/session/json,null,AVAILABLE,@Spark}
23/09/22 03:12:23 INFO LibraryResolutionManager: Preferred maven central mirror is configured to https://maven-central.storage-download.googleapis.com/maven2/
23/09/22 03:12:23 INFO DriverCorral: Creating the driver context
23/09/22 03:12:23 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
23/09/22 03:12:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@fcd300e{/StreamingQuery,null,AVAILABLE,@Spark}
23/09/22 03:12:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5e9c6d8a{/StreamingQuery/json,null,AVAILABLE,@Spark}
23/09/22 03:12:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@24cdc97f{/StreamingQuery/statistics,null,AVAILABLE,@Spark}
23/09/22 03:12:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@466d87a1{/StreamingQuery/statistics/json,null,AVAILABLE,@Spark}
23/09/22 03:12:24 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1215982f{/static/sql,null,AVAILABLE,@Spark}
23/09/22 03:12:24 INFO JettyServer$: Creating thread pool with name ...
23/09/22 03:12:24 INFO JettyServer$: Thread pool created
23/09/22 03:12:24 INFO JettyServer$: Creating thread pool with name ...
23/09/22 03:12:24 INFO JettyServer$: Thread pool created
23/09/22 03:12:24 INFO DriverDaemon: Starting driver daemon...
23/09/22 03:12:24 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
23/09/22 03:12:24 WARN SparkConf: The configuration key 'spark.akka.frameSize' has been deprecated as of Spark 1.6 and may be removed in the future. Please use the new key 'spark.rpc.message.maxSize' instead.
23/09/22 03:12:24 INFO DriverDaemon$: Attempting to run: 'set up ttyd daemon'
23/09/22 03:12:24 INFO DriverDaemon$: Attempting to run: 'Configuring RStudio daemon'
23/09/22 03:12:24 INFO DriverDaemon$: Resetting the default python executable
23/09/22 03:12:24 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/cluster_libraries/python, -p, /databricks/python/bin/python, --no-download, --no-setuptools, --no-wheel)
23/09/22 03:12:26 INFO DatabricksUtils: created python virtualenv: /local_disk0/.ephemeral_nfs/cluster_libraries/python
23/09/22 03:12:26 INFO Utils: resolved command to be run: List(/databricks/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
23/09/22 03:12:26 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib()))
23/09/22 03:12:26 INFO DatabricksUtils: created sites.pth at /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/sites.pth
23/09/22 03:12:26 INFO ClusterWidePythonEnvManager: Registered /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@1ab6a093
23/09/22 03:12:26 INFO DriverDaemon$: Attempting to run: 'Update root virtualenv'
23/09/22 03:12:26 INFO Utils: resolved command to be run: WrappedArray(getconf, PAGESIZE)
23/09/22 03:12:26 INFO DriverDaemon$: Finished updating /etc/environment
23/09/22 03:12:26 INFO DriverDaemon$$anon$1: Message out thread ready
23/09/22 03:12:26 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_362-b09
23/09/22 03:12:26 INFO AbstractConnector: Started ServerConnector@59960ae9{HTTP/1.1, (http/1.1)}{0.0.0.0:6061}
23/09/22 03:12:26 INFO Server: Started @28920ms
23/09/22 03:12:26 INFO Server: jetty-9.4.46.v20220331; built: 2022-03-31T16:38:08.030Z; git: bc17a0369a11ecf40bb92c839b9ef0a8ac50ea18; jvm 1.8.0_362-b09
23/09/22 03:12:26 INFO SslContextFactory: x509=X509@7cd50c3d(1,h=[az-westus.workers.prod.ns.databricks.com],a=[],w=[]) for Server@7d97a1a0[provider=null,keyStore=null,trustStore=null]
23/09/22 03:12:26 WARN config: Weak cipher suite TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA enabled for Server@7d97a1a0[provider=null,keyStore=null,trustStore=null]
23/09/22 03:12:26 WARN config: Weak cipher suite TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA enabled for Server@7d97a1a0[provider=null,keyStore=null,trustStore=null]
23/09/22 03:12:26 INFO AbstractConnector: Started ServerConnector@2d4eba14{SSL, (ssl, http/1.1)}{0.0.0.0:6062}
23/09/22 03:12:26 INFO Server: Started @28983ms
23/09/22 03:12:26 INFO DriverDaemon: Started comm channel server
23/09/22 03:12:26 INFO DriverDaemon: Driver daemon started.
23/09/22 03:12:26 INFO DynamicInfoServiceConf: Configured feature flag data source LaunchDarkly
23/09/22 03:12:26 INFO DynamicInfoServiceConf: Load feature flag from LaunchDarkly
23/09/22 03:12:26 WARN DynamicInfoServiceConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:26 INFO FeatureFlagRegister$$anon$1: Configured feature flag data source LaunchDarkly
23/09/22 03:12:26 INFO FeatureFlagRegister$$anon$1: Load feature flag from LaunchDarkly
23/09/22 03:12:26 WARN FeatureFlagRegister$$anon$1: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:26 WARN FeatureFlagRegister$$anon$2: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:12:27 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:27 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:27 INFO DriverCorral: Loading the root classloader
23/09/22 03:12:27 INFO DriverCorral: Starting sql repl ReplId-3e938-b2918-a40bd-0
23/09/22 03:12:27 INFO DriverCorral: Starting sql repl ReplId-4866c-e0496-c4e7c-0
23/09/22 03:12:27 INFO DriverCorral: Starting sql repl ReplId-5be81-765d5-a450d-b
23/09/22 03:12:27 INFO SQLDriverWrapper: setupRepl:ReplId-5be81-765d5-a450d-b: finished to load
23/09/22 03:12:27 INFO SQLDriverWrapper: setupRepl:ReplId-4866c-e0496-c4e7c-0: finished to load
23/09/22 03:12:28 INFO DriverCorral: Starting sql repl ReplId-7fb2e-41a1e-7bb98-6
23/09/22 03:12:28 INFO SQLDriverWrapper: setupRepl:ReplId-3e938-b2918-a40bd-0: finished to load
23/09/22 03:12:28 INFO SQLDriverWrapper: setupRepl:ReplId-7fb2e-41a1e-7bb98-6: finished to load
23/09/22 03:12:28 INFO DriverCorral: Starting sql repl ReplId-366bb-6c8fc-7e848-1
23/09/22 03:12:28 INFO SQLDriverWrapper: setupRepl:ReplId-366bb-6c8fc-7e848-1: finished to load
23/09/22 03:12:28 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:28 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:12:28 INFO DriverCorral: Starting r repl ReplId-37c67-11e71-085f2-b
23/09/22 03:12:28 INFO ROutputStreamHandler: Connection succeeded on port 33759
23/09/22 03:12:28 INFO ROutputStreamHandler: Connection succeeded on port 38659
23/09/22 03:12:28 INFO RDriverLocal: 1. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: object created with for ReplId-37c67-11e71-085f2-b.
23/09/22 03:12:28 INFO RDriverLocal: 2. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: initializing ...
23/09/22 03:12:28 INFO RDriverLocal: 3. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: started RBackend thread on port 44567
23/09/22 03:12:28 INFO RDriverLocal: 4. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: waiting for SparkR to be installed ...
23/09/22 03:12:31 WARN DriverDaemon: ShouldUseAutoscalingInfo exception thrown, not logging stack trace. This is used for control flow and is ok to ignore
23/09/22 03:12:45 INFO RDriverLocal$: SparkR installation completed.
23/09/22 03:12:45 INFO RDriverLocal: 5. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: launching R process ...
23/09/22 03:12:45 INFO RDriverLocal: 6. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: cgroup isolation disabled, not placing R process in REPL cgroup.
23/09/22 03:12:45 INFO RDriverLocal: 7. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: starting R process on port 1100 (attempt 1) ...
23/09/22 03:12:45 INFO RDriverLocal$: Debugging command for R process builder: SIMBASPARKINI=/etc/simba.sparkodbc.ini R_LIBS=/local_disk0/.ephemeral_nfs/envs/rEnv-f2101999-5405-42d1-9a13-54e56b10c595:/databricks/spark/R/lib:/local_disk0/.ephemeral_nfs/cluster_libraries/r LD_LIBRARY_PATH=/opt/simba/sparkodbc/lib/64/ SPARKR_BACKEND_CONNECTION_TIMEOUT=604800 DB_STREAM_BEACON_STRING_START=DATABRICKS_STREAM_START-ReplId-37c67-11e71-085f2-b DB_STDOUT_STREAM_PORT=33759 SPARKR_BACKEND_AUTH_SECRET=66ff8ddb37f72244f65671addc6c280315e049a38bc9f2d69956c1351b9dff0a DB_STREAM_BEACON_STRING_END=DATABRICKS_STREAM_END-ReplId-37c67-11e71-085f2-b EXISTING_SPARKR_BACKEND_PORT=44567 ODBCINI=/etc/odbc.ini DB_STDERR_STREAM_PORT=38659 /bin/bash /local_disk0/tmp/_startR.sh2234149851276446982resource.r /local_disk0/tmp/_rServeScript.r3565608712795707271resource.r 1100 None
23/09/22 03:12:45 INFO RDriverLocal: 8. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: setting up BufferedStreamThread with bufferSize: 1000.
23/09/22 03:12:47 INFO RDriverLocal: 9. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: R process started with RServe listening on port 1100.
23/09/22 03:12:47 INFO RDriverLocal: 10. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: starting interpreter to talk to R process ...
23/09/22 03:12:47 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
23/09/22 03:12:48 INFO ROutputStreamHandler: Successfully connected to stdout in the RShell.
23/09/22 03:12:48 INFO ROutputStreamHandler: Successfully connected to stderr in the RShell.
23/09/22 03:12:48 INFO RDriverLocal: 11. RDriverLocal.9f9878f0-af22-4610-8015-9ba9cba97f56: R interpreter is connected.
23/09/22 03:12:48 INFO RDriverWrapper: setupRepl:ReplId-37c67-11e71-085f2-b: finished to load
23/09/22 03:13:34 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:13:34 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:13:34 INFO DriverCorral: Starting python repl ReplId-285c6-06788-c5eb5-e
23/09/22 03:13:34 INFO JupyterDriverLocal: Starting gateway server for repl ReplId-285c6-06788-c5eb5-e
23/09/22 03:13:34 INFO PythonPy4JUtil: Using pinned thread mode in Py4J
23/09/22 03:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:13:35 INFO DynamicTracingConf: Configured feature flag data source LaunchDarkly
23/09/22 03:13:35 INFO DynamicTracingConf: Load feature flag from LaunchDarkly
23/09/22 03:13:35 WARN DynamicTracingConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
23/09/22 03:13:35 INFO DriverCorral: Starting sql repl ReplId-7a2d7-f0b9e-5e69d-c
23/09/22 03:13:35 INFO SQLDriverWrapper: setupRepl:ReplId-7a2d7-f0b9e-5e69d-c: finished to load
23/09/22 03:13:35 INFO CommChannelWebSocket: onWebSocketConnect: websocket connected with session: WebSocketSession[websocket=JettyAnnotatedEventDriver[com.databricks.backend.daemon.driver.CommChannelWebSocket@30cb2f61],behavior=SERVER,connection=WebSocketServerConnection@1fc76509::DecryptedEndPoint@1a0baa83{l=/10.11.115.134:6062,r=/10.11.115.198:55930,OPEN,fill=-,flush=-,to=160/7200000},remote=WebSocketRemoteEndpoint@3e406be6[batching=true],incoming=JettyAnnotatedEventDriver[com.databricks.backend.daemon.driver.CommChannelWebSocket@30cb2f61],outgoing=ExtensionStack[queueSize=0,extensions=[],incoming=org.eclipse.jetty.websocket.common.WebSocketSession,outgoing=org.eclipse.jetty.websocket.server.WebSocketServerConnection]]
23/09/22 03:13:35 INFO OutgoingDirectNotebookMessageBuffer: Start MessageSendTask with session: 162088433
23/09/22 03:13:37 INFO VirtualenvCloneHelper: Creating notebook-scoped virtualenv for b418d423-c52b-4877-8abc-07050e47b11d
23/09/22 03:13:37 INFO VirtualenvCloneHelper: Creating notebook-scoped virtualenv for f74596f4-5304-42b9-9f73-fa7bc858b89c
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c, -p, /local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, --no-download, --no-setuptools, --no-wheel)
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d, -p, /local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, --no-download, --no-setuptools, --no-wheel)
23/09/22 03:13:37 INFO DatabricksUtils: created python virtualenv: /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
23/09/22 03:13:37 INFO DatabricksUtils: created python virtualenv: /local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib()))
23/09/22 03:13:37 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib()))
23/09/22 03:13:37 INFO DatabricksUtils: created sites.pth at /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c/lib/python3.9/site-packages/sites.pth
23/09/22 03:13:37 INFO NotebookScopedPythonEnvManager: Time spent to start virtualenv /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c is 462(ms)
23/09/22 03:13:37 INFO NotebookScopedPythonEnvManager: Registered /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c/lib/python3.9/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@75a9c171
23/09/22 03:13:37 INFO DatabricksUtils: created sites.pth at /local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d/lib/python3.9/site-packages/sites.pth
23/09/22 03:13:37 INFO NotebookScopedPythonEnvManager: Time spent to start virtualenv /local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d is 495(ms)
23/09/22 03:13:37 INFO NotebookScopedPythonEnvManager: Registered /local_disk0/.ephemeral_nfs/envs/pythonEnv-b418d423-c52b-4877-8abc-07050e47b11d/lib/python3.9/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@1df8f4ae
23/09/22 03:13:37 INFO IpykernelUtils$: Python process builder: [bash, /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c/python_start_f74596f4-5304-42b9-9f73-fa7bc858b89c.sh, /databricks/spark/python/pyspark/wrapped_python.py, root, /local_disk0/.ephemeral_nfs/envs/pythonEnv-f74596f4-5304-42b9-9f73-fa7bc858b89c/bin/python, /databricks/python_shell/scripts/db_ipykernel_launcher.py, -f, /databricks/kernel-connections/7979f0a86bbf7b0beee0790480e55b2f93f4ffb25c936be5716b2ec62f608d01.json]
23/09/22 03:13:37 INFO IpykernelUtils$: Cgroup isolation disabled, not placing python process in repl cgroup
23/09/22 03:13:37 INFO ProgressReporter$: Added result fetcher for 8803832534457543132_7062199902851827812_65e2f9e7-9eb1-4d20-b3b7-bcf8c99891cf
23/09/22 03:13:38 INFO ClusterLoadMonitor: Added query with execution ID:0. Current active queries:1
23/09/22 03:13:38 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$
23/09/22 03:13:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 0.0, New Ema: 1.0 
23/09/22 03:13:40 INFO SecuredHiveExternalCatalog: creating hiveClient from java.lang.Throwable
	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:79)
	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:77)
	at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:113)
	at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
	at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:377)
	at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:363)
	at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:313)
	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:263)
	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:253)
	at org.apache.spark.sql.internal.SharedState.$anonfun$globalTempViewManager$1(SharedState.scala:336)
	at org.apache.spark.sql.internal.SharedState.$anonfun$globalTempViewExternalCatalogNameCheck$1(SharedState.scala:308)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at scala.util.Try$.apply(Try.scala:213)
	at org.apache.spark.sql.internal.SharedState.globalTempViewExternalCatalogNameCheck(SharedState.scala:308)
	at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:336)
	at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:332)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$hiveCatalog$2(HiveSessionStateBuilder.scala:78)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager$lzycompute(SessionCatalog.scala:554)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager(SessionCatalog.scala:554)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.setCurrentDatabaseWithoutCheck(SessionCatalog.scala:831)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.setCurrentDatabaseWithoutCheck(ManagedCatalogSessionCatalog.scala:503)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.setCurrentCatalog(ManagedCatalogSessionCatalog.scala:366)
	at com.databricks.sql.DatabricksCatalogManager.setCurrentCatalog(DatabricksCatalogManager.scala:135)
	at org.apache.spark.sql.execution.command.SetCatalogCommand.run(SetCatalogCommand.scala:30)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:229)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:249)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:399)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:194)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:148)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:349)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:229)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:214)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:227)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:220)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:298)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:294)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:220)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:354)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:220)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:174)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:165)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:107)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:104)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:820)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:815)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.$anonfun$executeSql$1(SQLDriverLocal.scala:91)
	at scala.collection.immutable.List.map(List.scala:293)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:37)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)
	at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$24(DriverLocal.scala:740)
	at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
	at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$21(DriverLocal.scala:723)
	at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:403)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:147)
	at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:401)
	at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:398)
	at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:62)
	at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:446)
	at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:431)
	at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:62)
	at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:700)
	at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:622)
	at scala.util.Try$.apply(Try.scala:213)
	at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:614)
	at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:533)
	at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:568)
	at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:438)
	at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:381)
	at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:232)
	at java.lang.Thread.run(Thread.java:750)

23/09/22 03:13:40 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
23/09/22 03:13:40 INFO HiveUtils: Initializing HiveMetastoreConnection version 0.13.0 using file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-common--org.apache.hive.shims__hive-shims-common__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.xerial.snappy--snappy-java--org.xerial.snappy__snappy-java__1.0.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--jetty--org.mortbay.jetty__jetty__6.1.26.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.jackson--jackson-mapper-asl--org.codehaus.jackson__jackson-mapper-asl__1.9.13.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.servlet--servlet-api--javax.servlet__servlet-api__2.5.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.slf4j--slf4j-api--org.slf4j__slf4j-api__1.7.36.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-common--org.apache.hive__hive-common__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.avro--avro--org.apache.avro__avro__1.7.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.httpcomponents--httpclient--org.apache.httpcomponents__httpclient__4.4.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.commons--commons-lang3--org.apache.commons__commons-lang3__3.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--stax--stax-api--stax__stax-api__1.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.thrift--libfb303--org.apache.thrift__libfb303__0.9.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-common-secure--org.apache.hive.shims__hive-shims-common-secure__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--stringtemplate--org.antlr__stringtemplate__3.2.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-serde--org.apache.hive__hive-serde__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--net.sf.jpam--jpam--net.sf.jpam__jpam__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.google.code.findbugs--jsr305--com.google.code.findbugs__jsr305__1.3.9.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-exec--org.apache.hive__hive-exec__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-rdbms--org.datanucleus__datanucleus-rdbms__4.1.19.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-jaspic_1.0_spec--org.apache.geronimo.specs__geronimo-jaspic_1.0_spec__1.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.ow2.asm--asm--org.ow2.asm__asm__4.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-cli--org.apache.hive__hive-cli__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm--asm__asm__3.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--antlr--antlr--antlr__antlr__2.7.7.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--servlet-api--org.mortbay.jetty__servlet-api__2.5-20081211.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.20S--org.apache.hive.shims__hive-shims-0.20S__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.ant--ant-launcher--org.apache.ant__ant-launcher__1.9.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.velocity--velocity--org.apache.velocity__velocity__1.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.mail--mail--javax.mail__mail__1.4.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.thrift--libthrift--org.apache.thrift__libthrift__0.9.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.google.guava--guava--com.google.guava__guava__11.0.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.mortbay.jetty--jetty-util--org.mortbay.jetty__jetty-util__6.1.26.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-service--org.apache.hive__hive-service__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-slf4j-impl--org.apache.logging.log4j__log4j-slf4j-impl__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-api--org.apache.logging.log4j__log4j-api__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--junit--junit--junit__junit__3.8.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.commons--commons-compress--org.apache.commons__commons-compress__1.9.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-logging--commons-logging--commons-logging__commons-logging__1.1.3.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.zookeeper--zookeeper--org.apache.zookeeper__zookeeper__3.4.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.jdo--jdo-api--javax.jdo__jdo-api__3.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.20--org.apache.hive.shims__hive-shims-0.20__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.ant--ant--org.apache.ant__ant__1.9.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.objenesis--objenesis--org.objenesis__objenesis__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm-commons--asm__asm-commons__3.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-io--commons-io--commons-io__commons-io__2.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.thoughtworks.paranamer--paranamer--com.thoughtworks.paranamer__paranamer__2.8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.reflectasm--reflectasm-shaded--com.esotericsoftware.reflectasm__reflectasm-shaded__1.07.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-core--org.apache.logging.log4j__log4j-core__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.transaction--jta--javax.transaction__jta__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--jline--jline--jline__jline__0.9.94.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.eclipse.jetty.aggregate--jetty-all--org.eclipse.jetty.aggregate__jetty-all__7.6.0.v20120127.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-core--org.datanucleus__datanucleus-core__4.1.17.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-httpclient--commons-httpclient--commons-httpclient__commons-httpclient__3.0.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--antlr-runtime--org.antlr__antlr-runtime__3.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-ant--org.apache.hive__hive-ant__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.antlr--ST4--org.antlr__ST4__4.0.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--oro--oro--oro__oro__2.0.8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-jdbc--org.apache.hive__hive-jdbc__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-beeline--org.apache.hive__hive-beeline__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.transaction--transaction-api--javax.transaction__transaction-api__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-lang--commons-lang--commons-lang__commons-lang__2.4.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-cli--commons-cli--commons-cli__commons-cli__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.kryo--kryo--com.esotericsoftware.kryo__kryo__2.21.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive.shims--hive-shims-0.23--org.apache.hive.shims__hive-shims-0.23__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.httpcomponents--httpcore--org.apache.httpcomponents__httpcore__4.2.5.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.jackson--jackson-core-asl--org.codehaus.jackson__jackson-core-asl__1.9.13.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--asm--asm-tree--asm__asm-tree__3.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.esotericsoftware.minlog--minlog--com.esotericsoftware.minlog__minlog__1.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-annotation_1.0_spec--org.apache.geronimo.specs__geronimo-annotation_1.0_spec__1.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-codec--commons-codec--commons-codec__commons-codec__1.8.jar:file:/databricks/databricks-hive/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-1.2-api--org.apache.logging.log4j__log4j-1.2-api__2.18.0.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.codehaus.groovy--groovy-all--org.codehaus.groovy__groovy-all__2.1.6.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--javax.jdo--org.datanucleus__javax.jdo__3.2.0-m3.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-shims--org.apache.hive__hive-shims__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--commons-collections--commons-collections--commons-collections__commons-collections__3.2.2.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--javax.activation--activation--javax.activation__activation__1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.geronimo.specs--geronimo-jta_1.1_spec--org.apache.geronimo.specs__geronimo-jta_1.1_spec__1.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.zaxxer--HikariCP--com.zaxxer__HikariCP__2.5.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.derby--derby--org.apache.derby__derby__10.10.1.1.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.apache.hive--hive-metastore--org.apache.hive__hive-metastore__0.13.1-databricks-8.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--com.jolbox--bonecp--com.jolbox__bonecp__0.8.0.RELEASE.jar:file:/databricks/databricks-hive/----ws_3_3--maven-trees--hive-metastore-databricks-log4j2--org.datanucleus--datanucleus-api-jdo--org.datanucleus__datanucleus-api-jdo__4.2.4.jar:file:/databricks/databricks-hive/manifest.jar:file:/databricks/databricks-hive/bonecp-configs.jar
23/09/22 03:13:40 INFO PoolingHiveClient: Hive metastore connection pool implementation is HikariCP
23/09/22 03:13:40 INFO LocalHiveClientsPool: Create Hive Metastore client pool of size 20
23/09/22 03:13:40 INFO HiveClientImpl: Warehouse location for Hive client (version 0.13.1) is dbfs:/user/hive/warehouse
23/09/22 03:13:41 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23/09/22 03:13:41 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:40.231Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"83806b2b-6e39-49a4-a6f2-8efdc67da215","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.command.SetCatalogCommand","num-children":0,"catalogName":"hive_metastore"}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_set_catalog_command","facets":{}},"inputs":[],"outputs":[]}
23/09/22 03:13:41 INFO AsyncEventQueue: Process of event SparkListenerSQLExecutionStart(executionId=0, ...) by listener OpenLineageSparkListener took 1.055753891s.
23/09/22 03:13:41 INFO ObjectStore: ObjectStore, initialize called
23/09/22 03:13:41 INFO Persistence: Property datanucleus.fixedDatastore unknown - will be ignored
23/09/22 03:13:41 INFO Persistence: Property datanucleus.connectionPool.idleTimeout unknown - will be ignored
23/09/22 03:13:41 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
23/09/22 03:13:41 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
23/09/22 03:13:41 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:13:41 INFO HikariDataSource: HikariPool-1 - Started.
23/09/22 03:13:42 INFO HikariDataSource: HikariPool-2 - Started.
23/09/22 03:13:42 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
23/09/22 03:13:43 INFO ObjectStore: Initialized ObjectStore
23/09/22 03:13:43 INFO HiveMetaStore: Added admin role in metastore
23/09/22 03:13:43 INFO HiveMetaStore: Added public role in metastore
23/09/22 03:13:43 INFO HiveMetaStore: No user is added in admin role, since config is empty
23/09/22 03:13:44 INFO HiveMetaStore: 0: get_database: default
23/09/22 03:13:44 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: default	
23/09/22 03:13:44 INFO HiveMetaStore: 0: get_database: global_temp
23/09/22 03:13:44 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: global_temp	
23/09/22 03:13:44 ERROR RetryingHMSHandler: NoSuchObjectException(message:There is no database named global_temp)
	at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:508)
	at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:519)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
	at com.sun.proxy.$Proxy86.getDatabase(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStore.java:796)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
	at com.sun.proxy.$Proxy88.get_database(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:949)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
	at com.sun.proxy.$Proxy89.getDatabase(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1165)
	at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1154)
	at org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:619)
	at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:440)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:337)
	at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
	at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:274)
	at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
	at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:317)
	at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:440)
	at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1(PoolingHiveClient.scala:321)
	at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1$adapted(PoolingHiveClient.scala:320)
	at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
	at org.apache.spark.sql.hive.client.PoolingHiveClient.databaseExists(PoolingHiveClient.scala:320)
	at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:313)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
	at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
	at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
	at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
	at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:377)
	at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:363)
	at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:313)
	at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(ExternalCatalogWithListener.scala:77)
	at org.apache.spark.sql.internal.SharedState.$anonfun$globalTempViewExternalCatalogNameCheck$1(SharedState.scala:308)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at scala.util.Try$.apply(Try.scala:213)
	at org.apache.spark.sql.internal.SharedState.globalTempViewExternalCatalogNameCheck(SharedState.scala:308)
	at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:336)
	at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:332)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$hiveCatalog$2(HiveSessionStateBuilder.scala:78)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager$lzycompute(SessionCatalog.scala:554)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager(SessionCatalog.scala:554)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.setCurrentDatabaseWithoutCheck(SessionCatalog.scala:831)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.setCurrentDatabaseWithoutCheck(ManagedCatalogSessionCatalog.scala:503)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.setCurrentCatalog(ManagedCatalogSessionCatalog.scala:366)
	at com.databricks.sql.DatabricksCatalogManager.setCurrentCatalog(DatabricksCatalogManager.scala:135)
	at org.apache.spark.sql.execution.command.SetCatalogCommand.run(SetCatalogCommand.scala:30)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:229)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:249)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:399)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:194)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:148)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:349)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:229)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:214)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:227)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:220)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:298)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:294)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:220)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:354)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:220)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:174)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:165)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:107)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:104)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:820)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:815)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.$anonfun$executeSql$1(SQLDriverLocal.scala:91)
	at scala.collection.immutable.List.map(List.scala:293)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:37)
	at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)
	at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$24(DriverLocal.scala:740)
	at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
	at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$21(DriverLocal.scala:723)
	at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:403)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:147)
	at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:401)
	at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:398)
	at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:62)
	at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:446)
	at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:431)
	at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:62)
	at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:700)
	at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:622)
	at scala.util.Try$.apply(Try.scala:213)
	at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:614)
	at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:533)
	at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:568)
	at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:438)
	at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:381)
	at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:232)
	at java.lang.Thread.run(Thread.java:750)

23/09/22 03:13:44 INFO HiveMetaStore: 0: get_database: default
23/09/22 03:13:44 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: default	
23/09/22 03:13:44 INFO HiveMetaStore: 0: get_database: default
23/09/22 03:13:44 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: default	
23/09/22 03:13:44 INFO ClusterLoadMonitor: Removed query with execution ID:0. Current active queries:0
23/09/22 03:13:44 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:44.19Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"COMPLETE","run":{"runId":"83806b2b-6e39-49a4-a6f2-8efdc67da215","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.command.SetCatalogCommand","num-children":0,"catalogName":"hive_metastore"}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_set_catalog_command","facets":{}},"inputs":[],"outputs":[]}
23/09/22 03:13:44 WARN SimpleFunctionRegistry: The function getargument replaced a previously registered function.
23/09/22 03:13:44 INFO ClusterLoadMonitor: Added query with execution ID:1. Current active queries:1
23/09/22 03:13:44 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:13:44 INFO HiveMetaStore: 0: get_databases: *
23/09/22 03:13:44 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_databases: *	
23/09/22 03:13:44 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:44.623Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"eacbd5dc-0514-4ec2-b963-f7dae875fdf3","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.catalyst.plans.logical.ShowNamespaces","num-children":1,"namespace":0,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"databaseName","dataType":"string","nullable":false,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":6,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]]},{"class":"org.apache.spark.sql.catalyst.analysis.ResolvedNamespace","num-children":0,"catalog":null,"namespace":[]}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.show_namespaces","facets":{}},"inputs":[],"outputs":[]}
23/09/22 03:13:46 INFO PythonDriverWrapper: setupRepl:ReplId-285c6-06788-c5eb5-e: finished to load
23/09/22 03:13:46 INFO ProgressReporter$: Added result fetcher for 2908305457167067998_7192583573582421287_a94f2305c01146bdabe8f83549508a51
23/09/22 03:13:46 INFO AsyncEventQueue: Process of event SparkListenerQueryProfileParamsReady(executionId=0, ...) by listener QueryProfileListener took 1.437714923s.
23/09/22 03:13:46 INFO CodeGenerator: Code generated in 1131.434114 ms
23/09/22 03:13:46 INFO ClusterLoadMonitor: Removed query with execution ID:1. Current active queries:0
23/09/22 03:13:46 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:46.546Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"COMPLETE","run":{"runId":"eacbd5dc-0514-4ec2-b963-f7dae875fdf3","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.catalyst.plans.logical.ShowNamespaces","num-children":1,"namespace":0,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"databaseName","dataType":"string","nullable":false,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":6,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]]},{"class":"org.apache.spark.sql.catalyst.analysis.ResolvedNamespace","num-children":0,"catalog":null,"namespace":[]}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.show_namespaces","facets":{}},"inputs":[],"outputs":[]}
23/09/22 03:13:46 INFO ClusterLoadMonitor: Added query with execution ID:2. Current active queries:1
23/09/22 03:13:46 INFO CodeGenerator: Code generated in 62.950143 ms
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionStart
23/09/22 03:13:47 INFO ClusterLoadMonitor: Removed query with execution ID:2. Current active queries:0
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionEnd
23/09/22 03:13:47 INFO ProgressReporter$: Removed result fetcher for 2908305457167067998_7192583573582421287_a94f2305c01146bdabe8f83549508a51
23/09/22 03:13:47 INFO CodeGenerator: Code generated in 34.004039 ms
23/09/22 03:13:47 INFO ProgressReporter$: Removed result fetcher for 8803832534457543132_7062199902851827812_65e2f9e7-9eb1-4d20-b3b7-bcf8c99891cf
23/09/22 03:13:47 INFO ProgressReporter$: Added result fetcher for 2908305457167067998_5655166856849056603_5a8498e54dc6435896f9d354ad4dc411
23/09/22 03:13:47 INFO ProgressReporter$: Removed result fetcher for 2908305457167067998_5655166856849056603_5a8498e54dc6435896f9d354ad4dc411
23/09/22 03:13:47 INFO ProgressReporter$: Added result fetcher for 2908305457167067998_5289524564745408939_d346c9547ff042428a53259a1692d220
23/09/22 03:13:47 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.0, New Ema: 0.85 
23/09/22 03:13:47 INFO ClusterLoadMonitor: Added query with execution ID:3. Current active queries:1
23/09/22 03:13:47 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionStart
23/09/22 03:13:47 INFO HiveMetaStore: 1: get_database: journey
23/09/22 03:13:47 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:47 INFO HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23/09/22 03:13:47 INFO ObjectStore: ObjectStore, initialize called
23/09/22 03:13:47 INFO ObjectStore: Initialized ObjectStore
23/09/22 03:13:47 INFO HiveMetaStore: 1: get_database: journey
23/09/22 03:13:47 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:47 INFO ClusterLoadMonitor: Removed query with execution ID:3. Current active queries:0
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionEnd
23/09/22 03:13:47 INFO ClusterLoadMonitor: Added query with execution ID:4. Current active queries:1
23/09/22 03:13:47 INFO ClusterLoadMonitor: Removed query with execution ID:4. Current active queries:0
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionStart
23/09/22 03:13:47 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionEnd
23/09/22 03:13:48 INFO ProgressReporter$: Removed result fetcher for 2908305457167067998_5289524564745408939_d346c9547ff042428a53259a1692d220
23/09/22 03:13:48 INFO ProgressReporter$: Added result fetcher for 2908305457167067998_8129229420424498214_3fae42fffd6144fca582f98dbc9b4746
23/09/22 03:13:48 INFO ProgressReporter$: Removed result fetcher for 2908305457167067998_8129229420424498214_3fae42fffd6144fca582f98dbc9b4746
23/09/22 03:13:48 INFO ProgressReporter$: Added result fetcher for 2908305457167067998_6796555818560213290_0c0092c6b28541e7b10544b4b1cad76d
23/09/22 03:13:48 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:48 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:48 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:48 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:48 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:48 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:48 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:48 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:49 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:49 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:49 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:49 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:50 INFO DeltaLog: Loading version 16 starting from checkpoint version 10.
23/09/22 03:13:50 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 0.85, New Ema: 0.0 
23/09/22 03:13:51 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:51 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:51 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:51 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:51 INFO SnapshotEdge: [tableId=88997f34-e6ae-4a52-8e90-beab2ca48dfb] Created snapshot SnapshotEdge(path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log, version=16, metadata=Metadata(e409515a-4f0e-4b35-908c-3a8c6591a14f,null,null,Format(parquet,Map()),{"type":"struct","fields":[{"name":"household_id","type":"integer","nullable":true,"metadata":{}},{"name":"basket_id","type":"long","nullable":true,"metadata":{}},{"name":"day","type":"integer","nullable":true,"metadata":{}},{"name":"product_id","type":"integer","nullable":true,"metadata":{}},{"name":"quantity","type":"integer","nullable":true,"metadata":{}},{"name":"sales_amount","type":"float","nullable":true,"metadata":{}},{"name":"store_id","type":"integer","nullable":true,"metadata":{}},{"name":"discount_amount","type":"float","nullable":true,"metadata":{}},{"name":"transaction_time","type":"integer","nullable":true,"metadata":{}},{"name":"week_no","type":"integer","nullable":true,"metadata":{}},{"name":"coupon_discount","type":"float","nullable":true,"metadata":{}},{"name":"coupon_discount_match","type":"float","nullable":true,"metadata":{}}]},List(),Map(),Some(1694676659851)), logSegment=LogSegment(wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log,16,WrappedArray(FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000011.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695274264000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000012.json; isDirectory=false; length=11239; replication=1; blocksize=536870912; modification_time=1695274677000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000013.json; isDirectory=false; length=8080; replication=1; blocksize=536870912; modification_time=1695276655000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000014.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695346578000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000015.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695347164000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000016.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695351300000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}),WrappedArray(FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000010.checkpoint.parquet; isDirectory=false; length=34444; replication=1; blocksize=536870912; modification_time=1695273438000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}),Some(10),1695351300000), checksumOpt=Some(VersionChecksum(20222849,4,1,1,Protocol(1,2),Metadata(e409515a-4f0e-4b35-908c-3a8c6591a14f,null,null,Format(parquet,Map()),{"type":"struct","fields":[{"name":"household_id","type":"integer","nullable":true,"metadata":{}},{"name":"basket_id","type":"long","nullable":true,"metadata":{}},{"name":"day","type":"integer","nullable":true,"metadata":{}},{"name":"product_id","type":"integer","nullable":true,"metadata":{}},{"name":"quantity","type":"integer","nullable":true,"metadata":{}},{"name":"sales_amount","type":"float","nullable":true,"metadata":{}},{"name":"store_id","type":"integer","nullable":true,"metadata":{}},{"name":"discount_amount","type":"float","nullable":true,"metadata":{}},{"name":"transaction_time","type":"integer","nullable":true,"metadata":{}},{"name":"week_no","type":"integer","nullable":true,"metadata":{}},{"name":"coupon_discount","type":"float","nullable":true,"metadata":{}},{"name":"coupon_discount_match","type":"float","nullable":true,"metadata":{}}]},List(),Map(),Some(1694676659851)),Some(FileSizeHistogram(Vector(0, 8192, 16384, 32768, 65536, 131072, 262144, 524288, 1048576, 2097152, 4194304, 8388608, 12582912, 16777216, 20971520, 25165824, 29360128, 33554432, 37748736, 41943040, 50331648, 58720256, 67108864, 75497472, 83886080, 92274688, 100663296, 109051904, 117440512, 125829120, 130023424, 134217728, 138412032, 142606336, 146800640, 150994944, 167772160, 184549376, 201326592, 218103808, 234881024, 251658240, 268435456, 285212672, 301989888, 318767104, 335544320, 352321536, 369098752, 385875968, 402653184, 419430400, 436207616, 452984832, 469762048, 486539264, 503316480, 520093696, 536870912, 553648128, 570425344, 587202560, 603979776, 671088640, 738197504, 805306368, 872415232, 939524096, 1006632960, 1073741824, 1140850688, 1207959552, 1275068416, 1342177280, 1409286144, 1476395008, 1610612736, 1744830464, 1879048192, 2013265920, 2147483648, 2415919104, 2684354560, 2952790016, 3221225472, 3489660928, 3758096384, 4026531840, 4294967296, 8589934592, 17179869184, 34359738368, 68719476736, 137438953472, 274877906944),[J@15334883,[J@76dae8bb)),Some(b0819991-eddc-4afd-bd64-1591bc13547f),Some(List(AddFile(part-00000-dac72f33-722d-4e3f-9497-6046eeadaf78-c000.snappy.parquet,Map(),5283951,1695351297000,false,{"numRecords":672132,"minValues":{"household_id":1,"basket_id":26984851472,"day":1,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":1,"discount_amount":-79.36,"transaction_time":0,"week_no":1,"coupon_discount":-29.99,"coupon_discount_match":-2.7},"maxValues":{"household_id":2500,"basket_id":30532627350,"day":240,"product_id":12949845,"quantity":85055,"sales_amount":505.0,"store_id":32124,"discount_amount":0.0,"transaction_time":2359,"week_no":35,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000000, MIN_INSERTION_TIME -> 1695351296000000, MAX_INSERTION_TIME -> 1695351296000000, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00003-8ff9238c-f34e-4d10-b70e-fcccd74a1e6d-c000.snappy.parquet,Map(),4537572,1695351296000,false,{"numRecords":587632,"minValues":{"household_id":1,"basket_id":40314850434,"day":568,"product_id":27160,"quantity":0,"sales_amount":0.0,"store_id":2,"discount_amount":-180.0,"transaction_time":0,"week_no":82,"coupon_discount":-31.46,"coupon_discount_match":-2.7},"maxValues":{"household_id":2500,"basket_id":42305362535,"day":711,"product_id":18316298,"quantity":45475,"sales_amount":631.8,"store_id":34280,"discount_amount":0.77,"transaction_time":2359,"week_no":102,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000003, MIN_INSERTION_TIME -> 1695351296000003, MAX_INSERTION_TIME -> 1695351296000003, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00002-618b4fff-77ad-4663-aaec-dbd5769515b1-c000.snappy.parquet,Map(),5238927,1695351296000,false,{"numRecords":667618,"minValues":{"household_id":1,"basket_id":32956680859,"day":401,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":26,"discount_amount":-90.05,"transaction_time":0,"week_no":58,"coupon_discount":-37.93,"coupon_discount_match":-5.8},"maxValues":{"household_id":2500,"basket_id":40314850434,"day":568,"product_id":16809685,"quantity":89638,"sales_amount":329.99,"store_id":34016,"discount_amount":2.09,"transaction_time":2359,"week_no":82,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000002, MIN_INSERTION_TIME -> 1695351296000002, MAX_INSERTION_TIME -> 1695351296000002, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00001-894cd31a-620c-4f5e-9ea8-cb25d4193b6e-c000.snappy.parquet,Map(),5162399,1695351296000,false,{"numRecords":668350,"minValues":{"household_id":1,"basket_id":30532627350,"day":230,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":2,"discount_amount":-129.98,"transaction_time":0,"week_no":34,"coupon_discount":-55.93,"coupon_discount_match":-7.7},"maxValues":{"household_id":2500,"basket_id":32956680859,"day":403,"product_id":14077546,"quantity":38348,"sales_amount":840.0,"store_id":33923,"discount_amount":3.99,"transaction_time":2359,"week_no":58,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000001, MIN_INSERTION_TIME -> 1695351296000001, MAX_INSERTION_TIME -> 1695351296000001, OPTIMIZE_TARGET_SIZE -> 268435456),null))))))
23/09/22 03:13:51 INFO ClusterLoadMonitor: Added query with execution ID:5. Current active queries:1
23/09/22 03:13:51 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:51 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:51 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:51 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:51 INFO HiveClientImpl: Warehouse location for Hive client (version 0.13.1) is dbfs:/user/hive/warehouse
23/09/22 03:13:51 INFO HiveMetaStore: No user is added in admin role, since config is empty
23/09/22 03:13:51 INFO HiveMetaStore: 2: get_table : db=journey tbl=transactions
23/09/22 03:13:51 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:51 INFO HiveMetaStore: 2: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23/09/22 03:13:51 INFO ObjectStore: ObjectStore, initialize called
23/09/22 03:13:51 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:51 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO ObjectStore: Initialized ObjectStore
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:52 INFO HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23/09/22 03:13:52 INFO ObjectStore: ObjectStore, initialize called
23/09/22 03:13:52 INFO ObjectStore: Initialized ObjectStore
23/09/22 03:13:52 INFO HiveMetaStore: 3: get_database: journey
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:52 INFO HiveMetaStore: 3: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23/09/22 03:13:52 INFO ObjectStore: ObjectStore, initialize called
23/09/22 03:13:52 INFO ObjectStore: Initialized ObjectStore
23/09/22 03:13:52 INFO HiveMetaStore: 3: get_multi_table : db=journey tbls=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_multi_table : db=journey tbls=transactions	
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:52 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:52 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:51.932Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"4d1903f6-f932-4e4c-a79c-ba66a376f72c","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.command.DropTableCommand","num-children":0,"tableName":{"product-class":"org.apache.spark.sql.catalyst.TableIdentifier","table":"transactions","database":"journey","catalog":"spark_catalog"},"ifExists":true,"isView":false,"purge":false,"materialized":false}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_drop_table_command.silver_transactions","facets":{}},"inputs":[],"outputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/silver/transactions","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"symlinks":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SymlinksDatasetFacet.json#/$defs/SymlinksDatasetFacet","identifiers":[{"namespace":"/examples/data/csv/completejourney/silver","name":"journey.transactions","type":"TABLE"}]},"lifecycleStateChange":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/LifecycleStateChangeDatasetFacet.json#/$defs/LifecycleStateChangeDatasetFacet","lifecycleStateChange":"DROP"}},"outputFacets":{}}]}
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_database: journey
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_database: journey
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:52 INFO HiveMetaStore: 1: drop_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=drop_table : db=journey tbl=transactions	
23/09/22 03:13:52 INFO HiveMetaStore: 1: get_table : db=journey tbl=transactions
23/09/22 03:13:52 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:53 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:53 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:53 INFO ClusterLoadMonitor: Removed query with execution ID:5. Current active queries:0
23/09/22 03:13:53 INFO HiveMetaStore: 2: get_table : db=journey tbl=transactions
23/09/22 03:13:53 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=journey tbl=transactions	
23/09/22 03:13:53 INFO HiveMetaStore: 2: get_database: journey
23/09/22 03:13:53 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: journey	
23/09/22 03:13:53 WARN DropTableCommandVisitor: Unable to find table by identifier `spark_catalog`.`journey`.`transactions` - Table or view 'transactions' not found in database 'journey'
23/09/22 03:13:53 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:53.535Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"COMPLETE","run":{"runId":"4d1903f6-f932-4e4c-a79c-ba66a376f72c","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.command.DropTableCommand","num-children":0,"tableName":{"product-class":"org.apache.spark.sql.catalyst.TableIdentifier","table":"transactions","database":"journey","catalog":"spark_catalog"},"ifExists":true,"isView":false,"purge":false,"materialized":false}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_drop_table_command.silver_transactions","facets":{}},"inputs":[],"outputs":[]}
23/09/22 03:13:53 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:53 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:54 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:54 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:54 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:54 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:54 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:54 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:54 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 1; threshold: 32
23/09/22 03:13:54 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:54 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:54 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
23/09/22 03:13:54 INFO InMemoryFileIndex: It took 126 ms to list leaf files for 1 paths.
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO ClusterLoadMonitor: Added query with execution ID:6. Current active queries:1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO DeltaLog: Loading version 16 starting from checkpoint version 10.
23/09/22 03:13:55 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:13:55.374Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"09b465e3-ef2c-452a-be68-6bcb8d01fe80","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand","num-children":0,"query":[{"class":"org.apache.spark.sql.execution.datasources.LogicalRelation","num-children":0,"relation":null,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"household_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":75,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"basket_id","dataType":"long","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":76,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"day","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":77,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"product_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":78,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"quantity","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":79,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"sales_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":80,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"store_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":81,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"discount_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":82,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"transaction_time","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":83,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"week_no","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":84,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":85,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount_match","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":86,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]],"isStreaming":false}],"dataSource":null,"options":null,"mode":null}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_save_into_data_source_command.silver_transactions","facets":{}},"inputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]}},"inputFacets":{}}],"outputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/silver/transactions","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]},"columnLineage":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-1/ColumnLineageDatasetFacet.json#/$defs/ColumnLineageDatasetFacet","fields":{"household_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"household_id"}]},"basket_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"basket_id"}]},"day":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"day"}]},"product_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"product_id"}]},"quantity":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"quantity"}]},"sales_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"sales_amount"}]},"store_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"store_id"}]},"discount_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"discount_amount"}]},"transaction_time":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"transaction_time"}]},"week_no":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"week_no"}]},"coupon_discount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount"}]},"coupon_discount_match":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount_match"}]}}},"lifecycleStateChange":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/LifecycleStateChangeDatasetFacet.json#/$defs/LifecycleStateChangeDatasetFacet","lifecycleStateChange":"OVERWRITE"}},"outputFacets":{}}]}
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO SnapshotEdge: [tableId=92982fea-9dbe-4e68-848c-022fb5257783] Created snapshot SnapshotEdge(path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log, version=16, metadata=Metadata(e409515a-4f0e-4b35-908c-3a8c6591a14f,null,null,Format(parquet,Map()),{"type":"struct","fields":[{"name":"household_id","type":"integer","nullable":true,"metadata":{}},{"name":"basket_id","type":"long","nullable":true,"metadata":{}},{"name":"day","type":"integer","nullable":true,"metadata":{}},{"name":"product_id","type":"integer","nullable":true,"metadata":{}},{"name":"quantity","type":"integer","nullable":true,"metadata":{}},{"name":"sales_amount","type":"float","nullable":true,"metadata":{}},{"name":"store_id","type":"integer","nullable":true,"metadata":{}},{"name":"discount_amount","type":"float","nullable":true,"metadata":{}},{"name":"transaction_time","type":"integer","nullable":true,"metadata":{}},{"name":"week_no","type":"integer","nullable":true,"metadata":{}},{"name":"coupon_discount","type":"float","nullable":true,"metadata":{}},{"name":"coupon_discount_match","type":"float","nullable":true,"metadata":{}}]},List(),Map(),Some(1694676659851)), logSegment=LogSegment(wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log,16,WrappedArray(FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000011.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695274264000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000012.json; isDirectory=false; length=11239; replication=1; blocksize=536870912; modification_time=1695274677000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000013.json; isDirectory=false; length=8080; replication=1; blocksize=536870912; modification_time=1695276655000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000014.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695346578000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000015.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695347164000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}, FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000016.json; isDirectory=false; length=6616; replication=1; blocksize=536870912; modification_time=1695351300000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}),WrappedArray(FileStatus{path=wasbs://studio@clororetaildevadls.blob.core.windows.net/examples/data/csv/completejourney/silver/transactions/_delta_log/00000000000000000010.checkpoint.parquet; isDirectory=false; length=34444; replication=1; blocksize=536870912; modification_time=1695273438000; access_time=0; owner=root; group=supergroup; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false}),Some(10),1695351300000), checksumOpt=Some(VersionChecksum(20222849,4,1,1,Protocol(1,2),Metadata(e409515a-4f0e-4b35-908c-3a8c6591a14f,null,null,Format(parquet,Map()),{"type":"struct","fields":[{"name":"household_id","type":"integer","nullable":true,"metadata":{}},{"name":"basket_id","type":"long","nullable":true,"metadata":{}},{"name":"day","type":"integer","nullable":true,"metadata":{}},{"name":"product_id","type":"integer","nullable":true,"metadata":{}},{"name":"quantity","type":"integer","nullable":true,"metadata":{}},{"name":"sales_amount","type":"float","nullable":true,"metadata":{}},{"name":"store_id","type":"integer","nullable":true,"metadata":{}},{"name":"discount_amount","type":"float","nullable":true,"metadata":{}},{"name":"transaction_time","type":"integer","nullable":true,"metadata":{}},{"name":"week_no","type":"integer","nullable":true,"metadata":{}},{"name":"coupon_discount","type":"float","nullable":true,"metadata":{}},{"name":"coupon_discount_match","type":"float","nullable":true,"metadata":{}}]},List(),Map(),Some(1694676659851)),Some(FileSizeHistogram(Vector(0, 8192, 16384, 32768, 65536, 131072, 262144, 524288, 1048576, 2097152, 4194304, 8388608, 12582912, 16777216, 20971520, 25165824, 29360128, 33554432, 37748736, 41943040, 50331648, 58720256, 67108864, 75497472, 83886080, 92274688, 100663296, 109051904, 117440512, 125829120, 130023424, 134217728, 138412032, 142606336, 146800640, 150994944, 167772160, 184549376, 201326592, 218103808, 234881024, 251658240, 268435456, 285212672, 301989888, 318767104, 335544320, 352321536, 369098752, 385875968, 402653184, 419430400, 436207616, 452984832, 469762048, 486539264, 503316480, 520093696, 536870912, 553648128, 570425344, 587202560, 603979776, 671088640, 738197504, 805306368, 872415232, 939524096, 1006632960, 1073741824, 1140850688, 1207959552, 1275068416, 1342177280, 1409286144, 1476395008, 1610612736, 1744830464, 1879048192, 2013265920, 2147483648, 2415919104, 2684354560, 2952790016, 3221225472, 3489660928, 3758096384, 4026531840, 4294967296, 8589934592, 17179869184, 34359738368, 68719476736, 137438953472, 274877906944),[J@4a157934,[J@aa60254)),Some(b0819991-eddc-4afd-bd64-1591bc13547f),Some(List(AddFile(part-00000-dac72f33-722d-4e3f-9497-6046eeadaf78-c000.snappy.parquet,Map(),5283951,1695351297000,false,{"numRecords":672132,"minValues":{"household_id":1,"basket_id":26984851472,"day":1,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":1,"discount_amount":-79.36,"transaction_time":0,"week_no":1,"coupon_discount":-29.99,"coupon_discount_match":-2.7},"maxValues":{"household_id":2500,"basket_id":30532627350,"day":240,"product_id":12949845,"quantity":85055,"sales_amount":505.0,"store_id":32124,"discount_amount":0.0,"transaction_time":2359,"week_no":35,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000000, MIN_INSERTION_TIME -> 1695351296000000, MAX_INSERTION_TIME -> 1695351296000000, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00003-8ff9238c-f34e-4d10-b70e-fcccd74a1e6d-c000.snappy.parquet,Map(),4537572,1695351296000,false,{"numRecords":587632,"minValues":{"household_id":1,"basket_id":40314850434,"day":568,"product_id":27160,"quantity":0,"sales_amount":0.0,"store_id":2,"discount_amount":-180.0,"transaction_time":0,"week_no":82,"coupon_discount":-31.46,"coupon_discount_match":-2.7},"maxValues":{"household_id":2500,"basket_id":42305362535,"day":711,"product_id":18316298,"quantity":45475,"sales_amount":631.8,"store_id":34280,"discount_amount":0.77,"transaction_time":2359,"week_no":102,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000003, MIN_INSERTION_TIME -> 1695351296000003, MAX_INSERTION_TIME -> 1695351296000003, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00002-618b4fff-77ad-4663-aaec-dbd5769515b1-c000.snappy.parquet,Map(),5238927,1695351296000,false,{"numRecords":667618,"minValues":{"household_id":1,"basket_id":32956680859,"day":401,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":26,"discount_amount":-90.05,"transaction_time":0,"week_no":58,"coupon_discount":-37.93,"coupon_discount_match":-5.8},"maxValues":{"household_id":2500,"basket_id":40314850434,"day":568,"product_id":16809685,"quantity":89638,"sales_amount":329.99,"store_id":34016,"discount_amount":2.09,"transaction_time":2359,"week_no":82,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000002, MIN_INSERTION_TIME -> 1695351296000002, MAX_INSERTION_TIME -> 1695351296000002, OPTIMIZE_TARGET_SIZE -> 268435456),null), AddFile(part-00001-894cd31a-620c-4f5e-9ea8-cb25d4193b6e-c000.snappy.parquet,Map(),5162399,1695351296000,false,{"numRecords":668350,"minValues":{"household_id":1,"basket_id":30532627350,"day":230,"product_id":25671,"quantity":0,"sales_amount":0.0,"store_id":2,"discount_amount":-129.98,"transaction_time":0,"week_no":34,"coupon_discount":-55.93,"coupon_discount_match":-7.7},"maxValues":{"household_id":2500,"basket_id":32956680859,"day":403,"product_id":14077546,"quantity":38348,"sales_amount":840.0,"store_id":33923,"discount_amount":3.99,"transaction_time":2359,"week_no":58,"coupon_discount":0.0,"coupon_discount_match":0.0},"nullCount":{"household_id":0,"basket_id":0,"day":0,"product_id":0,"quantity":0,"sales_amount":0,"store_id":0,"discount_amount":0,"transaction_time":0,"week_no":0,"coupon_discount":0,"coupon_discount_match":0}},Map(INSERTION_TIME -> 1695351296000001, MIN_INSERTION_TIME -> 1695351296000001, MAX_INSERTION_TIME -> 1695351296000001, OPTIMIZE_TARGET_SIZE -> 268435456),null))))))
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:55 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:55 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 398.7 KiB, free 3.3 GiB)
23/09/22 03:13:56 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 151.9 KiB, free 3.3 GiB)
23/09/22 03:13:56 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 18.1 KiB, free 3.3 GiB)
23/09/22 03:13:56 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.11.115.134:44293 (size: 18.1 KiB, free: 3.3 GiB)
23/09/22 03:13:56 INFO SparkContext: Created broadcast 1 from writeExternal at ObjectOutputStream.java:1459
23/09/22 03:13:56 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 13.8 KiB, free 3.3 GiB)
23/09/22 03:13:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.11.115.134:44293 (size: 13.8 KiB, free: 3.3 GiB)
23/09/22 03:13:56 INFO SparkContext: Created broadcast 0 from broadcast at Snapshot.scala:119
23/09/22 03:13:56 INFO DeltaLogFileIndex: Created DeltaLogFileIndex(Parquet, numFilesInSegment: 1, totalFileSize: 34444)
23/09/22 03:13:56 INFO DeltaLogFileIndex: Created DeltaLogFileIndex(JSON, numFilesInSegment: 6, totalFileSize: 45783)
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:13:56 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:13:56 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 0.0, New Ema: 1.0 
23/09/22 03:13:58 INFO CodeGenerator: Code generated in 12.782887 ms
23/09/22 03:13:58 INFO CodeGenerator: Code generated in 28.714197 ms
23/09/22 03:13:58 INFO CodeGenerator: Code generated in 284.497949 ms
23/09/22 03:13:58 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 31.6 KiB, free 3.3 GiB)
23/09/22 03:13:58 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 14.1 KiB, free 3.3 GiB)
23/09/22 03:13:58 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 10.11.115.134:44293 (size: 14.1 KiB, free: 3.3 GiB)
23/09/22 03:13:58 INFO SparkContext: Created broadcast 2 from toRdd at StateCache.scala:61
23/09/22 03:13:58 INFO CodeGenerator: Code generated in 140.11686 ms
23/09/22 03:13:58 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 38.5 KiB, free 3.3 GiB)
23/09/22 03:13:58 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 12.2 KiB, free 3.3 GiB)
23/09/22 03:13:58 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on 10.11.115.134:44293 (size: 12.2 KiB, free: 3.3 GiB)
23/09/22 03:13:58 INFO SparkContext: Created broadcast 3 from toRdd at StateCache.scala:61
23/09/22 03:13:58 INFO FileSourceStrategy: Pushed Filters: 
23/09/22 03:13:58 INFO FileSourceStrategy: Post-Scan Filters: 
23/09/22 03:13:58 INFO FileSourceStrategy: Output Data Schema: struct<txn: struct<appId: string, version: bigint, lastUpdated: bigint ... 1 more fields>, add: struct<path: string, partitionValues: map<string,string>, size: bigint, modificationTime: bigint, dataChange: boolean ... 6 more fields>, remove: struct<path: string, deletionTimestamp: bigint, dataChange: boolean, extendedFileMetadata: boolean, partitionValues: map<string,string> ... 6 more fields>, metaData: struct<id: string, name: string, description: string, format: struct<provider: string, options: map<string,string>>, schemaString: string ... 6 more fields>, protocol: struct<minReaderVersion: int, minWriterVersion: int, readerFeatures: array<string>, writerFeatures: array<string> ... 2 more fields> ... 5 more fields>
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO NativeAzureFileSystem: WASB Filesystem wasbs://studio@clororetaildevadls.blob.core.windows.net is closed with isClosed = false
23/09/22 03:13:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:13:59 INFO CodeGenerator: Code generated in 112.834173 ms
23/09/22 03:13:59 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 491.3 KiB, free 3.3 GiB)
23/09/22 03:13:59 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 17.7 KiB, free 3.3 GiB)
23/09/22 03:13:59 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on 10.11.115.134:44293 (size: 17.7 KiB, free: 3.3 GiB)
23/09/22 03:13:59 INFO SparkContext: Created broadcast 4 from $anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604
23/09/22 03:13:59 INFO FileSourceScanExec: Planning scan with bin packing, max split size: 134217728 bytes, max partition size: 4194304, open cost is considered as scanning 4194304 bytes.
23/09/22 03:13:59 INFO CodeGenerator: Code generated in 50.220344 ms
23/09/22 03:14:00 INFO DAGScheduler: Registering RDD 6 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) as input to shuffle 0
23/09/22 03:14:00 INFO DAGScheduler: Got map stage job 0 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) with 5 output partitions
23/09/22 03:14:00 INFO DAGScheduler: Final stage: ShuffleMapStage 0 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604)
23/09/22 03:14:00 INFO DAGScheduler: Parents of final stage: List()
23/09/22 03:14:00 INFO DAGScheduler: Missing parents: List()
23/09/22 03:14:00 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:00 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:00 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[6] at $anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604), which has no missing parents
23/09/22 03:14:00 INFO DAGScheduler: Jars for session None: Map()
23/09/22 03:14:00 INFO DAGScheduler: Files for session None: Map()
23/09/22 03:14:00 INFO DAGScheduler: Archives for session None: Map()
23/09/22 03:14:00 INFO DAGScheduler: Submitting 5 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[6] at $anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
23/09/22 03:14:00 INFO TaskSchedulerImpl: Adding task set 0.0 with 5 tasks resource profile 0
23/09/22 03:14:00 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:14:00.237Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"09b465e3-ef2c-452a-be68-6bcb8d01fe80","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand","num-children":0,"query":[{"class":"org.apache.spark.sql.execution.datasources.LogicalRelation","num-children":0,"relation":null,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"household_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":75,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"basket_id","dataType":"long","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":76,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"day","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":77,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"product_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":78,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"quantity","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":79,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"sales_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":80,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"store_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":81,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"discount_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":82,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"transaction_time","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":83,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"week_no","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":84,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":85,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount_match","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":86,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]],"isStreaming":false}],"dataSource":null,"options":null,"mode":null}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"spark_properties":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","properties":{"spark.master":"spark://10.11.115.134:7077","spark.app.name":"Databricks Shell"}},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"},"environment-properties":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","environment-properties":{"spark.databricks.clusterUsageTags.clusterName":"jason.yip@tredence.com's Cluster","spark.databricks.clusterUsageTags.azureSubscriptionId":"a4f54399-8db8-4849-adcc-a42aed1fb97f","spark.databricks.notebook.path":"/Repos/jason.yip@tredence.com/segmentation/01_Data Prep","mountPoints":[{"mountPoint":"/databricks-datasets","source":"databricks-datasets"},{"mountPoint":"/Volumes","source":"UnityCatalogVolumes"},{"mountPoint":"/databricks/mlflow-tracking","source":"databricks/mlflow-tracking"},{"mountPoint":"/databricks-results","source":"databricks-results"},{"mountPoint":"/databricks/mlflow-registry","source":"databricks/mlflow-registry"},{"mountPoint":"/Volume","source":"DbfsReserved"},{"mountPoint":"/volumes","source":"DbfsReserved"},{"mountPoint":"/","source":"DatabricksRoot"},{"mountPoint":"/volume","source":"DbfsReserved"}],"spark.databricks.clusterUsageTags.clusterAllTags":"[{\"key\":\"Vendor\",\"value\":\"Databricks\"},{\"key\":\"Creator\",\"value\":\"jason.yip@tredence.com\"},{\"key\":\"ClusterName\",\"value\":\"jason.yip@tredence.com's Cluster\"},{\"key\":\"ClusterId\",\"value\":\"0808-055325-43kdx9a4\"},{\"key\":\"Environment\",\"value\":\"POC\"},{\"key\":\"Project\",\"value\":\"SI\"},{\"key\":\"DatabricksEnvironment\",\"value\":\"workerenv-4679476628690204\"}]","spark.databricks.clusterUsageTags.clusterOwnerOrgId":"4679476628690204","user":"jason.yip@tredence.com","userId":"4768657035718622","orgId":"4679476628690204"}}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_save_into_data_source_command.silver_transactions","facets":{}},"inputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]}},"inputFacets":{}}],"outputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/silver/transactions","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]},"columnLineage":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-1/ColumnLineageDatasetFacet.json#/$defs/ColumnLineageDatasetFacet","fields":{"household_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"household_id"}]},"basket_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"basket_id"}]},"day":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"day"}]},"product_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"product_id"}]},"quantity":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"quantity"}]},"sales_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"sales_amount"}]},"store_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"store_id"}]},"discount_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"discount_amount"}]},"transaction_time":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"transaction_time"}]},"week_no":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"week_no"}]},"coupon_discount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount"}]},"coupon_discount_match":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount_match"}]}}},"lifecycleStateChange":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/LifecycleStateChangeDatasetFacet.json#/$defs/LifecycleStateChangeDatasetFacet","lifecycleStateChange":"OVERWRITE"}},"outputFacets":{}}]}
23/09/22 03:14:00 WARN FairSchedulableBuilder: A job was submitted with scheduler pool 2908305457167067998, which has not been configured. This can happen when the file that pools are read from isn't set, or when that file doesn't contain 2908305457167067998. Created 2908305457167067998 with default configuration (schedulingMode: FIFO, minShare: 0, weight: 1)
23/09/22 03:14:00 INFO FairSchedulableBuilder: Added task set TaskSet_0.0 tasks to pool 2908305457167067998
23/09/22 03:14:00 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0) (10.11.115.133, executor 0, partition 0, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:00 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1) (10.11.115.133, executor 0, partition 1, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:00 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2) (10.11.115.133, executor 0, partition 2, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:00 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3) (10.11.115.133, executor 0, partition 3, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:00 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 275.9 KiB, free 3.3 GiB)
23/09/22 03:14:00 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 75.6 KiB, free 3.3 GiB)
23/09/22 03:14:00 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on 10.11.115.134:44293 (size: 75.6 KiB, free: 3.3 GiB)
23/09/22 03:14:00 INFO SparkContext: Created broadcast 5 from broadcast at TaskSetManager.scala:622
23/09/22 03:14:01 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on 10.11.115.133:45037 (size: 75.6 KiB, free: 3.6 GiB)
23/09/22 03:14:01 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.11.115.133:45037 (size: 18.1 KiB, free: 3.6 GiB)
23/09/22 03:14:02 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:02 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4) (10.11.115.133, executor 0, partition 4, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:02 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 2074 ms on 10.11.115.133 (executor 0) (1/5)
23/09/22 03:14:02 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 2080 ms on 10.11.115.133 (executor 0) (2/5)
23/09/22 03:14:02 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 2081 ms on 10.11.115.133 (executor 0) (3/5)
23/09/22 03:14:02 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 37 ms on 10.11.115.133 (executor 0) (4/5)
23/09/22 03:14:02 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on 10.11.115.133:45037 (size: 17.7 KiB, free: 3.6 GiB)
23/09/22 03:14:05 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:08 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:08 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.11.115.133:45037 (size: 13.8 KiB, free: 3.6 GiB)
23/09/22 03:14:09 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 8252 ms on 10.11.115.133 (executor 0) (5/5)
23/09/22 03:14:09 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 2908305457167067998
23/09/22 03:14:09 INFO DAGScheduler: ShuffleMapStage 0 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) finished in 8.708 s
23/09/22 03:14:09 INFO DAGScheduler: looking for newly runnable stages
23/09/22 03:14:09 INFO DAGScheduler: running: Set()
23/09/22 03:14:09 INFO DAGScheduler: waiting: Set()
23/09/22 03:14:09 INFO DAGScheduler: failed: Set()
23/09/22 03:14:09 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:09 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:09 INFO CodeGenerator: Code generated in 104.675309 ms
23/09/22 03:14:09 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:14:09.058Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"COMPLETE","run":{"runId":"09b465e3-ef2c-452a-be68-6bcb8d01fe80","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand","num-children":0,"query":[{"class":"org.apache.spark.sql.execution.datasources.LogicalRelation","num-children":0,"relation":null,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"household_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":75,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"basket_id","dataType":"long","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":76,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"day","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":77,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"product_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":78,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"quantity","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":79,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"sales_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":80,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"store_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":81,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"discount_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":82,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"transaction_time","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":83,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"week_no","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":84,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":85,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount_match","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":86,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]],"isStreaming":false}],"dataSource":null,"options":null,"mode":null}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.execute_save_into_data_source_command.silver_transactions","facets":{}},"inputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]}},"inputFacets":{}}],"outputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/silver/transactions","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]},"columnLineage":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-1/ColumnLineageDatasetFacet.json#/$defs/ColumnLineageDatasetFacet","fields":{"household_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"household_id"}]},"basket_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"basket_id"}]},"day":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"day"}]},"product_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"product_id"}]},"quantity":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"quantity"}]},"sales_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"sales_amount"}]},"store_id":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"store_id"}]},"discount_amount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"discount_amount"}]},"transaction_time":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"transaction_time"}]},"week_no":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"week_no"}]},"coupon_discount":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount"}]},"coupon_discount_match":{"inputFields":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","field":"coupon_discount_match"}]}}},"lifecycleStateChange":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/LifecycleStateChangeDatasetFacet.json#/$defs/LifecycleStateChangeDatasetFacet","lifecycleStateChange":"OVERWRITE"}},"outputFacets":{"outputStatistics":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/OutputStatisticsOutputDatasetFacet.json#/$defs/OutputStatisticsOutputDatasetFacet","rowCount":0,"size":0}}}]}
23/09/22 03:14:09 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionStart
23/09/22 03:14:09 INFO CodeGenerator: Code generated in 59.190001 ms
23/09/22 03:14:10 INFO DAGScheduler: Registering RDD 16 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) as input to shuffle 1
23/09/22 03:14:10 INFO DAGScheduler: Got map stage job 1 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) with 1 output partitions
23/09/22 03:14:10 INFO DAGScheduler: Final stage: ShuffleMapStage 2 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604)
23/09/22 03:14:10 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 1)
23/09/22 03:14:10 INFO DAGScheduler: Missing parents: List()
23/09/22 03:14:10 INFO DAGScheduler: Submitting ShuffleMapStage 2 (MapPartitionsRDD[16] at $anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604), which has no missing parents
23/09/22 03:14:10 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerJobStart
23/09/22 03:14:10 INFO DAGScheduler: Jars for session None: Map()
23/09/22 03:14:10 INFO DAGScheduler: Files for session None: Map()
23/09/22 03:14:10 INFO DAGScheduler: Archives for session None: Map()
23/09/22 03:14:10 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 2 (MapPartitionsRDD[16] at $anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) (first 15 tasks are for partitions Vector(0))
23/09/22 03:14:10 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks resource profile 0
23/09/22 03:14:10 INFO FairSchedulableBuilder: Added task set TaskSet_2.0 tasks to pool 2908305457167067998
23/09/22 03:14:10 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 5) (10.11.115.133, executor 0, partition 0, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:10 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 417.6 KiB, free 3.3 GiB)
23/09/22 03:14:10 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 113.2 KiB, free 3.3 GiB)
23/09/22 03:14:10 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on 10.11.115.134:44293 (size: 113.2 KiB, free: 3.3 GiB)
23/09/22 03:14:10 INFO SparkContext: Created broadcast 6 from broadcast at TaskSetManager.scala:622
23/09/22 03:14:10 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on 10.11.115.133:45037 (size: 113.2 KiB, free: 3.6 GiB)
23/09/22 03:14:10 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.11.115.133:57974
23/09/22 03:14:11 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on 10.11.115.133:45037 (size: 12.2 KiB, free: 3.6 GiB)
23/09/22 03:14:11 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:11 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 10.11.115.133:45037 (size: 14.1 KiB, free: 3.6 GiB)
23/09/22 03:14:11 INFO BlockManagerInfo: Added rdd_13_0 in memory on 10.11.115.133:45037 (size: 5.0 KiB, free: 3.6 GiB)
23/09/22 03:14:12 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 5) in 2488 ms on 10.11.115.133 (executor 0) (1/1)
23/09/22 03:14:12 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 2908305457167067998
23/09/22 03:14:12 INFO DAGScheduler: ShuffleMapStage 2 ($anonfun$withThreadLocalCaptured$1 at CompletableFuture.java:1604) finished in 2.872 s
23/09/22 03:14:12 INFO DAGScheduler: looking for newly runnable stages
23/09/22 03:14:12 INFO DAGScheduler: running: Set()
23/09/22 03:14:12 INFO DAGScheduler: waiting: Set()
23/09/22 03:14:12 INFO DAGScheduler: failed: Set()
23/09/22 03:14:12 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerJobEnd
23/09/22 03:14:12 INFO SparkContext: Starting job: first at Snapshot.scala:238
23/09/22 03:14:12 INFO DAGScheduler: Got job 2 (first at Snapshot.scala:238) with 1 output partitions
23/09/22 03:14:12 INFO DAGScheduler: Final stage: ResultStage 5 (first at Snapshot.scala:238)
23/09/22 03:14:12 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 4)
23/09/22 03:14:12 INFO DAGScheduler: Missing parents: List()
23/09/22 03:14:12 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerJobStart
23/09/22 03:14:12 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[18] at first at Snapshot.scala:238), which has no missing parents
23/09/22 03:14:13 INFO DAGScheduler: Jars for session None: Map()
23/09/22 03:14:13 INFO DAGScheduler: Files for session None: Map()
23/09/22 03:14:13 INFO DAGScheduler: Archives for session None: Map()
23/09/22 03:14:13 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 5 (MapPartitionsRDD[18] at first at Snapshot.scala:238) (first 15 tasks are for partitions Vector(0))
23/09/22 03:14:13 INFO TaskSchedulerImpl: Adding task set 5.0 with 1 tasks resource profile 0
23/09/22 03:14:13 INFO FairSchedulableBuilder: Added task set TaskSet_5.0 tasks to pool 2908305457167067998
23/09/22 03:14:13 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 6) (10.11.115.133, executor 0, partition 0, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:13 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 363.2 KiB, free 3.3 GiB)
23/09/22 03:14:13 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 104.0 KiB, free 3.3 GiB)
23/09/22 03:14:13 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on 10.11.115.134:44293 (size: 104.0 KiB, free: 3.3 GiB)
23/09/22 03:14:13 INFO SparkContext: Created broadcast 7 from broadcast at TaskSetManager.scala:622
23/09/22 03:14:13 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on 10.11.115.133:45037 (size: 104.0 KiB, free: 3.6 GiB)
23/09/22 03:14:13 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 1 to 10.11.115.133:57974
23/09/22 03:14:14 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:15 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 6) in 2487 ms on 10.11.115.133 (executor 0) (1/1)
23/09/22 03:14:15 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 2908305457167067998
23/09/22 03:14:15 INFO DAGScheduler: ResultStage 5 (first at Snapshot.scala:238) finished in 2.554 s
23/09/22 03:14:15 INFO DAGScheduler: Job 2 is finished. Cancelling potential speculative or zombie tasks for this job
23/09/22 03:14:15 INFO TaskSchedulerImpl: Killing all running tasks in stage 5: Stage finished
23/09/22 03:14:15 INFO DAGScheduler: Job 2 finished: first at Snapshot.scala:238, took 2.576761 s
23/09/22 03:14:15 INFO CodeGenerator: Code generated in 44.3349 ms
23/09/22 03:14:15 INFO SparkSQLExecutionContext: OpenLineage received Spark event that is configured to be skipped: SparkListenerSQLExecutionEnd
23/09/22 03:14:15 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:15 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:15 INFO FileSourceStrategy: Pushed Filters: 
23/09/22 03:14:15 INFO FileSourceStrategy: Post-Scan Filters: 
23/09/22 03:14:15 INFO FileSourceStrategy: Output Data Schema: struct<household_id: int, basket_id: bigint, day: int, product_id: int, quantity: int ... 10 more fields>
23/09/22 03:14:15 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:15 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:16 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:16 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:16 INFO DeltaParquetFileFormat: Using user defined output committer for Parquet: org.apache.spark.sql.parquet.DirectParquetOutputCommitter
23/09/22 03:14:16 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 405.9 KiB, free 3.3 GiB)
23/09/22 03:14:16 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 14.5 KiB, free 3.3 GiB)
23/09/22 03:14:16 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on 10.11.115.134:44293 (size: 14.5 KiB, free: 3.3 GiB)
23/09/22 03:14:16 INFO SparkContext: Created broadcast 8 from execute at DeltaInvariantCheckerExec.scala:74
23/09/22 03:14:16 INFO FileSourceScanExec: Planning scan with bin packing, max split size: 36484162 bytes, max partition size: 36484162, open cost is considered as scanning 4194304 bytes.
23/09/22 03:14:16 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:14:15.962Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"ee45de3c-b839-4347-92cc-89f766d073c3","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.datasources.LogicalRelation","num-children":0,"relation":null,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"household_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":75,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"basket_id","dataType":"long","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":76,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"day","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":77,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"product_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":78,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"quantity","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":79,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"sales_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":80,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"store_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":81,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"discount_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":82,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"transaction_time","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":83,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"week_no","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":84,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":85,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount_match","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":86,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]],"isStreaming":false}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.scan_csv ","facets":{}},"inputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]}},"inputFacets":{}}],"outputs":[]}
23/09/22 03:14:16 INFO SparkContext: Starting job: write at WriteIntoDeltaCommand.scala:70
23/09/22 03:14:16 INFO DAGScheduler: Got job 3 (write at WriteIntoDeltaCommand.scala:70) with 4 output partitions
23/09/22 03:14:16 INFO DAGScheduler: Final stage: ResultStage 6 (write at WriteIntoDeltaCommand.scala:70)
23/09/22 03:14:16 INFO DAGScheduler: Parents of final stage: List()
23/09/22 03:14:16 INFO DAGScheduler: Missing parents: List()
23/09/22 03:14:16 INFO DAGScheduler: Submitting ResultStage 6 (MapPartitionsRDD[20] at execute at DeltaInvariantCheckerExec.scala:74), which has no missing parents
23/09/22 03:14:16 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections
23/09/22 03:14:16 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1
23/09/22 03:14:16 INFO DAGScheduler: Jars for session None: Map()
23/09/22 03:14:16 INFO DAGScheduler: Files for session None: Map()
23/09/22 03:14:16 INFO DAGScheduler: Archives for session None: Map()
23/09/22 03:14:16 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 6 (MapPartitionsRDD[20] at execute at DeltaInvariantCheckerExec.scala:74) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
23/09/22 03:14:16 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks resource profile 0
23/09/22 03:14:16 INFO FairSchedulableBuilder: Added task set TaskSet_6.0 tasks to pool 2908305457167067998
23/09/22 03:14:16 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 7) (10.11.115.133, executor 0, partition 0, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:16 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 8) (10.11.115.133, executor 0, partition 1, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:16 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 9) (10.11.115.133, executor 0, partition 2, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:16 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 10) (10.11.115.133, executor 0, partition 3, PROCESS_LOCAL, taskResourceAssignments Map())
23/09/22 03:14:16 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 234.0 KiB, free 3.3 GiB)
23/09/22 03:14:16 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 82.5 KiB, free 3.3 GiB)
23/09/22 03:14:16 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on 10.11.115.134:44293 (size: 82.5 KiB, free: 3.3 GiB)
23/09/22 03:14:16 INFO SparkContext: Created broadcast 9 from broadcast at TaskSetManager.scala:622
23/09/22 03:14:16 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on 10.11.115.133:45037 (size: 82.5 KiB, free: 3.6 GiB)
23/09/22 03:14:16 INFO ConsoleTransport: {"eventTime":"2023-09-22T03:14:16.29Z","producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunEvent","eventType":"START","run":{"runId":"ee45de3c-b839-4347-92cc-89f766d073c3","facets":{"spark.logicalPlan":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","plan":[{"class":"org.apache.spark.sql.execution.datasources.LogicalRelation","num-children":0,"relation":null,"output":[[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"household_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":75,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"basket_id","dataType":"long","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":76,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"day","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":77,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"product_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":78,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"quantity","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":79,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"sales_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":80,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"store_id","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":81,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"discount_amount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":82,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"transaction_time","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":83,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"week_no","dataType":"integer","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":84,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":85,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}],[{"class":"org.apache.spark.sql.catalyst.expressions.AttributeReference","num-children":0,"name":"coupon_discount_match","dataType":"float","nullable":true,"metadata":{},"exprId":{"product-class":"org.apache.spark.sql.catalyst.expressions.ExprId","id":86,"jvmId":"cf1b65cb-72d4-4826-9aa0-ea3aa307592f"},"qualifier":[]}]],"isStreaming":false}]},"spark_version":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","spark-version":"3.3.0","openlineage-spark-version":"1.2.2"},"spark_properties":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","properties":{"spark.master":"spark://10.11.115.134:7077","spark.app.name":"Databricks Shell"}},"processing_engine":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-1-0/ProcessingEngineRunFacet.json#/$defs/ProcessingEngineRunFacet","version":"3.3.0","name":"spark","openlineageAdapterVersion":"1.2.2"},"environment-properties":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/2-0-2/OpenLineage.json#/$defs/RunFacet","environment-properties":{"spark.databricks.clusterUsageTags.clusterName":"jason.yip@tredence.com's Cluster","spark.databricks.clusterUsageTags.azureSubscriptionId":"a4f54399-8db8-4849-adcc-a42aed1fb97f","spark.databricks.notebook.path":"/Repos/jason.yip@tredence.com/segmentation/01_Data Prep","mountPoints":[{"mountPoint":"/databricks-datasets","source":"databricks-datasets"},{"mountPoint":"/Volumes","source":"UnityCatalogVolumes"},{"mountPoint":"/databricks/mlflow-tracking","source":"databricks/mlflow-tracking"},{"mountPoint":"/databricks-results","source":"databricks-results"},{"mountPoint":"/databricks/mlflow-registry","source":"databricks/mlflow-registry"},{"mountPoint":"/Volume","source":"DbfsReserved"},{"mountPoint":"/volumes","source":"DbfsReserved"},{"mountPoint":"/","source":"DatabricksRoot"},{"mountPoint":"/volume","source":"DbfsReserved"}],"spark.databricks.clusterUsageTags.clusterAllTags":"[{\"key\":\"Vendor\",\"value\":\"Databricks\"},{\"key\":\"Creator\",\"value\":\"jason.yip@tredence.com\"},{\"key\":\"ClusterName\",\"value\":\"jason.yip@tredence.com's Cluster\"},{\"key\":\"ClusterId\",\"value\":\"0808-055325-43kdx9a4\"},{\"key\":\"Environment\",\"value\":\"POC\"},{\"key\":\"Project\",\"value\":\"SI\"},{\"key\":\"DatabricksEnvironment\",\"value\":\"workerenv-4679476628690204\"}]","spark.databricks.clusterUsageTags.clusterOwnerOrgId":"4679476628690204","user":"jason.yip@tredence.com","userId":"4768657035718622","orgId":"4679476628690204"}}}},"job":{"namespace":"adb-5445974573286168.8#default","name":"adb-4679476628690204.4.azuredatabricks.net.scan_csv ","facets":{}},"inputs":[{"namespace":"wasbs://studio@clororetaildevadls.blob.core.windows.net","name":"/examples/data/csv/completejourney/transaction_data.csv","facets":{"dataSource":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/DatasourceDatasetFacet.json#/$defs/DatasourceDatasetFacet","name":"wasbs://studio@clororetaildevadls.blob.core.windows.net","uri":"wasbs://studio@clororetaildevadls.blob.core.windows.net"},"schema":{"_producer":"https://github.com/OpenLineage/OpenLineage/tree/1.2.2/integration/spark","_schemaURL":"https://openlineage.io/spec/facets/1-0-0/SchemaDatasetFacet.json#/$defs/SchemaDatasetFacet","fields":[{"name":"household_id","type":"integer"},{"name":"basket_id","type":"long"},{"name":"day","type":"integer"},{"name":"product_id","type":"integer"},{"name":"quantity","type":"integer"},{"name":"sales_amount","type":"float"},{"name":"store_id","type":"integer"},{"name":"discount_amount","type":"float"},{"name":"transaction_time","type":"integer"},{"name":"week_no","type":"integer"},{"name":"coupon_discount","type":"float"},{"name":"coupon_discount_match","type":"float"}]}},"inputFacets":{}}],"outputs":[]}
23/09/22 03:14:16 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on 10.11.115.133:45037 (size: 14.5 KiB, free: 3.6 GiB)
23/09/22 03:14:17 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:20 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:23 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0, New Ema: 1.0 
23/09/22 03:14:24 INFO TaskSetManager: Finished task 3.0 in stage 6.0 (TID 10) in 8029 ms on 10.11.115.133 (executor 0) (1/4)
23/09/22 03:14:24 INFO TaskSetManager: Finished task 1.0 in stage 6.0 (TID 8) in 8458 ms on 10.11.115.133 (executor 0) (2/4)